Overfitting (or over-learning) is a statistical analysis that fits too closely (or exactly) with a particular set of data. Thus, this specific analysis may not correspond to additional data or can fail to predict future observations reliably. An over-adjusted model is a statistical model that contains more parameters than the data can justify.
The same problem of overfitting also exists in machine learning. It is usually caused by a bad sizing of the structure used to make the classification. Due to its over-capacity to capture information, a structure in an over-learning situation will have a hard time generalizing the data characteristics. It then behaves like a table containing all of the samples used during the learning process, and looses its prediction powers on new samples.