In statistics, the method ofestimating equations is a way of specifying how the parameters of astatistical model should beestimated. This can be thought of as a generalisation of many classical methods—themethod of moments,least squares, andmaximum likelihood—as well as some recent methods likeM-estimators.
The basis of the method is to have, or to find, a set of simultaneous equations involving both the sample data and the unknown model parameters which are to be solved in order to define the estimates of the parameters.[1] Various components of the equations are defined in terms of the set of observed data on which the estimates are to be based.
Important examples of estimating equations are thelikelihood equations.
Consider the problem of estimating the rate parameter, λ of theexponential distribution which has theprobability density function:
Suppose that a sample of data is available from which either thesample mean,, or the samplemedian,m, can be calculated. Then an estimating equation based on the mean is
while the estimating equation based on the median is
Each of these equations is derived by equating a sample value (sample statistic) to a theoretical (population) value. In each case the sample statistic is aconsistent estimator of the population value, and this provides an intuitive justification for this type of approach to estimation.