In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of a statistical model given observations, by finding the parameter values that maximize the likelihood of making the observations given the parameters. MLE can be seen as a special case of the maximum a posteriori estimation (MAP) that assumes a uniform prior distribution of the parameters, or as a variant of the MAP that ignores the prior and which therefore is unregularized.

The maximum likelihood parameter values are found such that they maximize the likelihood that the process described by the model produced the data that were actually observed. The maximum likelihood function L(μ, σ; data) the can be interpreted as the likelihood of the parameters μ and σ taking certain values given the data observed.

MLE is efficient; no consistent estimator has lower asymptotic error than MLE if you're using the right distribution.