Monday, April 28, 2008

Iterative parameter finding

I should be getting ready for my viva, but instead, I reread this cool bit on how to find the best parameters for a model.

You have a model with parameters , and you want to fit the parameters to the data . For the maximum log-likelihood, you would find the zero of the derivative . But suppose that is nontrivial. Then you can use the Newton-Rhapson method for iterative parameter finding.

Using the multivariate Taylor series, we can update the initial guess for the best parameters.

small terms

This gives

This iterative procedure is nice because the functions involved often have simple forms (e.g in logistic regression, is a combination of inputs that give classification errors), and it gives a solution in cases that are not analytically tractable.

No comments: