Performing optimization is a common data science task. In this recipe, we implement a simple optimization routine using the Marquardt algorithm. For more information on this, see http://mathworld.wolfram.com/Levenberg-MarquardtMethod.html or https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm. In the process of optimizing, we will also discover how Julia handles linear algebra and numerical differentiation.
The basic idea of the procedure is the following. Given a twice differentiable function,
, and some point,
, we want to find another point,
, such that
. By repeating this process, we want to reach the local the minimum of the
The rule for finding
is as follows:
The idea is that we mix two standard algorithms: the Newton algorithm, where the step is
, and the gradient descent algorithm, where the step is
is a parameter. We accept
if it leads to a decrease in the value of
The additional rule is that if we have found...