Wednesday 13 August 2008

Analytic Optimizers

Analysis (as in “real analysis” or “complex analysis”) is an extension of classical college calculus. Analytic optimizers involve the well-developed machinery of analysis, specifically differential calculus and the study of analytic functions, in the solution of practical problems.


In some instances, analytic methods can yield a direct (no iterative) solution to an optimization problem.

This happens to be the case for multiple regression, where solutions can be obtained with a few matrix calculations. In multiple regression, the goal is to find a set of regression weights that minimize the sum of the squared prediction errors. In other cases, iterative techniques must be used.


The connection weights in a neural network,
for example, cannot be directly determined. They must be estimated using an iterative procedure,
such as back-propagation.


Many iterative techniques used to solve multivariate optimization problems (those involving several variables or parameters) employ some variation on the theme of steepest ascent. In its most basic form, optimization by steepest ascent works as follows: A point in the domain of the fitness function (that is, a set of parameter values) is chosen by some means. The gradient vector at that point is evaluated by computing the derivatives of the fitness function with respect to each of the variables or parameters; this defines the direction in dimensional parameter space for which a fixed amount of movement will produce the greatest increase in fitness.
A small step is taken up the hill in fitness space, along the direction of the gradient.
The gradient is then recomputed at this new point, and another, perhaps smaller, step is taken. The process is repeated until convergence occurs.

No comments: