Convex Global Underestimator

The convex global underestimator technique can be used, if the search space roughly resembles 'mountains' with many local maximum values that tend to increase in direction of the global maximum. Its basic idea is to randomly look for some values ofthe function to generate a concave function (respectively convex for minimization problems) that comes close to the points, but stays above (respectively below) and has a simple structure, so its maximum (minimum) can easily be determined. This hypothetical global maximum is used to reduce the region of the search space, where iteratively new points are calculated.

For example, a concave function g can be defined by

- - div] 2 i i where Cj and d are constants that are yet to be found.

The optimization problem for the constants (so g stays above but as close as possible to the randomly calculated solutions) is linear. Hence, the simplex algorithm described before can be used. Usually, the region around the maximum of g is set just at a size that the best calculated solution is still included. Of course, the algorithm works better if first some local search routine has been applied with the previously calculated set of solutions. This can for example be done by gradient-based techniques or by simulated annealing.

Was this article helpful?

0 0
Worm Farming

Worm Farming

Do You Want To Learn More About Green Living That Can Save You Money? Discover How To Create A Worm Farm From Scratch! Recycling has caught on with a more people as the years go by. Well, now theres another way to recycle that may seem unconventional at first, but it can save you money down the road.

Get My Free Ebook


Post a comment