The convex global underestimator technique can be used, if the search space roughly resembles 'mountains' with many local maximum values that tend to increase in direction of the global maximum. Its basic idea is to randomly look for some values ofthe function to generate a concave function (respectively convex for minimization problems) that comes close to the points, but stays above (respectively below) and has a simple structure, so its maximum (minimum) can easily be determined. This hypothetical global maximum is used to reduce the region of the search space, where iteratively new points are calculated.
For example, a concave function g can be defined by
- - div] 2 i i where Cj and d are constants that are yet to be found.
The optimization problem for the constants (so g stays above but as close as possible to the randomly calculated solutions) is linear. Hence, the simplex algorithm described before can be used. Usually, the region around the maximum of g is set just at a size that the best calculated solution is still included. Of course, the algorithm works better if first some local search routine has been applied with the previously calculated set of solutions. This can for example be done by gradient-based techniques or by simulated annealing.
Was this article helpful?
Learning About 10 Ways Fight Off Cancer Can Have Amazing Benefits For Your Life The Best Tips On How To Keep This Killer At Bay Discovering that you or a loved one has cancer can be utterly terrifying. All the same, once you comprehend the causes of cancer and learn how to reverse those causes, you or your loved one may have more than a fighting chance of beating out cancer.