For differentiable objective functions, the optimization task is often rather easy. The main idea is to find the points, where f '(v) = 0 These spots include all extremal values, as long as Vis closed. Either, they can directly be calculated or approximated by Newton's method, which calculates to a given approximation of a root of the function, or in other words to a spot close to v where f' (v) = 0, the root of the tangent in this point and iteratively uses it as the next approximation of the function's root.
Other algorithms for differentiable problems make use of the fact that the gradient of the objective function always points to the direction with highest slope. The same idea can be used to optimize on discrete search spaces, by looking for the slope of all secants in a point instead of the tangents.
Was this article helpful?
You Might Start Missing Your Termites After Kickin'em Out. After All, They Have Been Your Roommates For Quite A While. Enraged With How The Termites Have Eaten Up Your Antique Furniture? Can't Wait To Have Them Exterminated Completely From The Face Of The Earth? Fret Not. We Will Tell You How To Get Rid Of Them From Your House At Least. If Not From The Face The Earth.