![]() |
GT RoboCup SSL
Soccer software, robot firmware
|
Mathematical optimization involves the maximization or minimization of a function through clever choices of the input variables and the analysis of the the output. They are mostly iterative as the underlying function is not easily analysed.
Below is the comparison of the optimization algorithms that we use and their general use cases.
Gradient Ascent involves using the first order derivative to choose the next point. It will then move in the direction of the fastest increasing or decreasing gradient. As it passes the maximum, the step size decreases. Once the gradient is close enough to zero or the step size is small enough, the optmization algorithm exits.
Pros:
Cons:
Parallel Gradient Ascent combines multiple independent Gradient Ascents together and operates on them as a set. If two inputs are near to each other, the two Gradient Ascents are combined into one.
Pros:
Cons:
Nelder-Mead involves using a simplex to "flip" its way up a hill, shrinking and expanding when neccessary. The simplex has N+1 vertices in N dimensions. For example, in the two dimensional case, the simplex is a triangle.
Pros:
Cons: