1. An iterative method of approaching a minimum by taking an increment along the steepest gradient to arrive at the next approximation, the step length often being proportional to the magnitude of the gradient. Provision can be made to speed up convergence onto the minimum and prevent oscillation about the minimum. It is assumed that the function is continuous and that the initial estimate is close enough to the correct minimum, in the event that the function has more than one minimum. Sometimes called steepest ascent when used to approach a maximum. See Lines and Treitel (1984). 2. A method to compute the asymptotic behavior of an integral, also called the ‘‘saddle-point method;’’ see Morse and Feshbach (1967: 437).