A least-squares optimization technique used in the generalized linear inversion method. Described the system as d=g(m) , with d the data model and m the model space, it is an itherative method. Starting from a initial random model m0 , it calculates the Jacobian matrix and the Hessian matrix to find the minimum of a curvature. The best model can be found in several ways, for example when the Misfit function, itheration after itheration, is smaller than a bound value. See Lines and Treitel (1984).