Non-Linear Least Squares (NLS)
While linear least squares provides a closed-form solution for fitting linear models, many real-world problems involve non-linear models. A common example is fitting a Gaussian function to a set of observations.
Given a set of 
Since this function is non-linear in its parameters, we cannot solve it directly. Instead, we must use an iterative optimization algorithm like the Gauss-Newton method.
The Gauss-Newton Algorithm
The core idea is to start with an initial guess for the parameters and iteratively refine it by solving a sequence of linear least-squares problems.
- 
Start with an initial estimate for the parameters,
.  - 
At each iteration, we linearize the non-linear function
around the current estimate using a first-order Taylor series expansion:  
where 
- We can now write the error (or residual) for each observation 
as:  
This is a linear system of the form 
- We solve this linear least-squares problem for 
using the normal equations:  
- Update the parameter estimate for the next iteration:
 
- Repeat steps 2-5, linearizing around the new estimate 
, until the change in parameters is below a small threshold , or a maximum number of iterations is reached.  
Levenberg-Marquardt (LM) Algorithm
The Gauss-Newton algorithm can be unstable if the Jacobian