# Interior-Point Method

Interior-point methods (also referred to as barrier methods or IPMs) are a certain class of algorithms that solve linear and nonlinear Convex Optimization.

I was trying to understand this, as opposed to Gradient Descent, for the Raceline Optimization problem.

https://coin-or.github.io/Ipopt/index.html#Overview

### IPOPT vs Gradient Descent

No, IPOPT is not exactly the same as gradient descent, although they share some similarities.

IPOPT does Constrained Optimization…?

Gradient descent is an iterative optimization algorithm with updates are determined solely by the gradient of the objective function with respect to the optimization variables

IPOPT, on the other hand, is a more advanced optimization algorithm that uses the concept of interior-point methods. It is also an iterative algorithm that seeks to minimize the objective function subject to constraints. However, it does not use the gradient of the objective function alone to update the optimization variables. Instead, it uses a combination of the gradient of the objective function and the gradient of the constraints, as well as information about the curvature of the optimization problem.

Specifically, IPOPT solves a sequence of barrier problems, where the barrier function penalizes violations of the constraints, and then uses a filter method to select candidate steps. The filter method determines whether a candidate step is acceptable based on whether it satisfies certain conditions, such as whether it reduces the objective function sufficiently or whether it violates the constraints too much.

Overall, IPOPT is a more advanced optimization algorithm than gradient descent, and it can handle a wider variety of optimization problems. However, IPOPT can also be more computationally expensive than gradient descent, particularly for large-scale optimization problems.