Open In App

Multivariate Optimization with Equality Constraint

Improve
Improve
Like Article
Like
Save
Share
Report

Wikipedia defines optimization as a problem where you maximize or minimize a real function by systematically choosing input values from an allowed set and computing the value of the function. That means when we talk about optimization we are always interested in finding the best solution. So, let say that one has some functional form(e.g in the form of f(x)) and he is trying to find the best solution for this functional form. Now, what does best mean? One could either say he is interested in minimizing this functional form or maximizing this functional form.

Generally, an optimization problem has three components. 
 

minimize f(x), 
w.r.t x,  
subject to a < x < b 


 

where, f(x) : Objective function 
x : Decision variable 
a < x < b : Constraint 
 



What’s a multivariate optimization problem? 

In a multivariate optimization problem, there are multiple variables that act as decision variables in the optimization problem. 
 

z = f(x1, x2, x3…..xn)



So, when you look at these types of problems a general function z could be some non-linear function of decision variables x1, x2, x3 to xn. So, there are n variables that one could manipulate or choose to optimize this function z. Notice that one could explain univariate optimization using pictures in two dimensions that is because in the x-direction we had the decision variable value and in the y-direction, we had the value of the function. However, if it is multivariate optimization then we have to use pictures in three dimensions and if the decision variables are more than 2 then it is difficult to visualize. 


What’s multivariate optimization with equality constraint
In mathematics, equality is a relationship between two quantities or, more generally two mathematical expressions, asserting that the quantities have the same value, or that the expressions represent the same mathematical object. So if there is given an objective function with more than one decision variable and having an equality constraint then this is known as so. 
Example

 

min 2x12 + 4x22
st
3x1 + 2x2 = 12



Here x1 and x2 are two decision variable with equality constraint 3x1 + 2x2 = 12 

Condition for identifying the optimum point in case of equality constraint 

 

If there is one equality constraint case then the condition is 
 

-∇ f(x*) = λ*∇ h(x*)


If there are more than one equality constraint case then the condition is 
 

-∇ f(x*) = Σi=1l[∇ hi(x*)] λi*



where, 
f(x*) = f(x1, x2, …., xn) = Objective function 
h(x*) = h(x1, x2, …., xn) = Equality constraint 
λ* ∈ R 
 



Let us quickly solve a numerical example on this to understand these conditions better. 

Numerical Example: 
 

Problem:
min 2x_1^2 + 4x_2^2

subject to 

3x_1 + 2x_2 - 12 = 0



Solution:

Here,

Objective function(f(x)) = 2x_1^2 + 4x_2^2 and

Equality constraint(h(x)) = 3x_1 + 2x_2 - 12 = 0



For identifying the optimum point we can write the equation as

- \nabla f = \lambda \nabla h

Hence

- \nabla f = - \begin{bmatrix} \partial f/ \partial x_1\\ \partial f/ \partial x_2\\ \end{bmatrix} = \begin{bmatrix} -4x_1\\ -8x_2\\ \end{bmatrix}

Similarly,

\nabla h = \begin{bmatrix} \partial h/ \partial x_1\\ \partial h/ \partial x_2\\ \end{bmatrix} = \begin{bmatrix} 3\\ 2\\ \end{bmatrix}

According to the condition

\begin{bmatrix} -4x_1\\ -8x_2\\ \end{bmatrix} = \lambda \begin{bmatrix} 3\\ 2\\ \end{bmatrix}

This can be written as 

-4x_1 = 3 \lambda ---(1) 

-8x_2 = 2 \lambda ---(2), And we have already the equality constraint equation

3x_1 + 2x_2 - 12 = 0 ---(3)

By solving these three equations we can get our optimum solution 

along with the value of the variable λ .

\begin{bmatrix} x_1 ^*\\ x_2 ^*\\ \lambda ^* \\ \end{bmatrix} = \begin{bmatrix} 3.27\\ 1.09\\ -4.36\\ \end{bmatrix}

Hence x_1 ^* = 3.27 and x_2 ^* = 1.09 is our optimum solution.



Why constraints are important in the optimization problem from a data science viewpoint? 

We look at optimization from a data science viewpoint because we are trying to minimize errors. In many cases when we try to solve data science problems and when we minimize error we said we could use some kind of gradient-based algorithm that we called as the learning algorithm to solve the problem. In some cases, while we are trying to minimize our error or the objective function, we might know some information about the problem that we want to incorporate in the solution. So, if for example, you are trying to uncover relationships between several variables and you do not know how many relationships are there, but you know for sure that certain relationships exist and you know what those relationships are, then when you try to solve the data science problem you would try to constrain your problem to be such that the known relationships are satisfied. So, that could pose an optimization problem where you have constraints in particular equality constraints and there are several other cases where you might have to look at the constraint version of the problem while one solves data science problems. So, it is important to understand how these problems are solved. Rather than equality constraint problems, inequality constraint problems are more relevant, for example, the algorithms for inequality constraints are very useful in data science algorithm that is called support vector machines and so on.
 



Last Updated : 10 Nov, 2021
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads