Master1022
- 590
- 116
- Homework Statement
- Given the primal problem below: (i) is it convex?, (ii) find the dual problem
- Relevant Equations
- Lagrangian, hessian
Hi,
I am working on the following optimization problem, and am confused how to apply the Lagrangian in the following scenario:
Question:
Let us look at the following problem
\min_{x \in \mathbb{R}_{+} ^{n}} \sum_{j=1}^{m} x_j log(x_j)
\text{subject to} A \vec{x} \leq b \rightarrow A\vec{x} - b \leq \vec{0}
\sum_{j=1}^{n} x_j = 1
(i) Is this problem convex?
(ii) Find the dual problem
Attempt:
(i) For the convexity, I argued that it is convex and used the hessian argument to show that (although I don't know if I applied it correctly):
\frac{\partial ^2 (obj)}{\partial x_k ^2} = \frac{1}{x_k} \geq 0 \forall x_k \geq 0
\frac{\partial ^2 (obj)}{\partial x_k \partial x_j} = 0
Thus the second order convexity condition is satisfied and the problem is convex.
(ii) This is where I get confused as I am not sure how to deal with the equality constraint. We can define the Lagrangian to be:
L(\vec{x}, \vec{\lambda}, \vec{\nu}) = \sum_{j=1}^{m} x_j log(x_j) + \vec{\lambda}^{T} \left( A\vec{x} - b \right) + \text{something with inequality constraint}
The first thing that comes to mind is: -1 + \sum_{j=1}^{n} \nu_j x_j = 0. However, this ignores the fact that the one is also present in the expression. Other options I have thought about include:
(a) Introducing extra slack variables
(b) Re-formatting the equation
But I am not sure how to make any progress.
I am working on the following optimization problem, and am confused how to apply the Lagrangian in the following scenario:
Question:
Let us look at the following problem
\min_{x \in \mathbb{R}_{+} ^{n}} \sum_{j=1}^{m} x_j log(x_j)
\text{subject to} A \vec{x} \leq b \rightarrow A\vec{x} - b \leq \vec{0}
\sum_{j=1}^{n} x_j = 1
(i) Is this problem convex?
(ii) Find the dual problem
Attempt:
(i) For the convexity, I argued that it is convex and used the hessian argument to show that (although I don't know if I applied it correctly):
\frac{\partial ^2 (obj)}{\partial x_k ^2} = \frac{1}{x_k} \geq 0 \forall x_k \geq 0
\frac{\partial ^2 (obj)}{\partial x_k \partial x_j} = 0
Thus the second order convexity condition is satisfied and the problem is convex.
(ii) This is where I get confused as I am not sure how to deal with the equality constraint. We can define the Lagrangian to be:
L(\vec{x}, \vec{\lambda}, \vec{\nu}) = \sum_{j=1}^{m} x_j log(x_j) + \vec{\lambda}^{T} \left( A\vec{x} - b \right) + \text{something with inequality constraint}
The first thing that comes to mind is: -1 + \sum_{j=1}^{n} \nu_j x_j = 0. However, this ignores the fact that the one is also present in the expression. Other options I have thought about include:
(a) Introducing extra slack variables
(b) Re-formatting the equation
But I am not sure how to make any progress.