Piecewise Linear Programming with Multiple Functions

Click For Summary

Discussion Overview

The discussion revolves around a linear programming problem involving piecewise linear functions. Participants explore the formulation of the problem, constraints, and potential methods for solving it, particularly focusing on the minimization of a sum of multiple piecewise linear functions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning

Main Points Raised

  • One participant presents a linear programming problem with a matrix ##\textbf{X}## that is not symmetric and seeks to minimize a sum of functions ##Y## subject to certain constraints.
  • Another participant questions the meaning of the constraint ##X_i \geq 0## and suggests redefining the variables to introduce functions ##g_j## that account for dependencies.
  • A later reply clarifies that ##x_{i,j} \geq 0## was intended and describes the function ##f## as a convex increasing piecewise linear function, expressing optimism about finding a single minimum.
  • One participant proposes that if ##f## is not decreasing, the constraints could lead to equality, allowing for a direct inclusion of this condition in the parameter space.

Areas of Agreement / Disagreement

Participants express differing views on the interpretation of constraints and the properties of the function ##f##. There is no consensus on the best approach to solve the problem, and multiple perspectives on the formulation and implications of the constraints are presented.

Contextual Notes

Participants note potential ambiguities in the notation and constraints, particularly regarding the definition of ##X_i \geq 0## and the nature of the function ##f##. The discussion reflects a range of assumptions about the properties of the functions involved and the implications for the optimization process.

smehdi
Messages
16
Reaction score
0
Dear Friends
I have a question about linear programming. It would be great to have your comments or suggestions.
Consider the followings
\begin{equation}
\\
Y = [y_{1}, y_{2}, \cdots, y_{n}]
\\
G = [g_{1}, g_{2}, \cdots, g_{n}]
\\
\textbf{X} =
\begin{pmatrix}
0 & x_{1,2} & \cdots & x_{1,n} \\
x_{2,1} & 0 & \cdots & x_{2,n} \\
\vdots & \vdots & \ddots & \vdots \\
x_{n,1} & x_{n,2} & \cdots & 0
\end{pmatrix}
\\
\end{equation}
##\textbf{X}## is not symmetric. Here I show the ##i##th row of matrix ##\textbf{X}## by subscript ##(i,:)##, e.g., ##X_{(2,:)} = [x_{2,1}, 0, \cdots, x_{2,n}]## is the second row vector of matrix ##\textbf{X}## and subscript ##(:,i)## shows the ##i##th column vector.

The problem is
\begin{equation}
\begin{aligned}
& \min &Y = \sum_{j=1}^{n} y_{j}\\
& \text{subject to}
& X_{(i,:)} G^T \geq A \\
&& X_i \geq 0
\end{aligned}
\end{equation}
in which ##i## is fixed, say ##i=2##, matrix ##G## is known, elements of ##\textbf{X}## are also known except the ##i##th row and ##X_{(i,:)}## should be found. Each of the elements in ##Y_i## is a function of the column vector in ##\textbf{X}## of the same index , i.e.,
$$
y_{j} = f(\textbf{X}(:,j))
$$
I googled and found that the function ##f## that I am working with is a piecewise linear function! Actually here the problem is not to find the minimum of a piecewise linear function, but to find the minimum of sum over multiple piecewise linear functions!
Since I am totally new in the filed of linear programming I don't know how such a problem should be solved!
Does anybody have any comments or know some references about how I could solve this?

Thanks in advance.
 
Physics news on Phys.org
What does ##X_i \geq 0## mean? It does not fit to the other notation.

As all but one element of X(:,j) is known, it is possible to introduce functions gj which cover those dependencies. Define ##Z=X_{i,:}## and ##Z_j=X_{i,j}##

$$\text{Maximize}~ Y = \sum_{j=1}^{n} g_j(Z_j)$$
with the constraint
$$\sum_{j=1}^{n} Z_j G_j \geq A$$
and maybe some other constraint I don't understand.

Depending on how ugly f is, a usual maximization algorithm could work.
 
mfb said:
What does ##X_i \geq 0## mean? It does not fit to the other notation.

As all but one element of X(:,j) is known, it is possible to introduce functions gj which cover those dependencies. Define ##Z=X_{i,:}## and ##Z_j=X_{i,j}##

$$\text{Maximize}~ Y = \sum_{j=1}^{n} g_j(Z_j)$$
with the constraint
$$\sum_{j=1}^{n} Z_j G_j \geq A$$
and maybe some other constraint I don't understand.

Depending on how ugly f is, a usual maximization algorithm could work.

Thank you mfb for your hint.
I meant ##x_{i,j} \geq 0##.
##f## is a convex increasing piecewise linear function. I think it is not ugly that much :)
 
Oh great, then I would expect a single minimum without additional local minima. Every minimization algorithm should find that. If f is not decreasing, your constraint on the sum should always lead to equality, which means you can directly include it in the parameter space (for example, let one of the unknowns be fixed by all others via this condition).
 
  • Like
Likes   Reactions: smehdi

Similar threads

  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
Replies
32
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
Replies
21
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 21 ·
Replies
21
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K