MHB What is the Canonical Form of Linear Programming?

AI Thread Summary
A linear programming problem is in canonical form when it maximizes a linear objective function subject to equality constraints and non-negativity conditions on the variables. The matrix A, which defines the constraints, must be of size m x n, where m is the number of equations and n is the number of variables, with the rank of A being m and less than n. This rank condition ensures that the equations are linearly independent, leading to infinitely many solutions due to having more variables than equations. The feasible solution set consists of all variable combinations that satisfy the constraints. Understanding these concepts is crucial for solving linear programming problems effectively.
evinda
Gold Member
MHB
Messages
3,741
Reaction score
0
Hello! (Wave)

A linear programming problem is in canonical form if it's of the following form:

$$\pm \max (c_1 x_1+ \dots + c_n x_n) , c_1, \dots, c_n \in \mathbb{R} \\ Ax=b, A \in F^{m \times n}, x=\begin{bmatrix}
x_1\\
\dots\\
\dots \\
x_n
\end{bmatrix}, b=\begin{bmatrix}
b_1\\
\dots\\
\dots \\
b_m
\end{bmatrix}$$

$(A=[a_{ij}], a_{ij} \in \mathbb{R})$

$b_j \geq 0, j=1, \dots, m \\ x_i \geq 0, i=1, \dots, n$

$r(A)=m<n$

where $r(A)$ is the rank of the matrix $A$.

$r(A):=$ maximum number of linearly independent rows of the matrix $A$.First of all, could you explain me why it has to hold that $A \in F^{m \times n}$ where $F$ is the set of feasible solutions?
Doesn't $F$ contain only the $x$ s for which it holds that $Ax=b$ ? Or am I wrong?

Also, does the condition $r(A)=m<n$ imply that all the $m$ equations that we get from $Ax=b$ are not multiple the one of the other? Or have I understood it wrong? (Thinking)
 
Physics news on Phys.org
evinda said:
First of all, could you explain me why it has to hold that $A \in F^{m \times n}$ where $F$ is the set of feasible solutions?
Doesn't $F$ contain only the $x$ s for which it holds that $Ax=b$ ? Or am I wrong?

Hey evinda! (Smile)

It seems to me that the definitions are a bit mixed up. :eek:
I think $F$ is just a field, typically $\mathbb R$.
The set of feasible solutions is the set of all $x$ that satisfy $Ax=b, x_i \geq 0$, which is a subset of $F^n$.
And the matrix $A$ is an $m\times n$ matrix with elements of $F$, so $A \in F^{m\times n}$. (Mmm)
Also, does the condition $r(A)=m<n$ imply that all the $m$ equations that we get from $Ax=b$ are not multiple the one of the other? Or have I understood it wrong? (Thinking)

Even stronger, anyone of the equations cannot be a linear combinations of any of the other equations.
The equations have to be linearly independent.
And there have to be infinitely many solutions $x$ for $Ax=b$ (Nerd)
 
I like Serena said:
Hey evinda! (Smile)

It seems to me that the definitions are a bit mixed up. :eek:
I think $F$ is just a field, typically $\mathbb R$.
The set of feasible solutions is the set of all $x$ that satisfy $Ax=b, x_i \geq 0$, which is a subset of $F^n$.
And the matrix $A$ is an $m\times n$ matrix with elements of $F$, so $A \in F^{m\times n}$. (Mmm)

Ah, I see... (Nod)

I like Serena said:
Even stronger, anyone of the equations cannot be a linear combinations of any of the other equations.

Any of the $m$ equations, right? (Thinking)

I like Serena said:
And there have to be infinitely many solutions $x$ for $Ax=b$ (Nerd)

Why does this hold?
 
evinda said:
Any of the $m$ equations, right? (Thinking)

Exactly. (Nod)

Why does this hold?

Because $\operatorname{rank} A = m < n$.
That means more columns than rows, while the rows are independent.
In turn that means more variables than equations, which guarantees infinitely many solutions.

As an example, let's pick $F=\mathbb R, m=1, n=2$ and $2x+3y = 6$.
How many solutions does it have? (Wondering)
 
I like Serena said:
As an example, let's pick $F=\mathbb R, m=1, n=2$ and $2x+3y = 6$.
How many solutions does it have? (Wondering)

Infinitely many... We can write the above equivalently as follows, right?

$\begin{bmatrix}
2 & 3
\end{bmatrix} \begin{bmatrix}
x\\
y
\end{bmatrix}=6$

How can we justify then that there are infinitely many solutions? (Thinking)
 
evinda said:
Infinitely many... We can write the above equivalently as follows, right?

$\begin{bmatrix}
2 & 3
\end{bmatrix} \begin{bmatrix}
x\\
y
\end{bmatrix}=6$

How can we justify then that there are infinitely many solutions? (Thinking)

Right. (Nod)

If we have $m$ independent equations with $m$ variables, there is a unique solution.
We can tell because if $A$ is square, the fact that the rows are independent means that it's invertible.
So $Ax=b \Rightarrow x = A^{-1}b$.

Now that we have more than $m$ variables, we can for starters set them to zero, giving us a first solution.
And since we have a linear dependency in the columns, we can shift them around, giving us infinitely many solutions.
This corresponds to moving along a boundary line of the feasible region. (Thinking)
 

Similar threads

Back
Top