MHB Symmetric Polynomials s1,s2,s3

Click For Summary
The discussion focuses on expressing the sum of squares of variables, r1^2 + r2^2 + ... + rn^2, in terms of elementary symmetric polynomials s1, s2, and s3. It is established that s1 represents the sum of the variables, while s2 captures the sum of products of pairs of variables. The key equation derived is s1^2 = r1^2 + r2^2 + ... + rn^2 + 2s2, leading to the conclusion that r1^2 + r2^2 + ... + rn^2 can be expressed as s1^2 - 2s2. The discussion also emphasizes the importance of understanding symmetric polynomials and their construction for various degrees. Overall, the expression for the sum of squares in terms of symmetric polynomials is clarified and mathematically framed.
mathjam0990
Messages
28
Reaction score
0
Express r12+r22+...+rn2 as a polynomial in the elementary symmetric polynomials s1, s2, . . . ,sn.

I'm sure the equation we are dealing with is (r1+r2+...+rn)2 which is very large to factor out but should yield r12+r22+...+rn2+(other terms)

I believe s1=r1+r2+...+rn

s2=Σri1ri2 for 1≤i1≤i2≤n

s3=r1r2⋅⋅⋅⋅⋅⋅rn

So the answer should be r12+r22+...+rn2 = s12 - (something with s2,...sn) Sorry I am not sure what to employ here to break this all the way down.

If there is anyone who could provide an explanation, that would be amazing. Thank you!
 
Physics news on Phys.org
mathjam0990 said:
Express r12+r22+...+rn2 as a polynomial in the elementary symmetric polynomials s1, s2, . . . ,sn.

I'm sure the equation we are dealing with is (r1+r2+...+rn)2 which is very large to factor out but should yield r12+r22+...+rn2+(other terms)

I believe s1=r1+r2+...+rn

s2=Σri1ri2 for 1≤i1≤i2≤n

s3=r1r2⋅⋅⋅⋅⋅⋅rn

So the answer should be r12+r22+...+rn2 = s12 - (something with s2,...sn) Sorry I am not sure what to employ here to break this all the way down.

If there is anyone who could provide an explanation, that would be amazing. Thank you!
.

now $s_1 = r_1 + r_2 + r_3 + r_ 4 ...\cdots r_n$
because we need to evaluate $r_1^2 + r_2^2 + ..\cdots + r_n^2$ I am tempted to sqaure $S_1$ which shall give
square terms and additional ones

As we get $s_1^2 = r_1^2 + r_2^2 + r_3^2 + r_ 4 ...\cdots r_n^2 + r_1 r_2 + r_1 r_3 + ...$
$= r_1^2 + r_2^2 + r_3^2 + r_ 4 ...\cdots r_n^2 + 2 \sum_{p=1}^{m-1} \sum_{m=1}^{n} r_p r_m$ (one comes from $r_p r_m$ and another from $r_m r_p$ for $p \ne m$
$= r_1^2 + r_2^2 + r_3^2 + r_ 4 ...\cdots r_n^2 + 2 s_2$

hence given sum = $s_1^2 - 2s_2$
 
mathjam0990 said:
Express r12+r22+...+rn2 as a polynomial in the elementary symmetric polynomials s1, s2, . . . ,sn.

I'm sure the equation we are dealing with is (r1+r2+...+rn)2 which is very large to factor out but should yield r12+r22+...+rn2+(other terms)

I believe s1=r1+r2+...+rn

s2=Σri1ri2 for 1≤i1≤i2≤n

s3=r1r2⋅⋅⋅⋅⋅⋅rn

So the answer should be r12+r22+...+rn2 = s12 - (something with s2,...sn) Sorry I am not sure what to employ here to break this all the way down.

If there is anyone who could provide an explanation, that would be amazing. Thank you!

It's easier to illustrate what the $s_i$ are with a particular $n$. Let's use $n = 4$. Suppose our variables are $r_1,r_2,r_3,r_4$. Then our polynomials are in the polynomial ring:

$F[r_1,r_2,r_3,r_4]$ (where $F$ is our underlying field).

Given $f \in F[r_1,r_2,r_3,r_4]$, we can have $\sigma \in S_4$ operate on $F[r_1,r_2,r_3,r_4]$ by:

$\sigma(f(r_1,r_2,r_3,r_4)) = f(r_{\sigma(1)},r_{\sigma(2)},r_{\sigma(3)},r_{\sigma(4)})$.

We say a polynomial $f \in F[r_1,r_2,r_3,r_4]$ is *symmetric* if $\sigma(f) = f$.

The ELEMENTARY symmetric polynomials are the "basic" symmetric polynomials of any given degree > 0. Let's see how they are constructed, for $n = 4$, by seeing what they ought to be (note we don't care about constant terms, since $\sigma$ never affects them, so for simplicity's sake, we'll always assume all constant terms are 0).

For degree $1$, our polynomials are just linear combinations of the $r_i$. So the "simplest" symmetric combination would be:

$s_1 = r_1 + r_2 + r_3 + r_4$.

This can't be improved on, eliminating any term would break the symmetry.

for degree $2$, our polynomials would be linear combinations of $r_ir_j$ ($i = j$ might happen) + terms of lower degree. since we can capture symmetric terms of lower degree by the polynomial above, we will only look at symmetric combinations of $r_ir_j$.

At first, it might seem the best candidate would be:

$f = r_1^2 + r_1r_2 + r_1r_3 + r_1r_4 + r_2^2 + r_2r_3 + r_2r_4 + r_3^2 + r_3r_4 + r_4^2$ (all possible degree 2 terms summed).

However, note that:

$s_1^2 = (r_1 + r_2 + r_3 + r_4)^2 =$

$ r_1^2 + r_1r_2 + r_1r_3 + r_1r_4 + r_2r_1 + r_2^2 + r_2r_3 + r_2r_4 + r_3r_1 + r_3r_2r_3^2 + r_4r_1 + r_4r_2 + r_4r_3 + r_4^2$

$= r_1^2 + r_2^2 + r_3^2 + r_4^2 + 2(r_1r_2 + r_1r_3 + r_1r_4 + r_2r_3 + r_2r_4 + r_3r_4)$

If we call the "cross-terms" $2g$, we have:

$f = s_1^2 - 2g + g$ where the $s_1^2 - 2g$ gives us just the degree 2 terms in $f$ that are squares.

Thus all we need to get $f$ is $s_1$ and $g$ (both of these are still symmetric) and $g$ is "more minimal" than $f$ (fewer terms). So we should pick $g$ as $s_2$:

$s_2 = r_1r_2 + r_1r_3 + r_1r_4 + r_2r_3 + r_2r_4 + r_3r_4$

The same sort of logic applies to the higher terms (but the algebra is horrendous), so:

$s_3 = r_1r_2r_3 + r_1r_2r_4 + r_1r_3r_4 + r_2r_3r_4$

(Do you see the pattern? We pick $s_k = \sum\limits_{i_1 < i_2 < \cdots < i_k} r_{i_1}r_{i_2}\cdots r_{i_k}$)

and finally $s_4 = r_1r_2r_3r_4$.
 
Thread 'How to define a vector field?'
Hello! In one book I saw that function ##V## of 3 variables ##V_x, V_y, V_z## (vector field in 3D) can be decomposed in a Taylor series without higher-order terms (partial derivative of second power and higher) at point ##(0,0,0)## such way: I think so: higher-order terms can be neglected because partial derivative of second power and higher are equal to 0. Is this true? And how to define vector field correctly for this case? (In the book I found nothing and my attempt was wrong...

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K