Roots of polynomials as nonlinear systems of equations

galoisjr
Messages
36
Reaction score
0
Ok, to start off I have been examining the structure of polynomials. For instance, consider the general polynomial

P(x)=\sum^{n}_{k=0}a_{k}x^{k} (1)

Given some polynomial, the coefficients are known. Without the loss of generality, we can force the coefficient of the highest term to be zero, since we can always divide by it.

Next, we examine the same polynomial in it's factored form, which, from the FTOA, must exist and be equal to (1):

P(x)=\sum^{n}_{k=0}a_{k}x^{k}=\prod^{n}_{i=1}(x-x_{i}) (2)

Now, we can expand the product and rewrite it:

P(x)=\sum^{n}_{k=0}a_{k}x^{k}=\sum^{n}_{k=0}(-1)^{n}e_{k}(x_{1},x_{2},...,x_{n})x^{n-k}

Where the

e_{k}(x_{1},x_{2},...,x_{n})=\sum_{1\leq j_{1}<j_{2}<...<j_{k}\leq n}x_{j_{1}}x_{j_{2}}\cdots x_{j_{k}}

are the elementary symmetric polynomials. (If this looks odd to you the coefficient of the n-k power is the sum over the permutations of the n roots taken k at a time)

So, equating the known coefficients of the like powers of x, we have, a system of n nonlinear equations with n unknowns:

a_{n}=1
a_{n-1}=\sum_{1\leq j_{1}\leq n}x_{j_{1}}=-(x_{1}+x_{2}+...+x_{n})
a_{n-2}=\sum_{1\leq j_{1}<j_{2}\leq n}x_{j_{1}}x_{j_{2}}=x_{1}x_{2}+...+x_{1}x_{n}+x_{2}x_{3}+...+x_{2}x_{n}+...+x_{n-1}x_{n}

and so on.


Now, I am assuming that for n>4 the system of nonlinear equations is unsolvable, which would make sense because of galois theory and the obvious relation between the symmetric polynomials and the symmetric group on n letters. However, I have never read an algebra book that goes into the analysis as I just have here. I've seen some that mention permutation of coefficients, but not the actual roots. So, I was wondering if anyone has any suggestions on where to go from here from an abstract algebra viewpoint, and hoping someone can recommend a good book on nonlinear algebra.
 
Physics news on Phys.org
galoisjr said:
Ok, to start off I have been examining the structure of polynomials. For instance, consider the general polynomial

P(x)=\sum^{n}_{k=0}a_{k}x^{k} (1)

Given some polynomial, the coefficients are known. Without the loss of generality, we can force the coefficient of the highest term to be zero, since we can always divide by it.

Next, we examine the same polynomial in it's factored form, which, from the FTOA, must exist and be equal to (1):

P(x)=\sum^{n}_{k=0}a_{k}x^{k}=\prod^{n}_{i=1}(x-x_{i}) (2)

Now, we can expand the product and rewrite it:

P(x)=\sum^{n}_{k=0}a_{k}x^{k}=\sum^{n}_{k=0}(-1)^{n}e_{k}(x_{1},x_{2},...,x_{n})x^{n-k}

Where the

e_{k}(x_{1},x_{2},...,x_{n})=\sum_{1\leq j_{1}<j_{2}<...<j_{k}\leq n}x_{j_{1}}x_{j_{2}}\cdots x_{j_{k}}

are the elementary symmetric polynomials. (If this looks odd to you the coefficient of the n-k power is the sum over the permutations of the n roots taken k at a time)

So, equating the known coefficients of the like powers of x, we have, a system of n nonlinear equations with n unknowns:

a_{n}=1
a_{n-1}=\sum_{1\leq j_{1}\leq n}x_{j_{1}}=-(x_{1}+x_{2}+...+x_{n})
a_{n-2}=\sum_{1\leq j_{1}<j_{2}\leq n}x_{j_{1}}x_{j_{2}}=x_{1}x_{2}+...+x_{1}x_{n}+x_{2}x_{3}+...+x_{2}x_{n}+...+x_{n-1}x_{n}

and so on.


Now, I am assuming that for n>4 the system of nonlinear equations is unsolvable, which would make sense because of galois theory and the obvious relation between the symmetric polynomials and the symmetric group on n letters. However, I have never read an algebra book that goes into the analysis as I just have here.


Any medium-decent book in algebra that deals with fields extensions/Galois Theory talks of the above. You can check under

"symmetric polynomials or symmetric rational functions" the books by Hungerford, Lang, Dummit & Foote (Gallian is an exception!), Fraleigh, etc.



I've seen some that mention permutation of coefficients, but not the actual roots. So, I was wondering if anyone has any suggestions on where to go from here from an abstract algebra viewpoint, and hoping someone can recommend a good book on nonlinear algebra.
 
galoisjr said:
Ok, to start off I have been examining the structure of polynomials. For instance, consider the general polynomial

P(x)=\sum^{n}_{k=0}a_{k}x^{k} (1)

Given some polynomial, the coefficients are known. Without the loss of generality, we can force the coefficient of the highest term to be zero, since we can always divide by it.
Typo- you clearly mean "we can force the coefficient of the highest term to be one", not zero.
 
Thank you Don Antonio. And you're right. The book that my class used in college just mentioned them. However, I did read a little of Artin's book and I don't remember seeing anything about them, but I most likely overlooked it. I really enjoyed the book by Rotman. It is a very well organized book in my opinion and it's also written very concisely and to the point even though its 1000 pages. That's where I found information on them. And he actually makes it a point to say that "no classical formula analogous to the quadratic formula for the quintic exists, we did not say that no solution to the system exists." Which answered my question.

And thank you Halls of Ivy, I'll edit that. Obviously there is no way I would have made it this far into my degree if I thought that I could force a coefficient to be zero and then divide by it... shaking my head.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Back
Top