Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Set of polynomials is infinite dimensional

  1. Jun 5, 2013 #1
    How does one show that the set of polynomials is infinite-dimensional? Does one begin by assuming that a finite basis for it exists, and then reaching a contradiction?

    Could someone check the following proof for me, which I just wrote up ?

    We prove that V, the set of all polynomials over a field F is infinite-dimensional. To do so, assume on the contrary that it is finite-dimensional, having dimension n. Then there exists a basis for V having n elements.
    Since the following set is linearly independent and has n elements, it is also a basis for V:
    [itex] β = \{ 1, x, x^{2}...x^{n-1} \} [/itex]

    Thus every polynomial is expressible as a linear combination of the vectors in this set.
    But then [itex]x^{n} \in span(β) [/itex] which implies that [itex] β \cup \{x^{n}\} [/itex] is linearly independent. This is clearly false, hence a contradiction. Thus the vector space of polynomials is infinite dimensional.

    Is it completely correct?

  2. jcsd
  3. Jun 5, 2013 #2
    That seems correct!
  4. Jun 6, 2013 #3
    Thanks for the feedback micro. But apparently one small thing seems to be bothering me now.
    I never actually proved that [itex] β = \{ 1, x, x^{2}...x^{n-1} \} [/itex] is linearly independent. How would I do that? Only way I can think of is using Wronskian, but is there perhaps a way to do it without calculus?

  5. Jun 6, 2013 #4
    It depends on how you define polynomials. In a lot of situations, you define the polynomials such that the set you mention is linear independent. This is the algebraic approach.
    The more familiar approach is to define polynomials as actual functions ##f:\mathbb{R}\rightarrow \mathbb{R}##. If you do that, then using calculus really is the best and easiest approach. I suppose you can also show it directly: For example, for n=1: assume that

    $$\alpha +\beta x + \gamma x^2 = 0$$

    Substitute in some values for x, for example, by substituting 0, 1 and 2, we get

    $$\left\{\begin{array}{l} \alpha = 0\\ \alpha +\beta +\gamma = 0\\ \alpha +2\beta + 4\gamma = 0 \end{array}\right.$$

    This system of equations has unique solution ##\alpha = \beta = \gamma = 0## and thus we get linear independence. The situation for ##n>2## is a similar but a bit more hairy. A relevant result that will help you here ar Vandermonde determinants: http://www.proofwiki.org/wiki/Vandermonde_Determinant
  6. Jun 6, 2013 #5
    I am rather curious about your direct approach. How would I go about it for n>2? I have not much knowledge of Vandermonde matrices.

    Would it involve some sort of induction?

  7. Jun 6, 2013 #6


    User Avatar
    Homework Helper

    There are many ways to do this. For simplicity lets assume any basis will only contain polynomial of different degrees. Prove a linear combination of polynomials has degree at most the highest degree of the combined polynomials.
  8. Jun 9, 2013 #7


    User Avatar
    Science Advisor

    Notice that if a linear combination of the elements is 0 , it is the zero polynomial, and not the number zero. But a (nonzero) polynomial can have only finitely-many zeros.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook