1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Zero experience with proofs, does this count?

  1. Jul 13, 2015 #1

    Nathanael

    User Avatar
    Homework Helper

    1. The problem statement, all variables and given/known data
    I'm trying to show that ##\Sigma_j(\vec a_j \cdot \vec B)=\vec B\cdot\Sigma_j(\vec a_j)##
    (I need this to be true to derive some angular momentum properties.)

    2. The attempt at a solution
    Let's say that in some coordinate system we can express the vectors as ##\vec B=<B_1,B_2,...>## and ##\vec a_j=<a_{j.1},a_{j.2},...>##

    Then the sum of the dot products will be ##\Sigma_j(\vec a_j \cdot \vec B)=\Sigma_j(B_1a_{j.1})+\Sigma_j(B_2a_{j.2})+...=B_1\Sigma_j(a_{j.1})+B_2\Sigma_j(a_{j.2})+...=\vec B\cdot\Sigma_j(\vec a_j)##

    I think the crux of my proof is that, since ##\Sigma_j(\vec B\cdot \vec a_j)## is a sum of a bunch of scalars, it should be independent of the coordinate system (because that's the definition of a scalar, right?).

    I know this is a simple problem, but I have two questions:
    First, is this a valid proof? (I've never proven a thing in my life :redface:)
    Second, is there a way to show this without using a coordinate system?
     
  2. jcsd
  3. Jul 14, 2015 #2

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The proof actually follows easily, without using coordinates, from the axioms of an inner product., which state that it is a linear map:
    $$\vec{a}\cdot(\vec{b}+\vec{c})=\vec{a}\cdot\vec{b}+\vec{a}\cdot\vec{c}$$
    $$\vec{a}\cdot(\lambda\vec{b})=\lambda(\vec{a}\cdot\vec{b})$$

    By using just the first one together with induction on the number of vectors ##\vec{a}_j##, you can get your desired result (that's a hint at how to build a proof, not a proof, of course).

    Your proof leaves out too much. Doing it with coordinates, you need to write out all the summations. You will have two-level nested sums, one level for summing across ##j## and the other for summing across components of each vector: use an index ##i## for that. Your LHS above has the ##j## summation outside the ##i## sumation, and for the RHS it's the other way around. If you write both sides out longhand, using explicit summation limits like ##\sum_{i=1}^n a_{ji}B_i##, rather than ellipses ('....') then you can show they are equal by just switching the order of summation.

    The fact that the order of summation of finite sums can be switched is just a consequence of the associative and commutative properties of addition, which you can take for granted at this level.

    Note however that switching order of summation is often invalid when infinite sums are involved.
     
  4. Jul 14, 2015 #3

    Nathanael

    User Avatar
    Homework Helper

    Right, I don't know why I thought the (...) would fly. (Like I said, no experience with proofs.) All I meant by the (...) was:
    ##\vec B=<B_1,B_2,...,B_n>##
    ##\vec a_j=<a_{j.1},a_{j.2},...,a_{j.n}>##
    ##\Sigma_j(\vec a_j \cdot \vec B)=\Sigma_{i=1}^n\Sigma_j(B_ia_{j.i})=\Sigma_{i=1}^nB_i\Sigma_j(a_{j.i})=\vec B\cdot\Sigma_j(\vec a_j)##

    Is it still leaving anything out? I think this being-exact stuff is not for me.

    It seems apparent that ##\vec{a}\cdot\vec{b}+\vec{a}\cdot\vec{c}=\vec{a}\cdot(\vec{b}+\vec{c})## is equivalent to ##\Sigma_j(\vec a_j \cdot \vec B)=\vec B\cdot\Sigma_j(\vec a_j)## I just don't know how to turn that into a proof...
    I'm curious about how one would go from one statement to the other in a "strict" way.
    The only way I could think of was to use a coordinate system, which seems unnecessary. (But obviously I can't even do that right... o0))

    Not sure what you are referring to by "it."
     
  5. Jul 14, 2015 #4

    andrewkirk

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    It's closer. In your first double sum, the sum over j needs to be start on the outside because the sum over n is the inner (dot) product. Once you've written it, you can swap the order in the next step. So you need one extra step.

    The "it" to which I refer is the 'dot' of the inner product, which is actually a bilinear functional, ie a linear function from ##\mathscr{V}\times \mathscr{V}## to ##F## where ##\mathscr{V}## is a vector space over field ##F##. That is, it (the dot) is a function that takes two vectors as inputs and gives a scalar as output.
     
  6. Jul 14, 2015 #5

    PeroK

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Whenever you have a defined property involving the addition of two things, you can show that the property applies to the addition of ##n## things using induction. An example is the distributive law for real numbers:

    ##\forall a,b, c \ \ a(b+c) = ab + ac \ \Rightarrow a(b_1+ b_2 + \dots + b_n) = ab_1 + ab_2 + \dots +ab_n ##

    You should try to prove this using induction (a proof by induction is a good place to start).

    Your proof for the inner product is then just the same. Although, as you have changed the order of the terms, you need an extra step using the commutivity of the operation.

    After that, you probably wouldn't ever "prove" this again, you'd just note that by induction you can extend any similar property to more than two terms. E.g.

    Once you have proved that:

    If ##f## and ##g## are continuous at a point ##x_0##, then ##f+g## is continuous at ##x_0##.

    Then, you would extend this property to any finite sum of functions simply by quoting "induction".

    Proofs in general depend on properties, so that's how you have to think. What properties make this statement true?
     
  7. Jul 14, 2015 #6

    Nathanael

    User Avatar
    Homework Helper

    Oops, thanks.

    I have heard of induction. The idea is to show that truth for n implies truth for n+1, then show it's true for a specific case (like n=1) therefore it's true for all n=1,2,...

    Thank you both for reminding me about induction because I did not remember this idea.

    So I guess it would just go like this:
    ##\Sigma_{j=1}^{n+1}\vec B \cdot \vec a_j=\Sigma_{j=1}^n\vec B \cdot\vec a_j+\vec B \cdot \vec a_{n+1}=\vec B \cdot\Sigma_{j=1}^n\vec a_j+\vec B \cdot \vec a_{n+1}=\vec B \cdot (\Sigma_{j=1}^n\vec a_j+\vec a_{n+1})##
    ##\vec B \cdot \vec a_1+\vec B \cdot \vec a_2=\vec B \cdot (\vec a_1+\vec a_2)##
    An embarrassingly trivial use of induction, but we all have to start somewhere.

    Thanks guys.
     
  8. Jul 14, 2015 #7

    Mark44

    Staff: Mentor

    The usual order is to start from the base case (typically n = 1, but not always), and then assume the statement is true for n, followed by showing the statement is true for n.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Zero experience with proofs, does this count?
  1. Zero Content Proofs (Replies: 20)

  2. Experience with proofs (Replies: 4)

Loading...