Mastering Linear Vector Spaces: Exploring Bases, Dependence, and Operators

In summary, if a set has more than n vectors, then it is linearly dependent. The differential operator is linear and hermitian in the space of all deifferentiable wave functions that, say, vanish at both ends of an interval.
  • #1
fahd
40
0
hi there..i am stuck wid these 2 problems from the subject mathematical methods for physicists and the topic is "linear vector spaces"

Q1) If S={|1>,|2>,...|n>} is a basis for a vector space V, show that every set with more than n vectors is linearly dependent? (where |> is a dirac bracket)

Q2)Show that the differential operator
p=h/i (d/dx)
is linear and hermitian in the space of all deifferentiable wave functions
[phi(x)] that, say, vanish at both ends of an interval (a,b)?


i am totally confused with these two questions..we were not taught this topic that well and they expect us to know these questions because similar ones like these wud be in the test tomrrow..please help me ..i dun want to loose marks.I ALSO KNOW THAT according to the rules..i need to show u what iv dun so far..but please understand..what do i show you..im totally confused! please revert!
 
Physics news on Phys.org
  • #2
to get started

hey Fahd,

if you're going to continue on in physics, particularly quantum mechanics, this is important stuff to know. to get you started think about the definitions of the terms involved. linearly dependent means that at least one of the vectors in the 'greater than n' set can be written in terms of others in the set. also recall that the definition of a basis is that any vector in the space can be written in terms of these basis vectors. start out by thinking about how you can write every vecor in the 'greater than n' set in terms of the n vectors given as the basis.


for the momentum operator recall that linearity just means

O[a f(x) + b g(x)] = a O f(x) + b O g(x)

and hermitian just means
[tex]
\int_a^b f(x)^*p g(x) \,dx = \int_a^b (p f(x))^* g(x) \,dx
[/tex]

do a little integration by parts and you should be set


gabe
 
  • #3
Allday said:
hey Fahd,
if you're going to continue on in physics, particularly quantum mechanics, this is important stuff to know. to get you started think about the definitions of the terms involved. linearly dependent means that at least one of the vectors in the 'greater than n' set can be written in terms of others in the set. also recall that the definition of a basis is that any vector in the space can be written in terms of these basis vectors. start out by thinking about how you can write every vecor in the 'greater than n' set in terms of the n vectors given as the basis.
for the momentum operator recall that linearity just means
O[a f(x) + b g(x)] = a O f(x) + b O g(x)
and hermitian just means
[tex]
\int_a^b f(x)^*p g(x) \,dx = \int_a^b (p f(x))^* g(x) \,dx
[/tex]
do a little integration by parts and you should be set
gabe

thanks allday
just wonering
what do i take as f(x) and g(x) in the second question as stated by you
thanks
 
  • #4
you can't use any particular function because the relation has to hold for the entire vector space ie (every differentiable wave function). the only objects you can use are those functions and their derivatives. The important thing to know (and this comes up all the time in derivations) is that integration by parts allows you to move a derivative under an integral from one function to the other at the cost of a boundry term and a minus sign.
ill show you some of the steps in the last part.
[tex]
\int_a^b f^*(x)\frac{h}{i}\frac{dg}{dx} \,dx
[/tex]
[tex]
= \frac{h}{i}[f^*(x=b)g(x=b)-f^*(x=a)g(x=a)] - \int_a^b \frac{h}{i}\frac{df^*}{dx}g(x)
[/tex]
how is this related to the rhs? the tricky part about these problems is dealing only with the abstract label of the function which represents all the functions in a certain family. here you'll have to make some assumptions about how the functions behave at the boundries of the region that they're defined (x=a and x=b)
 
Last edited:
  • #5
thanks

hey..allday
thanks a lot for ur help
i finally understood it well...got both the questions..i was initially wondering where did g(x) and f(x) come from when they don't belong to the question..now i know!
thanks again!
 
  • #6
glad to hear that it makes sense. i know it feels great to finally understand a particularly abstract concept.

gabe
 

1. What is a linear vector space?

A linear vector space is a mathematical concept that describes a collection of objects, called vectors, that can be added together and multiplied by scalars to form new vectors. It follows a set of axioms, including closure, commutativity, and associativity, which allow for mathematical operations to be performed on the vectors in a consistent and predictable manner.

2. What are the applications of linear vector spaces?

Linear vector spaces have a wide range of applications in various fields, including physics, engineering, economics, and computer science. They are used to model and solve problems related to linear systems, such as calculating forces in a structure or predicting stock market trends. They are also essential in the development of algorithms and machine learning techniques.

3. How is a basis defined in a linear vector space?

A basis is a set of linearly independent vectors that can be used to express any vector in the linear vector space. This means that every vector in the space can be written as a unique linear combination of the basis vectors. The number of basis vectors in a linear vector space is called the dimension of the space.

4. What is the difference between a vector space and a linear vector space?

A vector space is a more general concept that encompasses both linear and nonlinear spaces, while a linear vector space specifically follows the axioms of linearity. This means that in a linear vector space, the operations of addition and scalar multiplication have certain properties, such as distributivity and associativity, that may not hold in a general vector space.

5. How can I determine if a set of vectors form a linear vector space?

To determine if a set of vectors form a linear vector space, you can check if they satisfy the axioms of linearity. This includes closure under addition and scalar multiplication, associativity and commutativity of addition, and distributivity of scalar multiplication over addition. If the set of vectors satisfies all of these properties, then it can be considered a linear vector space.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
274
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
972
  • Linear and Abstract Algebra
Replies
10
Views
1K
Replies
9
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
2K
  • Quantum Physics
Replies
32
Views
3K
  • Precalculus Mathematics Homework Help
Replies
6
Views
1K
Back
Top