Proving W^a=0 in Tensor Multiplication: A critical analysis

In summary, if W^aX_a=0 and X_a is arbitrary, then it means that W^a must be equal to 0, as this is the only vector orthogonal to all other vectors. This can be proven by setting X_a equal to W_a, which results in W^aW_a=0 and thus W^a=0.
  • #1
PhyPsy
39
0
This book says that if [itex]W^aX_a=0[/itex] and [itex]X_a[/itex] is arbitrary, then I should be able to prove that [itex]W^a=0[/itex]. I don't see how this is possible. This is the equivalent of the vector dot product, so if, say, [itex]X_a=(1,0,0,0)[/itex], then [itex]W^a[/itex] could be (0,1,1,1), and the dot product would be [itex]1*0+0*1+0*1+0*1=0[/itex]. Why would [itex]W^a[/itex] have to be 0?
 
Physics news on Phys.org
  • #2
It means for any Xa not for some Xa. So in your vector example the only vector orthogonal to all vectors is the zero vector.
 
  • #3
If [itex]X_\alpha[/itex] can be any vector, it can be [itex]W_\alpha[/itex]. If [itex]W^\alpha X_\alpha= 0[/itex] for [itex]X+_\alpha[/itex] any vector then [itex]W^\alpha W_\alpha= 0[/itex] which immediately gives [itex]W^\alpha= 0[/itex]
 

1. What is tensor multiplication?

Tensor multiplication is a mathematical operation that involves multiplying two tensors (multidimensional arrays) to produce a new tensor. It is a fundamental operation in linear algebra and is used in many areas of science and engineering.

2. How is tensor multiplication different from matrix multiplication?

Tensor multiplication is an extension of matrix multiplication to higher dimensions. While matrix multiplication involves multiplying two 2D arrays, tensor multiplication can involve multiplying tensors of any number of dimensions. Additionally, tensor multiplication follows different rules and properties compared to matrix multiplication.

3. What are the different types of tensor multiplication?

There are two main types of tensor multiplication: element-wise multiplication and tensor dot product. Element-wise multiplication involves multiplying corresponding elements of two tensors, while tensor dot product involves performing a series of matrix multiplications on the tensors.

4. What are the applications of tensor multiplication?

Tensor multiplication has many applications in various fields of science and engineering, including image and signal processing, machine learning, quantum mechanics, and general relativity. It is also used in the representation and manipulation of data in deep learning and neural networks.

5. Are there any limitations to tensor multiplication?

While tensor multiplication is a powerful mathematical operation, it can become computationally expensive when dealing with large tensors. Additionally, the interpretation of the results of tensor multiplication can be challenging due to the high dimensionality of tensors. Therefore, it is essential to have a good understanding of the underlying concepts and properties of tensor multiplication before using it in practical applications.

Similar threads

  • Linear and Abstract Algebra
Replies
10
Views
351
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
963
  • Linear and Abstract Algebra
Replies
1
Views
3K
  • Linear and Abstract Algebra
Replies
2
Views
926
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
14
Views
2K
Back
Top