Saw
Gold Member
- 631
- 18
@PeroK, you misquoted me, please edit your quote in post 48. My wink was accompanying the joke to PeterDonis where I gave him back the recommendation to go and study vector space. I introduced it to show that I was saying that in a playful way, with constructive intention, following his suggestion. But if you put it where you put it, you make me sound as patronizing, which is far from my intention.
"Naive" is an implicit assumption that the analysis is basically correct. So the burden of disproving it is displaced to you.
The two vectors at the extremes are orthogonal, i.e. totally linearly independent, and this is a 2D space, so forcefully they span the whole space, including the vector in the middle, which is therefore fully redundant... I don´t see how this contradicts what I am saying, rather it looks like a confirmation thereof.
Furthermore, it is not enough if you list differences btw the two concepts, you should mention why such differences are relevant to the discussion. I brought up the idea of independence btw basis vectors just to show that it makes sense to assign the same units to T and X because, no matter if they seem independent from each other, since that does not prevent one from seeing them as ways to look at the same thing. However, if as I already mentioned, we adopted an operational method to measure T and X where these axes were only linearly independent but not orthogonal, that would not undermine the argument...
PeroK said:The two concepts are, therefore, more subtly different that your naive analysis involving "shadows" would suggest.
"Naive" is an implicit assumption that the analysis is basically correct. So the burden of disproving it is displaced to you.
In what sense is inner product analytic? Is "analytic" here referring to calculus? If so, that will be the case when the dot product is an integral, like in Hilbert space, but not here.PeroK said:From a pure mathematical perspective, linear independence is an algebraic property. It depends only on the addition of vectors and multiplication by scalars. Whereas, orthogonality is an analytic property, as it depends on the inner product.
PeroK said:Moreover, linear independence of a set of (more than two) vectors is not a case of mutual linear independence only. E.g. the vectors ##(0,1), (1,1), (1,0)## in ##\mathbb R^2## are all pairwise linearly independent, but form a linearly dependent set.
Orthogonality, on the other hand, is only a pairwise concept. A set of vectors is orthogonal if and only if every pair of vectors in the set is orthogonal.
The two concepts are, therefore, more subtly different that your naive analysis involving "shadows" would suggest.
The two vectors at the extremes are orthogonal, i.e. totally linearly independent, and this is a 2D space, so forcefully they span the whole space, including the vector in the middle, which is therefore fully redundant... I don´t see how this contradicts what I am saying, rather it looks like a confirmation thereof.
Furthermore, it is not enough if you list differences btw the two concepts, you should mention why such differences are relevant to the discussion. I brought up the idea of independence btw basis vectors just to show that it makes sense to assign the same units to T and X because, no matter if they seem independent from each other, since that does not prevent one from seeing them as ways to look at the same thing. However, if as I already mentioned, we adopted an operational method to measure T and X where these axes were only linearly independent but not orthogonal, that would not undermine the argument...