# Orthogonality and physical applications

1. Aug 4, 2014

### 16universes

Just as we have orthogonal vectors/vector spaces/etc., we can have orthogonal functions/function spaces/etc. I'm trying to apply these concepts to physical processes. Here's a general idea of what I'm doing:

Suppose you have a physical quantity you are trying to measure, $F$, and it depends on two other physical processes, $A$ and $B$.

Consider the case where $A$ and $B$ are completely independent from one another:

We apply process $A$ only and measure $F_A$.
Next, we apply process $B$ and measure $F_B$.
We then apply processes $A$ and $B$ simultaneously and measure $F_{AB}$.
Since the processes are independent, we expect to find that $F_A + F_B = F_{AB}$.

We could say that processes $A$ and $B$ are "orthogonal", and that their inner product is

$$\langle A \cdot B \rangle = 0$$

Contrast this to the case where $A$ and $B$ are not completely independent:

$$F_A + F_B \neq F_{AB}$$

and their inner product is

$$\langle A \cdot B \rangle \neq 0$$

Is it accurate to say that the inner product is a quantity that represents "dependency" between processes $A$ and $B$? Or is there a more specific physical interpretation? And, can the value of the inner product be used to say anything in particular about the processes $A$ and $B$?

When dealing with vectors, the inner product gives information about the angle between the two vectors, which has physical significance for the vectors. When we apply concepts of "dependency" and "orthogonality" to sets of functions, what information is the inner product revealing to us?

2. Aug 4, 2014

### MrAnchovy

You are putting the cart before the horse: first you must define what you mean by the inner product. You will then know what its significance is.

Avoid the word "dependency" as it misleadingly implies causality in a way that "independence" does not.

3. Aug 4, 2014

### 16universes

Physical processes are described with functions. Inner product as defined when discussing functions was implied.

@MrAnchovy: Based on the way you respond to people on the forums, I can tell that non-mathematicians really get under your skin. I'm not a mathematician and I know haven't adequately described my question mathematically. You're an intelligent guy and I respect your level of expertise. I'd appreciate it if you could look past the inaccuracies of my question though and focus on the underlying problem. I know you understood my question and I'm sure you have a great answer.

4. Aug 4, 2014

### jbunniii

It is worth noting that, in the context of probability theory, statistical independence is a stronger condition than orthogonality. Independence implies orthogonality (for zero mean random variables) but not vice versa.

For example, consider tossing two coins, and enumerate each outcome as +1 for heads, and -1 for tails.

Define two random variables as follows:

X = the sum of the two outcomes
Y = the difference of the two outcomes

Thus the values of X and Y depend on the coin toss results as follows:

HH: X = 2, Y = 0
HT: X = 0, Y = 2
TH: X = 0, Y = -2
TT: X = -2, Y = 0

Definition of orthogonality: E[XY] = 0; related notion of uncorrelatedness in the non-zero mean case: E[XY] = E[X]E[Y]

Definition of independence: P(X = x and Y = y) = P(X = x)P(Y = y)

Note that E[X] = E[Y] = 0, and XY = 0 in all four cases, so E[XY] = E[X]E[Y] = 0, in other words, the random variables are orthogonal (and uncorrelated).

However, X and Y are not independent: if X = 0, then Y cannot be 0, and vice versa. Thus for example, P(X = 0 and Y = 0) = 0, whereas P(X = 0)P(Y = 0) = (1/2)(1/2) = 1/4.

Last edited: Aug 4, 2014
5. Aug 4, 2014

### 16universes

Hmmm. Interesting.

6. Aug 8, 2014

### Stephen Tashi

What are some specific examples of the above situation?

7. Aug 8, 2014

### 16universes

Here's one example: Suppose you were cleaning a surface. You can use soap, water, or both, before wiping down the surface.

You apply water only, then wipe the surface. You then apply soap only and wipe the surface. Is the "cleanliness" of the surface equivalent to the case where you apply both soap and water before wiping the surface?

Suppose it was possible to define an "inner product" for these cleaning "processes". If we found no difference between using soap and water separately as compared to using soap and water simultaneously, then this inner product would be zero (we might say that using soap and water are "orthogonal processes" with respect to cleaning our surface). If we found that there was some difference in the "cleanliness" when we apply soap and water separately as compared to applying both simultaneously, then our inner product would be non-zero.

Is there a generalized meaning for non-zero inner products (assuming that an inner product can be defined)? In other words, what can be said about process A and B if their inner product is non-zero? Does this require an application before anything can be said at all?

Does this help to clarify the question?

8. Aug 8, 2014

### MrAnchovy

I think you must be confusing me with someone else, but to show there's no hard feelings I'll be more explicit:

Is it accurate to say that the inner product is a quantity that represents "dependency" between processes A and B?
No. The word "dependency" implies causality and is not used in this context. If the two processes are not independent we might say that they are "correlated".

Or is there a more specific physical interpretation? And, can the value of the inner product be used to say anything in particular about the processes A and B?
Possibly, but you haven't actually defined how to calculate the value of the inner product of two processes. I'm not criticising you for any mistake or lack of knowledge here, I'm simply saying that the words "inner product" don't have a universal meaning so if you want to apply them to something they are not normally used for you have to give them a meaning yourself. So once you have stated how to calculate the inner product of two processes (and before you do that it's probably a good idea to state exactly what you mean by a "process"), you will almost certainly have an insight into the significance of its value.

9. Aug 8, 2014

### 16universes

I appreciate the help. I naively assumed that inner product had a specific definition. My experience revolves around applications to vectors in general (representing physical quantities), quantum mechanics and single variable functions in general. On some level I assumed that an inner product revealed the amount of "interaction" between two abstract objects.

Do you know of any sources that discuss inner products on more of an abstract level? Is there even such a thing? All sources I've come across have definitions that involve vectors and vector spaces. Is it even possible to make it more abstract than that?

Are there any examples (or is it possible to create an example) of an inner product that is defined differently than what we usually encounter with vectors/functions?

10. Aug 8, 2014

### Stephen Tashi

In you original post, are you literally taking $F_A + F_B$ to mean addition? Does it mean arithmetic addition or does it stand for the vague idea of doing process B to something that process A has already been applied?

I think you are expressing an idea like "linearity", not an idea like "orthogonality".

For a real valued linear transformation $F$ of a real number argument, $F(A+B) = F(A) + F(B)$.

If you want to generalize the idea of linearity, you must generalize the meaning of "+". I think you must give the "+" in the expression $F(A + B)$ a different meaning than the "+" in the expression $F(A) + F(B)$.

You also have used the notation $F_{AB}$ so you need to say what you mean by the multiplicative notation.$AB$.

In your example, suppose when we apply a certain amount of soap twice thet we get the surface exactly as clean as when we apply twice that amount of soap once. Would we say soap was "orthogonal" to itself?

11. Aug 8, 2014

### 16universes

Hmmm. You've raised some interesting points I haven't thought about. I need some time to think this through. I'll post my thoughts sometime in the next few days.

Thanks for the help everyone.

12. Aug 8, 2014

### MrAnchovy

No, the definition of an inner product depends on the domain - and there can even be different definitions for the same domain.

However any transformation that is called an inner product will usually satisfy the following:
1. $\langle u + v, w \rangle = \langle u, w \rangle + \langle v, w \rangle$
2. $\langle \alpha v, w \rangle = \alpha \langle v, w \rangle$
3. $\langle v, w \rangle = \langle w, v \rangle$ (or $\langle v, w \rangle = \overline{\langle w, v \rangle}$ if the range of the transformation is complex)
4. $\langle v, v \rangle \ge 0$ and $\langle v, v \rangle = 0$ if and only if $v = 0$

However as Stephen indicated, it is important to deal with addition and multiplication first before looking at an inner product.

Well that is not completely wide of the mark. If we can define a metric such that the norm $\lvert v \rvert = \sqrt{\langle v, v \rangle}$ we should have that $\langle u, v \rangle = \lvert u \rvert \lvert v \rvert cos \psi$ - and we can call ψ a measure of correlation (again I don't like the word interaction, two values can be correlated without interacting).

I don't know any abstract sources, the definition of an inner product is really only a step on the way to creating an inner product space, and I don't think it would really help understanding of e.g. a Hilbert Space to draw comparisions of the operation of the inner product transformation with e.g. 3D Euclidean geometry.

Have you studied the tensor dot and double dot products?

Last edited: Aug 8, 2014
13. Aug 8, 2014

### WWGD

I think one can restrict the space of random variables in some way so that correlation , or a slightly-changed version of it is an inner-product.

14. Aug 8, 2014

### strangerep

16universes,

Have you studied Quantum Mechanics? (If not, I think you might like it.)

15. Aug 9, 2014

### 16universes

@MrAnchovy: Thanks. I'm going to read through this later. The equations aren't showing up on my phone so it's difficult to read.

Correlation is probably a better choice of wording. I'm thinking of these abstract "processes" as operators of some kind operating on some abstract "state". The word "interaction" came to mind because I'm interested in what happens when the "processes" are simultaneously applied to a "state". Do they "see" each other while operating on the same state? That was my line of thinking.

I might have encountered double dot and tensor dot products while taking a course in general relativity, but I don't exactly remember. Are those used when dealing with manifolds?

@strangerep: Yes I have studied QM back in grad school but at the time it was a little over my head. Ha, I wasn't a big fan of it. I'm very much a visual learner and have horrible retention from reading words/equations. Bra ket notation made it a little easier because of its "visual" way of communicating the mathematics, but overall I struggled with QM. If I took it again I'd probably get more out of it though.

Last edited: Aug 9, 2014
16. Aug 10, 2014

### WWGD

Last edited: Aug 10, 2014
17. Aug 10, 2014

### strangerep

Try Ballentine, chapters 1 and 2, and see how you go this time...

18. Aug 10, 2014