# Max min distribution problem

1. Jun 2, 2012

### S_David

Hello,

I have this probability:

$$\text{Pr}\left\{\underset{i,j}{\max\,}\underset{n}{\min\,}X_i(n)+X_j(n)<a\right\}$$

where X_i(n) and X_j(n) are i.i.d. for all i,j, and n. Can I find the distribution of

$$X_i(n_{\text{min}})$$

where:

$$\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})$$

??

2. Jun 4, 2012

### Stephen Tashi

I find your notation mysterious. Why is the index $n$ in parenthesis versus being a subscript like $i$ and $j$ are?

3. Jun 4, 2012

### S_David

Basically, I have a set of random variables $$X_i(n)$$ for i=1,...,K, and for n=1,..., N. So, X_i(n) means the nth random variable of X_i. It is hard to explain. It is easier using communication systems.

4. Jun 4, 2012

### Stephen Tashi

Are you saying that you only know the above probability and do not know the common distribution of the $X_i(k)$ ?

5. Jun 4, 2012

### S_David

I know the distribution of x_i(n), but I do not know what is the distribution of X_i(n_min), because the minimization is done for X_i(n)+X_j(n).

6. Jun 5, 2012

### haruspex

I don't understand how
$$\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})$$
serves as a definition of
$$n_{\text{min}}$$
Isn't nmin a function of i and j?

7. Jun 5, 2012

### S_David

OK, let me state the problem in another way: suppose I have N×K i.i.d. random variables $$X_{i,n}$$ for i=1,...,K and n=1,...,N.

Define

$$X_{ij}=\underset{n}{\min\,}X_{i,n}+X_{j,n}$$

for

$$i\neq j$$

Now I can find the distribution X_{ij}, but I need the distribution of:

$$\underset{i,j}{\max\,}X_{ij}$$

Is that doable?

8. Jun 5, 2012

### chiro

This looks like a standard order statistics problem. Are the domain for i and j fixed?

9. Jun 5, 2012

### S_David

i=1,...,N and j=1,...,N and i does not equal j.

The problem is that X_ij are not independent.

10. Jun 5, 2012

### chiro

Perhaps you could create an uncorrelated basis and go from there. Are you aware of Principal Component Analysis?

11. Jun 6, 2012

### S_David

Not really, what is that?

12. Jun 6, 2012

### chiro

It's the main idea of principal components.

http://en.wikipedia.org/wiki/Principal_component_analysis

The idea is to create an orthogonal (but not necessarily orthonormal in general) basis where each basis vector is a linear combination of your random variables. The basic idea is to solve an optimization problem where one constraint is to set your covariance matrix of your new basis to zero.

This will create an uncorrelated basis and from there you can use techniques that would otherwise assume to have un-correlated random variables.

This isn't enough to solve your problem, but I think it's worth looking into as one part of the solution especially since you are faced with the dependencies between the variables.

13. Jun 6, 2012

### S_David

OK, I will have a look on it. Thanks for interacting