Can Principal Component Analysis Solve the Max Min Distribution Problem?

  • Context: Graduate 
  • Thread starter Thread starter EngWiPy
  • Start date Start date
  • Tags Tags
    Distribution Max
Click For Summary

Discussion Overview

The discussion revolves around the application of Principal Component Analysis (PCA) to solve a problem related to the distribution of random variables defined by a max-min distribution scenario. Participants explore the mathematical formulation of the problem, the nature of the random variables involved, and the potential utility of PCA in addressing dependencies among these variables.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant presents a probability expression involving the maximum and minimum of i.i.d. random variables, seeking to find the distribution of a specific variable defined by these operations.
  • Another participant questions the notation used, particularly the indexing of the random variables, and seeks clarification on the definitions involved.
  • Several participants discuss the implications of the minimization operation on the distribution of the random variables and express uncertainty about how to define the index for the minimum.
  • A participant suggests that the problem resembles a standard order statistics problem, prompting further exploration of the dependencies between the variables.
  • There is mention of the potential to create an uncorrelated basis using PCA, with one participant expressing unfamiliarity with PCA and another providing a brief explanation of its principles and relevance to the problem.
  • Participants acknowledge that while PCA may not fully resolve the problem, it could be a useful part of the solution given the dependencies present in the random variables.

Areas of Agreement / Disagreement

Participants express varying levels of understanding regarding the notation and the implications of the problem. There is no consensus on the best approach to find the distribution of the specified random variables, and multiple perspectives on the use of PCA and its relevance are presented.

Contextual Notes

The discussion highlights limitations in the clarity of notation and definitions, as well as the complexity introduced by the dependencies among the random variables. The exact nature of the distributions involved remains unresolved.

EngWiPy
Messages
1,361
Reaction score
61
Hello,

I have this probability:

[tex]\text{Pr}\left\{\underset{i,j}{\max\,}\underset{n}{\min\,}X_i(n)+X_j(n)<a\right\}[/tex]

where X_i(n) and X_j(n) are i.i.d. for all i,j, and n. Can I find the distribution of

[tex]X_i(n_{\text{min}})[/tex]

where:

[tex]\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})[/tex]

??

Thanks in advance
 
Physics news on Phys.org
S_David said:
where X_i(n) and X_j(n) are i.i.d. for all i,j, and n. Can I find the distribution of

I find your notation mysterious. Why is the index [itex]n[/itex] in parenthesis versus being a subscript like [itex]i[/itex] and [itex]j[/itex] are?
 
Stephen Tashi said:
I find your notation mysterious. Why is the index [itex]n[/itex] in parenthesis versus being a subscript like [itex]i[/itex] and [itex]j[/itex] are?

Basically, I have a set of random variables [tex]X_i(n)[/tex] for i=1,...,K, and for n=1,..., N. So, X_i(n) means the nth random variable of X_i. It is hard to explain. It is easier using communication systems.
 
S_David said:
I have this probability:

[tex]\text{Pr}\left\{\underset{i,j}{\max\,}\underset{n}{\min\,}X_i(n)+X_j(n)<a\right\}[/tex]

where X_i(n) and X_j(n) are i.i.d. for all i,j, and n.

Are you saying that you only know the above probability and do not know the common distribution of the [itex]X_i(k)[/itex] ?
 
Stephen Tashi said:
Are you saying that you only know the above probability and do not know the common distribution of the [itex]X_i(k)[/itex] ?

I know the distribution of x_i(n), but I do not know what is the distribution of X_i(n_min), because the minimization is done for X_i(n)+X_j(n).
 
S_David said:
Can I find the distribution of
[tex]X_i(n_{\text{min}})[/tex]
where:
[tex]\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})[/tex]

I don't understand how
[tex]\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})[/tex]
serves as a definition of
[tex]n_{\text{min}}[/tex]
Isn't nmin a function of i and j?
 
haruspex said:
I don't understand how
[tex]\underset{n}{\min\,}X_i(n)+X_j(n)=X_i(n_{\text{min}})+X_j(n_{\text{min}})[/tex]
serves as a definition of
[tex]n_{\text{min}}[/tex]
Isn't nmin a function of i and j?

OK, let me state the problem in another way: suppose I have N×K i.i.d. random variables [tex]X_{i,n}[/tex] for i=1,...,K and n=1,...,N.

Define

[tex]X_{ij}=\underset{n}{\min\,}X_{i,n}+X_{j,n}[/tex]

for

[tex]i\neq j[/tex]

Now I can find the distribution X_{ij}, but I need the distribution of:

[tex]\underset{i,j}{\max\,}X_{ij}[/tex]

Is that doable?
 
S_David said:
OK, let me state the problem in another way: suppose I have N×K i.i.d. random variables [tex]X_{i,n}[/tex] for i=1,...,K and n=1,...,N.

Define

[tex]X_{ij}=\underset{n}{\min\,}X_{i,n}+X_{j,n}[/tex]

for

[tex]i\neq j[/tex]

Now I can find the distribution X_{ij}, but I need the distribution of:

[tex]\underset{i,j}{\max\,}X_{ij}[/tex]

Is that doable?

This looks like a standard order statistics problem. Are the domain for i and j fixed?
 
chiro said:
This looks like a standard order statistics problem. Are the domain for i and j fixed?

i=1,...,N and j=1,...,N and i does not equal j.

The problem is that X_ij are not independent.
 
  • #10
S_David said:
i=1,...,N and j=1,...,N and i does not equal j.

The problem is that X_ij are not independent.

Perhaps you could create an uncorrelated basis and go from there. Are you aware of Principal Component Analysis?
 
  • #11
chiro said:
Perhaps you could create an uncorrelated basis and go from there. Are you aware of Principal Component Analysis?

Not really, what is that?
 
  • #12
S_David said:
Not really, what is that?

It's the main idea of principal components.

http://en.wikipedia.org/wiki/Principal_component_analysis

The idea is to create an orthogonal (but not necessarily orthonormal in general) basis where each basis vector is a linear combination of your random variables. The basic idea is to solve an optimization problem where one constraint is to set your covariance matrix of your new basis to zero.

This will create an uncorrelated basis and from there you can use techniques that would otherwise assume to have un-correlated random variables.

This isn't enough to solve your problem, but I think it's worth looking into as one part of the solution especially since you are faced with the dependencies between the variables.
 
  • #13
chiro said:
It's the main idea of principal components.

http://en.wikipedia.org/wiki/Principal_component_analysis

The idea is to create an orthogonal (but not necessarily orthonormal in general) basis where each basis vector is a linear combination of your random variables. The basic idea is to solve an optimization problem where one constraint is to set your covariance matrix of your new basis to zero.

This will create an uncorrelated basis and from there you can use techniques that would otherwise assume to have un-correlated random variables.

This isn't enough to solve your problem, but I think it's worth looking into as one part of the solution especially since you are faced with the dependencies between the variables.

OK, I will have a look on it. Thanks for interacting
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
7K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 29 ·
Replies
29
Views
6K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K