Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Max min distribution problem

  1. Jun 2, 2012 #1

    I have this probability:


    where X_i(n) and X_j(n) are i.i.d. for all i,j, and n. Can I find the distribution of





    Thanks in advance
  2. jcsd
  3. Jun 4, 2012 #2

    Stephen Tashi

    User Avatar
    Science Advisor

    I find your notation mysterious. Why is the index [itex] n [/itex] in parenthesis versus being a subscript like [itex] i [/itex] and [itex] j [/itex] are?
  4. Jun 4, 2012 #3
    Basically, I have a set of random variables [tex]X_i(n)[/tex] for i=1,...,K, and for n=1,..., N. So, X_i(n) means the nth random variable of X_i. It is hard to explain. It is easier using communication systems.
  5. Jun 4, 2012 #4

    Stephen Tashi

    User Avatar
    Science Advisor

    Are you saying that you only know the above probability and do not know the common distribution of the [itex] X_i(k) [/itex] ?
  6. Jun 4, 2012 #5
    I know the distribution of x_i(n), but I do not know what is the distribution of X_i(n_min), because the minimization is done for X_i(n)+X_j(n).
  7. Jun 5, 2012 #6


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I don't understand how
    serves as a definition of
    Isn't nmin a function of i and j?
  8. Jun 5, 2012 #7
    OK, let me state the problem in another way: suppose I have N×K i.i.d. random variables [tex]X_{i,n}[/tex] for i=1,...,K and n=1,...,N.




    [tex]i\neq j[/tex]

    Now I can find the distribution X_{ij}, but I need the distribution of:


    Is that doable?
  9. Jun 5, 2012 #8


    User Avatar
    Science Advisor

    This looks like a standard order statistics problem. Are the domain for i and j fixed?
  10. Jun 5, 2012 #9
    i=1,...,N and j=1,...,N and i does not equal j.

    The problem is that X_ij are not independent.
  11. Jun 5, 2012 #10


    User Avatar
    Science Advisor

    Perhaps you could create an uncorrelated basis and go from there. Are you aware of Principal Component Analysis?
  12. Jun 6, 2012 #11
    Not really, what is that?
  13. Jun 6, 2012 #12


    User Avatar
    Science Advisor

    It's the main idea of principal components.


    The idea is to create an orthogonal (but not necessarily orthonormal in general) basis where each basis vector is a linear combination of your random variables. The basic idea is to solve an optimization problem where one constraint is to set your covariance matrix of your new basis to zero.

    This will create an uncorrelated basis and from there you can use techniques that would otherwise assume to have un-correlated random variables.

    This isn't enough to solve your problem, but I think it's worth looking into as one part of the solution especially since you are faced with the dependencies between the variables.
  14. Jun 6, 2012 #13
    OK, I will have a look on it. Thanks for interacting
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook