Minimisation over random variables

In summary, the conversation discusses the relationship between two random variables x and y, and whether the expectation of F(x) is less than or equal to the expectation of F(y) when the expectation of x is less than or equal to the expectation of y. It is shown that if F(y)/y is decreasing and F(x) is restricted to a smaller interval than F(y), then the expectation of F(x) will be less than the expectation of F(y). However, if F(x) is assumed to be increasing and F(0)=0, it is possible to find a counterexample.
  • #1
TaPaKaH
54
0
Suppose we have a function ##F:\mathbb{R}_+\to\mathbb{R}_+## such that ##\frac{F(y)}{y}## is decreasing.
Let ##x## and ##y## be some ##\mathbb{R}_+##-valued random variables.
Would ##\mathbb{E}x\leq\mathbb{E}y## imply that ##\mathbb{E}F(x)\leq\mathbb{E}F(y)##?
 
Physics news on Phys.org
  • #2
Suppose we take ##F(y) = 1/y##. Then certainly ##F(y)/y = 1/y^2## is decreasing for positive ##y##.

Now let ##x## be some random variable which is restricted to the interval ##[1,2]## and let ##y## be some other random variable restricted to the interval ##[3,4]##. Thus ##E[x] < E[y]##. But ##F(x)## is restricted to ##[1/2, 1]## and ##F(y)## is restricted to ##[1/4, 1/3]##. So ##E[F(y)] < E[F(x)]##.
 
  • #3
Your example makes perfect sense.

But what if we assume that ##F(x)## is increasing in ##x## and ##F(0)=0##?
 
  • #4
TaPaKaH said:
Your example makes perfect sense.

But what if we assume that ##F(x)## is increasing in ##x## and ##F(0)=0##?
What if we take
$$F(x) = \begin{cases}
\sqrt{x} & \text{ if }0 \leq x \leq 1 \\
1 & \text{ if } x > 1
\end{cases}$$
Then ##F(x)/x## is decreasing for all positive ##x##.

Let ##x## be uniformly distributed over ##[1,2]##. Let ##y## be 0 or 3, each with probability 1/2. Then ##E[x] = E[y] = 1.5##.

But ##F(x) = 1## with probability 1, so ##E[F(x)] = 1##. And ##F(y)## is 0 or 1, each with probability 1/2, so ##E[F(y)] = 1/2##.

If you want ##F## to be strictly increasing, then give it a tiny positive slope for ##x > 1##, and define ##x## and ##y## as above. The result will still be ##E[F(y)] < E[F(x)]##.
 
  • #5


I would approach this question by first understanding the definitions and properties of the variables and function involved. In this case, we have a function F that maps positive real numbers to positive real numbers and has the property that F(y)/y is decreasing. This means that as y increases, the ratio F(y)/y decreases.

Next, we have two random variables, x and y, which are also positive real numbers. The notation E[x] represents the expected value of x, which is a measure of the average value of x over all possible outcomes. Similarly, E[y] represents the expected value of y.

Based on this information, we can see that the question is asking if the expected value of F(x) is less than or equal to the expected value of F(y) when the expected value of x is less than or equal to the expected value of y. In other words, does the ordering of the expected values of x and y carry over to the expected values of F(x) and F(y)?

To answer this question, we can use a counterexample. Let x = 1 and y = 2. In this case, E[x] = 1 and E[y] = 2. However, F(x) = F(1) could be any positive number, while F(y) = F(2) is necessarily smaller since F(y)/y is decreasing. Therefore, it is possible that E[F(x)] > E[F(y)], which contradicts the proposed implication.

In conclusion, the statement does not hold in general, as there are cases where the expected value of F(x) can be greater than the expected value of F(y) even when the expected value of x is less than or equal to the expected value of y. However, if we have additional information about the relationship between x and y, such as their joint distribution, we may be able to prove or disprove the implication.
 

1. What is minimisation over random variables?

Minimisation over random variables refers to the process of finding the minimum value of a function that depends on one or more random variables. This can involve finding the optimal values for these variables in order to minimise the overall output of the function.

2. Why is minimisation over random variables important?

Minimisation over random variables is important because it allows us to find the most efficient or optimal solution to a problem in situations where there is uncertainty or variability involved. This can help us make better decisions and improve the performance of our systems or processes.

3. What are some common methods used for minimisation over random variables?

Some common methods used for minimisation over random variables include gradient descent, stochastic gradient descent, and simulated annealing. These methods involve iteratively adjusting the values of the random variables in order to find the minimum value of the function.

4. How does minimisation over random variables relate to machine learning?

Minimisation over random variables is a crucial component of machine learning algorithms. Many machine learning models involve finding the optimal values for certain parameters or features, which can be seen as minimising a function over random variables. This process allows the model to learn and improve its performance over time.

5. Are there any challenges or limitations to minimisation over random variables?

Yes, there can be challenges and limitations to minimisation over random variables. One challenge is that the function being minimised may have multiple local minima, making it difficult to find the global minimum. Additionally, the process can be computationally intensive and may require a large amount of data. There may also be limitations in the types of problems that can be effectively solved through minimisation over random variables.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
447
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
927
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
959
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
886
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
870
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
745
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
846
Back
Top