For Log function, is this true?

  • #1
1,368
61
Hi,

I have the numbers x1, ..., xN, and I need to find the max of log2(1+xi). Is this equivalent to find Log2(1+max xi), since the Log function is monotonically increasing function of its argument?

Thanks
 

Answers and Replies

  • #2
Hi,

I have the numbers x1, ..., xN, and I need to find the max of log2(1+xi). Is this equivalent to find Log2(1+max xi), since the Log function is monotonically increasing function of its argument?

Thanks

I believe so, yes. The increasing monotonic behavior being the cause. :smile:

Is that log2(1+xi) by the way?
 
  • #3
I believe so, yes. The increasing monotonic behavior being the cause. :smile:

Is that log2(1+xi) by the way?

Yes, it is what you wrote. To be more clear. Is:

[tex]\underset{i=1,\ldots,N}{\max\,}\log_2(1+x_i)=\log_2(1+\underset{i=1,\ldots,N}{\max\,}x_i)[/tex]?

Thanks
 
  • #4
Sure. As Infinitum mentioned, log is monotonic. Assuming the argument is positive, that is.
 
  • #5
Ok, in the same line, is this true:

[tex]\underset{i}{\min\,}\sum_n\frac{1}{1+f_i(n)}\geq \sum_n\underset{i}{\min\,}\frac{1}{1+f_i(n)}[/tex]
where f_i(n)>=0?
 
  • #6
Ok, in the same line, is this true:

[tex]\underset{i}{\min\,}\sum_n\frac{1}{1+f_i(n)}\geq \sum_n\underset{i}{\min\,}\frac{1}{1+f_i(n)}[/tex]
where f_i(n)>=0?

What are you summing over? It seems you are summing over n for some set of integer numbers, but I'm unsure.

Also are trying to find the minimal value for f_i(n) over all of n for a fixed i? The reason I ask is that you write min i, which seems to minimize i but I don't think this is what you meant.

One way I think of approach this is to use a triangle inequality identity, where each term is expressed in terms of minimums.

Do you know about triangle inequalities for norms?
 
  • #7
What are you summing over? It seems you are summing over n for some set of integer numbers, but I'm unsure.

Also are trying to find the minimal value for f_i(n) over all of n for a fixed i? The reason I ask is that you write min i, which seems to minimize i but I don't think this is what you meant.

One way I think of approach this is to use a triangle inequality identity, where each term is expressed in terms of minimums.

Do you know about triangle inequalities for norms?

For each i there is a summation over n, right?. I need to find i that minimizes the summation. I am not sure what triangle inequalities you are talking about, but I was thinking of Jensen's inequality. Does it work here?
 
  • #8
For each i there is a summation over n, right?. I need to find i that minimizes the summation. I am not sure what triangle inequalities you are talking about, but I was thinking of Jensen's inequality. Does it work here?

You would have to establish whether the function is convex or not, and if it is convex, then you could use it one form, if concave another.

Your function is going to look like a weird kind of set of planes, kind of like an inverted pyramid (for the 2D case) which seems at least visually to be convex but I would double check.

But from this, you will be able to to either prove one inequality (for convex) or it's converse (for concave) and this is a really good idea to use Jensons inequality.

Based on the convexity definition, you can basically test convexity for a set of two arguments and then apply induction on that (which will give you something similar to Jensen's inequality but will be proven specifically for your case). Since you will get lots of inequalities that add together to give one final inequality.
 
  • #9
You would have to establish whether the function is convex or not, and if it is convex, then you could use it one form, if concave another.

Your function is going to look like a weird kind of set of planes, kind of like an inverted pyramid (for the 2D case) which seems at least visually to be convex but I would double check.

But from this, you will be able to to either prove one inequality (for convex) or it's converse (for concave) and this is a really good idea to use Jensons inequality.

Based on the convexity definition, you can basically test convexity for a set of two arguments and then apply induction on that (which will give you something similar to Jensen's inequality but will be proven specifically for your case). Since you will get lots of inequalities that add together to give one final inequality.

I think it is convex if f_i(n) is strictly positive, where the second derivative will be positive then, right?
 
  • #10
I think it is convex if f_i(n) is strictly positive, where the second derivative will be positive then, right?

I think it will be convex, but I'm guessing this is for work (I have replied in your other threads) so I'd double check to be sure. It does say though the equality holds if you can prove the function is convex.

What you should do is prove the statement for two variables and then use an inductive argument to prove it for n statements. The two variable case is where you only deal with two choices for your index set i, and from there you can be sure that the inequality holds (basic idea of Jensens inequality).

If you have any trouble post your work here so one of the readers can give any hints or a helping hand.
 
  • #11
I think it will be convex, but I'm guessing this is for work (I have replied in your other threads) so I'd double check to be sure. It does say though the equality holds if you can prove the function is convex.

What you should do is prove the statement for two variables and then use an inductive argument to prove it for n statements. The two variable case is where you only deal with two choices for your index set i, and from there you can be sure that the inequality holds (basic idea of Jensens inequality).

If you have any trouble post your work here so one of the readers can give any hints or a helping hand.

I have a very strong feeling it is true, because basically in the left hand side you sum the smallest possible value for each n. I do not know how to prove it mathematically, though, even for two variables (actually we need four variable at lease, I think).

Thanks
 
  • #12
I have a very strong feeling it is true, because basically in the left hand side you sum the smallest possible value for each n. I do not know how to prove it mathematically, though, even for two variables (actually we need four variable at lease, I think).

Thanks

The proof will probably be best if you do it inductively.

Start off with proving it for min(x,y). For three variables prove it for min(x,min(y,z)). Once you have proven it for min(x,y) you are done since min(x,y,z) = min(x,min(y,z)) and so on.
 

Suggested for: For Log function, is this true?

Replies
10
Views
450
Replies
44
Views
2K
Replies
9
Views
910
Replies
4
Views
895
Replies
3
Views
525
Replies
13
Views
953
Replies
10
Views
874
Replies
51
Views
1K
Back
Top