# A question about Young's inequality and complex numbers

• I
• VX10
In summary, the conversation discusses the attempt to demonstrate that ##\Omega## is real using the fact that ##u<0##. However, it is shown that ##u## must be greater than or equal to zero, making ##\Omega## either zero or imaginary. The fallacy in the given proof is that the last step relies on assuming that both ##a^2## and ##b^2## are positive, which is not initially stated.

#### VX10

TL;DR Summary
Here, I present question about the validity of Young's inequality.
Let ##\Omega## here be ##\Omega=\sqrt{-u}##, in which it is not difficult to realize that ##\Omega ## is real if ##u<0##; imaginary, if ##u>0##. Now, suppose further that ##u=(a-b)^2## with ##a<0## and ##b>0## real numbers. Bearing this in mind, I want to demonstrate that ##\Omega## is real. To that end, we must demonstrate that ##u<0##, or equivalently,
$$a^2+b^2-2ab<0$$.
Going through some straightforward algebraic manipulations, we then have
$$ab>\frac{1}{2}\left(a^{2}+b^{2}\right)$$.
Nevertheless, on recalling that ##a<0##, we then are led to conclude that
$$ab<\frac{1}{2}\left(a^{2}+b^{2}\right)$$.

Based on the above, I ask:
1. Would that last statement hold true by virtue of Young's inequality?
2. Is there any fallacious step in that given proof?

Last edited:

I have no idea what you are trying to do there, I'm sorry to say.

PeroK said:

I have no idea what you are trying to do there, I'm sorry to say.
Hi, PeroK. I hope you are doing well. I want to demonstrate that ##u<0##.

VX10 said:
Hi, PeroK. I hope you are doing well. I want to demonstrate that ##u<0##.
But if ##u = (a-b)^2##, isn't ##u## guaranteed to be ##\ge 0##?

VX10
VX10 said:
Hi, PeroK. I hope you are doing well. I want to demonstrate that ##u<0##.
And what is ##u##?

VX10 said:
TL;DR Summary: Here, I present question about the validity of Young's inequality.

Let ##\Omega## here be ##\Omega=\sqrt{-u}##, in which it is not difficult to realize that ##\Omega ## is real if ##u<0##; imaginary, if ##u>0##. Now, suppose further that ##u=(a-b)^2## with ##a<0## and ##b>0## real numbers. Bearing this in mind, I want to demonstrate that ##\Omega## is real.

If $a$ and $b$ are real of any sign, then $a - b$ is real and $u = (a-b)^2 \geq 0$. Hence $\Omega$ is imaginary.

VX10 and FactChecker
FactChecker said:
But if ##u = (a-b)^2##, isn't ##u## guaranteed to be ##\ge 0##?
Thanks for commenting FactChecker. This is the point of my doubt. As you can see, when I expand the equation and use the fact that ##a<0##, I arrive at the final statement presented above.

pasmith said:
If $a$ and $b$ are real of any sign, then $a - b$ is real and $u = (a-b)^2 \geq 0$. Hence $\Omega$ is imaginary.
Hi, pasmith. I hope you are doing well. This is the point of my doubt. As you can see, when I expand the equation and use the fact that ##a<0##, I arrive at the final statement presented above. Is the "mathematical development" presented above fallacious? Thanks for commeting.

VX10 said:
Thanks for commenting FactChecker. This is the point of my doubt. As you can see, when I expand the equation and use the fact that ##a<0##, I arrive at the final statement presented above.
Any real number, when squared, is positive. It is pointless to look at how that real number was obtained.

VX10 said:
Hi, pasmith. I hope you are doing well. This is the point of my doubt. As you can see, when I expand the equation and use the fact that ##a<0##, I arrive at the final statement presented above. Is the "mathematical development" presented above fallacious? Thanks for commeting.

There's no development. To show ##u<0## you decided it was equivalent to ##ab> 0.5(a^2+b^2)##, but you concluded that actually the opposite is true. This means you proved ##u>0##.
But this is all nonsense, since that last step relies on ##a^2## and ##b^2## being positive, which is something you didn't want to assume to begin with.

quasar987, VX10 and PeroK
VX10 said:
Hi, pasmith. I hope you are doing well. This is the point of my doubt. As you can see, when I expand the equation and use the fact that ##a<0##, I arrive at the final statement presented above. Is the "mathematical development" presented above fallacious? Thanks for commeting.
As I said before your steps make little or no sense. ##(a-b)^2 \ge 0## for all real ##a, b##. It's not clear how or why you think you have shown that ##(a - b)^2 < 0##.

FactChecker said:
Any real number, when squared, is positive.
Or zero...

FactChecker and VX10
Office_Shredder said:
There's no development. To show ##u<0## you decided it was equivalent to ##ab> 0.5(a^2+b^2)##, but you concluded that actually the opposite is true. This means you proved ##u>0##.
But this is all nonsense, since that last step relies on ##a^2## and ##b^2## being positive, which is something you didn't want to assume to begin with.
Thanks for commeting. Now, it became clear to me. Thanks again.

quasar987
VX10 said:
Let ##\Omega## here be ##\Omega=\sqrt{-u}##, in which it is not difficult to realize that ##\Omega ## is real if ##u<0##; imaginary, if ##u>0##. Now, suppose further that ##u=(a-b)^2## with ##a<0## and ##b>0## real numbers. Bearing this in mind, I want to demonstrate that ##\Omega## is real.
To summarize what others have said, if ##u = (a - b)^2##, with a and be being any real numbers, then ##u \ge 0##. So ##\Omega## is zero if a = b or is imaginary otherwise. Period.