StoneTemplePython
Science Advisor
Gold Member
- 1,265
- 597
julian said:I think I have solved problem 1:
I split the proof into the parts:
Part (a) A few facts about real skew symmetric matrices.
Part (b): Proof for ##n## even.
(i) Looking at case ##n = 2##.
(ii) Proving a key inequality. This will prove case ##n=2##
(iii) Proving case for general even ##n## (then easy).
Part (c) Case of odd ##n##.
(i) Proving case for ##n = 3## (easy because of part (b)).
(iii) Proving case for general odd ##n## (easy because of part (b)).
I went through it fairly granularly and did not see any flaws.
A couple thoughts:
1.) If you are so inclined in your first spoiler, you may make use of rule
I would be happy to accept basic results about skew symmetric matrices' eigenvalues having zero real component by spectral theory. On the other hand, your workthrough may be more instructive for 3rd parties reading it who don’t know the spectral theory underlying it – so they both have merits.QuantumQuest said:2) It is fine to use nontrivial results without proof as long as you cite them and as long as it is "common knowledge to all mathematicians". Whether the latter is satisfied will be decided on a case-by-case basis.
2.) The key insight for this problem, in my view, is figuring out the ##n = 2## case. Everything can be built off of this. The other insight is relating it to ##\text{GM} \leq \text{AM}## in some way. I think you basically re-created Cauchy’s forward-backward induction proof for ##\text{GM} \leq \text{AM}##, in Part (b) albeit for additivity not for vanilla ##\text{GM} \leq \text{AM}##. Since we are at month end, I will share another much simpler idea, which is the fact that 'regular' ##\text{GM} \leq \text{AM}## implies this result.
my take is that in Part (B) (II) when you are seeking to prove:
##(\theta^2 + x_1^2) \dots (\theta^2 + x_k^2) \geq [\theta^2 + (x_1^2 \dots x_k^2)^{1/k}]^k##
or equivalently
##\Big((\theta^2 + x_1^2) \dots (\theta^2 + x_k^2)\Big)^{1/k} \geq \theta^2 + (x_1^2 \dots x_k^2)^{1/k}##
multiply each side by
##\big(\theta^2\big)^{-1}##
(which is positive and doesn't change the inequality) and define
##z_i := \frac{x_i^2}{\theta^2} \gt 0##
The relationship is thus:
##\Big(\prod_{i=1}^k (1 + z_i)\Big)^{1/k}= \Big((1 + z_1) \dots (1 + z_k)\Big)^{1/k} \geq 1 + (z_1 \dots z_k)^{1/k} = \Big(\prod_{i=1}^k 1\Big)^{1/k} + \Big(\prod_{i=1}^k z_i\Big)^{1/k}##
which is true by the super additivity of the Geometric Mean (which incidentally was a past challenge problem, but since it is not this challenge problem I think it is fine to assume it is common knowledge to mathematicians).
- - - -
To consider the case of any eigenvalues equal to zero, we can verify that the inequality holds with equality, which we can chain onto the above.
- - - -
I have a soft spot for proving this via ##2^r## for ##r = \{1, 2, 3, ...\}## and then filling in the gaps. Really well done. Forward backward-induction is a very nice technique, but a lot of book-keeping!