Real Analysis (Rudin) exercise with inequalities

Click For Summary
SUMMARY

The discussion centers on proving that for k > 2, given points x and y in R^k with |x - y| = d > 0 and r > 0, if 2r > d, there exist infinitely many points z in R^k such that |z - x| = |z - y| = r. The problem is derived from "Principles of Mathematical Analysis" by Walter Rudin, specifically problem 16(a) on page 23. The proof involves demonstrating the existence of at least one z and then leveraging the properties of distance functions in R^k to establish the existence of infinitely many solutions through linear combinations of z.

PREREQUISITES
  • Understanding of Euclidean space R^k
  • Familiarity with distance functions and inequalities in metric spaces
  • Knowledge of linear combinations and their properties
  • Proficiency in applying the binomial theorem in proofs
NEXT STEPS
  • Study the properties of distance functions in R^k
  • Learn about linear combinations and their implications in vector spaces
  • Explore the binomial theorem and its applications in mathematical proofs
  • Review proofs involving infinite solutions in higher-dimensional spaces
USEFUL FOR

Students of real analysis, mathematicians tackling proofs in higher dimensions, and anyone interested in the geometric interpretation of inequalities in R^k.

murmillo
Messages
114
Reaction score
0

Homework Statement


Suppose k>2, x, y in R^k, |x-y| = d > 0, and r > 0.
Prove if 2r > d, there are infinitely many z in R^k such that
|z-x| = |z-y| = r

(In Principles of Mathematical Analysis, it is problem 16(a) on page 23.)

Homework Equations


|ax| = |a||x|
|x-z| < or = |x-y| + |y-z|
|x+y| < or = |x| + |y|


The Attempt at a Solution


I'm not quite sure how to tackle this proof. Here's the general outline I have:
- Show that there exists at least one z.
- Suppose there existed one and only one z. Is there a contradiction? Or, can I find z', a linear combination of z that also works and then from z' use the same rule to get z'', ad infinitum?

As you can see, I don't really know how to tackle "show there are infinitely many solutions" proofs.
One thought I had was, well, suppose z satisfies the necessary conditions.
Can I show that there exists c in R or d in R^k such that z' = cz + d also satisfies the necessary conditions?

But all I can get is |z' - y| = |z - y + d| < or = |z-y| + |d|
And |z' - y| has to = |z-y| = r, but that doesn't tell me anything.

Can someone give a hint or two to point me in the right direction? How should I tackle this problem?
 
Physics news on Phys.org
Once you show that some z satisfies the equation, you can use the definition of the distance function in R^k to yield two quadratic equations in at least three unknowns. Expand with the binomial theorem and you cancel out the components of z and thus there are two linear equations in at k>2 unknowns, which has infinite solutions.
 

Similar threads

Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
6
Views
1K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K