MHB Solving Problem w/ Norm Space Proof: Advice & Resources

  • Thread starter Thread starter SamJohannes
  • Start date Start date
  • Tags Tags
    Norm Proof Space
SamJohannes
Messages
13
Reaction score
0
View attachment 3159

Hi guys, I've attached a problem that I've been struggling with for a while now. I was wondering if anyone had some advice on how to approach it (in particular part a) or some resources they could recommend to me?Thanks in advance, Sam
 

Attachments

  • MTH2015_Assign_2_S2_2014_pdf__page_1_of_2_.png
    MTH2015_Assign_2_S2_2014_pdf__page_1_of_2_.png
    31.4 KB · Views: 122
Physics news on Phys.org
Hi Sam, and welcome to MHB!

I assume that $\|\mathbf x\|_1$ is defined as $\sum|x_j|$, where $x_j\ (1\leqslant j\leqslant n)$ are the coordinates of $\mathbf x$ with respect to some basis. It is not clear to me whether that basis is meant to be the given basis $\{\mathbf e_1,\ldots,\mathbf e_n\}$, or the standard basis for $\mathbb{R}^n$?

In the first of those two cases, define $$C = \max_{1\leqslant j\leqslant n}\|\mathbf e_j\|$$. Then $ \|\mathbf x\| = \left\| \sum x_j\mathbf e_j\right\| \leqslant \sum|x_j|\|\mathbf e_j\| \leqslant C\sum|x_j| = C\|\mathbf x\|_1.$ A similar proof will work if the norm $\|\mathbf x\|_1$ is defined with respect to some other basis (such as the standard basis).
 
Thanks for the response Opalg, it's good to be here.

You're right, ∥x∥1 is the 1-norm.
I don't understand the bit ∥x∥=∥∥∑xjej∥∥. Is this true for all norms? Sorry if the question sounds silly, I'm relatively new to the topic.

-Cheers, Sam
 
SamJohannes said:
I don't understand the bit ∥x∥=∥∥∑xjej∥∥. Is this true for all norms?
You are told that $\{\mathrm e_1,\ldots,\mathrm e_n\}$ is a basis. So every vector $\mathrm x$ can be (uniquely) written as a linear combination of the basis vectors: $\mathrm x = \sum x_j\mathrm e_j$. Then $\|\mathrm x\| = \left\|\sum x_j\mathrm e_j\right\|$. The next step is to use the triangle inequality to say that this is $\leqslant \sum|x_j|\|\mathrm e_j\|.$
 
Thanks Opalg. That's helped a lot.
 
Any thoughts on part b?
 
SamJohannes said:
Any thoughts on part b?

Hi Sam,

To prove part (b), fix $\varepsilon > 0$; by continuity of $f$ at $(a,b)$, we can choose a $\delta > 0$ such that for all $(x,y)$, $||(x,y) - (a,b)|| < \delta$ implies $|f(x,y) - f(a,b)| < \varepsilon$.

Here's where I'll use the result of part (a). Let $\eta := \frac{\delta}{C}$, where $C$ is the constant in part (a). For all $x$, $|x - a| < \eta$ implies $||(x,b) - (a,b)||_1 = |x - a| < \eta$. So, $||(x,b) - (a,b)|| < C\eta = \delta$. Hence, $|f_b(x) - f_b(a)| = |f(x,b) - f(a,b)| < \varepsilon$. Since $\varepsilon$ was arbitrary, $f_b$ is continuous at $a$.
 
Back
Top