Exploring Affine Bases in $\mathbb{R}^n$

  • MHB
  • Thread starter mathmari
  • Start date
  • Tags
    Bases
In summary, we discussed affine bases of $\mathbb{R}^n$ and showed that for $1\leq n\leq 3$, the vectors that form an affine basis have a geometric description. We also proved that adding a vector to each element of an affine basis results in another affine basis, and that multiplying each element of an affine basis by an invertible matrix also results in an affine basis. We then used the fact that isometries in $\mathbb{R}^n$ can be written as orthogonal transformations plus a translation to show that the image of an affine basis under an isometry is also an affine basis. Finally, we showed that if a function $\beta$ is an isometry and
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :giggle:

Let $1\leq n\in \mathbb{N}$ and $(p_0,\ldots , p_n)$ an affine basis (that means that the vectors $p_1-p_0, p_2-p_0,\ldots ,p_n-p_0$ build a basis of $\mathbb{R}^n$.
(a) Give a geometric description of affine bases of $\mathbb{R}^n$ for $1\leq n\leq 3$.
(b) For all $v\in \mathbb{R}^n$ show that $(p_0+v,\ldots , p_n+v)$ is an affine basis of $\mathbb{R}^n$.
(c) Let $a$ be an invertible matrix. Then show that $(ap_0,\ldots , ap_n)$ is an affine basis of $\mathbb{R}^n$.
(d) For each isometry $\beta\in \text{Isom}(\mathbb{R}^n)$ show that $(\beta (p_0),\ldots , \beta( p_n))$ an affine basis of $\mathbb{R}^n$.
(e) Let $p_0=0$ and $\beta\in \text{Isom}(\mathbb{R}^n)$, with $\beta (p_i)=p_i$ for all $0\leq i\leq n$. Then show that $\beta=\text{id}_{\mathbb{R}^n}$.
(f) Let $\beta\in \text{Isom}(\mathbb{R}^n)$, with $\beta (p_i)=p_i$ for all $0\leq i\leq n$. Then show that $\beta=\text{id}_{\mathbb{R}^n}$.For (b) I have done the following :
We have that the vectors \begin{align*}&(p_1+v)-(p_0-v)=p_1+v-p_0-v=p_1-p_0 \\ &(p_2+v)-(p_0-v)=p_2+v-p_0-v=p_2-p_0 \\ &\ldots \\ &(p_n+v)-(p_0-v)=p_n+v-p_0-v=p_n-p_0 \end{align*}
build a basis of $\mathbb{R}^n$, since $(p_0, p_1, \ldots ,p_n)$ is an affine basis of $\mathbb{R}^n$.

Therefore $(p_0+v, p_1+v, \ldots ,p_n+v)$ is an affine basis of $\mathbb{R}^n$. For (c) I have done the following :
We have that \begin{align*}&ap_1-ap_0=a(p_1-p_0) \\ &ap_2-ap_0=a(p_2-p_0) \\ &\ldots \\ &ap_n-ap_0=a(p_n-p_0) \end{align*}
We have that \begin{align*}\lambda_1[a(p_1-p_0)]+\lambda_2[a(p_2-p_0)]+\cdots +\lambda_n[a(p_n-p_0)]=0 &\Rightarrow a\left (\lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)\right )=0 \ \\ & \overset{a\in \text{GL}_n(\mathbb{R})}{\Rightarrow } \ a^{-1}a\left (\lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)\right )=a^{-1}\cdot 0 \\ & \Rightarrow \lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)=0\end{align*}
Since $(p_0, p_1, \ldots ,p_n)$ is an affine basis of $\mathbb{R}^n$ the $p_1-p_0, \ p_2-p_0, \ \ldots , \ p_n-p_0$ build a basis of $\mathbb{R}^n$ and so they are linearly independent, and so $\lambda_1=\lambda_2=\ldots =\lambda_n=0$.

So the $n$ linearly independent vectors $a(p_1-p_0), \ a(p_2-p_0), \ \ldots , \ a(p_n-p_0)$ in the $n$-dimensional space are a basis. So $(p_0+v, p_1+v, \ldots ,p_n+v)$ is anaffine basis of $\mathbb{R}^n$.Is that correct so far? :unsure:

Could you give me a hint for he remaining parts? For example for the isometries the distances are preserved but canwe say that the differences of $\beta(p_i)-\beta(p_0)$ are equal to $p_i-p_0$ ? :unsure:
 
Physics news on Phys.org
  • #2
mathmari said:
Could you give me a hint for he remaining parts? For example for the isometries the distances are preserved but canwe say that the differences of $\beta(p_i)-\beta(p_0)$ are equal to $p_i-p_0$ ?
Hey mathmari!

Every isometry in $\mathbb R^n$ with the usual metric can be written as an orthogonal transformation plus a translation.
We can use that for (d), (e), and (f), can't we? 🤔
 
  • #3
Klaas van Aarsen said:
Every isometry in $\mathbb R^n$ with the usual metric can be written as an orthogonal transformation plus a translation.
We can use that for (d), (e), and (f), can't we? 🤔

How do we use that? I got stuck right now. :unsure:
 
  • #4
mathmari said:
How do we use that? I got stuck right now.
Use that $\beta(x)=ax+v$ where $a$ is an invertible matrix and use the same logic as in (b) and (c)? 🤔
We also have that $a^Ta=\text{id}_{\mathbb R^n}$.
 
  • #5
Klaas van Aarsen said:
Use that $\beta(x)=ax+v$ where $a$ is an invertible matrix and use the same logic as in (b) and (c)? 🤔
We also have that $a^Ta=\text{id}_{\mathbb R^n}$.

For (d) I have done the following :

We have that \begin{align*}&\beta (p_1)-\beta (p_0)=(ap_1+v)-(ap_0+v)=ap_1-ap_0=a(p_1-p_0) \\ &\beta (p_2)-\beta (p_0)=(ap_2+v)-(ap_0+v)=ap_2-ap_0=a(p_2-p_0) \\ &\ldots \\ &\beta (p_n)-\beta (p_0)=(ap_n+v)-(ap_0+v)=ap_n-ap_0=a(p_n-p_0) \end{align*}
We have that \begin{align*}&\lambda_1[a(p_1-p_0)]+\lambda_2[a(p_2-p_0)]+\cdots +\lambda_n[a(p_n-p_0)]=0 \\ &\Rightarrow a\left (\lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)\right )=0 \ \\& \overset{a\in \text{GL}_n(\mathbb{R})}{\Rightarrow } \ a^{-1}a\left (\lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)\right )=a^{-1}\cdot 0 \\ & \Rightarrow \lambda_1(p_1-p_0)+\lambda_2(p_2-p_0)+\cdots +\lambda_n(p_n-p_0)=0\end{align*}
Since $(p_0, p_1, \ldots ,p_n)$ is an affine basis $\mathbb{R}^n$, it follows that $p_1-p_0, \ p_2-p_0, \ \ldots , \ p_n-p_0$ is a basis of $\mathbb{R}^n$, and so these are linearly independent, so $\lambda_1=\lambda_2=\ldots =\lambda_n=0$.
So it follows that the $n$ linearly independent vectors $a(p_1-p_0), \ a(p_2-p_0), \ \ldots , \ a(p_n-p_0)$, i.e. $\beta (p_1)-\beta (p_0), \ \beta (p_2)-\beta (p_0), \ \ldots , \ \beta (p_n)-\beta (p_0)$, in $n$-dimensional space form a basis.
Therefore $(\beta (p_0), \beta (p_1), \ldots ,\beta (p_n))$ is an affine basis of $\mathbb{R}^n$.
For (e) I have done the following :

We have that $\beta(p_i)=p_i \Rightarrow ap_i+v=p_i\Rightarrow (a-u)p_i=-v$.
How do we continue? :unsure:
 
  • #6
mathmari said:
We have that $\beta(p_i)=p_i \Rightarrow ap_i+v=p_i\Rightarrow (a-u)p_i=-v$.
How do we continue?
Suppose we substitute $p_0=0$? :unsure:
 
  • #7
Klaas van Aarsen said:
Suppose we substitute $p_0=0$? :unsure:

From $\beta(p_i)=p_i \Rightarrow ap_i+v=p_i\Rightarrow (a-u)p_i=-v$ for $i=0$ we have $p_0=0$ so we get $(a-u)\cdot 0=-v \Rightarrow v=0$. Therefore $ap_i=p_i$. Do we have to show that $a$ is the identity matrix? :unsure:
 
  • #8
mathmari said:
From $\beta(p_i)=p_i \Rightarrow ap_i+v=p_i\Rightarrow (a-u)p_i=-v$ for $i=0$ we have $p_0=0$ so we get $(a-u)\cdot 0=-v \Rightarrow v=0$. Therefore $ap_i=p_i$. Do we have to show that $a$ is the identity matrix?
Yep. (Nod)
 
  • #9
Klaas van Aarsen said:
Yep. (Nod)

Do we consider for that the system $(a-I)p_i=0$ ? Since the $p_i$'s cannot be zero because the form an affine basis, it follows that $a-I=0$, i.e. $a=I$.

Is that correct? :unsure:
 
  • #10
mathmari said:
Do we consider for that the system $(a-I)p_i=0$ ? Since the $p_i$'s cannot be zero because the form an affine basis, it follows that $a-I=0$, i.e. $a=I$.

Is that correct?
It's not sufficient that the $p_i$ are non-zero. We actually need that the $p_i$ for $n=1,\ldots,n$ form a basis of $\mathbb R^n$.
It follows from there that $a$ must indeed be the identity matrix. 🤔
 
  • #11
Klaas van Aarsen said:
It's not sufficient that the $p_i$ are non-zero. We actually need that the $p_i$ for $n=1,\ldots,n$ form a basis of $\mathbb R^n$.
It follows from there that $a$ must indeed be the identity matrix. 🤔

We have that the $p_i$ for $n=1,\ldots,n$ form a basis of $\mathbb R^n$. But how does it follow that $a$ must indeed be the identity matrix? :unsure:
 
  • #12
mathmari said:
We have that the $p_i$ for $n=1,\ldots,n$ form a basis of $\mathbb R^n$. But how does it follow that $a$ must indeed be the identity matrix? :unsure:
Suppose we put the $p_i$ as column vectors in a basis matrix $b$, which is invertible.
Then $ap_i=p_i\implies ab=b \implies abb^{-1}=bb^{-1}\implies a=I$. 🤔
 
  • #13
Klaas van Aarsen said:
Suppose we put the $p_i$ as column vectors in a basis matrix $b$, which is invertible.
Then $ap_i=p_i\implies ab=b \implies abb^{-1}=bb^{-1}\implies a=I$. 🤔

Ahh I see!

So at (f) we do that in a similar way:
We have that $\beta(p_i)=p_i \Rightarrow ap_i+v=p_i\Rightarrow (a-u)p_i=-v$. But how do we get that $v=0$ ? We don't have any additional condition that we could use as before that $p_0=0$.
Could you give me a hint? :unsure:
 
  • #14
What is $\beta(p_i-p_0)$? 🤔
 
  • #15
Klaas van Aarsen said:
What is $\beta(p_i-p_0)$? 🤔

We have that $\beta (p_i-p_0)=a(p_i-p_0)+v=ap_i-ap_0+v$, or not? :unsure:
 
  • #16
How about $\beta p_i - \beta p_0$?
And in particular for $i=0$? 🤔:
 

1. What is an affine base in $\mathbb{R}^n$?

An affine base in $\mathbb{R}^n$ is a set of $n$ points that can be used to represent any point in $\mathbb{R}^n$ through a linear combination. These points are not necessarily linearly independent, but they serve as a reference frame for describing the position and orientation of objects in $\mathbb{R}^n$.

2. How do affine bases differ from vector bases?

While vector bases in $\mathbb{R}^n$ are made up of linearly independent vectors, affine bases can include points that are not linearly independent. Additionally, affine bases allow for translation, while vector bases only allow for rotation and scaling.

3. What is the importance of affine bases in geometry and computer graphics?

Affine bases are essential in geometry and computer graphics because they provide a way to represent and manipulate objects in space. They allow for transformations such as translation, rotation, and scaling, which are necessary for creating realistic and dynamic graphics.

4. Can an affine base be used in any dimension?

Yes, an affine base can be used in any dimension, not just in $\mathbb{R}^n$. For example, in two-dimensional space, an affine base would consist of three points, while in three-dimensional space, an affine base would consist of four points.

5. How are affine bases related to linear transformations?

Affine bases and linear transformations are closely related. In fact, linear transformations can be used to describe the relationship between different affine bases. In particular, the linear transformation matrix can be used to convert coordinates between different affine bases.

Similar threads

  • Linear and Abstract Algebra
Replies
28
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
990
  • Linear and Abstract Algebra
Replies
10
Views
1K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
917
  • Linear and Abstract Algebra
Replies
15
Views
1K
  • Linear and Abstract Algebra
2
Replies
52
Views
2K
Replies
3
Views
2K
Back
Top