# Linear Algebra Proof: Dim(X)=n, Show nk Dim Vector Space

• tiger2030
In summary: Since there are ##n^k## such forms, and the dimension of the space of all ##k##-forms is ##n^k##, this implies that the set is a basis.
tiger2030

## Homework Statement

If dim(X)=n, show that the vector space of k-linear forms on X is of dimension nk.

## The Attempt at a Solution

So I know we need to let x1, x2,...xn be a basis for X. My professor then said to "show that the function fj1,...,jk, 1≤jl≤n defined by fj1,...,jk(xi1,...xik) = δi1j1,...,δikjk and then extend multilinearly." This is where I am lost on what to do. Any help would be much appreciated.

In one variable. Let's say you have a basis ##e_1,...,e_n##. You can define a unique linear map ##T## by saying that ##T(e_i) = y_i##.
Indeed, the actual definition of ##T## is

$$T(\sum \alpha_i e_i) = \sum \alpha_i y_i$$

This is what it means to extend a function linearly: you start by defining it on a basis, and then use linear combinations to define the function on the entire space.

Does that make sense?

Yes, this makes sense because to be linear, you must be able to pull the constant out and still yield the same answer.

tiger2030 said:
Yes, this makes sense because to be linear, you must be able to pull the constant out and still yield the same answer.

Good, so does that explain the professor's hint:

"show that the function ##f_{j_1,...,j_k}##, ##1\leq j_l\leq n## defined by ##f_{j_1,...,j_k} = \delta_{i_1j_1}...\delta_{i_kj_k}## and then extend multilinearly."

It's just the same but in multiple dimensions.

Ok so first I need to check that they are well defined k-linear forms, correct?

tiger2030 said:
Ok so first I need to check that they are well defined k-linear forms, correct?

That's one thing you need to verify, yes.

so all fjk map to either 1(for fjk(xik)) or 0(for fjk(xim), where k≠m.

also, if xik=yik, then fjk(xik)=1=fjk(yik)

Therefore all f are well defined.

tiger2030 said:
so all fjk map to either 1(for fjk(xik)) or 0(for fjk(xim), where k≠m.

also, if xik=yik, then fjk(xik)=1=fjk(yik)

Therefore all f are well defined.

How are the ##f_{i_1...i_k}## defined on non-basis elements?

As a linear combination of basis elements?

tiger2030 said:
As a linear combination of basis elements?

Can you give an exact definition like I did in my post 2?

so fjk(∑αiei)=∑aiδi=∑ai?

Say x31=αe1+βe2+γe3.
Then fj1(x31)=fj1(αe1+βe2+γe3)= αfj1(e1)+βfj1(e2)+γfj1(e3)=α+β+γ?

So going back to the original problem. Take ##x_1,...,x_n## a basis for ##X##. Take ##y_1,...,y_k## any ##k## elements in ##X##. How do we define

$$f_{i_1,...,i_k}(y_1,...,y_k)$$

That would equal ∑ai, where yi=∑aixi

No, that's not correct.

ok so I am just going to try and use an example to see where my thought process is wrong and then use that to apply to a general case.
Say y2=3x1+2x2+4x3. Then fj2(y)=fj2(3x1+2x2+4x3)=fj2(3x1)+fj2(2x2)+fj2(4x3)=2

tiger2030 said:
ok so I am just going to try and use an example to see where my thought process is wrong and then use that to apply to a general case.
Say y2=3x1+2x2+4x3. Then fj2(y)=fj2(3x1+2x2+4x3)=fj2(3x1)+fj2(2x2)+fj2(4x3)=2

It seems you misunderstand the indices ##j_1## and so on.
Let's work in one variable. In that case, you are given an index ##j_1## and you know that ##1\leq j_1 \leq n##. So ##j_1## could be anything from ##1## to ##n##. And for each value of this ##j_1##, you have a map.

So you have maps

$$f_1,~f_2,~f_3,~f_4,...$$

What ##f_4## (for example) simply does is take out the fourth basis vector and give its coordinate. So, for example

$$f_4(3x_1 + 2x_2 + 4x_3 + 6x_4) = 6$$

Now, in the case of two variables, you have two indices ##j_1## and ##j_2## which can take on values anything from ##1## to ##n##. Let's say ##n=3##, then you have maps

$$f_{1,1},~f_{1,2},~f_{1,3},~f_{2,1},~f_{2,2},~f_{2,3},~f_{3,1},~f_{3,2},~f_{3,3}$$

So there are ##9## maps. (with general ##n##, there are ##n^2## maps).

Let's look at a specific map like ##f_{2,1}##. This map is a biliniear map, meaning it takes in two elements of ##X##. And what it does is select the 2nd coordinate of the first element and the first coordinate of the second element and multiply them. So

$$f_{2,1}(3x_1 + 6x_2 + 4x_3,x_1 + 2x_2 + 5x_3) = 6\cdot 1 = 6$$

Likewise for example,

$$f_{1,3}(3x_1 + 6x_2 + 4x_3,x_1 + 2x_2 + 5x_3) = 3\cdot 5 = 15$$

Ok, that clears things up a lot more. So instead I would get the coefficient with the first basis vector from y1 multiplied by the coefficient with the second basis vector from y2, and so on until it it multiplied by the coefficient with the nth basis vector from yn

Right, so what we do is decompose each ##y_j## in the basis ##x_1,...,x_n##. Let's write

$$y_j = \alpha_{1,j} x_1 + ... + \alpha_{n,j} x_n$$

Then

$$f_{i_1,...,i_k}(y_1,...,y_k) = \alpha_{i_1,1}\cdot ... \cdot \alpha_{i_k,k}$$

I know the notation is very awkward, but you need to get used to it.

The explanation of the notation cleared a lot of stuff up for me. So I get know that we have taken apart each yj and are multiplying the coefficients together but how does this show the dimension is nk?

I understand there would be nk different permutations of the coefficients indices but is this enough to prove the dim?

The idea is to show that the set of all ##f_{i_1,...,i_k}## forms a basis for the vector space of all ##k##-forms.

## 1. What is the definition of "Dim(X)" in linear algebra?

Dim(X) refers to the dimension of a vector space X, which is the number of linearly independent vectors needed to span the entire space.

## 2. How is the dimension of a vector space related to the number of vectors in a basis?

The dimension of a vector space is equal to the number of vectors in any basis of that space. This means that any set of n linearly independent vectors in a vector space with dimension n can serve as a basis for that space.

## 3. How do you prove that nk Dim Vector Space is true?

To prove that nk Dim Vector Space is true, we must show that there are nk linearly independent vectors in the vector space. This can be done by constructing a basis with nk linearly independent vectors, or by showing that any set of nk vectors in the space is linearly independent.

## 4. Can the dimension of a vector space ever be less than n times k?

No, the dimension of a vector space can never be less than n times k. This is because the dimension of a vector space is equal to the number of linearly independent vectors needed to span the space, and n times k is the minimum number of vectors needed to span a space of dimension n.

## 5. What is the significance of proving nk Dim Vector Space in linear algebra?

Proving nk Dim Vector Space is significant because it shows that the vector space has enough linearly independent vectors to span the space, and therefore has a well-defined dimension. This can be useful in various applications of linear algebra, such as solving systems of linear equations or finding eigenvalues and eigenvectors.

Replies
15
Views
1K
Replies
34
Views
2K
Replies
10
Views
2K
Replies
8
Views
1K
Replies
7
Views
749
Replies
24
Views
2K
Replies
15
Views
2K
Replies
8
Views
983
Replies
1
Views
1K
Replies
5
Views
2K