Representing a dot product with Sums.

Disowned
Messages
30
Reaction score
0
Is it possible to represent the dot product (matrix multiplication) with sums? For example, know the dot product of a polynomial and another one [i.e. 2+5x and 3x+7x2] would be the sums of the products. [i.e. 2(3x) + 5x(7x2)].

Could this be also written as \sum^{n}_{i=1} a1ibi1? I'm asking this because I'm taking a really strict teacher who wants you to show all your work on a test or else he marks the entire thing wrong and showing the work for a dot product of a 5x2 by 2x5 is too much work during a timed exam.
 
Physics news on Phys.org
Disowned said:
Is it possible to represent the dot product (matrix multiplication) with sums?

The dot product can't always be described with matrices. With R^n, this is fine. But for other inner product spaces, it's not always possible. For example, function spaces often use a dot product that looks like this: f \cdot g = \int f(x)g(x) dx

For example, know the dot product of a polynomial and another one [i.e. 2+5x and 3x+7x2] would be the sums of the products. [i.e. 2(3x) + 5x(7x2)].

There are a few issues with this. First, you're not using R^n, so it's unclear what a "dot product" means between polynomials. Second, even if you were to use a definition similar to this, you'd need to multiply corresponding terms. What you're doing is similar to saying that the dot product between [2,5,0] and [0,3,7] is 2 * 3 + 7 * 5 (which is nonsensical because the pairs are not multiplied in the correct way).

Could this be also written as \sum^{n}_{i=1} a1ibi1? I'm asking this because I'm taking a really strict teacher who wants you to show all your work on a test or else he marks the entire thing wrong and showing the work for a dot product of a 5x2 by 2x5 is too much work during a timed exam.

A dot product is a special operation between two vectors, not two matrices. It sounds like you're maybe looking for shortcuts in matrix multiplication. Sadly, it IS time consuming to do matrix multiplication, and it's work best left to a calculator if you need numeric answers. You should talk to your teacher about his test policy and other students in your class. See what he wants from you and what your classmates are doing in your same situation.
 
Well yeah, I was just talking about vector spaces in R^n. I'll see if he'll give us some leeway with matrix multiplication. Also, isn't a matrix just composed of vectors?
 
Disowned said:
Well yeah, I was just talking about vector spaces in R^n. I'll see if he'll give us some leeway with matrix multiplication. Also, isn't a matrix just composed of vectors?

A matrix is its own thing. It's not "composed" of vectors in any literal sense.

However, you can define two useful operations on matrices. If M is an m-by-n matrix, then you can defined Row_i(M) to be the i-th "row vector" and Col_j(M) to be the j-th "column vector". Row vectors will be vectors in R^n and column vectors will be in R^m.

Then, matrix multiplication is a little easier. For two matrices, M and N,

(M * N)_i,j = Row_i(M) * Col_j(N)

Let me clarify what this means. The result of M * N is a matrix. To find out what M * N is, you just need to know what it's components are. So the component of M * N at row i and column j is equal to the right hand side of the equation. The right hand side says "take the i-th row of M and the j-th column of N and find their dot product".

This is the way I like to think about matrix multiplication, but just find a way that you can remember for tests, I guess. It's hard stuff. I still can't keep rows and columns straight. But really, it's an arbitrary decision anyway! It works just as well if you flip "column" with "row" in the definition as long as you keep it consistent.
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
Back
Top