Showing the derivative of a vector is orthogonal to the vector

In summary, the conversation discusses the concept of a curve on a sphere and the relationship between the position vector and the derivative vector. The statement "if a curve lies on a sphere with center the origin" is equivalent to "if |r(t)|=c (constant)", as it means the position vector is always at a constant distance of c from the origin. However, if |r(t)| is not a constant, the position vector and the derivative vector may not be orthogonal. An example of this is r(t) = <t,t>.
  • #1
davidp92
12
0

Homework Statement


http://i.imgur.com/6j8W6.jpg
I'm trying to understand that example in the text. I can imagine a curve on a sphere having the derivative vector being orthogonal to the position vector. What I don't understand is, how does "if a curve lies on a sphere with center the origin" mean the same thing as "if |r(t)|=c (constant)"?

Homework Equations


The problem is I don't understand why the statement says "Show that if |r(t)|=c ..."
Doesn't the example mean that every derivative vector of a curve is orthogonal to the position vector since |r(t)|=c for all t as long as the curve is continuous? (thinking it in terms of sqrt(x^2+y^2+z^2))
When is |r(t)| not equal to a constant?
 
Physics news on Phys.org
  • #2
If |r(t)| is a constant, then you are essentially talking about a radius. Think about what the definition of |r(t)| is (it looks like you have an idea in the relevant equations section). If it is not equal to a constant, then the position vector and the derivative are not necessarily orthogonal. It is easy to find an example of this (for example, r(t) = <t,t>).
 
  • #3
I don't quite get your question. |r(t)|=c means the position r(t) is at a constant distance of c from the origin at all times. If it's always at a distance of c from the origin, then it lies on a sphere of radius c around the origin. For a general curve, the distance from the origin will vary with time.
 
  • #4
lineintegral1 said:
If |r(t)| is a constant, then you are essentially talking about a radius. Think about what the definition of |r(t)| is (it looks like you have an idea in the relevant equations section). If it is not equal to a constant, then the position vector and the derivative are not necessarily orthogonal. It is easy to find an example of this (for example, r(t) = <t,t>).

How is |r(t)| not equal to a constant for r(t)=<t,t>?

Thanks for replying!
 
  • #5
Dick said:
I don't quite get your question. |r(t)|=c means the position r(t) is at a constant distance of c from the origin at all times. If it's always at a distance of c from the origin, then it lies on a sphere of radius c around the origin. For a general curve, the distance from the origin will vary with time.

Ohh, I get it now. I wasn't thinking of it the way you have it in the bold.

Thanks!
 

Related to Showing the derivative of a vector is orthogonal to the vector

1. What is a derivative of a vector?

A derivative of a vector is a mathematical concept that describes how a vector changes over time or distance. It represents the rate of change of the vector's magnitude and direction.

2. How is the derivative of a vector calculated?

The derivative of a vector can be calculated using the limit definition of a derivative. It involves taking the limit of the change in vector over a very small interval of time or distance. This can be done for each component of the vector to determine the overall derivative.

3. What does it mean for the derivative of a vector to be orthogonal?

When the derivative of a vector is orthogonal to the vector itself, it means that the two vectors are perpendicular to each other. This is because the dot product of two orthogonal vectors is equal to zero.

4. Why is it important to show that the derivative of a vector is orthogonal to the vector?

Showing that the derivative of a vector is orthogonal to the vector is important because it is a fundamental property of vector calculus. It helps us understand the behavior and changes of vectors in various mathematical applications, such as in physics and engineering.

5. Can the derivative of a vector ever be parallel to the vector?

No, the derivative of a vector can never be parallel to the vector itself. This is because if two vectors are parallel, their dot product is non-zero, which means they are not orthogonal. Therefore, the derivative of a vector must be perpendicular to the vector.

Similar threads

  • Calculus and Beyond Homework Help
Replies
1
Views
818
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
513
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
777
  • Calculus and Beyond Homework Help
Replies
5
Views
817
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Back
Top