What is the proper time interval in relativity and how is it determined?

  • Thread starter Thread starter SecretSnow
  • Start date Start date
  • Tags Tags
    Proper time Time
Click For Summary
SUMMARY

The discussion centers on the concept of proper time in the context of relativity, specifically addressing how it is measured and its relationship to relativistic time. Proper time, denoted as t0, is defined as the time interval measured by an observer at rest relative to the events being timed. It is established that proper time is always less than the time measured by an observer in a different inertial frame, such as an observer on Earth measuring the time of an airplane in flight. The confusion arises when considering scenarios involving acceleration and constant speed, highlighting the need for clarity in defining the proper time interval in varying frames of reference.

PREREQUISITES
  • Understanding of Einstein's theory of relativity
  • Familiarity with the concept of inertial frames of reference
  • Knowledge of time dilation and its mathematical representation
  • Basic principles of measuring time intervals in physics
NEXT STEPS
  • Study the mathematical formulation of time dilation in special relativity
  • Explore the concept of simultaneity in different inertial frames
  • Learn about the implications of acceleration on time measurement
  • Investigate practical applications of proper time in GPS technology
USEFUL FOR

Students of physics, educators teaching relativity, and anyone interested in the implications of time measurement in different frames of reference.

SecretSnow
Messages
66
Reaction score
0
Hi guys, I'm confused with proper time. Because t=t0 * lambda, proper time t0 is usually smaller than the relativistic time right? Also, proper time is measured by an observer whose frame of reference is at rest right? (Can it be measured by an observer in an inertial frame of reference? Like one in constant speed). In the case of an observer on Earth measuring the time taken of an airplane traveling at constant speed, which is the proper time to be taken? At first look, I wanted to take the proper time as the time taken by the observer at rest on Earth but that would give me a greater time for the plane travelling, which is wrong since time runs slower in the plane (so it should have a smaller time quantity for a given time measured by a non-moving observer) if that is the case, then the plane should have the proper time right? Why is this so? Is my definition of proper time wrong? Also, if the plane is initially accelerating to reach its cruising constant speed, then how can I take it as my proper time? Thanks a lot!
 
Physics news on Phys.org
SecretSnow said:
Hi guys, I'm confused with proper time. Because t=t0 * lambda, proper time t0 is usually smaller than the relativistic time right?
I don't know what you mean by one time being "smaller" than another. I think you mean that the time interval measured by a person at rest relative to another.

Also, proper time is measured by an observer whose frame of reference is at rest right?
Well, every observer is at rest in his own frame of reference. But, yes, the "proper time" interval between two events is the time measured in a frame at which the two events occur at the same place- there is no motion.

(Can it be measured by an observer in an inertial frame of reference? Like one in constant speed).
Well, not directly "measured" but an observer can certainly calculate what the time interval would be in a moving frame.

In the case of an observer on Earth measuring the time taken of an airplane traveling at constant speed, which is the proper time to be taken?
The "proper time" aboard an airplane would be the time according to a clock traveling on that airplane.

At first look, I wanted to take the proper time as the time taken by the observer at rest on Earth but that would give me a greater time for the plane travelling, which is wrong since time runs slower in the plane (so it should have a smaller time quantity for a given time measured by a non-moving observer) if that is the case, then the plane should have the proper time right? Why is this so? Is my definition of proper time wrong? Also, if the plane is initially accelerating to reach its cruising constant speed, then how can I take it as my proper time? Thanks a lot!
We are, after all, taking about relativity. Your "proper time", standing on the earth, would be different that the "proper time" of a person traveling on the airplane. If you are thinking that "proper time" is some sort of "absolute" time that everyone would agree on, you have misunderstood the whole idea of "relativity"- there is no such absolute time.
 
HallsofIvy said:
I don't know what you mean by one time being "smaller" than another. I think you mean that the time interval measured by a person at rest relative to another.


Well, every observer is at rest in his own frame of reference. But, yes, the "proper time" interval between two events is the time measured in a frame at which the two events occur at the same place- there is no motion.


Well, not directly "measured" but an observer can certainly calculate what the time interval would be in a moving frame.


The "proper time" aboard an airplane would be the time according to a clock traveling on that airplane.


We are, after all, taking about relativity. Your "proper time", standing on the earth, would be different that the "proper time" of a person traveling on the airplane. If you are thinking that "proper time" is some sort of "absolute" time that everyone would agree on, you have misunderstood the whole idea of "relativity"- there is no such absolute time.

Ah yes the keyword is interval, I missed out on that. I know there's no such thing as absolute time, but I'm rather confused at how do I know which is the proper time. I've encountered a qns that told me a plane is traveling at 300m/s across 3000km for example and I don't know which proper time should be used. Of course if I take the proper time as that on the plane, which means that I think of the Earth moving away from the plane at 300m/s instead, and the plane is at rest, this is correct. But how do I know which proper time to take if I want to select my frame of reference as an observer on earth? Because at that time, I'll be confused unless I use logic to know that the time interval for a moving object with respect to my frame of reference, has a lesser time interval. I might assume that because I'm at rest, I take that time I measure as proper time interval

Or should I just take it as that if I'm at rest on earth, the proper time should just be the time measured by the moving object? Which means I take the moving object as the theme, so the proper time interval is based on the moving object reference frame. Whatever other reference frames are, they are seeing a larger time interval.

However, most importantly, is this proper time interval that I take truly justified? After all, before the object can move, it must accelerate and doing so, what will be the consequences? How can it still be considered as a proper time interval. Should I ignore the part where it accelerates and take the proper time interval when it is at constant speed?

If so, how can I deal with the time interval while accelerating?
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
1
Views
3K
  • · Replies 21 ·
Replies
21
Views
6K
Replies
6
Views
3K
  • · Replies 22 ·
Replies
22
Views
1K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 70 ·
3
Replies
70
Views
4K