Non-Euclidean Geometry question

RelativeQuanta
Messages
9
Reaction score
0
I'm trying to prove that the interior angles on a spherical triangle sum to \pi + (A)/(a)^2 where A is the area of the triangle and a is the radius of the spherical space.
I think I know how to prove it, but there is one part that has me stumped.

I'm using the General Relativity book by James Hartle which states that any geometry can be built using line elements. The book claims that you can use, in Euclidean geometry, the definition that \theta = (\delta C / r) ( \delta C is the arc length)to prove that the sum of the angles of a triangle = \pi. I reason that I can use that proof with the modifications for spherical space to prove that the interior angles on a spherical triangle sum to \pi + (A)/(a)^2. My problem is, I haven't been able to prove that a planar triangle's angles sum to 180 using the definition for an angle!

So my questions are:
1. Is this how I should be trying to make my proof?
2. If so, how do I prove that the angles of a triangle sum to pi on a plane just using C = 2 \pi r and \theta = (\delta C / r)?

Thanks
 
Last edited:
Physics news on Phys.org
Well, of course you can't prove that a planar triangle's angles sum to 180 "using the definition for an angle" because the definition of an angle is the same even in spherical geometry where that's not true. In order to prove that the angles of a planar (Euclidean) triangle sum to 180 degrees you have to use the parallel postulate. The parallel postulate is not true for a sphere so that's not going to work.
 
So, was the author of my text incorrect in claiming that from the the line element dS = [(dx)^2 + (dy)^2]^{1/2} and the definition \theta = \delta C/R you can prove that the sum of interior angles of a triangle in plane space add to 180 degrees?

HallsofIvy said:
Well, of course you can't prove that a planar triangle's angles sum to 180 "using the definition for an angle" because the definition of an angle is the same even in spherical geometry . . .

I believe your statement here is incorrect. Remember, the angles in a 2D sphere space are on the surface of the sphere. The circumference of a circle on a sphere is defined like this:
C = 2 \pi a Sin [r/a] Where r is the radius of the circle and a is the radius of the sphere. Wouldn't this make the definition of the angle look like this?
\theta = \frac {\delta C} {a Sin [r/a]}

I'll explore other ways to make the proof.
 
Last edited:
That's NOT the "definition of the angle", that's a formula for the measure of the angle.
 
I'm sorry if I offended you, but what you are saying is in direct contrast with what my text is saying, either way, it's a moot point.

All I want to know is how to prove that the interior angles on a spherical triangle sum to \pi + \frac {A}{a^2}

I felt the textbook "Gravity: An Introduction to Einstein's General Relativity" by James B. Hartle was leading me in how to make this proof when it said,
The angle between two intersecting lines, for example, can be defined as the ratio of the length \Delta C of the part of the circle centered on their intersection that lies between the lines to the circle's radius R.
\theta \equiv \frac{\Delta C}{R} (radians).

With this definition we could prove that the sum of the interior angles of a triangle is \pi.
 
I proved it! Though not in the way I thought.

Since a spherical triangle is made up of three intersecting great circles you can use the area enclosed by each of the sections of the great circle. (These sections being the area that is closed off by two of the three intersecting circles). It helps to draw it out. When you do so, it becomes apparent that adding together each of the areas will equal the area of one hemisphere plus twice the area of the triangle they make. Using that, you can easily show that the sum of the angles on that triangle is \pi + \frac{A}{a^2}

:biggrin:
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top