What is the Relationship between Angles and....pixels?

In summary: There is no overlap in the pixels they capture, because the image planes are not aligned.In summary, your three cameras have different image planes, and there is no overlap in the pixels they capture.
  • #1
Justice Hunter
98
7
Wasn't really sure how to ask this question since it's kind of niche.

I have 3 cameras, which have the property that they occupy no physical space, and have no lens. They are defined only by the angle of their view.

Now these cameras capture a certain field and take an image. The cameras will always creates an image that's 1280 pixels in length, and 720 pixels in height. The other two cameras Left and Right, are rotated along the center cameras axis by any arbitrary amount, and create an elongated image of the center camera. So when the Left and Right cameras are placed and then rotated at the same Angle of View as the center camera, the amount of pixels the field of view extends to is 3x1280, which would be 3840. Below is an illustration that reflects this.
Physics forum help anamorphic1.png

Additionally, if the Left and Right cameras are placed at 0 rotation from the center camera, the minimum Field of View is 1280pixels.

So that's the setup. Now the issue arises when i want to find out the relationship between the light blue lines (Lines of Interest) and the angle of view of the L and R cameras.. Essentially, as the angle of the Left and Right cameras increase or decrease, while the center cameras angle remains the same, the position of the Lines of Interest also changes, as well as the total amount of the field of view (The purple line).

My problem is finding out the overlap in terms of pixels, when the angle of view of the Left and Right cameras change. So for example if the L and R cameras are rotated 10degrees from the center camera, the position of those blue lines changes to some number between 0 and 1280 (0 being no change in angle, and 1280 being a change in angle that's equal to the angle of the center camera.

I searched online for anything remotly close to this, but many of the things i found online had to deal with cameras that took up real space, which involved real physics behaviour about light and lens's, which is not what I'm looking for. I'm looking for a much more mathematical explanation which are mostly about relationships to triangles.
 
Physics news on Phys.org
  • #2
Generally a camera needs to have a lens, and if the object is far away, the detector array is in the focal plane of the lens. There is a pinhole type camera that is an exception to this, but otherwise, an array of pixels as a detector is not going to be able to image anything.
 
  • #3
Apart from the real life thing that Charles addresed above, sounds to me like all you need to solve this type of a problem is some basic geometry. There are some triangles and angles involved, but you shouldn't need anything more complicated than a basic trigonometry to calculate everything.
 
  • #4
I agree with the the previous comments. For your analysis, any camera can be considered just a "pinhole" camera. The rest is plane geometry. I don't really understand, from your description, the exact question you have.
 
  • #5
Thanks guys for the replies, but i ended up finding the answer to my question while attempting to write my comment further articulating the problem. The above is described by a Proportion:

LOI/Hpixel = AOS/TAoV

Where LOI is the line of interest (our variable)
Hpixel is the fixed number of horizontal pixels
AOS is the angle of separation from the center camera ∠[(L+R)-C]
TAoV is the angle of the center camera, plus the angles of separation of L and R ∠[L+C+R]

So, for example : LOI/1920 = 20/50. If my center camera (assumes all cameras have same angle of view) have an angle of 30, then the total angle of two cameras tilted 10 degrees away from the center gives it a total of 50 degrees, and sets the AOS to 20 in the numerator. The LOI will always spit out a number between 0 and 1280 (In this case it's 768), which was surprising to me, but it's what confirms that the relationship is true.

I know the answer could be articulated a bit better, but for now this solves my current problem, if anyone wants to clean up that answer that would be nice.

Thanks again.
 
  • #6
We view our spherical world through a lens, by projecting it onto our spherical retinas.

There is a problem with your model of the camera. The view is being considered in spherical coordinates, but the rectangular image is formed on the plane surface of an image sensor.

The edges of adjacent images will not match unless you distort the Cartesian images to make the directional lines of longitude vertical.
 
  • #7
As I understand it, you have three pinhole cameras sharing the same pinhole.
So the only different among those cameras are their image planes.
Your two lines of interest land on a plane that we will call the subject plane.

Let's say that the all three cameras have a view that extends 30 degrees to the left and right of the center-of-view axis - a total 60-degree view.

We will also say that the center camera and just barely includes your two lines of interest. It will divide the space between those line into 1280 parts - and because the subject plane is parallel to the center cameras image plane, those pixels will be evenly spaced along the subject plane.

Now your right and left cameras are not looking directly at the subject plane. You do not show their image planes, but normally the image plane is perpendicular to a ray extending from the focal point (pin hole) to the center of the camera view - and so I will assume that to be the case.

Let's say that each of them is rotated 30 degrees to the side (relative to the center camera). So they will have a view that runs from 0 to 60 degrees or 0 to -60 degrees.

Since each side camera is catching half the frame of the center camera, it will have have 640 pixels of overlap with that center camera. But be careful. Even though there are 640 pixels of overlap, those 640 center camera pixels and 640 side camera pixels are not distributed across the subject plane in the same way. Also, the width of center wall captured by the center camera will be less that that captured by either side camera. This is especially true when the side cameras are rotated far enough for them to catch the subject plane horizon in their view. When this happens, the amount of subject plane surface area captured by each side camera becomes infinite.
 

1. What is the relationship between angles and pixels?

The relationship between angles and pixels is that pixels are the smallest unit of measurement in digital images, while angles are used to measure the direction of lines or surfaces. In digital images, pixels are used to represent the colors and details of an image, and the angles of lines and surfaces within the image can affect how the pixels are displayed.

2. How do angles affect the appearance of pixels in an image?

Angles can affect the appearance of pixels in an image in a few ways. First, the angle of a line or surface can determine how the pixels are arranged and displayed, which can affect the overall shape and structure of the image. Additionally, the angle of light hitting an object can affect the way the pixels are reflected and captured, which can impact the color and brightness of the pixels in the image.

3. Is there a specific angle that is best for capturing high-quality pixel images?

There is no specific angle that is universally considered the best for capturing high-quality pixel images. The optimal angle for capturing an image will depend on various factors such as lighting, subject matter, and the desired effect. Experimenting with different angles can help determine the best approach for capturing a high-quality pixel image.

4. How do angles and pixels relate to image resolution?

Image resolution is a measure of how many pixels are contained within an image. Therefore, the relationship between angles and pixels is closely related to image resolution. The angle of lines and surfaces within an image can affect the arrangement and distribution of pixels, which can impact the overall resolution and clarity of the image.

5. Can angles and pixels be used to enhance or manipulate images?

Yes, angles and pixels can be used to enhance or manipulate images in various ways. For example, changing the angle of light hitting a subject can create different shadows and highlights, which can enhance the overall appearance of an image. Additionally, manipulating the arrangement and distribution of pixels can alter the resolution and clarity of an image, allowing for creative effects and enhancements.

Similar threads

  • Computing and Technology
Replies
3
Views
364
  • Optics
Replies
3
Views
1K
  • General Engineering
Replies
17
Views
2K
Replies
1
Views
955
Replies
3
Views
2K
Replies
152
Views
5K
  • Engineering and Comp Sci Homework Help
Replies
9
Views
1K
  • Introductory Physics Homework Help
Replies
29
Views
922
  • Introductory Physics Homework Help
Replies
8
Views
2K
Replies
2
Views
768
Back
Top