Do we see in digital or analog?

In summary, the conversation discusses the question of whether we see in digital or analog. Some argue that it is a combination of both due to the way our eyes detect light and our brain processes information. The brain is highly adept at recognizing stimuli in a continuous flow rather than snapshots. There is also a discussion about the frame rate at which we can detect changes, with some saying it is at least 1000 frames per second. The conversation also touches on the analog and digital nature of circuits and cameras, with the conclusion that the answer lies somewhere in the middle.
  • #1
PeteGt
51
0
simple question... do we see in digital or analog?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
There's a baby in a cradle. Some person bends down over the cradle, and smiles at the baby. There are 2 possible responses:
0) This is mom.
1) This is not mom.
So, clearly: digital - just jokin'!
 
  • #3


Originally posted by PeteGt
simple question... do we see in digital or analog?

Actually that's an interesting question. Since our eyes detect light with a set of rods and cones then I'd say that it was a little of both but to be accurate the question turns to - How do rods and cones detect light/photons?

Pete
 
  • #4
Something in the middle -- more akin to fuzzy logic. Fuzzy logic deals with not just two states (truth and falsehood, one and zero, etc.) but in classes of stimuli (very purple, sort of purple, a little purple, not very purple, etc.). The brain seems to do a lot of work in fuzzy logical terms.

You can analyze the pieces individually: the rods & cones, the optical nerve fibers that carry messages, the sections of the visual cortex used for edge detection, concavity, closure, and so on -- and the end result is that the brain does not see in either purely digital nor purely analog style.

- Warren
 
  • #5
Which is an interesting point. We do not "see" with our eyes, our brain actually constructs the image for us, we "see" with our brains.

But I guess the same can be said for hearing. Were your hairs in your coclea are each tunedd for a specific wavelength that transmit to the brain, and can transmit amplitude information as well.

I'm not totally certain it is anything we have said. But I still debate with my friend whether it is or not.

So should we debate why we really can see more than 30 frames per second?

Pete
 
  • #6
We see in analog but the messages are sent digitally?
 
  • #7
There's no need to debate the issue of the number of frames per second we can see -- it's at least 1000 per second. Experiment can show us this.

However, "seeing" involves the brain. The brain is highly adept at recognizing certain stimuli unfold in millisecond scale (for example, a strobe), but not others. I'm not entirely sure, but I think it's a bit incorrect to think of the eye-brain system as taking "snapshots" at all. The flow of data from eye -> visual cortex -> conscious white matter may be more or less continuous.

- Warren
 
  • #8
I would believe 1000fps myself. I didn't think it was that high, but I'm sure it is. I remeber a study done by the navy and they were showing things at 1/200th of a second and a person can see object and recognize them.

Hmmm, would the constant "seeing" that you say we do (instead of snapshots) be analog in itself?
 
  • #9
Eyes are digital. CCD-camera like sampling. Even more so, isn't eye sensors excited by particle aspect of photons? That make eyes even more digital.

But messages to brain are indeed more like analog signals per pixel.

In regards to "seeing" itself, focus of attention of brain has quite some impact. Things at focus get more processing and that reduces "fps". But our peripheral vision can detect small changes as per 1000fps, but they are not clearly perceived images.
30fps is rate at which brain is able to construct fluid movie, its not the rate at which it can detect changes. Still, fps in focus is variable and depends on complexity of image.
 
  • #10
If 30 FPS is what we consider gives us moive sight, then why is the movie standard 24 fps?

If you consider a rod or cone responding to a light o a photon and thus is digital, then everything should be digital. most of analog circuitry is just excitement of an electron to another state, and then another, and another.

Just stating questions, not harassing, believe me :-)

Pete
 
  • #11
wimms,

Do you have any references to support your assertion that eyes operate analogously to CCDs (i.e. an array of discrete pixels)? I was under the impression that they were far, far different.

- Warren
 
  • #12
As far as I know digital circuits operate with 2 levels of voltage while analog circuits operate with a continuous range of voltage. Even modern ICs are difficult to categorize as purely digital or purely analog circuits, they are more of a mix bertween the two.
Not even the CCD is entirely digital since you have to sample the value of the charge in the CMOS capacitors and you have to use some converters for that. So I'd say the answer is somewhere in the middle.
 
  • #13
In a sense you are correct, Guybrush.

There is no such thing as a "purely digital" circuit. The transistors that go into a digital circuit are inherently analog devices, but they are designed so that their analog character is negligible in the circuit. Essentially, if the transistor is fast enough, you can just assume it's "purely digital."

And you're also correct about CCDs -- the sense amps used to amplify the charge contained in each pixel are analog circuits. Shortly afterward, the analog voltage (corresponding to the amount of charge in a bucket) is converted to digital word by an analog-to-digital converter, a mixed-mode device. The digital words are then sent off-chip.

- Warren
 
  • #14
Originally posted by chroot
Do you have any references to support your assertion that eyes operate analogously to CCDs (i.e. an array of discrete pixels)? I was under the impression that they were far, far different.
You didn't take it literally I hope.
short: http://hyperphysics.phy-astr.gsu.edu/hbase/vision/eye.html#c1
with ccd image ;) http://users.rcn.com/jkimball.ma.ultranet/BiologyPages/V/Vision.html
more than you ever wanted: http://webvision.med.utah.edu/
Eye is such a mess :smile:

Guybrush, when we talk about digital IC's, it isn't voltages that matter, these fluctuate, but meaning that is assigned to them. Each digital circuit has 2 threshold levels of voltage, crossing which means flip of a bit. Its the information that is digital. Voltage can fluctuate above and below "1" level and it is still considered digital 1. There is also uncertainty region between thresholds of 1/0 where digital value is unspecified. To avoid errors from transit fluctuations, digital ICs are usually clocked, values are sampled when voltages are stable. Clocks of CPUs are limited by these uncertainty fluctuations for eg.

Digitization of analogue signal is sampling it with stable clock (timedomain discretisation) and trying to measure analog values at these instants with AD converters (that have limited precision and discrete values. Note that each "digital transistor" is equivalent to 1-bit AD converter). CCD's can do this as you said, and they provide digital stream of data on output. But first cameras (analog) had CCD and sampling, but didn't convert samples into digital data, but recorded it on tape in PAL/NTSC format which is analog.

Digital data is only part of the game. The image itself is discretisized by pixels already (spatial domain), image is not continuous, it is limited by resolution.
You can view it this and that way, but if there is limit to possible resolution (spatial in case of CCD), it can be considered digitized. Thats what I meant. Eyes are also consisting from "pixels", more like analog CCD cameras, and transfering pixel data to brain"tape" in neural format instead of PAL. Individual pixels offer analog signal though. Counting photons is exaggeration of course :smile:

PeteGt, 30fps is approximate. It depends on individual. 24fps was picked for cost reasons. 100yrs ago it was considered good enuf. There are also may tricks that helps to create illusion of smoothness in movies.
 

1. Do our eyes see in digital or analog?

Our eyes see in analog. Analog vision is the ability to perceive continuous gradations of light and color, as opposed to digital vision which is based on discrete pixels.

2. Are digital images more accurate than analog images?

It depends on how you define accuracy. Digital images have a fixed resolution and can capture more detail, but analog images can capture a wider range of colors and tones, making them more true-to-life in some cases.

3. Can we change our vision from analog to digital?

No, our eyes are naturally designed to see in analog. However, digital technology such as cameras and screens can mimic analog vision and display images in a digital format.

4. How does digital vs analog vision affect our perception?

Digital vision can create sharper, more precise images, while analog vision may have a softer, more natural appearance. This can affect our perception of depth, color, and overall visual experience.

5. Is there a benefit to seeing in digital or analog?

Both types of vision have their own advantages. Digital vision allows for precise and consistent image reproduction, while analog vision allows for a more natural and immersive visual experience. Ultimately, it depends on the context and purpose of the vision.

Similar threads

  • Other Physics Topics
Replies
9
Views
3K
Replies
10
Views
2K
  • Computing and Technology
Replies
0
Views
2K
  • Computing and Technology
Replies
11
Views
3K
  • Computing and Technology
Replies
0
Views
1K
  • Electrical Engineering
2
Replies
42
Views
4K
  • Linear and Abstract Algebra
Replies
1
Views
798
  • Computing and Technology
Replies
5
Views
1K
  • Quantum Physics
2
Replies
43
Views
3K
Replies
2
Views
1K
Back
Top