simple question.... do we see in digital or analog?
There's a baby in a cradle. Some person bends down over the cradle, and smiles at the baby. There are 2 possible responses:
0) This is mom.
1) This is not mom.
So, clearly: digital - just jokin'!
Actually that's an interesting question. Since our eyes detect light with a set of rods and cones then I'd say that it was a little of both but to be accurate the question turns to - How do rods and cones detect light/photons?
Something in the middle -- more akin to fuzzy logic. Fuzzy logic deals with not just two states (truth and falsehood, one and zero, etc.) but in classes of stimuli (very purple, sorta purple, a little purple, not very purple, etc.). The brain seems to do a lot of work in fuzzy logical terms.
You can analyze the pieces individually: the rods & cones, the optical nerve fibers that carry messages, the sections of the visual cortex used for edge detection, concavity, closure, and so on -- and the end result is that the brain does not see in either purely digital nor purely analog style.
Which is an interesting point. We do not "see" with our eyes, our brain actually constructs the image for us, we "see" with our brains.
But I guess the same can be said for hearing. Were your hairs in your coclea are each tunedd for a specific wavelength that transmit to the brain, and can transmit amplitude information as well.
I'm not totally certain it is anything we have said. But I still debate with my friend whether it is or not.
So should we debate why we really can see more than 30 frames per second?
We see in analog but the messages are sent digitally?
There's no need to debate the issue of the number of frames per second we can see -- it's at least 1000 per second. Experiment can show us this.
However, "seeing" involves the brain. The brain is highly adept at recognizing certain stimuli unfold in millisecond scale (for example, a strobe), but not others. I'm not entirely sure, but I think it's a bit incorrect to think of the eye-brain system as taking "snapshots" at all. The flow of data from eye -> visual cortex -> conscious white matter may be more or less continuous.
I would believe 1000fps myself. I didn't think it was that high, but i'm sure it is. I remeber a study done by the navy and they were showing things at 1/200th of a second and a person can see object and recognize them.
Hmmm, would the constant "seeing" that you say we do (instead of snapshots) be analog in itself?
Eyes are digital. CCD-camera like sampling. Even more so, isn't eye sensors excited by particle aspect of photons? That make eyes even more digital.
But messages to brain are indeed more like analog signals per pixel.
In regards to "seeing" itself, focus of attention of brain has quite some impact. Things at focus get more processing and that reduces "fps". But our peripheral vision can detect small changes as per 1000fps, but they are not clearly perceived images.
30fps is rate at which brain is able to construct fluid movie, its not the rate at which it can detect changes. Still, fps in focus is variable and depends on complexity of image.
If 30 FPS is what we consider gives us moive sight, then why is the movie standard 24 fps?
If you consider a rod or cone responding to a light o a photon and thus is digital, then everything should be digital. most of analog circuitry is just excitement of an electron to another state, and then another, and another.
Just stating questions, not harassing, believe me :-)
Do you have any references to support your assertion that eyes operate analogously to CCDs (i.e. an array of discrete pixels)? I was under the impression that they were far, far different.
As far as I know digital circuits operate with 2 levels of voltage while analog circuits operate with a continuous range of voltage. Even modern ICs are difficult to categorize as purely digital or purely analog circuits, they are more of a mix bertween the two.
Not even the CCD is entirely digital since you have to sample the value of the charge in the CMOS capacitors and you have to use some converters for that. So I'd say the answer is somewhere in the middle.
In a sense you are correct, Guybrush.
There is no such thing as a "purely digital" circuit. The transistors that go into a digital circuit are inherently analog devices, but they are designed so that their analog character is negligible in the circuit. Essentially, if the transistor is fast enough, you can just assume it's "purely digital."
And you're also correct about CCDs -- the sense amps used to amplify the charge contained in each pixel are analog circuits. Shortly afterward, the analog voltage (corresponding to the amount of charge in a bucket) is converted to digital word by an analog-to-digital converter, a mixed-mode device. The digital words are then sent off-chip.
You didn't take it literally I hope.
with ccd image ;) http://users.rcn.com/jkimball.ma.ultranet/BiologyPages/V/Vision.html
more than you ever wanted: http://webvision.med.utah.edu/
Eye is such a mess
Guybrush, when we talk about digital IC's, it isn't voltages that matter, these fluctuate, but meaning that is assigned to them. Each digital circuit has 2 threshold levels of voltage, crossing which means flip of a bit. Its the information that is digital. Voltage can fluctuate above and below "1" level and it is still considered digital 1. There is also uncertainty region between thresholds of 1/0 where digital value is unspecified. To avoid errors from transit fluctuations, digital ICs are usually clocked, values are sampled when voltages are stable. Clocks of CPUs are limited by these uncertainty fluctuations for eg.
Digitization of analogue signal is sampling it with stable clock (timedomain discretisation) and trying to measure analog values at these instants with AD converters (that have limited precision and discrete values. Note that each "digital transistor" is equivalent to 1-bit AD converter). CCD's can do this as you said, and they provide digital stream of data on output. But first cameras (analog) had CCD and sampling, but didn't convert samples into digital data, but recorded it on tape in PAL/NTSC format which is analog.
Digital data is only part of the game. The image itself is discretisized by pixels already (spatial domain), image is not continuous, it is limited by resolution.
You can view it this and that way, but if there is limit to possible resolution (spatial in case of CCD), it can be considered digitized. Thats what I meant. Eyes are also consisting from "pixels", more like analog CCD cameras, and transfering pixel data to brain"tape" in neural format instead of PAL. Individual pixels offer analog signal though. Counting photons is exaggeration of course
PeteGt, 30fps is approximate. It depends on individual. 24fps was picked for cost reasons. 100yrs ago it was considered good enuf. There are also may tricks that helps to create illusion of smoothness in movies.
Separate names with a comma.