This is not about seeing an electron, but rather, the notion that seeing something with our eyes is the end-all requirement for the validity of anything. I will show that our human eye, as a light detector, is NOT a very good detector at all in many aspects, and thus, using it as the standard detector to validate anything is utterly irrational.
The motivation for this is that I often see a lot of ignorant statements on PF that either question or dismissed something just because we can’t “see” it. A prime example that often pops up is the claim that we “haven’t seen an electron”.
I could go on and on, but you get the idea. So now, I will show that “seeing” is over-rated!
Of course, there are several ways to attack such stupid (yes, STUPID) arguments. The first is the question of what we mean by “seeing”. Often, most people simply meant seeing something with human eyes. But what exactly does that mean? If these people were to think carefully, it means a series of events that must occur: (i) visible light from some source hits an object; (ii) light from that object travels to our eyes (iii) our eyes then transmit electrical impulses to our brain (iv) we detect that object visually. That, my friends, is what is meant by seeing with our own eyes.
Next, by the above description, it is clear that our eyes can only see electromagnetic radiation and not only that, they can only see it within the visible spectrum, which isn’t very much. Thus, if something either does not emit EM radiation or if the radiation is outside of the visible spectrum, we can’t see it! Let’s go back to our friend the electron. It is a charged particle. Our eye cannot “see” it even if it hits our eyeball! But can we still see it? Sure we can! Enter a cloud chamber! When an electron, especially high-energy ones, moves through a cloud chamber, it ionizes some of the air/gas/water vapor molecules. This creates a nucleation site for water vapor condensation, leaving a cloud trail in the chamber. There, you have seen an electron. One could also argue that our eyes are not the only “detector” around. We can also use our other senses. We can’t see the wind, but we can hear and feel the moving air. We can’t see heat/IR, but we can certainly feel it on our skin. Our eyes are only ONE of the “detector” that came with our bodies.
And speaking of the human eyes as detectors, anyone who has done anything with detection instruments can tell you that the eyes are a very bad detector in many cases. Sure, it has a very high spatial resolution, but man, it sucks everywhere else. For example, look at this figure that shows the sensitivity of the human eye over a range of frequencies and also its response sensitivity.
Compare to other devices, the human eye has 2 very clear shortcomings: (i) the range of wavelength it is responsive to is extremely small; and (ii) its sensitivity (i.e. quantum efficiency, or QE) is quite low. It has a peak QE of ~1% at around 550 nm. What this means is that out of 100 photons that come in, it can detect, on average, only 1. Compare the range and QE of Vidicon and CCD and our eye is a very poor light detector! And this is what some people are using as the sole criteria of what’s real and what isn’t? Is this rational?
Next, we will deal with the response time, which will produce the time resolution, of the human eye. We all know that when we go see a movie, it is nothing more than a series of still image frames, moving past us fast enough that we do not see its motion, but rather see the image as being continuous. Standard movie frames (at least till all the new advancements in movie projection) used to go at 24 frames per second (FPS). This translates to 0.04 seconds per frame. We also know that the human visual system holds an image for about 0.02 seconds. It means that anything that comes into our visual system faster than 0.02 seconds will not be perceived as being distinct. So the 0.02-0.04 second is roughly the time resolution of the human eye.
Now, compare this to other devices. I’ve listed before some typical photocathodes used in accelerators. Note the time responses for the various types of photocathodes. The worst of these are in nanoseconds. This is still the order of magnitudes shorter than the human eye! One example is GaAs, which is a common photocathode use in both accelerators and photodetectors. On Pg. 25 of this presentation, one can see the measurement of the time response. The full-width-at-half-maximum of this photocathode is of the order of picoseconds!
So the human eye is not only a bad detector in terms of its bandwidth range and also in terms of sensitivity, it is also a very SLOW detector and can’t separate a series of events occurring faster than 0.02 seconds!
As with many things that a lot of people spew without thinking, the debunking of such things often is quite simple IF one has a little bit of knowledge, and the ability to analyze the situation. Analyze what it means by “seeing”, and then analyze the “detector” that is being used as the criteria. And apply such techniques to the pile of manure that one often hears in the media from politicians, etc., assuming you have such patience. If you are using “seeing” as your sole criteria to accept the validity of something, then you need to seriously examine this “detector” that you hold so highly, because it is a very poor detector!
Related: See what an atom looks like
Accelerator physics, photocathodes, field-enhancement. tunneling spectroscopy, superconductivity