The first brain reading technology was developed way back in the 70's.
It was started by the fact that sharks were digging up buried telephone lines and attacking them. It was then that they realized that some sharks have the ability to sense certain electromagnetic signals and that they hunt schools of fish based on their EM broadcasts.
Humans broadcast their thoughts via EM to a limited extent as well.
By the 80's(I think) they had a little program with a skier on a screen avoiding oncoming obstacles. You could place two fingers on pad and think left or right to sorta control the skier.
In recent years the declassified programs allow pilots to control aircraft simulations via a headband and simply willing the plane to move. They are shown as still only moderately successful but I certainly hope the US military doesn't just tip it's hand in every new technology...
And then there's the next step in that field. Injecting EM into the brain for it to interpret. I know that back in the early 80's as well they had already inserted electrodes into the visual cortex of a blind man and hooked him up to a video camera that was pointed at a simple black and white screen which flashed representations of braille letters on the screen. Eventually the man was able to make out some of the letters once his brain adapted to the new stimuli...
Please excuse me while I present some interesting conjecture: Is it all that much of a leap to suspect that humans, which already have highly sensitive EM radiation interpretting devices (eyes) might have evolved some small ability to sense and interpret other frequencies of the EM spectrum before they evolved speech? (I've always wondered what sinuses were good for) A sort of vestigial ability primarily abolished via witch trials? LOL. Perhaps some other animals have developed it further. Lends some alternative thoughts to the behavior of fish moving as a school and birds as a flock.
"While people's attention switched between the two images, the researchers used fMRI (functional Magnetic Resonance Imaging) brain scanning to monitor activity in the visual cortex.
It was found that focusing on the red or the blue patterns led to specific, and noticeably different, patterns of brain activity.
The fMRI scans could reliably be used to predict which of the images the volunteer was looking at, the researchers found. "
Telling the difference between a subject looking at one thing or another by brain scan is interesting, but hardly constitutes "reading their thoughts". I can tell the same thing about people without a brain scanner by watching their eyes. I find this story headline to be sensational to the point of being misleading.
I imagine that's why the term was placed in scare quotes. It could be conceived as a kind of thought reading if one took a suitably wide meaning of the word 'thought.' If they could discern the same sort of things they do in these studies about an internal visual image you were imagining (which seems quite possible to me, as such imagining would presumably activate many of the same brain areas as 'actually' seeing), would that make you any more amenable to the term 'reading thoughts'?
In any case, it might be that we can eventually do sophisticated things like reading thoughts in terms of, say, decoding the contents of one's inner speech, with an approach similar to the one used in these studies. For now they are just taking baby steps.
Why be so broad and imprecise?
This sentence of the article is an outright lie:
What they did is to establish that a scan of a person's brain has certain features when they are forced to view either one of two visual images, and that that pattern changes when they switch their attention between one and the other.
No. It would be quite remarkable if they figured out how to do this, but a mental image is fundamentally different than the interior verbal monolog. They happen in different parts of the brain, and involve different circuits.
The term "reading thoughts" or "thought reading" is understood, I believe, to mean the ability to "hear" someone's internal monolog as if they were speaking it aloud. What's important in this concept is that you hear not only the words but the tone of voice so that you understand the emotional valence the person places on the words.
It might happen if there is some remarkable breakthrough that leads in that direction, but what these researchers have done isn't it. It's too rudimentary. We should expect and predict that the scans would be different according to the type of pattern they are looking at, because we already know brain activity changes according to the type of thing it is doing, (from EEG's) and I would have been dissapointed if they'd been unable to detect a difference.
We might just as well call it "thought reading" when we detect the difference between alpha and beta waves with EEG's, or when we see someones eyes move from one object to another, for that matter. That's how overblown this claim is. When your baby takes its first step, you don't announce that it has won the Boston Marathon, because, really, it might not happen.
Separate names with a comma.