Advances in Machine Intelligence

  • #51
Buzz Bloom
Gold Member
2,272
392
Should say "all possible" moves...
Hi jon:
Do you have a source for this quote saying "all possible"? I corrected the typo to "promising".

Regards,
Buzz
 
  • #52
1,241
189
Do you have a source for this quote saying "all possible"? I corrected the typo to "promising".
No I just meant that would be the ideal, I don't think it's possible really as the number of moves possible could be astronomical...
 
  • #53
chroot
Staff Emeritus
Science Advisor
Gold Member
10,226
34
Interesting thread!

It is true that researchers had figured out 90% of the pieces of a modern neural net in the 1980's. It is true that most modern deep learning is just the same-ol' backpropagation. In some sense, it is true that modern AI is not all that modern.

It is also true that computer hardware has enabled today's modern AI renaissance, because deep learning requires immense amounts of data. No one in the 1980's had any idea of just how much data would be required. Many researchers in the 1980's gave up on their ideas because they couldn't make them work, even though they were right! They just needed 1,000x or 1,000,000x more data, which wasn't even conceivable in the 1980's.

Big data isn't enough, though. The remaining 10% of the pieces were not at all obvious, but they were utterly necessary for good performance. Some problems, like the exploding / vanishing gradient problem, vexed researchers for a decade. It turns out that it pretty much goes away if you just use ReLU instead of sigmoid activation... and ReLU is actually simpler and much faster to compute!

The landscape of AI research today feels a lot like life hunting for better enzymes by evolution. Life struggles along for millennia with a poor enzyme, until some random mutation changes one amino acid and -- BOOM! -- the enzyme is suddenly 10,000x better. Everyone in the AI community feels like just about any little tweak here or there could actually win the lottery and prove to be an incredible breakthrough. So, AI researchers are exploring every little idea like it might be the next big thing. It is very common today to see a 10x or 100x improvement in the speed or quality of an AI algorithm in a single year, solely because of some neato trick that no one expected would work. The field is fast-paced, intellectually curious, and a lot of fun.
 
  • Like
Likes AaronK, Asymptotic, stoomart and 2 others
  • #55
35
14
Fantastic work done by IBM scientists that I just heard about: Abu Sebastion et al. at IBM have managed to reliably collocate computation and memory at the nanometer scale to compute tasks in a different way from the usual von-Neumann architecture. They do this by exploiting the crystallization dynamics of the phase change memory devices and thereby perform in-memory computation (reminded me of memristor based architectures, but this is a different approach). Also, this tech will allow for massively parallel computing systems--super useful for machine learning!

Here's the paper's abstract: "Conventional computers based on the von Neumann architecture perform computation by repeatedly transferring data between their physically separated processing and memory units. As computation becomes increasingly data centric and the scalability limits in terms of performance and power are being reached, alternative computing paradigms with collocated computation and storage are actively being sought. A fascinating such approach is that of computational memory where the physics of nanoscale memory devices are used to perform certain computational tasks within the memory unit in a non-von Neumann manner. We present an experimental demonstration using one million phase change memory devices organized to perform a high-level computational primitive by exploiting the crystallization dynamics. Its result is imprinted in the conductance states of the memory devices. The results of using such a computational memory for processing real-world data sets show that this co-existence of computation and storage at the nanometer scale could enable ultra-dense, low-power, and massively-parallel computing systems."

Here's the link to the nature article where you can get the pdf if you'd like: https://www.nature.com/articles/s41467-017-01481-9
 
  • Like
Likes QuantumQuest

Related Threads on Advances in Machine Intelligence

Replies
7
Views
978
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
4
Views
4K
  • Last Post
Replies
1
Views
2K
Replies
53
Views
10K
Replies
2
Views
2K
Replies
34
Views
3K
Replies
6
Views
2K
Replies
3
Views
2K
Replies
12
Views
1K
Top