Deep Learning, learning completion, silicon chip implementation

  • Thread starter kris kaczmarczyk
  • Start date
  • Tags
    Silicon
In summary, the conversation discussed the possibility of using "deep learning" and back propagation to create static coefficients for thousands of nodes, such as in a game of chess. It was questioned whether these coefficients could be hard-wired onto hardware for tasks like driving cars or translating languages. The article mentioned the world's largest computer chip with two trillion transistors, but it was noted that even this would not be enough to handle the vast number of board positions in chess.
  • #1
kris kaczmarczyk
14
0
TL;DR Summary
Back propagation, learning process, coefficients at the hidden layer nodes. Hard-coded on the silicon?
After lengthy process of "deep learning" and back propagation of information; would we get static coefficients for the thousand of nodes; for example for playing game of chess, would that state be good to hard-wire on the silicon cheep and this way we would have perfect chess player?

Or the coefficients are always changing? In other words can we stop learning at certain stage of error percentage and than we have some set of numbers which we can hard-code on to the hardware (driving cars, translating, painting)?

https://www.popularmechanics.com/technology/design/a28816626/worlds-largest-computer-chip/
 
Technology news on Phys.org
  • #2
https://en.wikipedia.org/wiki/Solving_chess#Predictions_on_when/if_chess_will_be_solved

two trillion (2e12 since the article is inch-based it must come from one of those countries) transistors sill isn't much if you have to deal with 1e43 board positions...

But it'll be a nice step forward

You would need a lot of those chips ! I wonder if anyone can make a Shannon-like guess ?

Note that one layer all over the surface of the world only gets you 1e16 chips
 

1. What is Deep Learning?

Deep Learning is a subset of machine learning that involves training artificial neural networks to learn and make predictions from data. It is inspired by the structure and function of the human brain and has shown great success in various fields such as computer vision, natural language processing, and robotics.

2. What is Learning Completion?

Learning Completion is a technique used in Deep Learning to train neural networks with missing or incomplete data. It involves predicting the missing values based on the available data, and using them to update the network's parameters. This allows for more efficient and accurate learning, especially in cases where data is scarce or incomplete.

3. How is Deep Learning implemented on silicon chips?

Deep Learning can be implemented on silicon chips through the use of specialized hardware known as Neural Processing Units (NPUs). These chips are designed specifically for executing deep learning algorithms, and their parallel architecture allows for faster and more efficient processing of neural networks.

4. What are some applications of Deep Learning?

Deep Learning has a wide range of applications, including image and speech recognition, natural language processing, autonomous vehicles, and predictive analytics. It is also being used in healthcare for disease diagnosis and drug discovery, and in finance for fraud detection and stock market forecasting.

5. What are the limitations of Deep Learning?

Although Deep Learning has shown great success in many areas, it also has some limitations. One of the main challenges is the need for large amounts of labeled data for training, which can be time-consuming and expensive. Additionally, deep learning models can be complex and difficult to interpret, making it challenging to understand the reasoning behind their predictions.

Similar threads

  • Science Fiction and Fantasy Media
Replies
3
Views
2K
  • STEM Academic Advising
Replies
6
Views
2K
  • Sticky
  • Aerospace Engineering
2
Replies
48
Views
60K
  • STEM Academic Advising
Replies
13
Views
3K
  • General Engineering
Replies
27
Views
8K
  • General Engineering
Replies
7
Views
4K
  • General Discussion
Replies
7
Views
2K
Replies
6
Views
3K
Back
Top