Why backpropagation dominates neural networks instead of interpolation

Click For Summary
SUMMARY

The discussion centers on the advantages of backpropagation in neural networks compared to classical interpolation and extrapolation methods. Participants clarify that while interpolation relies on known equations to minimize error, backpropagation allows neural networks to learn complex functions through the adjustment of weights and biases across multiple layers. This flexibility enables neural networks to model sophisticated decision processes, as the intermediate layers can interpret data in ways that traditional methods cannot. The consensus is that backpropagation's ability to optimize weights for hidden layers is what distinguishes it from simple interpolation techniques.

PREREQUISITES
  • Understanding of neural network architecture and components, including layers, weights, and biases.
  • Familiarity with backpropagation algorithm and its role in training neural networks.
  • Knowledge of interpolation and extrapolation methods in data fitting.
  • Basic concepts of machine learning and model training.
NEXT STEPS
  • Study the backpropagation algorithm in detail, focusing on weight adjustment techniques.
  • Explore neural network architectures, particularly the role of hidden layers in decision-making.
  • Investigate the differences between supervised learning and interpolation methods.
  • Learn about advanced optimization techniques used in training neural networks, such as gradient descent.
USEFUL FOR

Machine learning practitioners, data scientists, and anyone interested in understanding the mechanics of neural networks and the significance of backpropagation in model training.

jonjacson
Messages
450
Reaction score
38
TL;DR
I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.
Hi guys,

I was learning machine learning and I found something a bit confusing.
When I studied physics I saw the method of least squares to find the best parameters for the given data, in this case we assume we know the equation and we just minimize the error. So if it is a straight line model we compute the best slope and constant.
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
 
Technology news on Phys.org
jonjacson said:
TL;DR Summary: I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.

With machine learning we don't know the underlying equation or model
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
 
Hill said:
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
 
jonjacson said:
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
Here is one way to define the function (sorry, I'd need to dig to find the source of this paper):

1702209231417.png

.......
1702209291789.png

1702209321470.png

.......
1702209388276.png

1702209430175.png
 
  • Like
Likes   Reactions: jonjacson
jonjacson said:
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
I am not an expert, but here is my understanding:
The backpropagation of neural networks is to develop weights for intermediate layers that can help to model a sophisticated decision process. Those intermediate layers are not given data like interpolation is. They are free to interpretation. Sometimes they are obscure and other times we can imagine a meaning for them. The only hard training data are the inputs and outputs at the first and last layer.
 
  • Like
Likes   Reactions: jonjacson

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
8K
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K