Why backpropagation dominates neural networks instead of interpolation

Click For Summary

Discussion Overview

The discussion centers around the differences between backpropagation in neural networks and classical interpolation/extrapolation methods in machine learning. Participants explore the nature of neural networks, the role of weights and biases, and the implications of not knowing the underlying model in machine learning.

Discussion Character

  • Exploratory
  • Debate/contested
  • Conceptual clarification

Main Points Raised

  • One participant expresses confusion about the advantages of backpropagation in neural networks compared to interpolation/extrapolation methods.
  • Another participant argues that each artificial neuron implements a definite function with parameters that need to be determined, suggesting that the claim about not knowing the underlying equation is incorrect.
  • A different participant reiterates that the weights in a neural network compute outputs but questions whether this constitutes a function.
  • One participant proposes that backpropagation allows for the development of weights for intermediate layers that model complex decision processes, contrasting this with interpolation, which does not provide such flexibility.

Areas of Agreement / Disagreement

Participants express differing views on the nature of neural networks and the role of backpropagation versus interpolation. There is no consensus on the advantages of backpropagation or the characterization of neural network functions.

Contextual Notes

Some participants highlight the ambiguity in defining functions within neural networks and the implications of using weights and biases without a clear underlying model. The discussion reflects varying interpretations of the concepts involved.

jonjacson
Messages
450
Reaction score
38
TL;DR
I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.
Hi guys,

I was learning machine learning and I found something a bit confusing.
When I studied physics I saw the method of least squares to find the best parameters for the given data, in this case we assume we know the equation and we just minimize the error. So if it is a straight line model we compute the best slope and constant.
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
 
Technology news on Phys.org
jonjacson said:
TL;DR Summary: I don't understand what are the advantages of backpropagation in a neural network versus using classical interpolation/extrapolation methods.

With machine learning we don't know the underlying equation or model
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
 
Hill said:
I don't think this is correct. Each artificial neuron implements a definite function with parameters (weights and bias), which need to be determined to minimize a definite target function, which is built by composing the functions of the neurons.
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
 
jonjacson said:
The weights are used to compute how much of an output from one neuron enters the next ones, but that is not a function.
Here is one way to define the function (sorry, I'd need to dig to find the source of this paper):

1702209231417.png

.......
1702209291789.png

1702209321470.png

.......
1702209388276.png

1702209430175.png
 
  • Like
Likes   Reactions: jonjacson
jonjacson said:
With machine learning we don't know the underlying equation or model so we use a general method to "fit" as best as we can the real data.
But, isn't that what we do with interpolation/extrapolation?
What is it so special about a neural network and the backpropagation method that we can't achieve with interpolation?
I am not an expert, but here is my understanding:
The backpropagation of neural networks is to develop weights for intermediate layers that can help to model a sophisticated decision process. Those intermediate layers are not given data like interpolation is. They are free to interpretation. Sometimes they are obscure and other times we can imagine a meaning for them. The only hard training data are the inputs and outputs at the first and last layer.
 
  • Like
Likes   Reactions: jonjacson

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
3
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 7 ·
Replies
7
Views
8K
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K