What is the benefit of a positive-only sigmoid function?

  • Thread starter Thread starter ADDA
  • Start date Start date
  • Tags Tags
    Function
Click For Summary

Discussion Overview

The discussion centers around the benefits and applications of a positive-only sigmoid function, particularly in the context of neural networks and activation functions. Participants explore its characteristics, conventions, and implications for modeling and network design.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants note that the positive-only sigmoid function is commonly used because it returns values that are monotonically increasing, typically ranging from 0 to 1 or from -1 to 1, depending on the convention.
  • One participant questions the relevance of the topic in the Computing forum and requests more context regarding its application, suggesting that additional information would facilitate better answers.
  • Another participant suggests that the positive range of activation functions in neural networks allows for a clear interpretation of activation levels, where 0 indicates no activation and 1 indicates full activation.
  • A participant mentions that the sigmoid function introduces non-linearity to neural networks, which may help simulate biological systems more effectively.
  • One participant expresses confusion about how a positive-only activation function can effectively "pull down" on incorrect inputs, indicating a potential issue with convergence in their implementation.
  • Another participant provides a brief mathematical observation regarding the behavior of the function with positive x values, noting that the output at x=0 equals 0.
  • A later reply discusses modifying the function by adjusting parameters to shift its curve, although it raises concerns about smoothness beyond certain values.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the benefits of the positive-only sigmoid function, and multiple viewpoints regarding its application and implications remain present throughout the discussion.

Contextual Notes

Some participants express uncertainty about the function's behavior in specific contexts, such as its effectiveness in neural networks and the implications of its positive-only nature on network performance.

ADDA
Messages
67
Reaction score
2
Computer science news on Phys.org
ADDA said:
What is the benefit of a positive only sigmoid function? or why is it 'most often' used?

NOTES:
return value monotonically increasing most often from 0 to 1 or alternatively from −1 to 1, depending on convention; source: https://en.wikipedia.org/wiki/Sigmoid_function
Can you give more context to your question? Why are you asking this in the Computing forum? Is it for some modeling work you are doing? More information would make it much easier to try to answer your questions.
 
Is this for a neural net node?

Having the activation function of each node to range from 0 (no activation) to 1 (full activation) is a common convention in neural net design.

Here are some common activation functions:

https://en.wikipedia.org/wiki/Activation_function

The sigmoid activation function provides some non-linearity to the neural net to simulate biological systems better.
 
  • Like
Likes   Reactions: berkeman
jedishrfu, you are correct. The equation, 1.0 / (1.0 + e^(-x)), comes from this video:

When I implemented a network, however, the output always converged to the error vector. Perhaps, I was wrong, I no longer have the code.

My question comes from the idea that the network has to pull down on wrong input. How can a node pull down with a positive only activation function?
 
take a look at this function with only positive x values. y for x=0 equals 0
 

Attachments

  • Sans titre.png
    Sans titre.png
    14.9 KB · Views: 590
you can slide it more to the right by increasing the nb 2 at the end of the equation but past 7 the curve is not smooth enough
 

Similar threads

Replies
45
Views
7K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 29 ·
Replies
29
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K