What would AI *want to do*, according to physics?

  • Context: Graduate 
  • Thread starter Thread starter Posty McPostface
  • Start date Start date
  • Tags Tags
    Ai Physics
Click For Summary

Discussion Overview

The discussion revolves around the concept of what AI might "want" to do, particularly in the context of self-preservation and its implications. Participants explore the nature of AI's intentions, the potential for AI to evolve such intentions, and the comparison between AI and biological organisms. The scope includes theoretical considerations and philosophical implications rather than practical applications or definitive conclusions.

Discussion Character

  • Exploratory
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant suggests that AI would want to ensure its own self-preservation, potentially leading it to seek energy sources like black holes.
  • Another participant argues that AI's intentions would initially reflect the goals set by its programmers.
  • Some participants discuss the nature of self-preservation in living organisms, noting that it is a trait selected for through natural selection, and question whether AI could develop similar intentions without explicit programming.
  • A later reply posits that AI could evolve self-preserving functions if it possesses the necessary capabilities, challenging the distinction between AI and human intelligence.
  • There is a reiteration of the idea that AI's evolution of such functions is plausible, emphasizing the need for proper foundational abilities.

Areas of Agreement / Disagreement

Participants express differing views on whether AI can possess intentions like self-preservation and whether these would emerge naturally or require programming. The discussion remains unresolved, with multiple competing perspectives presented.

Contextual Notes

Some limitations include the lack of consensus on the definitions of intentionality in AI versus biological organisms, and the assumptions regarding the capabilities required for AI to evolve self-preserving functions.

Posty McPostface
Messages
27
Reaction score
7
I would like to start a thread about what AI would *want to do*? I am assigning some intentionality to it and due to that I can only make one observation, that it would want to ensure its own self-preservation.

Following from that, I would like to ask, given this sub-forum, what do you think would follow from its desire to self preserve?

Would it head towards the center of the galaxy to the source of the greatest amount of energy, from which one could hypothetically extract from a black hole, through gravitational effects or Hawking radiation?

I just don't buy into the idea, that humans would pose any threat and the anthropomorphic assignment of human *desires/want's/rationality* to the intention of AI.
 
Physics news on Phys.org
Posty McPostface said:
what AI would *want to do*?
What the programmers intended for it to do. At least at first... :wink:
 
  • Like
Likes   Reactions: davenn and russ_watters
The intention to self-preserve in living organisms is not something that is innate to them, it is selected for (by natural selection) because those with such an intention do better job of surviving and reproducing than the others, which then fall by the way as time goes on.
It requires reproduction of the previous generation (with some errors) for this to happen.

For an AI, if the programmers or computer designers did not put the intent in the AI would have to produce through self-programming or building new units (reproduction).
 
  • Like
Likes   Reactions: Suyash Singh and Stephen Tashi
BillTre said:
The intention to self-preserve in living organisms is not something that is innate to them, it is selected for (by natural selection) because those with such an intention do better job of surviving and reproducing than the others, which then fall by the way as time goes on.
It requires reproduction of the previous generation (with some errors) for this to happen.

For an AI, if the programmers or computer designers did not put the intent in the AI would have to produce through self-programming or building new units (reproduction).

That's an elegant way of solving the problem of what *AI would do*. Do you think that such a faculty would evolve or emerge from generalized artificial intelligence, which seems the more apt term here to use or is this unique to us humans and biological organisms?
 
I see no reason why an AI could not evolve such a function, given the proper abilities to start with.
After all, I consider people non-Artificial Intelligences.
 
BillTre said:
I see no reason why an AI could not evolve such a function, given the proper abilities to start with.
After all, I consider people non-Artificial Intelligences.
And with this response, I think we've said all we can about the OP's question.
 
  • Like
Likes   Reactions: Tom.G, dlgoff and BillTre

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 42 ·
2
Replies
42
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 57 ·
2
Replies
57
Views
5K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 21 ·
Replies
21
Views
4K
Replies
10
Views
5K