StatGuy2000
Education Advisor
Gold Member
- 2,065
- 1,161
newjerseyrunner said:I think it's simply because we're on the precipice of a mind boggling social disruption but we haven't quite gone over it yet. It's simply new and untested. It has the potential for destruction on an unimaginable scale, or it could ferry us into a new golden age. Humans had the same reservations about unlocking the power of the atom. The main horror is that we don't know where the major breakthrough will come from, and we don't like being out of control. It's understood that the invention of a truly intelligent machine could outsmart every banker and investor in the world and have total control over the stock market before we can even notice.
War is an even scarier proposition. If two advanced states end up warring, the AI race will heat up. It's a paradigm shifting technology, and the side that gets there first will overwhelm everyone else. If Hitler figured out the bomb before us, I'm not sure the allies still would have won the war. It's a terrifying thought that we didn't get there first by very much, but doing so completely changed the world order.
I've thought a bunch about the effects of AI on a planet long term. I've come to the conclusion that AI will be the masters of the universe. If we continue to build benign AI, we will become more and more dependent on it. Over generations, it'll just become a more and more important role in society. It'll control the economy, entertain us, server us, and shape our society. As society gets more complex, the need for humans to work will become less and less. Humans and AI will at first work together, but eventually the work will get too complex for humans and the AI will take over. There is a history of this. There are two species on this planet that worked together in order to survive, but as complexity grew, the smarter one came to dominate and the lesser one ended up never working much at all: humans and dogs. I think we'll eventually become more like pets to god-like machines. I think we'll be perfectly okay with that. Most humans currently believe that we are subservient to one or more gods.
@newjerseyrunner, for the scenario you present to come to fruition, are you not assuming that progress in AI development will proceed more or less smoothly? Because, from my vantage point, that is far from obvious. Yes, we have made considerable advances in areas of machine learning, such as neural networks/deep learning, but it is far from clear to me that "strong" AI will necessarily emerge out of deep learning (from what I've read, the most impressive results from deep learning came about more through advances in computing power rather than anything particularly groundbreaking in the specific algorithms or theoretical underpinnings, much of which has been already laid out since the early 1990s, if not before then).