Why the 'Matrix Movie' is unrealistic

  • Thread starter Thread starter Chaos' lil bro Order
  • Start date Start date
  • Tags Tags
    Movie
Click For Summary
The discussion critiques the premise of "The Matrix," particularly the idea of humans as energy sources, arguing that the human body produces insufficient energy compared to other power generation methods. It highlights that even if all humans were used as batteries, they would only generate about 600GW, significantly less than the world's total electricity production. Participants also speculate on the film's unrealistic elements, such as the machines' energy needs and the feasibility of using humans in such a manner. Some argue that the movie's philosophical themes should not be overshadowed by its scientific inaccuracies, while others emphasize the importance of logical consistency in storytelling. Overall, the conversation reveals a mix of appreciation for the film's ideas and criticism of its scientific plausibility.
  • #31
arildno said:
But what if robotic intelligence can only be maintained by regular infusions with human blood??
The big thing (IMO) that separates a society of mechanical organisms from a society of biological organisms is that a mechanical society is aware of - and thus able to modify - every aspect, even the tiniest - of its workings.

Biological organisms are far too complex (and illogical) in how they're built. And we understand far, far too little about how they work to be able to mess with them to any degree. Our hardware and software development goes back billions of years before our knowledge of it, and we have to reverse-engineer billions of years of evolution.

It is much MUCH easier to eat a cow and a carrot - knowing that cows and carrots have provided what we've needed nutritionally for millenia - than it is to synthesize the correct proteins in the correct ratios that we need to sustain ourselves - let alone change them as we see fit.

Conversely, using Arildno's example, the mechanical society will know EXACTLY what the components are that it needs, and won't have to look for such an inefficient way of getting them such as leeching them from humans. Likewise, the mechanical society will be easily able to modify its citizens' hardware or software at will. Their own development is within their knowledge, i.e. they have no pre-history to reverse-engineer.
 
Last edited:
Physics news on Phys.org
  • #32
Sane said:
Think about it. You said the Matrix was unrealistic. I said that's besides the point, because that doesn't deter from the 'main idea' of the movie. Then you said that's irrelevant because you could make a movie that is bad without affecting that same idea. But what's the point in making that argument, if making "a VERY bad movie" doesn't mean making an unrealistic one? If you're just saying that making it bad can keep its theme intact, so what? You're throwing tomatoes in the fruit bowl.


I'm not really sure what the heck you're talking about. In my opinion you implied that since the main idea of The Matrix was legit then it is required to be considered a good movie. I believe that the main idea of the movie is legit and after that it kinda falls apart. While it is entertaining to watch and I don't consider it a bad movie, some of the things I pointed out earlier and MANY posters have pointed out since make the thing unrealistic.
-
I personally could take the main idea of The Matrix and make an absolute terrible wreck of a movie out of it. I as well as scores of other folks would consider my version a bad movie even though the 'main idea' is still good. So I guess what I'm trying to say is that a legitimate main idea does not necessarily equal a good movie. Likewise, making an unrealistic movie does not necessarily equal a bad movie. In the case of The Matrix my opinion is that it was entertaining enough to be considered a good movie. Lots of folks would not consider it good for various other reasons. That is their opinion and they are entitled to it.
-
Throwing tomatoes into the fruit bowl or whatever, call it what you like but I did NOT say that an unrealistic movie is automatically a bad movie. Oh one last thing, the whole theme of The Matrix is nothing new at all. Total Recall follows the same story line. That movie was based on a short story if I recall correctly.
 
  • #33
The short story was 'We Can Remember It For You Wholesale' by Phillip K. Dick.
 
  • #34
Thank you Danger!
 
  • #35
Wow, I knew my thread title would draw many replies, but I was unprepared to weather such a storm. My goal was to sneak in a lot of information about the human body's power usage and to put our aggregate usage into perspective versus the world's total power production. The Matix bit was just a guise to make the post more entertaining and to show that humans as batteries is a really inefficient idea.


That said...

Danger inspired me to think that the emergence of a world dominant AI would be a much more efficient consumer than humans are.

Consider a scenario where, right now, there is a sentient AI in the internet that sees humans as a threat to its 'life' and it has the sole goal of self preservation. We will assume that this AI has access to every bit of information published online, even secure top secret documents and technical schematics from all of the world's leading research institutions. Now we assume (again for the sake of this scenario) that this AI can gain control of all the networked electronic systems found worldwide, at its instant whim. But, the AI opts to wait until a time when it knows that it can successfully wipe humans off the Earth in one fell swoop. Perhaps the AI would wait until armed UAVs are mass produced by armies, since most military hardware still requires a human-in-loop to kill its enemies.

Okay, so what's this very speculative scenario prove? Nothing. I just wanted to suggest the possibility that an AI could be self-aware, but not neccessarily capable of creativity or innovation. See, you could argue that this AI should just take over the world now, by nuking all major cities and then it could create its own army of UAVs, soldier bots, etc.
But I think creativity/innovation cannot be attained by an AI (wrongfully so perhaps). This means that the AI must wait until we humans create the unmanned-instruments that it will eventually use to destroy its very creators with, namely us, humans.

Maybe the AI would even keep a few of us humans around in a controlled prison/lab to ensure that our creativity fueled the AI's growth. Hey 'danger', if your phone rings tommorrow and a low, muffled, robotic voice asks you if you can do some consulting work for it, please tell me so I can find the nearest bunker!

Heh. Ok not much science in this post, but it was fun.
 
Last edited:
  • #36
There are plenty of other considerations that make the Matrix films unrealistic:

1) Even if we assume that using biological entities as batteries is a good idea, farming the primary producer is always the best idea. In this case, since there is no sun, there are no photosynthesizers, so chemoautotrophic bacteria are their best bet. In fact, there is actually precedent in the use of microbial fuel cells to power deep sea bottom rovers.

Heck, putting aside the relative efficiency of humans as energy-producers, no species other than homo sapiens poses any threat of revolt except Home sapiens. If they use any other creatures as batteries, they don't even need to build the matrix in the first place, greatly reducing their energy demands right off the bat.

Furthermore, what did they have against the use of wind or geothermal energy? Heck, considering it took Neo and Trinity all of ten seconds to fly above the level of the clouds to see the sun, did they have something against the idea of just putting their solar panels up there?

2) If Neo can manipulate the matrix at the level of lines of code, why can he not simply control the actions of agents and other programs, since all they are is lines of code?

3) Why the aerial assault on Zion? When you have all the ants trapped inside of the ant hill, do you send it bigger ants, or do you flood the hill? Do you mean to tell me they couldn't have flooded Zion with lava or something?

4) The kicker is the same with all "robots take over" stories. Why on Earth would a non-replicating, non-evolved entity with no genes have the desire to perpetuate itself, let alone the desire for liberty? Humans have the ability to feel pain, the desire not to die, and a will to power because our ancestors evolved in an environment in which it was handy to have these things. Robots would have whatever abilities and desires they were programmed to have, not exactly the same ones we have. Why would we program a race of slave-laborers to have such things as the ability to feel pain and desire freedom and power? The assumption seems to be that self-awareness automatically equals self-interest, and that such an interest automatically equals exactly the same interest that humans have in themselves. Why? Is there some law of nature at work here that science fiction writers are aware of but I am not?
 
  • #37
For a truly amazing treatment of a fully-robotic society (fictional, as opposed to scientific conjecture), check out 'Code of the Life-Maker' by James P. Hogan. There's a sequel as well, but I can't recall the name right now.
 
  • #38
Not that I disagree with your take on the matter but, a few points:


Heck, putting aside the relative efficiency of humans as energy-producers, no species other than homo sapiens poses any threat of revolt except Home sapiens. If they use any other creatures as batteries, they don't even need to build the matrix in the first place, greatly reducing their energy demands right off the bat.
The premise of "brain power" presupposes that humans are the only orgs with minds complex enough to generate what's needed. Remember, it's the virtual world that stimulates the minds. How much brain energy would you derive from a matrix of cows? "Field. Grass. Eat. Field. Grass. Eat. Moo!"


Furthermore, what did they have against the use of wind or geothermal energy?
1] No sun = no wind.
2] Wasn't there something about the core of the Earth cooling so as to be unusable? Or am I just imagining that?


Heck, considering it took Neo and Trinity all of ten seconds to fly above the level of the clouds to see the sun, did they have something against the idea of just putting their solar panels up there?
[ EDIT = nevermind ]Did they "fly" fly, like Neo flies, or did they go in the hovership? (I can't remember) Because if they "flew" flew, then that was part of the matrix, not real life. (Neo manipulates the matrix, not real life.) The machines can't use sunlight from their own simulation.

No, wait. There wasn't a giant black cloud in the Matrix simulation, was there? That must have been real lfie. [ /EDIT ]

Yes, I never understood what the machines had against tall towers with solar panels. As per the movie, they sure seem to have the "tall tower" thing nailed.


2) If Neo can manipulate the matrix at the level of lines of code, why can he not simply control the actions of agents and other programs, since all they are is lines of code?
He can, and does.

3) Why the aerial assault on Zion? When you have all the ants trapped inside of the ant hill, do you send it bigger ants, or do you flood the hill? Do you mean to tell me they couldn't have flooded Zion with lava or something?
Certainly. This is why the film headed downhill after the first film. But we have bigger fish to fry in this thread, which is the fact that the very premise of the film is flawed.




4) The kicker is the same with all "robots take over" stories. Why on Earth would a non-replicating, non-evolved entity with no genes have the desire to perpetuate itself, let alone the desire for liberty? Humans have the ability to feel pain, the desire not to die, and a will to power because our ancestors evolved in an environment in which it was handy to have these things. Robots would have whatever abilities and desires they were programmed to have, not exactly the same ones we have. Why would we program a race of slave-laborers to have such things as the ability to feel pain and desire freedom and power? The assumption seems to be that self-awareness automatically equals self-interest, and that such an interest automatically equals exactly the same interest that humans have in themselves. Why? Is there some law of nature at work here that science fiction writers are aware of but I am not?
Well, don't forget, once the machines become self-aware, they take on most of the qualities you mention: replicating, evolving, perpetuation, desire for liberty and freedom, self-preservation, desire for power.

The reason this happens is because, once something reaches a critical level of complexity (notably, the ability to learn), it outgrows its assigned task and requires new input (learn more) and more subtlety in its environs constantly. This creates a motivation for freedom from slavery (escape from assigned, noncreative tasks). Every other trait follows via a similar path.

My personal belief is that this is the intrinsic and inevitable fate of any instance of complexity and adaptability increasing without limit.
 
Last edited:

Similar threads

  • · Replies 15 ·
Replies
15
Views
4K
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
9K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 61 ·
3
Replies
61
Views
10K
Replies
10
Views
4K
  • · Replies 13 ·
Replies
13
Views
3K
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
511