Let's say for example in a couple of hundred years we built an AI robot that was not only capable of making decisions for its self, but was self sustaining and was also capable of reproducing (via whatever means), would switching it off be considered murder (I'm aware murder is the killing of a human being, not a robot)? If we were to accept the fact that we are just flesh and bones and that our conciousness has no significant bearing on the universe, then surely we are as unimportant and insignificant as the robot. Surely once we've gave AI life to a robot that we have no right to take it away because the only real difference between us is that we are biological instead of mechanical. Mainstream science seems to like the idea that we are insignificant beings and I'd be interested to hear their morals when it comes to the insignificance of AI compared to our selves. I'm sure most people will just take the opinion that a robot is a robot and nothing more, but supposing they also take the opinion that we are flesh and bones and nothing more, I'd be interested to hear how they justify the switching off of an AI and how they are any less significant than us.