Banana Tree House

This is a blog on my incoherent thoughts and painstaking details of my life. Welcome and please consider this the disclaimer...

Tuesday, July 20, 2004

The Perfect World - A Post Inspired by "I, Robot"

(Warning: If you had not yet watched "I, Robot" the movie and do not want it spoiled for you. Do not read any further. Skip to the next entry. :)

The movie "I, Robot" prompted me to think about this: a robot-led world has a lot to offer.

I don't know why people immediately think of all the downside of this: but they are trying to enslave us humans. Hah! Wrong! remember in the movie, VIKI was trying to protect human, never was enslaving mentioned. What is it that machine needs humans to do for them that they can't accomplish themselves, and do a better job? Is that because they'll get lazy and tired so they'll need slaves? A better question to ask is: is that the first idea sprang into your mind because you are thinking like a human?

If you really think about it from a subjective point of view, we are the ones that manage to f up everything. The earth was fine until we came along. No other species are so intelligent that they are destroying their own living environment. No other species has hunt another one down to extinction, for food or just entertainment, there has always been checks and balances. We are the ones that mess up everything. The robots were fine (in the movie) until we wanted to make them "more human." See? THAT was the real problem. We couldn't leave good enough alone. We want to play God ("God created Adam in His own image...."); we can't help but to make robots more human like.

Now back to what I was originally saying, here's how I envision a robot-ruled world (basing on the assumption in the movie that the purpose was to PROTECT human, not to ENSLAVE us.):

From the very basic: everyone will be taken care of, disabled people, mentally challenged individuals. There will be no more homeless bums. That's like socialism to its fullest.

Take it a step further, since big brother is watching all the time and practically omnipresent, there will also no longer be child-abuse, spousal-abuse, or elderly abuse.

The world can be practically crime free. homocides can be stopped before it happens. Behaviors such as drunk driving and reckless driving can be completely weeded out.

Terrorism? Gone. No one group of fundies killing another group. No any group trying to make their group the nationally or internationally acclaimed religion. And mind you, also no leader of the free world waging unjustified war with another country.

Rest be assured, there will be no greed, no corruption, no abuse of power. (Once again, those who are saying "how do you know?" are totally thinking like a human. Making the assumption that robots are capable to be as evil as human. These are behaviors very unique to human. There will be no greed and corruption simply because they don't need money or materialistic goods, and no ego to get in the way of making decisions.)

Sure. It also said in the movies that some causualty will occur during the human-led to robot-led world. But as I have said, there's always blood-shed in a revolution. It's a small price to pay for a perfect world. After all, haven't we decided for the Iraqii that 10,000 PLUS of their lives is just a small price to pay for freedom? We are not just talking about freedom here, we are talking about the crime-free, perfect world.

Now I know a lot of people simply find this idea resentful. But that's just because most of us have been poisoned by all these science fiction stories since childhood. "Oooh, machines will take over and bad things will happen." Dude! It's a STORY, and by definition you need conflicts to make it a story. Who wants to read about an idea world where people live happily ever after being taken care of by the machine? There's always have to be a villian, an antagonist, much like Cinderella or Snow White's evil stepmothers. They are necessary to make the story work, without them there is no story: so one day this beautiful princess was borning, she grew up happily without much events, married well and lived happily after. You would pay money to watch that? What's the lesson in that story? (Not that Cinderella and Snow White have a very good lesson: be pretty and the prince will choose you.)

So in the science fiction stories, the robots are inevitably made the "bad guys," and the only way the story can end is that human had, once again, defeated the bad guys and restored this concept of "free will."

A lot might argue about being "uncomfortable" of being monitored all the time. But why not? Most fundies believe that their Gods are watching all the time anyway. Just think of robots as some sort of a pseudo God, there to ensure our safety. After all, it's not a voyeur on the other end watching. You'll be surprised at how fast you can adapt to that.

So I was telling this idea to a friend last night (this said friend shall be referred to as "DD" from now on). DD's argument was very interesting (and probably represent some of the most of the general (mis)conceptions). He used the scene in the movie where the robot "decided" to save Will Smith instead of the little girl in the car accident. I wasn't sure what his point was, there were two victims (probably more, there's also the girls dad who was allegedly driving their car) and apparently the robot could save only 1. It made an assessment that Will had better odds of surviving. That makes PERFECT sense. Should it had gone for the girl, it might lose both. Isn't saving one better than losing both? I don't understand why the life of a little girl is naturally more valuable than that of an adult male. Aren't we taught to think that all lives are equal? (Or was I thinking about Animal Farm?) According to DD, the girl's life is more precious because he has no parents, never mind that his grandmother will be devastated to lost her only kin at her age (apparently she's old, so her loss is acceptable). I found that logic completely bizarre. Regardless, I do realize that my value/standard/logic sometimes deviates from th at of the norm. If AGE (and suriving kins) should be the determining factors instead of the chance of survival, then for Pete's sake, program the robots that way. Don't blame them for doing what they are programmed to do!

For you information, during medical school interview, one of the questions is: if there's a 24-yr-old patient and a 67-yr-old patient, and you can only tend to one (or there's only 1 bed), what would you do. Mind you, the correct answer is NOT "Screw the old man 'cause he had lived long enough and go to the 24-yr-old of course.")

Had we read about science fiction stories that promote a robot-controlled ideal world, we might not be as resistent to the idea today.