
Daft Punk's Robot Rock is blasting my eardrums. I have a growing list of science and science fiction themed music that one day I shall make into the most awesome collection of such themed music ever known. For the moment, Robot Rock seems the right song to accompany my thoughts of sentience in machines.
Speaking of sentient machines, I was thinking of one of my favorite episodes of Star Trek, one which features my personal favorite sentient, the android known as Data. The episode is called The Measure of a Man. In that episode cybernetics expert Commander Bruce Maddox wants to disassemble Data to understand his positronic brain. He claims that if he can understand how Data is made they can build more like him which will be an incredible benefit to Starfleet. Data may be damaged in the process but Maddox feels the loss of the android would be a small price to pay for all that Starfleet would gain. Data asserts his autonomy stating that he would rather resign from Starfleet then to submit to the procedure. Maddox then informs Data that he is Starfleet property without rights. A trial is arranged to determine Data's legal status.
Picard is to act as Data's defense, but how will he prove that Data is more than a mere collection of programming and electronic components? He seeks the sage advice of the Enterprise's Bartender Guinan. (Sometimes you have to wonder why they even had a Cousellor.) Guinan's advise was invaluable, as usual.

Captain Picard defends Data saying that if Starfleet were to build many like Data to work for and be property of Starfleet would they not be creating a slave race? Sentience in robotics does tend to raise many ethical questions.

Isaac Asimov may be the master when it comes to sentient robots in fiction. His Three Laws of Robotics dealt with the other side of ethics in sentient machines. They would act as morals to guide behaviour.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Although these laws are designed to protect humans from harm by providing robots with a set of moral rules, they conflict with the ability to make moral decisions based on self determined choices. Technically all robots are well behaved when these three laws are applied. The second law created by Asimov states that robots must obey humans , not a very cool way to treat a sentient being! No wonder robots rise up against humans in the majority of science fiction stories!

According to futurist Ray Kurzweil, we are on the verge of the age of sentient artificial

If sentience in machines is an unavoidable part of our future is humanity doomed? Or should we perhaps banish all technology, shunning advancement to avoid this grim future? Should roboethics be a priority in the discussions of robotic engineers. Can kindness be part of AI programming? Is it possible that the day may arrive when we find ourselves faced with defining equality for a new race?
At the end of Data's trial Captain Phillipa Louvios sums it up nicely:
At the end of Data's trial Captain Phillipa Louvios sums it up nicely:
"It sits there looking at me; and I don't know what it is. This case has dealt with metaphysics - with questions best left to saints and philosophers. I am neither competent nor qualified to answer those. But I've got to make a ruling, to try to speak to the future. Is Data a machine? Yes. Is he the property of Starfleet? No. We have all been dancing around the basic issue: does Data have a soul? I don't know that he has. I don't know that I have. But I have got to give him the freedom to explore that question himself. It is the ruling of this court that Lieutenant Commander Data has the freedom to choose."
No comments:
Post a Comment