Saturday, July 24, 2010

Should Robots Be Considered as Humans?

1 September 2008

Should Robots Be Considered as Humans?

As I read the synopsis from Star Trek: The Next Generation, I was kind of amazed at the plot, but wasn’t really surprised, since Gene Roddenberry was a humanist. This is typical of his material, but does pose some good questions, such as this assignment has presented.
I believe that maybe man can mock certain behaviors in robots and machines – placing beliefs and desires in them, or the ability to have desires, but it would not be the same as what God gives to His people. We start from nil at our births, and totally create a worldview based on our experiences, our influences and our observations. Should robots ever have an ability to have desires or beliefs – they would stem from the beliefs of the person, who created them. For instance, I believe that a leftwing engineer would create a robot with liberal tendencies, and vice versa if the designer were a conservative. In other words, the thought process would only be as good as what the software developer planted in the machine, whereas we develop our own ideas, free of what has been pre-planted in our brains.
Should this occur, that we do one day see machines with thoughts, ideas, determinations and desires, I do not believe that they should be given rights nor treat them ethically. All the thought process of the machine would be superficial, as the robots do not have souls. If we do not give animals rights, (of course, we must treat them ethically, as they are God’s creation with feelings and emotions) then why should a creation – made by man, lower than the animals, which are next under man in God’s creation, have rights? Robots are created to do jobs, and that is what they should be used for.
To turn off a robot would not constitute murder. The Bible describes murder as the shedding of innocent blood. First of all, in order to kill a robot, only a flick of a switch would be necessary to do so, and no blood would be involved. Secondly, the Bible describes the death of man in Ecclesiastes 3:21. It says, “Who knoweth the spirit of man that goeth upward, and the spirit of the beast that goeth downward to the earth?” So one would have to have a spirit to die.
In the movie, the Judge (JAG) said that she doesn’t know if the robot, Data, has a soul or not, and neither does she know if she has a soul. This is answered in Genesis 7:22, where we are told, “All in whose nostrils was the breath of life, of all that was in the dry land, died.” Another verse includes, Job 27:3, which states, “All the while my breath is in me, and the spirit of God is in my nostrils.” Clearly, if the robot race does not have breath, given by God, in their nostrils, they are not considered to have life in them.
To answer the question, “What view of the mind/body problem do you think is exhibited by Piccard?” I must say that I believe that he expresses the Dualism view. I say this, because Hasker states, “Dualism begins by taking quite seriously the fact that human beings have both physical properties and mental properties.” Clearly, this agrees with Captain Picard’s view in the movie’s synopsis, when he says, “Data has rights among them the right to refuse to undergo an experimental procedure such as this.” Picard is giving the robot both physical and mental attributes. Dualism contends that these particulars are separate, yet work together, such as Picard indicates here since the robot unmistakably has a mechanical physical body and an artificial mind, which seem to also be separate, yet join somehow to perform as one.
Maddox, on the other hand, sees the robot, Data, as a material object, lacking any real mental elements. He refers constantly to Data as “it” and is not concerned at not being able to reassemble him. He believes that Data is merely a self-operating computer.
I believe that Maddox is justified in calling Picard and emotional in his behavior. Obviously, we know that Data has no soul, because God has not created him, nor has he breathed a soul into him. Therefore, he cannot die, nor can he be murdered. He has no freewill, by God’s definition, nor does he have the capabilities of making any decisions or carrying any emotions, save what man has programmed him to do so. Maddox is obviously correct in his view.
Picard’s argument about slavery is invalid, because Data is not a human. He was created to do a specific job. His purpose was to serve in processing information and coming to rational conclusions. He was not created to have a free life, nor to marry of enjoy life at all. He was created to perform a job. Since he is not human, he cannot realistically have the feelings of one who is in bondage. This feeling would be artificial – a figment of Data’s computeristic mind.
I do not agree with the Judge’s decision. She said that she wasn’t sure if Data had a soul, nor was she sure that she has a soul. In the paragraphs above, it has been established that in order for one to have a soul, God must create him out of living flesh, and breath life into his nostrils. Obviously, this is not the case, and therefore, she had no right to “play God” by giving man’s creation the right to choose.

© 2010 Kimberly Padilla, A.A Religion

No comments:

Post a Comment