By André Møllerhaug
Philip K. Dick's book "Do Androids Dream of Electric Sheep" is an all time science fiction classic, actualised by the screen adaptation "Blade Runner". The book, and the film, rise questions about the meeting of machine and man, this is particularly the case in the way we wonder after have read or seen the book or movie: Is Deckard an android?
I will not go into that one, there have been so many ventures in that direction, it's a jungle out there, and it will be no less so with yet another monkey. But the ambiguity in Deckards character leads us to another, more exciting question: "How can man claim superiority over its creation", or in relation to "Blade Runner", "how can it be justified morally that Deckard kills androids?"
Because, in my view, it can't.
A machine is so many things. Hereafter "machine" will mean the same as "structure interacting with its environment". This opens up for the use of "machine" on many different object. For instance, if you have a computer, that is of course a machine. But the definition also says that the word processing program that you use every day is yet another machine, apart from the computer itself. And mine-sweeper is also a machine in itself.
But this is well covered by our everyday language. It is another issue when we call the system of government a machine (maybe a malfunctioning one, but anyway), or the postal system. Mail is put in the mailboxes, is sorted, transported, and is received by the rightful owner. The burial system is a machine, you die, they bury you, and if your relatives don't go through hell (for you, that is), another guy gets your place (three feet below, good for him) twenty years later.
The social system is a machine. You have or have not connections, and they helps you (or don't), with getting your children into Yale. All are structures that interact with their environment or each other. But now comes the juicy part.
Life is a guarded quality, especially for the ones who have it. And we tend to claim a certain superiority over the rest, because we have the gift of life. But let us see if there is really a difference between us and our subjects.
A plant interacts with its environment, a man does too. We breath and breed, we eat and puke. And thus we are a machine, a very fine and complex one, maybe, but we cannot deny the fact that we basically works in the same way that the next computer, the next local city-counsel, the next potted plant do. We are but mere machines.
We suffer under the misconception of superiority. In our own eyes we are the top of the creation. And in our own eyes and terms, we are. But its like using a ruler to measure itself. It will always measure up.
So, avoiding the blurred sight of man watching himself, what is mans true moral worth? If we also avoid the fact that man also invented moral: nothing. Good and bad is what happens in our heads. When we do biological "good" things, like reproducing, we get rewarded. If we break an arm, we get punished. Also the replicants are programmed for not getting into self-destruction. And they get rewarded if they do as told.
And: why should it be bad for the replicants to kill men, if men tries to kill them?
The following arguments can be made, based on man-moral, to kill men, not the machines (the replicants):
So, in "Blade Runner", the answer is clear: It doesn't matter if Deckard is replicant, and if he is, good for him! The replicants are anyway morally superior.
And to answer the question "Do Androids Dream of Electric Sheep?" No they don't. They yearn for world domination.
Written by André Møllerhaug