27 February 2014

A Comment from RogerHub, Metastasized

A comment I began writing on a great post over at RogerHub kind of became too long-winded to politely dump in that man's comment section, so now it's going to live here. It even fits in with the focus of this blog!

Comment Transcript:
Just to add to this article from the point of view of someone who was educated in the liberal arts: technology is a sort of odd binary in the minds of most people: it is seen as savior and satan simultaneously. As in this blog post, anyone who stops to think about it long enough must inevitably conclude that technology's moral standing must necessarily be a reflection of its users.

That's where a lot of the fun in Science Fiction is - when technology is used in a plot to elevate the moral quandaries of characters to epic heights, or to make moral issues that seem unimportant to us now more concrete and pressing. When technology is presented as inherently good or evil, that's often when we lose interest in the story; such technologies seem like too much of a crutch propping up a certain decision (stories are always problems, and problems are always decisions).

Humans like verisimilitude in our stories (our "virtualizations" of other's lives, as you term them in another post). You can be as fantastic as you like, but on some level - literal or metaphorical - you have to jive with reality. So it should be no surprise we like it when technology simply magnifies the decisions of people (or other sorts of moral agents like intelligent robots, aliens, etc.); that's because this is basically how technology works in the real world.

So when you say "Progress exists indifferently: the problem is mankind. Science moves forward, yes, but humans don’t. We don’t mature. We just don’t learn, and this is why we have problems with technology." it is crux of the issue. The user is the reason technology seems two-faced. But what does it mean that the power technology grants us is being put into the hands of a proportionally ever-more inept species?  You do not give a rifle to a child, because although the child may choose between right and wrong, they are not as good at it as an adult, and most think the consequences too severe to risk allowing them to make a bad decision with such a technology. Some other people believe that even an adult is in no position to make proper moral choices with a firearm, and this debate rages in our society these days.

There is no technology for improving human moral decision-making; it's extremely doubtful that there ever could be, given how much of a premium we place on free will. Human beings seem to think that the good life can only be lived when they are free to make their own choices; look at the plot of Brave New World, by Aldous Huxley. This is considered dystopian fiction; a picture of a world in which technology is used to subjugate and control humans - drugs and epigenetic engineering are used to create individuals with certain moral capacities and inclinations. Few mature people would conceive of that world as ideal, yet it is essentially the gist of what any attempt to improve humans would look like - a removal of choice, of free will. A subversion of the good.

Technology isn't going to get us a solution to our moral quandaries; neither is Science. There is nothing particularly new to be discovered about what humans find moral and immoral; only changing fashion. Basically, however revolting it might at first seem to some, a technological society is in need of something like the solution in Arthur C. Clarke's Childhood's End - something paternal, that is, which comes very close to being - perhaps even is - a religion.

However, we are in the process of abolishing religion from society, it seems - in part because of what it has to say about how technology should be used (stem cell research, abortion, etc.). Potential irony aside, there's only one further resolution to the conflict. When humans are removed from the equation and there's only mindless technology (true AI would just be a repeat performance in silicon), then technology will have at last had its apotheosis. Most people don't like the sound of that either; so conflict it is for us. We will never be at peace with technology, because we cannot be at peace with ourselves and our choices.


No comments: