17 June 2009

Ray Kurzweil, SF Author

I do not believe in The Singularity.

Those of us who spend too much time on the internet, and the blogosphere in particular, cannot avoid excited talk of the upcoming Singularity, where man and machine merge into something more than just men and their peripherals. It is not really a moment, more a stretch of time when telling apart 'manmade' from 'computer made' will become less and less easy, where Turing Test-defeating machines are possible and easy to manufacture, if indeed they do not already manufacture themselves. The Singularity is simultaneously hoped for and worked towards.
If there is one thing that it lacks, it might be humanity, but really, it has humanity built in. Literally.

The problem is this: exponential progression is impossible when attempted by men. Predictions of the year 2000 had steam horses and rigid airships and other things similarly advanced, flying through the aether to Mars and beyond, perhaps to land on Jupiter. These are linear predictions, the kind that Kurzweil tells us is not only impractical, but worthless.

I would suggest that men working towards a future of the Singularity are laboring under the same misbeliefs. I think that Kurzweil moves his predictions closer, unbeliveably close, in hopes of circumventing this lack of imagination, but in the end his predictions are still functionally worthless. Take his prediction of a Turing Test capable machine: we know what that means, from a technological standpoint, but remember that years ago it was discovered that the simplest of looped response scripts can be incredibly convincing...this does not show limitation in the techology, but it does demonstrate the critical failure point of any predictions, namely that while we understand the technology, the mechanics of humanity continue to elude us.

This is where I think that Kurzweil's predictions suffer. He predicts artificial blood that increases oxygen carrying capacity by thousands of percent or more, while I suggest that what REALLY may happen is something completely different, unpredictably different. Artificial blood is the hyper-advanced airship, and whatever happens for real is the Concorde, because not only might new uses come up but other challenges may evolve that were literally unthinkable ten years before. Remember the idea of videophones? Who even thinks about those any more? We have devices that carry video, often shockingly high-def video, and phones that can transmit that in realtime--but we don't have videophones because we don't want them! Who wants an unavoidable video connection; voice is one thing but getting a midnight call on a videophone is quite another.

Basically, I think that Kurzweil is thinking like an SF author, a possibly misguided, probably genius, and certainly prolific SF author. He is taking what he knows, and adjusting it to what he predicts. While others publish novels, however, he publishes the predictions in non-fiction form. Same thing, different presentation--and who gets taken seriously? All negativity about his ideas aside, I think that he is on to something, and something big, but it's not something that is unique to him.


Adam Wykes said...

Not entirely sure what you're trying to say here, although I would be the first to admit that admitting to the probability of a technological singularity in the near future takes a sort of quasi-cult-like leap that is enough to make the least reasonable among us slightly squeamish.

Say what you like, but to ignore his extrapolations is to ignore all of the impressive statistics he has behind what he says. If we want to deny the likelihood of Mr. Kurzweil's future, we have to come out and say precisely how we think the present day trends he extrapolates from are going to deviate significantly in the near future, and why.

The same can be said for the likes of Al Gore, another significant prognosticator. He has a wealth of convincing statistical data behind his arguments. Anyone who wants to deny that global warming is going to follow the path he predicts for it is going to have to show how he is doing his math wrong, otherwise you simply can't say much.

These guys are making predictions based on models that have so far been proven true more than once. The denier had better come up with something better.

Geoffrey Wykes said...

Don't get me started on Al Gore. His global warming talk is overdone in the extreme. He's in like company, though; they all do that.

Geoffrey Wykes said...

The main thrust of my argument is this: math based on wrong numbers is simply wrong, and so are predictions based on wrong assumptions. This isn't math; it's untidy reality.

Adam Wykes said...

Let me see your numbers then?

Geoffrey Wykes said...

This is the only time I am going to do this, because I dislike the whole argument, especially in off-contexts.

Al Gore asserts that Global Warming is both happening and is caused by humanity. He's right in at least one respect, and partially correct in another: there has been measurable warming, and indeed human activities have, and are known to be capable of, increasing known greenhouse gases. This is where the exaggeration comes into play, however...Al Gore and others assert that Global Warming is mostly caused by humans. Temperatures have indeed gone up, even given that the past 10 years or so have been statistically flat, but some is not all!

The proof that man is the main culprit comes down to a correlation between human presence and rising temperatures. However, we run into the problem of measurement--higher temperatures as compared to what? An idealized past, or an imagined happy average? Things ARE warmer than the past, but go far enough back and suddenly they aren't warmer...where to we stop on that? Al Gore and others certainly seem to operate on an assumption that there is an ideal temperature, and that temperature is the temperature of the near past.

How do we measure this? We can hardly even be sure that models based on measurements are accurate, because, for example, cities tend to encroach upon rural measurement stations--so the urban heat island effect tends to skew them increasingly, which tells us about the UHI but hardly helps with general measurements. Many of the models, too, have inherent biases towards massive positive feedbacks that have yet to be proven, much less quantified.

Bottom line: Al Gore has a point to push, which is fine, but it's not a point that should be listened to without a great deal of skepticism. Oh, and it being a point of morals? Seriously?

Adam Wykes said...

Don't recall saying it was a point of morals... models, maybe. Did you read that wrong?

Ignoring the entire temperature debate and cutting to the meat of the issue: human activities significantly increase the amount of carbon dioxide (and other gases) in the atmosphere. It has been determined that CO2 is a major greenhouse gas, the thermal effects of which are known. Any effects this additional CO2 is having on the climate will be deleterious because it is not natural, and therefore the natural cycles of the earth will have difficulty adjusting for it.

Don't get me wrong: at various points in the Earth's past CO2 levels may naturally have been much higher than now. Temperatures may have been much higher. But as human beings that evolved within the last 120,000 years, it behooves us to do what we can to keep things amenable to the climate and ecology we evolved in.

So yes, Gore and others like him operate on an assumption that the temperature of the near past is the ideal temperature because it IS. Given that one hundred thousand years ago man was essentially as bright as he is now, it's a little mystifying as to why he didn't do better much earlier on. Could it be because they had to live through the bullshit ice ages? Yes.

Geoffrey Wykes said...

Al was the one who made the comment about it being a moral issue, by way of clarification.