Over at
evhead he's got a portion of
A Conversation with Virtual Reality Pioneer Jaron Lanier posted, originally from
here:
Aren't bugs just a limitation of human minds? No, no, they're not. What's the difference between a bug and a variation or an imperfection? If you think about it, if you make a small change to a program, it can result in an enormous change in what the program does. If nature worked that way, the universe would crash all the time. Certainly there wouldn't be any evolution or life. There's something about the way complexity builds up in nature so that if you have a small change, it results in sufficiently small results; it's possible to have incremental evolution.
It's all wrong buddy. You stretching the metaphor a bit far if you think you can compare a program and the universe. They aren't equivalent. Or even close. Of course a program can crash and
bug-out from a simple change in the code - so can humans. Schizophrenia, Diabetes, Cancer.. need I go on? Bugs? Maybe not. But small changes resulting in large results - Yes. For that person. For that program, if you will. Now let's call something large and complex "The Universe". Oh.. I dunno. Ideas, anyone? Like.. oh.. the INTERNET? Yeah. That sounds about right. Still a stretch but a lot better than calling 3,000 lines of code "The Universe". Anyhow, now we have many thousands of servers (worlds?) interacting.. all running many programs (individuals?).. And what happens when there is a small, tiny change on the internet? Not much. How about when someone drops a worm and let's it wiggle through everything? Well.. some denial of service. Some trashing of networks. Does the whole internet crash? No. Does it self implode never to work again? No. But it might evolve.
But by the second half of the interview he has a lot of interesting things to say:
Here's the problem with computers: it's just so much work to think about programs that people treat the details of software as if they were acts of God. When you go to school and learn how to program, you are taught about an idea like a computer file as if it were some law of nature. But if you go back in history, files used to be controversial. The first version of the Macintosh before it was released didn't have files. Instead, they had the idea of a giant global soup of little tiny primitives like letters. There were never going to be files, because that way, you wouldn't have incompatible file formats -- right?
The important thing to look at is how files became the standard. It just happened that UNIX had them, IBM mainframes had them, DOS had them, and then Windows. And then Macintosh came out with them. And with the Internet, because of the UNIX heritage, we ended up thinking in terms of moving files around and file- oriented protocols like FTP. And what happened is that the file just became a universal idea, even though it didn't start out as one.
So, now, when you learn about computer science, you learn about the file as if it were an element of nature, like a photon. That's a dangerous mentality. Even if you really can't do anything about it, and you really can't practically write software without files right now, it's still important not to let your brain be bamboozled. You have to remember what's a human invention and what isn't. And you have to think about files in the same way you think about grocery carts. They are a particular invention with positive and negative elements. It's very important to keep that sense of skepticism alive. If you do that, it will really have an influence on the quality of code that you create today.
Plus, considering he invented the term Virtual Reality..well.. he might have a few things on me.