Let's start with a little bit of background. The supercomputing industry is in trouble. For the last few years it's been showing about 3% growth per year. As ignorant as I am of business, even I know that 3% doesn't leave much room to do interesting things, sometimes including paying your employees. In the past 15 years we've seen a lot of the companies that used to make the Big Iron shrink, be acquired, or simply vanish.
Lots of people are looking for the killer app that will make everyone jump up and say "I need a supercomputer and I need it NOW!". We've all seen them happen: Halo for the Xbox, the Web for broadband access at home, Tivo for those execrable "male enhancement" commercials. Well, according to Mr. Rattner, the killer app for supercomputing is the 3D Internet -- or, as he calls it, the 3D Web.
Here is what he was talking about. Virtual worlds like Second Life, World of Warcraft, Final Fantasy Online, EVE Online and the like get very computationally demanding when there's a lot going on in one place. The number of interactions you have to track is proportional to the square of the number of avatars within shouting distance of one another. Some systems (WoW especially) handle this by running several instances of the game world to try to keep the population manageable. Others such as EVE Online throw heavy-duty hardware at the problem and run all the players in the same world all the time. So far this is perfectly normal.
According to his slides, Mr. Rattner's vision is that supercomputers that run between $3M and $300M a shot will run all of these. Moreover, the supercomputer will be responsible for computing what your avatar can see at any time as well as computationally expensive things like accurate simulations of cloth, water and sound. Never mind the graphics card, CPU or physics coprocessor that your computer may have. It'll all happen remotely. By this point I was shaking my head in disbelief. In Gandalf's words, he has left the path of wisdom. In mine, he's out of his mind. But wait! It gets better!
In his next slides Rattner noted that there are a couple of zillion sites out there on the "3D web" today, none of which talk to one another. He compared this to the old days when the Internet was a big research project and "online" meant a walled garden like CompuServe, AOL, Prodigy, GEnie and the like. He didn't mention BBSes at all, which I think is quite a shame, but that's a different rant.
Okay. The early Internet. Walled gardens. He said that two supercomputing guys were responsible for changing all that: Tim Berners-Lee, a researcher at CERN, better known as the inventor of HTTP and HTML; and Marc Andreesen, staff at NCSA, who wrote the first graphical Web browser. He said that when he first saw their work he knew the walled gardens were in trouble. Then he suggested that we (as supercomputing researchers) should go forth and do likewise. The Cloud, nebulous thing that it is, would handle commerce, authentication, identity and such. The 3D Web would live on supercomputers and all sites would interoperate. You'd be able to take your avatar from one site to the next. Tauren from World of Warcraft could boogie alongside Hello Kitty Online avatars in an all-night club in Second Life.
I scarcely know where to begin here. First, it wasn't just the invention of HTML and HTTP that undid AOL and Compuserve, though that was a part of it: it was the ease with which anyone could put up a Web server and start writing pages. You didn't need an elaborate distributed infrastructure, a gigantic, power-hungry supercomputer, or the funding of a major corporation. Its innovation was that all you needed was a text editor, a decent paint program, and a computer hooked up to the network.
Second, he's presuming a standard of interoperability that just doesn't exist. Right now, with a very few exceptions, you have a different identity and corresponding credentials on every single web site you visit. You can't take your Amazon account and use it to browse eBay. Things you post on Groklaw don't show up attached to your account on Calculated Risk.
Third, the creation of a lingua franca for 3D worlds has already been tried at least three times -- OpenInventor, VRML and VRML97 -- and has failed each time. Some of this is because of the limited graphics capability and network bandwidth in computers of the time. Detailed virtual worlds use a lot of data. Some was because of the lack of good tools for building these worlds. This one still isn't solved. However, these are just technical reasons. If that was the main problem we could knock it out in five years.
The worst hurdle is the fourth one. 3D isn't an app. 3D is a display method. The crucial question he ignored was this: if interoperable 3D with realistic everything on the Internet is the killer app for supercomputers, what's the killer app for 3D with realistic everything on the Internet?
Anyone? Anyone? Bueller?
The fact of the matter is that for most of the things you do with your computer 3D isn't useful. Games? Absolutely. Bring it on. Email? Web browsing? Booking flights? Buying books? Writing papers or presentations? Doing your taxes? Not so much. Looking at a house or a car you might buy? Sure, that would be great! But how much time do you spend doing those things in the average month?
The point is that in my estimation Mr. Rattner was fundamentally wrong on just about every point he made. 3D is shiny, makes pretty pictures, and has totally revolutionized several industries but has not yet proven to be essential to what you and I want to do on the Internet every day. Gigantic, multi-million-dollar, multi-megawatt supercomputers can do some pretty astonishing things, but the Internet runs just fine without using them for front-line servers. At this point the kind of interoperability he envisions is far more of a social problem than a technical one.
If he truly believes that the "3D Internet" is the future of supercomputing then he's already hosed. I hope Mr. Rattner is a better businessman than he is a visionary.
So. That was my morning. The next two should be considerably more interesting. We'll see.