3pointD.com has more on that mainframe project, and I still don’t get it, because of some rather odd statements in the interview.
It will “rely on the Cell’s processor for rendering.” Uh, a metaverse with a processor requirement for clients? Or do they mean server-side rendering, which seems unlikely and a bad idea anyway?
“We see this technology moving into banking and retail and anything where the consumer is involved in a transaction of commerce that they would today do over the Web, online shopping, online banking.” Banking? Does anyone at all want to do their banking in a virtual world? What value-add does a virtual world bring to doing banking? Being able to interact with a menu on an ATM was a vast improvement in efficiency and accuracy for the typical person going to a bank.
Their big concern? “The problem is that rendering is kind of weak. We haven’t figured out how to accelerate that yet, and how to marry that to transactions.” Rendering is the weak point for online worlds? Despite the fact that it’s been shown over and over again that customers just don’t give a damn about the visual representation? On the other hand, there’s a comment about clients for mobile phones, music players, TVs, and so on. Or maybe they mean something like timeliness of rendering given latency, given their comment about transactions…
From stuff within the article, it seems that this is both a hardware solution and a product suite designed to run on that hardware — mention is made of Hoplon’s messaging and physics solutions, of a billing system, and so on. I’m not clear on to what degree you are expected to write your own virtual world atop all this, or whether IBM is planning on creating the server software too.