In the past I have written and spoken about what I called “Moore’s Wall,” which could be summarized as the notion that expanding computing capabilities give us higher bars to reach which then result in higher costs and development times, and not actually better products.
Well, Toshiba just announced a TV at CES that circumvents this in a clever way. The TV has a Cell chip in it, which makes it outrageously powerful for a TV. So powerful that it can in fact do silly things with the extra processing power, such as interpolate frames, or do special video effects.
Or render the image twice at full speed, so that it can turn any signal into a 3d image.
In effect, this means that the problem in Moore’s Wall is sort of circumvented to a degree; instead of upping the caliber of the content needed, it just uses the computing power to transform the content we already have.
I like this notion, in part because it has a lot in common with notions about standard formats and the like. But it also makes me propose a parlor game: what would <insert device here> be like with insane computing power but no changes to the rest of the technology? We have started to see glimmers of that with the way in which phones and iPods have been changing, of course, and the idea of networked fridges that detect spoiling food has been out there forever… but I am wondering about things like this, which seem to magically upgrade everything we already had.