OnLive: digital distribution play
(Visited 8886 times)Mar 242009
Just yesterday I was talking with a reporter from the Wall Street Journal about how more publishers needed to get on the digital distribution bandwagon… and today, I see an article on CNet entitled OnLive could threaten Xbox, PS3, and Wii.
What is it? It’s a tiny box for your TV, or a service client for your PC. With a high enough bandwidth network connection, it lets you simply play standard PC games remotely. The games run on remote servers, and you are streamed the rendered screen, so your client hardware doesn’t matter. The company claims no lag thanks to amazing compression.
A whole bunch of publishers have signed up…
26 Responses to “OnLive: digital distribution play”
Sorry, the comment form is closed at this time.
It sounds almost like a video version of the Citrix remote connection client/server architecture. It’s going to have to have some SERIOUSLY good compression to make streamed games playable. I remain very skeptical.
Yea I’m not sure you’d ever be able to play a game like Crysis or Far Cry remotely (in our time anyway :P) but for smaller games it could work. I’m not against the idea, I just think that when games are upwards of 20gigs installed on your machine for content and graphics reasons, it’d be pretty hard to stream all that.
MMOs have enough issues with just sending character update information.
They must have just did a press conference or something. I have received info on this from at least five different vectors inside of one hour.
ArsTechnica.com
Twitter.com
RaphKoster.com
Email from a friend pointing to a PC World blog
Email and two Twitters hit my mobile phone.
Impressive press release. They seem to be lighting up the Internets.
Wait a second on the compression required. There’s plenty of companies out there doing HD video. Games are not different from a streaming perspective than these movies. The reported 5Mb required is about right if you compare their service to Netflix HD streams and the like. You’ll want an 8 – 10Mb+ connection to your house but this service could use 5Mb easily. If we can stream 720p movies then we can stream 720p games.
The real question is, can it be done in a way where the game remains playable. Input latency, especially in a shooter, is paramount. Can they reduce the input latency so that it “feels” like you’re playing the game on a local computer. Forget synchronizing multiplayer access to the stream if the single player version can’t be accomplished. It’s the input lag that concerns me the most.
That’s a move I was waiting for.
It is indeed technically not an easy challenge. But if they manage to pull it off for simple games, it should be relatively easy then to manage more graphics intensive games.
And well, with this kind of architecture, security becomes an easier task, multiplayer within the host should be easy and efficient to implement, and huge optimizations could be made such as if you need a 3D model you only need to load it once for all players.
Then of course, all depends on whether you can manage to get the right hardwares to begin with and create a viable business.
Sounds quite like G-Cluster, who did this something close to 10 years ago for mobile devices. Dunno what’s up with them these days, just remember it because of a couple of ex-colleagues who were involved with developing the tech for it back in the day…
I tested this technology a couple of years ago. This services had Playstation and xbox games, which worked surprisingly well. Granted, there was some input lag.
They had something like Colin Mcrae rally and other fast paced games which actually were playable. Graphics were litte fuzzier than actually playing with the console itself.
I’m sceptical…
I’ll keep my phyiscal games.
HOWEVER…
If they were to get backing from a company… like Sega…
I might be able to get access to games that its usually very very hard to find, even on eBay.
Dreamcast games…. Mmmm
There’s a little more info in this EE Times article:
http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=216200305&pgno=1
Looks like they’re using a huge array of ASICs (100+) to do the compression of the video stream. So this is going to be a very expensive server environment, but I don’t see any real technical issues that sound insurmountable.
The codec for decoding the stream will be very interesting, though.
100 cores(what the article mentions) is not the same as 100 ASICs. The Nvidia GTX 280 in my computer has 216 cores in one ASIC.
Yeah, honestly, input lag in the single biggest hurdle here. Nothing else is particularly difficult as long as the server is fast enough and the codec is good. The bandwidth requirements are high enough that this’ll put it out of the reach of a lot of people right now though (not all broadband is capable of 5mb yet), but that’ll improve over time.
Even if this one attempt doesn’t succeed wildly, something like this seems inevitable in the long term. It solves so many problems for developers and consumers all at once, but it’ll basically put the hardware folks out of business in the process, which could be bad. On the other hand, using a superpowered cluster for game development has it’s upsides; you can start building stuff that exceeds anything we’d be able to do with discreete hardware, and still be able to have a market for it.
Ye olde diskless workstation.
I remember (circa 1987?) when someone (IBM, Novel, etc.) was touting computers without hard drives that booted off the LAN. They didn’t require DOS. Never worked.
Then in 1997(?) someone (Oracle? Sun?) tried the same thing, commenting about total cost of ownership of PCs, and how if they didn’t have HDs and were just dumb terminals, they’d be cheaper to run. They ran Java. Never worked.
And internet set-top boxes (1998?) where people would read the internet on their PC.
Basically, if you want to play games on your TV (without and XBox, etc.) just go buy one of Asus’s(?) micro PCs (basically a DVD drive, atom processor, bit of flash memory) for $200-$300(?). I’m sure they can make one without the DVD for $50 less. The only difference is that PC’s have a sucky UI compared to video game machines.
Uhm, best of luck to them and all, but like everyone else here it seems, I’m skeptical at best. That said, I do highly support digital distribution in general, but I cannot help but see this as a new platform that is going to have to do a lot of work in selling itself. Although, good for them in getting a lot of publishers, if that is indeed the case.
I prefer to play games on PC
[…] GDC so far, and while I may not be attending this year, everyone else on the Internet seems to be talking about it so far, so I feel like I should […]
It’s all a matter of timing, though. Already some of the news articles are leading with “Play XXX hot game on your netbook!” Before, there wasn’t much demand for any of those setups. Now, the demand is already there (if the trend towards netbooks and cheaper computers continues).
@Mike, eh. How about wanting to play games on your PC without spending 500 dollars on a high end video card?
Or wanting to use your cheap super power efficent netbook as a gaming device?
And honestly, terminal machines have had a long and successful history in certain contexts, especially back when it was too expensive to have decent processing power at every local machine. With rising hardware costs, we’re running into a similar situation (gaming hardware is too expensive for many people to afford), but now we can broadly distribute our terminals from the main servers thanks to broadband internet.
Eolirin…
Netbooks are separate issue. I think games should be designed for them. I think game developers will feel the same as the number of netbooks increase.
Rising hardware costs??? Hardware costs are falling all the time. It’s the latest games that are designed for the latest graphics cards that require you to buy new hardware.
The games that OnLive is talking about (to be flippant) require so little CPU that they could run on an iPhone. I suspect they’re spending more CPU compressing at 30 FPS than the games take generating the 30 FPS in the first place. Plus, they are using HUGE amounts of bandwidth, which still costs money.
On a sever, every gigabyte is around $0.10. From the customer’s POV, every gigabyte costs them $2-$3 (at least in Australia where plans are limited to 14gig/month for $50-ish, and then extra for more, and/or severely limited bandwidth.)
A family’s 2.2 kids can easily use 4 hours per day x 31 days = 120 hours. Assume 1 gig video per hour (DVD quality is more like 2 gig/hr… and by the way, DVD-quality isn’t good enough for fast action games). That’s 120 gigs. Will your ISP let you use 120 gigs a month for a standard rate? At $2-$3, that’s $240-$360.
And that doesn’t include the cost of the hardware at the remote site… where you’d need 1 game device per 4(?) subscribers to meet peak-time demand. Don’t forget, you also need an extra CPU (or more) to compress the images to send over the internet, and AC to cool off the server room. It means, that at best, OnLive gets a 75% hardware cost savings, but probably more like a 50%.
For those interested, here is the link to gamespot video
http://au.gamespot.com/shows/on-the-spot/?series=on-the-spot&event=on_the_spot20090324
It’s good news for people with unmetered fat pipe. Does that describe a big enough segment of the base to really represent a serious challenge to traditional game distribution? I don’t doubt that it’s viable, but I do question whether it’s going to be mainstream any time soon.
I hope I’m wrong. Something like Second Life would be a LOT smoother if it was predigested by a high-octane rendering machine and fed out as video. Any environment that is subject to frequent and unpredictable changes in topography could benefit, which might be a boon for user-created content.
Well if it is viable, we might expect optical fiber to spread faster in response to their offer.
This sounds very intriguing, but as far as the video streaming goes, 720p streaming works because of caching, and in a real-time game, caching is out of the question.
I could see this for single player games, maybe. But for Multi-User games I think synchronizing the different clients so that it plays “real time” will be a very daunting task.
If they accomplish it I would be very impressed.
I’m watching the video now…
– 1.5 megabit = 675 megabytes per hour of play. This is slightly cheaper than my 1 gig/hour estimate. 720p is 4 megabits = 1.8 gigbytes per hour.
– This is no $10M startup. A LOT of money has been spent on this.
– They seem to have done a good job doing what they set out to.
– But until bandwidth becomes cheap enough to stream 120 hours of video per month, I don’t see how this will work.
Indeed G-cluster has done this already almost 10 years, so its nothing new. Their strategy is a bit different though; target audience is not hard-core gamers since they have PS3 anyway and the quality of 1.5MB stream is not sufficient for those gamers. Secondly G-cluster sells through IPTV operators that have dedicated networks and there is no need for yet another box (microconsole) since the existing IPTV STB acts as the console.
Let’s say that the microconsole and gamepad costs 200USD to manufacture and the montly subscription fee of the service is 10USD; it takes 20 months for even pay back the hardware at home and how about the hardware in the datacenter..
I have not heard any facts on how many servers they need to run this service for, let’s say 10k simultaneous users. Just for comparison you need only one VOD server for 10000 concurrent streams.
Or where this super data center is located that can serve all users i.e.who can actually access the service with decent latency.
[…] GDC so far, and while I may not be attending this year, everyone else on the Internet seems to be talking about it so far, so I feel like I should […]