Using my iPhone and looking at a PS3 console in Selfridges made me think about what the future could hold for games. While there are companies looking at doing server side games and a VDI solution to the end device I just don't think that matches against Moore's Law. But equally the model of downloading and installing loads of games onto a single device with multiple different anti-piracy tools doesn't seem to make sense either.
Given that "bare-metal" concepts have been around for a while now, BEA had their bare-metal version of the application server, wouldn't it make sense for games to be a full single virtual image? So you don't boot the OS then start the game, you just boot the image which just contains the game and what ever minimal drivers it needs.
Now some people will point out "what about the drivers?" and there is a slight point there. But would it be really so hard to define an image where you select your graphics cards et al and it constructs a fully-optimised image just for you? Upgrade your kit and then you just upgrade the image.
Now others will bleat "but how will I pirate it if its a custom environment?" and to be honest I don't care, your problem.
What are the advantages? Well from a piracy perspective it clearly reduces the potential if the game is running on a custom micro-kernel and is tied against a specific license that is created when you create the image. From a performance perspective it can be tuned to the 9s as there is nothing else running. From an end-user perspective it means using your PC like a console, or indeed your console like a console, and selecting the game to play at boot up.
Once you start thinking like this of course it then becomes a question as to what other applications don't really require an OS and the concept of isolation via VMs. Windows 7 is doing this with its XP mode which really opens up the question as to when you don't need Windows for certain applications.
Virtualisation is a server technology that is going to have a massive impact on the desktop.
3 comments:
I can see at least 1 problem here. Take, as a totally non-random example, the Wii console. An update to the Wii system software a while ago added support for SDHC cards. Games, however, basically have no real access to the system software, and so need to provide all their own HW access routines to flash memory. This poses a problem where, if you buy an SDHC card because you believe you'll be able to store a ton of downloadable content (DLC), you'll soon discover that most, if not all, games do _not_ support SDHC cards, since most were developed before the Wii itself supported SDHC. So games like Rock Band 2, Guitar Hero World Tour, etc, while being used in a system that nominally support SDHC cards, still can't access them, period. Which, you know, kinda is a pain, if you bought an SDHC card thinking that since the Wii itself supports them, then games would too. With more OS support, this kind of thing is abstracted away. Apparently, the Wii took this "why do games need an OS" thing to heart, and since games on the Wii can't download patches like the Xbox360 or PS3, consumers are left in a lurch, with somewhat unusable peripherals that they have every legitimate expectation of working.
I say this being in this position, with a mostly worthless 32G SDHC card...
Another great example is, when I got my first PC, before hard drives were even available, several games I had booted off floppies. When all the hardware was basically the same (CGA graphics, a clock speed a max of 4.77 MHz, PC speaker as the only sound option), this wasn't much of an issue. With today's diverse hardware, from video, to sound, to network cards, expecting games to support all variations puts an immense burden on developers.
Can you imagine the amount of bad press when popular game X doesn't support video card variant Y? Especially if Y came out just a few weeks before X's release? Or worse, the added cost of X to support Y, Z, A, B, and C, because they can't make use of standard OS-level drivers?
I would suspect that would be cost prohibitive...
And from the pirate perspective... There are emulators for every proprietary game system from MAME to N64 to Atari etc...
I'd expect the virtual machine environment to be maintained by a third party (maybe even Microsoft) which is where the drivers et al piece goes. The point would be for the card manufacturers to work with that supplier and then the games suppliers to create the image.
Effectively the company who owned the VM creation would have a "console" type model but based around PC hardware.
From a piracy perspective the point is that the environment can be hardware tied in a way that makes emulation significantly more difficult to achieve.
Post a Comment