Showing posts with label virtualisation. Show all posts
Showing posts with label virtualisation. Show all posts

Thursday, October 01, 2009

Why do games need an operating system?

Using my iPhone and looking at a PS3 console in Selfridges made me think about what the future could hold for games. While there are companies looking at doing server side games and a VDI solution to the end device I just don't think that matches against Moore's Law. But equally the model of downloading and installing loads of games onto a single device with multiple different anti-piracy tools doesn't seem to make sense either.

Given that "bare-metal" concepts have been around for a while now, BEA had their bare-metal version of the application server, wouldn't it make sense for games to be a full single virtual image? So you don't boot the OS then start the game, you just boot the image which just contains the game and what ever minimal drivers it needs.

Now some people will point out "what about the drivers?" and there is a slight point there. But would it be really so hard to define an image where you select your graphics cards et al and it constructs a fully-optimised image just for you? Upgrade your kit and then you just upgrade the image.

Now others will bleat "but how will I pirate it if its a custom environment?" and to be honest I don't care, your problem.

What are the advantages? Well from a piracy perspective it clearly reduces the potential if the game is running on a custom micro-kernel and is tied against a specific license that is created when you create the image. From a performance perspective it can be tuned to the 9s as there is nothing else running. From an end-user perspective it means using your PC like a console, or indeed your console like a console, and selecting the game to play at boot up.

Once you start thinking like this of course it then becomes a question as to what other applications don't really require an OS and the concept of isolation via VMs. Windows 7 is doing this with its XP mode which really opens up the question as to when you don't need Windows for certain applications.

Virtualisation is a server technology that is going to have a massive impact on the desktop.

Friday, May 09, 2008

JPC - winner most mentally brilliant thing I saw at JavaOne

At JavaOne you always seen some crap presentations and you see some great presentations on things that you will never actually use in the real world. Then occasionally you wander into a presentation where people have done something in Java that is truly mental but actually has a point.

Welcome to JPC, the Java PC emulator. Yup you can run an x86 PC on top of a JVM, including running Linux. Okay so it isn't fast but this does give a great demonstration how mentally powerful modern machines are. The clever bit is around the work that they have done around compiling the x86 code into VM code.

The presenters did a good job of describing a very complex area such as compiler design and the pieces that they did.

Why though isn't this just another crazy concept that you will never use? Well first off it means that you can run x86 on any VM platform, this is important because just think in 20 years time, will x86 code from 1990 still run on the modern hardware? Quite probably not so it gives a great addition for future security of archives. The other bit where it wins over a virtualisation solution is that you could run it as a minor slave on a box rather than having to virtualise everything. If you have some ancient DOS programme that just processes and dumps a file or has a very basic green screen interface than you don't have to virtualise the entire platform before you can run it, you can just run a VM and have the application running along side the main OS. This could be quite a nice way of deploying those old crappy DOS applications on the new shiny hardware and doing that in a way that doesn't require expensive virtualisation of all those new shiny terminals.

It also has a great case (which is why a bunch of physics people did it) around grid computing as a way to provide a more scalable approach to distributing applications that can utilise downtime without requiring a local install and thus giving a nice secure environment (the JVM) for that grid code to operate in. This is one of the few (hell I think its the ONLY) practical ways I've seen to deploy a multi-purpose grid in a secure sand-boxed environment for any hardware (they demo'ed it on mobile phones, sort of like the iPod supercomputer concept but without the hardware hack).

In the presentation they actually set up a grid with people in the room, that is confidence on quality.

Oh and they are claiming they can get to 50% of the native machine in performance.... Java is so slow that it runs a PC at half speed.... in a browser window.

Mental, very clever and something you could even see a use for.

Technorati Tags: ,

Thursday, May 01, 2008

VM backup problems

One of the best things about working in a Virtual Machine environment for work is that you can take a full backup of the machine and if there are issues roll back to it. This has worked really well when I've been installing software that tended to trash Windows, but I came across a big problem today as I tried to resolve my space problem.

I decided to revert to a saved VM from 12 months ago and then just take the security updates and copy over it the modern files that I had.

Err slight problem. My work policy is for a new Password ever month and I pride myself in having passwords that are tough to break.

So yes I have the VM. Yes it starts up... but can I remember the password? Can I bollocks. So its in to work to connect to the network to get the "right" tokens and password. The point however is clear.

If you are backing up a VM... make sure that you have a local admin account set up as well, otherwise its just a nice set of files that give you a pointless login screen.



Technorati Tags: ,

Friday, April 25, 2008

Software licensing in a virtual world

Reading an article at El Reg on Oracle's licensing model for Sun and thinking about my comments on SOA pricing models there appears to be a fundamental disconnect between the direction that IT is taking and the pricing to get there.

When you move into the virtualisation world the problem becomes even greater. Lets say that I have a blade server with 1000 x 2 core CPUs and on that I virtualise up to have 10 different grid environments made up of 500 x 4 CPU boxes. Now at peak (e.g. for the analytics piece at Tax time) I want to run one of those environment 100% for 1 day, but the rest of the time it takes 1% of the grid or less. Different loading profiles occur for all the environments, but overall I average about 90% utilisation on the blade server.

A smart, and enviromentally strong, use of the hardware. But here is where the business case breaks down.

Every single software vendor has a scale up, not a scale down model

This means that if you are going to use 2,000 cores at peak then you have to license for 2,000 cores even if that peak is only 60 seconds long. This means that the hardware savings are completely eliminated by the software costs. So you just build some "adequate" solutions that can't scale to the peak in the most efficient way (so the Tax calculation takes two weeks instead of one day) but at least you aren't being burnt.

This is one of the biggest challenges right now in deploying Services Fabrics and in consolidating enterprise infrastructures. The solution is to have a more capacity oriented model, which of course the vendors tend not to like as it will inevitably mean less revenue as there is currently a lot of paid for, but rarely used, capacity.

Open Source software clearly changes the dynamic somewhat but clearly it covers only some of the estate. The question therefore is how will software licensing change in a heavily virtualised world? Because if it doesn't that virtualised world won't happen.


Technorati Tags: ,