Reading an article at El Reg on Oracle's licensing model for Sun and thinking about my comments on SOA pricing models there appears to be a fundamental disconnect between the direction that IT is taking and the pricing to get there.
When you move into the virtualisation world the problem becomes even greater. Lets say that I have a blade server with 1000 x 2 core CPUs and on that I virtualise up to have 10 different grid environments made up of 500 x 4 CPU boxes. Now at peak (e.g. for the analytics piece at Tax time) I want to run one of those environment 100% for 1 day, but the rest of the time it takes 1% of the grid or less. Different loading profiles occur for all the environments, but overall I average about 90% utilisation on the blade server.
A smart, and enviromentally strong, use of the hardware. But here is where the business case breaks down.
Every single software vendor has a scale up, not a scale down model
This means that if you are going to use 2,000 cores at peak then you have to license for 2,000 cores even if that peak is only 60 seconds long. This means that the hardware savings are completely eliminated by the software costs. So you just build some "adequate" solutions that can't scale to the peak in the most efficient way (so the Tax calculation takes two weeks instead of one day) but at least you aren't being burnt.
This is one of the biggest challenges right now in deploying Services Fabrics and in consolidating enterprise infrastructures. The solution is to have a more capacity oriented model, which of course the vendors tend not to like as it will inevitably mean less revenue as there is currently a lot of paid for, but rarely used, capacity.
Open Source software clearly changes the dynamic somewhat but clearly it covers only some of the estate. The question therefore is how will software licensing change in a heavily virtualised world? Because if it doesn't that virtualised world won't happen.