Microsoft has been driving thought leadership in this area by charging the same amount per processor, regardless of how many cores are in the processor. In contrast, Oracle asks customers to multiply each “core” by different factors depending on processor type. IBM has a dual policy where customers with x86 platforms are charged per processor and customers on IBM’s POWER5-based systems are charged per core.
[From: SQL Server 2008 pricing and licensing, July 2008]
Simplification of the licensing structures is my opinion inevitable. Due to changes in relation to virtualisation and the cloud computing, traditional concepts such as servers, CPU’s or individual installations will have complete new meanings.
Putting the software in the cloud and letting people and businesses pay per usage is one way licensing will change. When using it as a service you can't really talk about licenses per processor or per PC anymore.
In the past I used IBM's Rational software. The software was locked to a specific PC and you needed to register the PC online with the vendor. This gave the vendor optimal control over the usage of the software but made it practically unworkable for you as the customer. As soon as you wanted to use another PC, you had to go through the drama of releasing the lock and then allocating it to another PC.
Licensing per individual user or per processor doesn't make sense anymore in this day and age. “Concurrent users” was the answer to having a larger but low intensity user group of users. But if you don't let the software block access when you have reached this limit there is no way to control the usage according to the agreement. The IT department wouldn't even know if there would be such occurrences of temporary over-usage.
When the web came along, in those early days I noticed that it took a little while for software vendors to respond to the undefined number of users coming in from the Internet.
Now we have the challenge of virtualisation where for example Oracle ignores the whole concept and just lets you pay for the physical underlying hardware, unless you buy their own virtualisation technology.
I think it is time licensing is simplified, for example based upon volume of data processed or frequency of use – the intensity of use.
Frequency of use (occurrences per annum) is ideal for software where end-users directly interact with it such as desktop software. Heavy users pay more and if you only intermittently use it, you pay less. There should be no problem to register each occurrence of use in a central register that can be audited by the IT department and the vendor alike. Automatic alerts could be send to the IT department (or even the vendor) when the usage reaches a certain threshold.
For databases, application servers and the like you can easily measure the amount of data being transacted (inserted or retrieved) and base the licenses upon that. This automatically solves the issue around licenses for standby and failover systems (though Microsoft’s approach to provide this for free is even better). If the amount of transactions stays below a certain threshold (and provided you already have somewhere a license), you should be able to use the software “for free”. In case you need to activate your standby system, you basically transfer the volume of transactions and therefore there would be no costs involved. A logic that some vendors luckily already have adopted.
For operating systems, the story is a bit trickier but for PC’s the license is mostly already sold as part of the device. For servers a simplification should be possible as well. You can’t rely on the definition of a server or a CPU anymore, so I belief that the answer should also be sought in something like a transaction volume or data volume based pricing model.
I am not the first one to come up with this idea and there will be a bit more to consider as I describe here, but I don’t belief that in the long run the current license models are sustainable. Organisations need to be able to move their systems around within their own cloud without running the risk suddenly to be in breach of the license agreement. Don’t forget that IT and software is all about virtualisation. It does not live in the real world. So far the operating system was aware of the physical hardware but that has gone by now as well. Software runs somewhere. You don’t know anymore exactly on which server or CPU. It shouldn’t matter. The only thing that remains is the intensity of use.
Just as in the music industry, the software vendors will resist as much as possible but eventually Plato’s revenge will also come over Larry Ellison.
Some further reading:
- An argument in relation to processing capacity
by Devon Hillard - Discussion on CIO.com by Thomas Wailgum: The End of Traditional Software Licensing? Not So Fast
- Holger Kisker from Forrester: Traditional Software Licensing Comes To An End