This is a classic technical argument, and one that is hugely divorced from business reality. Having the perfect solution for a business problem does not mean that the solution has the "best" technical architecture, it means that it is good enough for the job.
Scalability is the easiest place to over engineer solutions. Back in 2001 I architected a solution that used stateful web services, I did this as the web service provided a security session into the back end application (basically if you didn't authenticate then the service wasn't connected to the backend). It worked a treat, scaled fine because there was only one consumer of the service they were a call centre and they had a single point of interaction with the service and all their requests went through that one session. It worked, it went live. Would it scale to 10,000 users? Nope. But then that was NEVER in the business case and was NEVER going to happen.
By separating the backend from the information exchange it then becomes possible to have different interfaces on the same logic that provide different scaling approaches. All to often however people want to architect the whole system based around that information exchange.
Split information exchange from the business services, and worry about the scaling that is appropriate for your information exchange. Don't worry about technical purity and some "wonder" architectural approach. Don't over engineer because if you do X (or R) then it will scale to 100,000 users, but your requirements say "6".
Business requirements should drive your decisions on scalability, not a technical discussion on what is possible. Scale to what is needed, not to what is dreamt.
Obsession with technical purity is a major challenge in IT, and its unlikely the business will take IT people seriously while they continue to obsess about things that don't have business requirements.
Technorati Tags: SOA, Service Architecture
2 comments:
It seems, that having all these "full security", "massive scalability" or "generic service" (another popular ideology among seniors: "no, we are experienced enough to make it generic. It will work with any data you will want to use years after") in one's mind should also raise an alarm about bad requirements or, which is more common, about bad requirements sharing/explaining. Which requirements are more frequent: "there are 10 users or something, maybe a bit more) or "there no more than 20 users"? If I would be a client, I wouldn't constrain myself that easy and would prefer the first variant :)
I agree about the requirements challenge, but when you put the cost tag for them it tends to focus the mind wonderfully on non-functional requirements. All too often in IT I see people allow loose requirements like "more than 100", which means they scale the system for 500 to give it "room to grow" and after 3 years of operation the average is around 10 and the peak 20.
Just remind the person creating the requirement that for ever X% they have over estimated it will cost Y, and that the cost of underestimating is Z. Make it a visibile thing as to who created the requirement rather than just something that gets evolved.
My point was that scaling should be based on the requirements, not based on "what can be done" with a given technology approach.
Post a Comment