Thursday, May 31, 2007
Anyway a quick bit of hacking around and thanks to having the backend reasonably well done up popped the mapplet
All this takes is a simple Carbon Counter Mapplet XML file. Feel free to give it a go.
Monday, May 28, 2007
Why is this relevant for SOA? Well here is an example of an incredibly efficient batch system which is based around time synchronisation and which therefore is able to move large numbers of people successfully around. A "real time" process based solution would be the car. Start at A, do lots of steps, arrive at B. But with the railway system, which is much more event based, is able to achieve the same end game (indeed there are parts of Switzerland where cars cannot reach) but do so in a way that reduces stress and increases efficiency.
So when you are looking at replacing those old Batch processes which look so inefficient, ask yourself whether they are batch processes because that is all that could be done, or whether you have a Swiss Railway implementation that you are about to replace with a butt ugly SUV.
Tuesday, May 22, 2007
Which raises an interesting question about using Atom and RSS and indeed HTTP, because while you can indeed set a cache request (or no-cache request) on the document there is nothing to stop the other end making an arbitary decision. Now this certainly makes Google quicker to respond to subsequent requests, but it doesn't make the information more reliable. Now clearly the folks at Google have made a smart trade off between performance and information latency. But by making that trade off its made my service less responsive.
What a cracking example of the problems of the networked world... easier my arse.
Technorati Tags: Google,
Scenario: I have an XML doc (that I have converted to a String) which represents the KML file and I want to return a Zipped version of that file to save on bandwidth and the like. First of all you need to have somewhere to get the KML file from (in this case counter.getKML().
ZipEntry entry = new ZipEntry("geoblog.kml");
ZipOutputStream zipOutputStream = new ZipOutputStream(response.getOutputStream());
And that is it. Mind bogglingly simple.
So the next challenge was obvious, take the RSS feed and put it up on a Google Map, hey its a mashup and we all know that they MUST use Google Maps :) For this next challenge I decided to abandon Yahoo Pipes for a while (mainly because it absolutely NAILS the server if you are looping through a big blog feed, and do it in the old "traditional" way of processing data in the single lump it comes in (the RSS feed).
My task was as follows
- Put the Geo tagged feed onto a map with the Google Maps API
- Generate a KML file for Google Earth
- Use Ajax in someway
To get you going try this simple UK to Wengen trip or there is a much messier one that goes all over the globe, that later one takes quite a while for Google to render. If you don't see it after 30 seconds just hit the button, I haven't done a "waiting for Google" bit yet, if you get bored just hit the button again.... Over on the right are a bunch of links including one to the KMZ file which should pop up directly in Google Earth (if you haven't got it download it).
Now I feel I can safely attack powerpoint for another couple of weeks.
Sunday, May 20, 2007
Then there was the code from the muppets the people who cut and paste code from one place to another and you got slammed because now you were calling THEIR function instead of your own when it gets deployed into system test.
Then we get the slight syntax error breaking everything problem... nice, reminds me of the old "make" scripts.
Friday, May 18, 2007
Basically I have my big monitor on one computer which does the searches and my Gmail etc and I have the dev station where I'm doing some coding to stop my brain going to mush while I am a bit ill (Shingles). The scenario is simple. I search on one machine, I find a good link and a good page and I want to cut and paste the link from one computer to another.
Yes I know that the mouse is a "physical" device, but my mind thinks that its the thing that controls the pointer of the screen I am currently looking at therefore why should it matter about cut and paste? Now back in the old XWindows days I would often have 2 machines or more on the go but I would have Emacs running on both boxes at the same time so there was never a problem.
I want a magic mouse that can link to two computers, respond based on where I am looking and just work like I want.
Technorati Tags: Mice
Tuesday, May 15, 2007
The single most depressing thing for me in IT is how many applications are really just Mainframe data processing solutions with better screens. Applications for which the word "intelligence" is limited to data validation and there is no actual algorithmic or interactional element to the system beyond just data being lobbed into storage.
Looking at the latest raft of .NET, Ruby, Java and the like CRUD "tools" really is pretty depressing, not so much that they are bad (they aren't) but because people seem to be still insisting on coding this dull and uninteresting crap and looking for yet more ways to "optimise" their code for a task that should be tooled.
Sure there are occasions where you can't tool the CRUD bit because it won't fit into the rest of the application, but are you really sure? Or is it just that it would look a bit "ugly"? Worst of all is the fact that many tools still can't handle "complex" elements like foreign keys and want to do single tables.
CRUD is dull, boring and uninteresting. Can we please just get this stuff tooled and move on to the interesting stuff.
So are formal ontologies the way to go, or smart contextual searches? Me I'm backing the searches.
Monday, May 14, 2007
Now I've blogged before about the "semantic" web not really being semantic for web services but I've been thinking even more about the problem that the semantic web with its descriptions of information tries to solve and I'm really not convinced that from a business scenario this is something for anyone to be worrying about today. Sure the concept of "automatic" consumption and transformation sounds beguiling at first, but isn't this pretty much the same vision that was promoted around UDDI at the start for Web Services?
What I mean here is that the thing that this tries to solve is people's understanding of information and automate that process. From some reviews I've done recently, and a conference or two I've attended, the accuracy of these transformations are still pretty ropey and are more about helping people at design time than being something you would rely on in a production runtime.
So really here we have a way of adding "hints" in to people about what a given field means so it can help them understand what it maps to and maybe make a suggestion that might, or might not, be accepted. But is RDF/OWL and the like really the way to go about this? Or should we think more in terms of the sort of free form association that Google gives us? What I mean here is think about the way Google maps works "Hotels near London" where it looks for the term "hotel" and a geo location that is around London (another inference), in effect they create a semantic tree for those terms based on the probability that this is what you meant.Now I've never used an RDF file to help me describe "Hotel" or "London" to Google, I'm just relying on it having built up a contextual reference that means it takes a good guess at the answer.
So are RDF and OWL really required? Or is the solution to have a Google contextual search?
Thursday, May 10, 2007
Where I do disagree though is whether this is a good or a bad thing to have these camps. Now I'm clearly biased as I'm on the contract side but I thought I'd put the case as to why contract based and enforcement, and static languages, represent the engineering approach to IT delivery while dynamic languages and late validation is the approach taken by those who consider IT to be an art. This doesn't make the later inherently wrong, but it does mean that it is hugely predicated on the talent of the person doing the job.
This for me is the problem. When I used to do a lot of "proper" User Interface design there was often talk about the challenges of WILI v KISS. Everyone knows KISS, everyone recites it as an empty mantra, but most people end up doing WILI. WILI is "Well I like it" and is used as the rationalisation of lots of bad decisions, with UI design this was added to by the "well Microsoft do that" school of justification (which is fine if you are doing a word extension, but not if you are doing radar displays).
The same WILI and "appeal to authority" approach is part of the great Technology Delusion that runs straight through IT, part of the problem is simply one of talent. If I could have a team in support of Stefan and Mark Baker I'd have no trouble whatsoever doing REST, if I could have Larry Wall in support I'd be reasonably happy to have PERL code there and so the list of the truly talented goes on. Give me Dan Creswell in support and that Jini idea looks fantastic, hell I'd even say go for a massive project using Spring 1.0 if Rod Johnson had to maintain the XML files.
The appeal of dynamic languages to people who genuinely understand how computer systems should work is clearly appealing, as is pushing the bounds of technology and creating new and exciting ways that are just a little bit better than what came before. These are the people to whom IT is an art form, something that requires imagination and craft and to whom that improvement is important as they are trying to push the boundaries.
I've used dynamic languages, I mess around with them on my personal projects, but I never tend to recommend them on projects and indeed actively ban them when I am the architect.
So why do I choose to have strict contracts, static languages, early validation of everything and extremely rigorous standards applied to code, build and test? The answer was nicely highlighted by Bill Roth on his post around JavaOne Day 1, there are SIX MILLION Java developers out there. This is a clearly staggering number and means a few things.
- There are lots of jobs out there in Java
- Lots of those people are going to be rubbish or at least below average
The quality of dynamic language people out there takes a similar profile (as a quick trawl of the groups will show) and here in lies the problem with IT as art. If you have Da Vinci, Monet or even someone half-way decent who trained at the Royal College of Art then the art is pretty damned fine. But would you put a paintbrush in the hand of someone who you have to tell which way round it goes and still expect the same results?
IT as art works for the talented, which means it works as long as the talented are involved. As soon as the average developer comes in it quickly degrades and turns into a hype cycle with no progress and a huge amount of bugs. The top talented people are extremely rare in application support, that is where the average people live and if you are lucky a couple of the "alright" ones.
This is why the engineering and "I don't trust you" approach to IT works well when you consider the full lifecycle of a solution or service. I want enforced contracts because I expect people to do something dumb maybe not on my project but certainly when it goes into support. I want static languages with extra-enforcement because I want to catch the stupidity as quickly as possible.
But most of all I restrict choices because the number of people talented enough to make them is vanishingly small I count myself lucky in my career because I've worked with some pretty spectacular people, but even in the best of cases it was around 10% of the project and in none of the cases did those people more into support. People will no doubt bleat that dynamic interfaces give some sort of increased flexibility, my experience however is that it just leads to a right pain in the arse which is a bitch to debug.
Unfortunately in IT the level of self-perception is not strong so far too many people think they are at the top table when in reality they should be left in the sandpit. This leads to people taking on dynamic languages, late validation, multi-threading, async and the like with a single minded belief that either
- they will do this because Expert X said it was the right thing to do and they worship at the altar of Expert X and even when it sucks they will say "but X says its right so you are wrong"
- They will do something because it looks easy and not understand the consequences
IT as art can be beautiful, but IT as engineering will last.
Friday, May 04, 2007
I've said it before and I'll say it again... business process isn't everything. Right now this focus on BPM is driven by one thing and one thing alone, the fact that every single vendor's stack tops out at business process. Now most of these don't even have a decent way of handling services (i.e. one interface = one process = what a load of crap interfaces you have) but that is beside the point. What they are proposing is that the IT/Business model looks like this
Its a simple stack based view of the world, business at the top, techy IT at the bottom and BPM as the medium for communication "at the business level", people who talk about this view tend to talk "bottom up" with "services exposing legacy and BPM orchestrating services". Its pretty amazing how this view just happens to match the product vendors stacks, this means that either
- This is the end of IT product development we have fixed it all, we are done
- Its bollocks
This is where SOA really earns its keep, not as the bit that delivers the solution but as the contextual framework within which that delivery can sit. SOA, and in particular a business service architecture, is all about understanding the various different "blobs" of the enterprise, how and why they interact and then choosing the right delivery approach for that service.
My view of the world has BSA being important, but as a contextual framework. When you get down to implementation you are still going to think about the specifics of the requirements or demands on an area and this means you will still have to speak to the business. The differences is that the BSA means you are talking within a business context where it has been decided that BPM/Technical SOA/GDA/EDA/People/Flying Monkeys/etc is the best way to solve that problem.
One size doesn't fit all, BPM is not the culmination of all IT. The challenge in IT and business remains the same, namely getting a contextual framework within which the problem domain can be understood and then choosing the right way to solve that problem. BSA isn't a hammer, its the plan that helps you decide when you use the hammer or when you use the saw.
BPM as the language of business is, IMO, snakeoil. I've heard many CEOs report on how their business is doing, I've heard sales directors report on sales... and I've never heard any of them step through a process of their business to describe where they are at.
Think first, plan first, then decide the way to go. Starting with BPM is as silly as starting with WSDL.