While flying to Amsterdam the other day (a very nice walking city) on business I was reading through the British Airways in flight magazine "High-life". On pages 125 and 126 was a double page advert from Microsoft entitled
"Everything My Computer Science Teacher Taught Me Was Wrong"Its an article by Andrew Herbert who is one of their R&D leads. Its basically trying to say how technology is rapidly changing and what that means and it makes 5 predictions for major change in IT, which refers back to the CS teacher being wrong.
The first one is The Single Threaded Program in which he says this is a "20th century idea that has become obsolete", now I have a bit of an issue here because I'm sitting in my home office right now and looking behind me I can see one of my University text books "Communicating Sequential Processes" which was published in 1985. Now I don't know what they were teaching at Cambridge in the 20th Century, but certainly up at York it was assumed that multi-threaded was the way to go. So this is certainly an old prediction as its predicting that the past will happen. Its a good point to stress however that single threaded applications are very limiting but I hope that Mr Herbert isn't right when he says "it's going to change the way we teach programming" as I'd hope that all good universities had been teaching multi-threading for decades.
Next up is an even less bold statement when Andrew predicts the end of "Low-level programming languages", which contains the cracking line (this is 2007 remember) "Once considered an extravagant use of memory, compilers are now essential tools", this is even less of a prediction than the multi-threading one. The debate on assembler v compiler was pretty much answered in the 1970s and by the 1980s it was a question of how high level the programming language was rather than assembler v C/Ada/Smalltalk/LISP etc. This area finishes up with an amazingly bold prediction however "We are moving towards designing yet higher level languages with greater levels of automation and self-checking that eliminate programming mistakes"... hang on did I get that right
moving towards [...] higher level languages [...] that eliminate programming mistakesYes I did read it right.... Oh boy, moving from the old to the practically and mathematically impossible, apart from issues like the Halting Problem and Busy Beaver there is the basic challenge that nothing can eliminate wilful stupidity. Reduce yes... eliminate no.
The next one is the first real prediction as Andrew predicts then end to "Screens on Desks" basically saying that future displays will just be "there" as part of the wall or surface by either projection or direct integration. Now here I'm with him. The ability to not have this monitor and use all of the wall space in front of me would be cracking when I'm working. Its not overly bold as some of this technology exists today, but its a pretty good prediction that can be measured in a reasonable (10 year or less) time frame and you can see the business case for it.
"Virtual Memory" or Disk swapping of memory to disk is the next thing that is going to die as memory is ceasing to become an issue. Fair enough really again as its a solid prediction that can be measurable, for instance by the next version of Windows after Vista not having Virtual Memory support. What are the odds on that though?
In the last paragraph for Virtual Memory is another sub-prediction (like the programming languages one) that boldly predicts two things "Within a few years we will probably be able to carry a terabyte of personal storage, enough to hold all the audio and video you'd want to use in a lifetime". So its a "few years" (lets say 5) to get 1TB. This seems fair enough as you can get 1TB USB drives these days and the iPod is already at 80Gb, with other players significantly higher. So 1TB in 5 years seem a rock solid prediction that can be measured by having 1TB personal players available in that time. The 2nd part though is whether 1TB is enough, and here I'd have to say that I've already passed the 1TB level and I'm hopefully years away from being dead. I've taken 30+ hours of video, and lets assume that in future everyone will do this in HD (or more) which means a (post compression) bit rate of around 750KBs, which means I've already created over 1TB in video alone. So the question is whether I've watched over 30 hours of video/film/TV in my lifetime... and the answer clearly has to be yes. So its a prediction but its definitely not a valid one. 1TB is a lot, but its not all you will ever need.
The final prediction is the death of "Hierarchical File Systems" By which he means the OS storing stuff in that sort of format and users accessing it like that. His prediction, again measurable by the next version of Windows (and Linux), is that this will be replaced by "Modern Database technology" which is where it puts his predictions at odds with Google who seem fine with just leaving stuff around and letting the search find it. And isn't this what BeOS sort of did?
Its always brave when people predict the future, so good on Andrew Herbert for doing that. But to describe these things as "top five obsolete software ideas" says much more about the mindset of an organisation that thought they were valid approachs in the late 20th century than it does about a radical shift happening today. Out of the 5 predictions, two are already mainstream and have been for over 20 years, and 2 are predictions for the future but about hardware (screens & Virtual Memory) and the other one is about operating system specifics that has already been released in a product in the 20th Century, and which was originally meant to be in Windows Vista.
But remember folks "1TB of storage is all you will ever need" can now be added to the list of foolish predictions.
Technorati Tags: Microsoft