Showing posts with label predictions. Show all posts
Showing posts with label predictions. Show all posts

Monday, March 03, 2014

The next big wave of IT is Software Development

I can smell a change coming, the last few years have seen cloud and SaaS on the rise and seen a fragmentation in application development (thanks in a large part to the appalling stewardship of Java) and a real focus of budgets around BI and 'vanilla' package approaches.  Now this is a good thing, both because I jumped out of the Java boat onto the BI boat a few years ago but also because its really helped shift the investment focus away from 'Big iT' towards 'Big It' - by which I mean the focus has shifted very firmly towards Information over technology.

Now however companies are waking up to the fact that as good as a Salesforce.com or SAP is it really doesn't matter if everyone in your industry is using it, these are not your differentiators.  Your differentiators are what matter in terms of accelerating growth and outperforming your competitors.

This is where software development comes back and I predict the following four waves
Big Data will drive the new Software Development explosion
Big Data is the hype today and it will be the foundation of this new era as information is the key, fast data will however rapidly become just, if not more, important than 'Big' which means that the ability to integrate into transactions and deliver differentiation will be critical.  This is why we'll see a resurgence in software development based on new approaches and new platforms but we'll see the same sort of consolidation phases that we saw around Java.  Maybe this wave will last 10 years, maybe 5, maybe 20 but what appears certain is that the wave is coming.

This isn't the older wave though, it never is in IT, its the same sort of thing but now applied to the next generation of problems.  This is about information and collaboration and digitization of organisations, its about taking all these vast amounts of internal and external information and driving better outcomes and crucially about collaborating between organisations.

Lets be clear: this is going to be hard.

Back with Java we had a Database, we had SQL, we had (eventually) an App Server... that my friends was a walk in the park compared with what is coming.  I'll write a new post shortly on what the new world could look like but suffice to say you need the following

1) An inherent understanding of what makes information work - Master Data, Big Data, Analytics
2) An understanding of how to drive impact - how to engage users
3) An understanding of how to work with people outside your organisation

You thought doing a Java project with offshore delivery and a bunch of legacy integration points was hard?  Hang on to your hats... 

Thursday, February 16, 2012

How Apple will beat Facebook

Looking at the extension of iMessage to the desktop made me think about how Apple can take on Facebook and win.  Lets see what Facebook have over Apple technically...

  1. Multiple people in one conversation
  2. Broadcast conversations with 'followers'
Now Apple have already integrated Twitter into the iPhone but lets assume that long term the folks from Cupertino want total control.  What do they need to do?
  1. Add a 'message thread' function to iMessage so its not just a stream between two people
  2. Add the ability to talk in groups
  3. Add the ability to broadcast status updates
Applications can compete easily by having some form of multiplayer iCloud sync, or in the same way they already do via 3rd party servers.  What more could Apple do however than Facebook?
  1. Integrate the status update piece into Contacts so before you call you see the status and can see the recent messages
  2. Integrate the group chat dimension by having 'groups' in Contacts (umm almost Circle like)
  3. Provide multi-participant Facetime to switch from iMessage to group comms
The point here is that technically Facebook don't have much that Apple couldn't make standard Mac OS X and more importantly iOS functionality.  Indeed much of this would be a welcomed integrated upgrade to those things (rather than a clear market grab like Google+) so people would 'naturally' start using these facilities as they are on their phone/desktop.  This would increase the link to Apple products in those communities (much as Blackberry used to see).

An added advantage of Apple's approach is that it can remove the view of a 'big central server' and instead create a more federated view of inclusion than Facebook.  This is liable to help increase people's engagement levels and unlike Facebook Apple doesn't need to drive revenue via advertising or selling people, it wants to drive that via more people on its platform as those people hand over real hard cash.

Facebook's big risk is that its network ceases to be the cool and only place to network and that other social based approaches take off.  Apple are ideally placed in the consumer space and have the platform control mentality to drive this.  iMessage is only the start of the evolution, the question is just how much engagement does Apple want to have?



Monday, February 13, 2012

Why Broadband, Apps and Moore's law will beat Server based HTML 5

The browser is about to have its day... Apps are going to win.  Now these Apps could be like the Chrome store pieces and developed in HTML5 but with local storage and offline access added but they will fundamentally be local things.  Why?
  1. Moore's Law isn't going away any time soon. 
    1.  In a couple of years we will have Smartphones with quad or even octo cores, 8GB of main RAM and 256 GB of storage... and you are seriously just using that as a browser?
    2. Your TV will have that as well
    3. Your Fridge will be a bit dumber, say dual core, 8GB storage, 100MB RAM... its a ruddy Fridge
  2. Connections to the home will be a normal thing
    1. Mobile phone companies will start offering 'VPN to the home' as a standard offering so you can unify your control of internet access
    2. This doesn't require a 'home server' just a VPN link
    3. Your home devices will then be accessible directly or via the cloud
    4. Current 'TV via 3G' offers will be linked back to your home connection
  3. Rich Clients beat thin clients when there is a choice
    1. Look at the App Stores, look at games...
  4. The network is never something to bet on being 'always' there
    1. Do you want a Sat Nav that fails due to network connections?
    2. Do you want a Fridge that turns off because it can't download the right temperature for fish?
  5. The speed of light isn't just a good idea... its the law.
    1. Ping lag is an issue of immediacy.  Even if processing takes zero time there is still 100ms+ of ping-lag to contend with and server lag, etc, etc. 

This isn't a retro-grade step its actually a great challenge as what it means is that federation is going to increase.  Social centralisers like Twitter and Facebook are liable to start facing competition from Social aggregators which work on federated information served from your devices via your home network.  Cloud providers will remain focused on functionality and capacity and the blurring of the cloud between the physical and the virtual will be complete, you won't even know if your TV is running locally or via the cloud... except when it borks... which is why in reality it will run locally.

HTML5 a great technology but for it to win it needs everyone to sign up for 'open' on all devices, this includes TVs, Mobiles, tablets and motor cars.  Applications are so much the 'thing' that Google are even promoting applications that can be downloaded and run from Chrome, thus meaning that Chrome isn't really a browser anymore but instead is a hosting platform for applications.

Server-side HTML has had its day, the only question now is whether the industry will unite behind a single 'open' client-side approach for applications or whether every platform will have its own approach.  Apple's current success and the Android marketplace seem to indicate the later.

Server-side HTML - 1991 to 2015.

Tuesday, January 25, 2011

Cloud providers and software vendors aren't a great long term bet

I'm noticing a bunch of cloud providers attracting massive numbers for funding and people are talking about mega-billion industries and everyone getting hugely rich.

I'd like to sound a note of caution, not on the concept that cloud is important or not going to happen but on the concept that there are loads of companies that are going to make loads of money on it. Let me tell you a quick story about a company that believed in Telecoms in the late 20th Century. The company was called GEC and was one of the giants of UK industry. A GE of the UK with a very strong defence arm. The company had billions in the bank and was one of the most solid stocks in the FTSE 100. Now this company had some new leaders who loved the idea of Telecoms and its "better multiples" and wanted to get out of that boring, profitable, defence industry and go heavy into Telecoms. In 5 years from 1997 till 2001 these new leaders invested all of the cash pile, sold off the defence arm and turned a once towering industrial into a bankrupt shell.

How about another? Lets take Vodafone and their stock chart across this Telecom bubble.


Want another? Alcatel Lucent. Note here I'm talking about two companies who survived the bubble as well as one huge company that bit the bullet as a result of it. One that never recovered would be Nortel a company that during the bubble was at one stage worth 1/3 of the total value of Canadian companies! Startups like Winstar were allegedly worth over $4bn but went pop within a year. Throw in AOL's merger with Time Warner and the picture is pretty complete of massive over investment in infrastructure providers and technologies with a view that the market was basically infinite.

This isn't the first time that an infrastructure play has fundamentally failed to make long term money. Roads, Rail and even Canals had their own booms and bust as it became clear that it was too expensive to build all that infrastructure which people fundamentally didn't want. This is really true in something like Telco, and the cloud, where fundamentally the cost of provision is being driven relentlessly downwards. Investing $10bn today in IT infrastructure is like investing $2.5bn in 4 years time, in other words your investment is worth 1/4 of its retail value in 4 years. Even today with the boom in Mobile Internet you could argue that the large providers aren't massive growth stocks but instead are acting as traditional infrastructure providers and many aren't back to their peak of ten years ago.

So what does this mean for cloud? Well this is another infrastructure play. SaaS and end user provider pieces like Facebook are different types of companies but cloud companies are fundamentally about infrastructure so there are a couple of things to note

1) Its probably too late to get in at the ground floor with startups, although a few will do a spectacular growth and pop
2) Its still worth getting into a cloud startup
3) Start looking for the exits when you compare your company with a "dull" company and think "hell we could be worth as much as Wallmart soon"... that is the time to jump

Stock and investment wise its fine to ride the wave that these companies represent as we should never avoid making money from the up-curve of a bubble.

In the long term its a Telco model ala Vodafone or AT&T so expect the big investments from Microsoft, IBM and Amazon to yield minor returns initially but provide a long term steady income but at the sort of levels that would make people just hold onto the cash if they sat back and thought about it.

Technorati Tags: ,

Tuesday, January 18, 2011

Public Cloud is temporary, virtual cloud will move compute to the information

This is another of my "prior art" patent ideas, its something I've talked about before but reading pieces around increasing data volumes its made me think about it more and more.

The big problem with public cloud is that the amount of data that needs to move around is getting exponentially higher. This doesn't mean that public cloud is wrong, it just means that we will need to look more and more about what needs to be moved. At the moment a public cloud solution consists of storage + processing and its the storage that we move around. In that I mean that we ship data to the cloud and back down again. Amazon have recognised the challenge so you can actually physically ship storage to them for large volume pieces, there is however with the continuing rise of Moore's Law and virtualisation another option.

Your organisation has lots of desktops, servers, mobiles and other pieces. The information is created and stored fairly close to these things. The data centre also will contain lots of unused capacity (it always does) so why don't we view it differently? Rather than shipping storage we ship processing? You virtually provision a grid/hadoop/etc infrastructure across your desktop/server/mobile estate as close as possible to the bulk data.

This is when it really gets cloudy as you now move compute to where it can most efficiently process information (Jini folks can now say "told you so") rather than shifting storage to cloud.

The principle here is that the amount of spare capacity in a corporate desktop environment will outstrip that in a public cloud (on a cost/power ratio) and due to its faster network connections to the raw data will be able to more efficiently process the information.

So I predict that in future people will develop technologies that deploy VMs and specific process pieces (I've talked about this with BPEL for years) to the point where it can most efficiently process information.

Public clouds are just new data centre solutions, they don't solve the data movement problem. A truly cloud based processing solution would shift the lightest thing (the processing units) to the data rather than moving the data to the processing units. The spare capacity in desktop and mobile estates could well be the target environment for these virtual clouds.

Technorati Tags: ,

Monday, December 20, 2010

When clouds really will be cloudy

People are talking about clouds and SaaS as the future, and I really believe that they are, in fact I'd say they are the present reality for leading companies. However one of the questions is always "where does this go"? Now there is one world that says "everything on the cloud and delivered via HTML 5". This is an interesting view but it misses out a couple of key questions
  1. When does Moore's Law go away?
  2. When is it really a cloud
The first point is that I'm sitting here with an iPad, iPhone, MacBook Pro and AppleTV (I am a fanboi) with miles more processing at my disposal than commercial systems and websites I put live late in the last century. Clouds talk about dynamic deployment and portability... but normally within a specific data centre environment. When we think about services being consumed and co-ordinated and assume that this is being done over the internet then two questions raise themselves.
  1. What decides where a service is deployed?
  2. Why can't it be deployed to my phone?
What is the point of these questions? Well my son and I can play Need for Speed:Undercover with one of us "hosting" the game on the iPhone or iPad. This is therefore an example of a piece of Software being delivered "as a Service" from a mobile device to another device. Sure its a specific use case but its a very real one to scale up.

Why wouldn't the "Rich" interface still be deployed to the device but now as a client service? Why wouldn't the information cache and some clever software that proactively populates the cache be deployed to the local device?

Now folks like RightScale already do deployment and management across multiple cloud platforms and why wouldn't this be extended to ever more powerful mobile devices, laptops and other devices. Why couldn't my operating system be deployed as part of the cloud rather than just a consumer and the elements such as latency determine where the most effective deployment is for each service in a network? Think about all those apple iPhone apps running in the background on millions of devices... who needs more capacity than that and what latency problems when the app is actually spread across a few devices in the local area?

Now there are challenges to this but there are also big advantages, your data centres are cheap because you don't need them anymore, you just deploy to your clients devices.

This clearly isn't a solution for 2011 but it is something I firmly believe will happen and its driven by the power of devices. Sure HTML 5 is cool, sure Amazon AWS is neat and sure SaaS is wonderful.... but the day that clouds really become cloudy is when no-one can point at the great big data centre that it ultimately all connects to.


Technorati Tags: ,

Thursday, October 01, 2009

Why do games need an operating system?

Using my iPhone and looking at a PS3 console in Selfridges made me think about what the future could hold for games. While there are companies looking at doing server side games and a VDI solution to the end device I just don't think that matches against Moore's Law. But equally the model of downloading and installing loads of games onto a single device with multiple different anti-piracy tools doesn't seem to make sense either.

Given that "bare-metal" concepts have been around for a while now, BEA had their bare-metal version of the application server, wouldn't it make sense for games to be a full single virtual image? So you don't boot the OS then start the game, you just boot the image which just contains the game and what ever minimal drivers it needs.

Now some people will point out "what about the drivers?" and there is a slight point there. But would it be really so hard to define an image where you select your graphics cards et al and it constructs a fully-optimised image just for you? Upgrade your kit and then you just upgrade the image.

Now others will bleat "but how will I pirate it if its a custom environment?" and to be honest I don't care, your problem.

What are the advantages? Well from a piracy perspective it clearly reduces the potential if the game is running on a custom micro-kernel and is tied against a specific license that is created when you create the image. From a performance perspective it can be tuned to the 9s as there is nothing else running. From an end-user perspective it means using your PC like a console, or indeed your console like a console, and selecting the game to play at boot up.

Once you start thinking like this of course it then becomes a question as to what other applications don't really require an OS and the concept of isolation via VMs. Windows 7 is doing this with its XP mode which really opens up the question as to when you don't need Windows for certain applications.

Virtualisation is a server technology that is going to have a massive impact on the desktop.

Monday, May 25, 2009

Will Apple dominate the cloud?

At dinner the other night with a friend we were talking about what we wanted the iPhone to do. We agreed that a new camera would be good but neither of us thought that a forward facing camera had any point as we don't know anyone who has made more than two video calls despite having phones that could do it.

So what we thought would be really good for the 4th Generation iPhone? In particular what could Apple do to re-enforce their strengths over the telcos and use it to leverage people in the Mac. The answer we decided would be in how Apple look to exploit iTunes, MobileMe and the store. Increasing storage is of course something that will happen but really whether its 32GB, 64GB or even 128GB you still won't have enough storage for every video that you might want to see, and its video that really takes up the space. Podcasts move in a close second for their huge files and the challenge of course is that you can only sync when connected to your base machine. The addition of buying over both WiFi and now 3G potentially heralds a new direction however and one in which Apple can use the cloud in a positive way and in such a way that sets up an additional contractual revenue stream for Apple outside of the telcos control.

The answer is MobileMe, currently it is (IMO) a pretty basic service with minimal value, sure you can do contacts and the like but for those people who use an iPhone at work they already get that sort of service via Exchange.

The other piece is Apple's other "cloud" solution.... iTunes. Millions of connections, billions of downloads, its a cloud of music, video and application content. Like a pre-populated S3. In addition Apple know what you have bought from them, they don't currently recognise what you haven't bought, but the point here is to increase the apple lock-in.

So how could they combine a cloud like solution such as MobileMe with the iPhone/iPod Touch and iTunes? Well how about a very simple concept... infinite storage. Your iPhone no-longer just has 32GB it has a series of rules that say what you want locally, e.g. the last 5 unlistened to podcasts, 5 oldest unwatched episodes, etc. Now these rules get applied on your machine and you fill up the 32GB iPhone. You then go out and start watching videos and listening to podcasts.

MobileMe then kicks in and starts in the background (its Apple so they can do this, you can't, they can) downloading the latest versions and locally deleting the episodes you've just listened to. Now sure it takes a while but you've got 5 episodes to get through and in the next 5 hours it should be able to get it all down.

The point here is that MobileMe doesn't need huge amounts of storage, if its an iTunes purchased element (e.g. an "unlistened to but purchased" list) then it just pulls from there, if its a podcast then Apple can get it in a fraction of a second and just stream it up to you and cache it in your MobileMe account. When you get back to your base station the same sort of sync occurs and not only is all your music et al backed up automatically to the cloud but it now gives you the feeling of "infinite" storage. Now the easy option for Apple would be to limit it to iTunes purchases, or at least do so for a basic MobileMe account, then for a Premium account you can backup and sync everything that you have, of course through a properly authorise account which answers the challenges of file sharing.

The UI side should be pretty simple for Apple as well, the Time Machine interface of multiple "levels" sounds like an ideal piece to replicate for a cloud browser, so you look at an album where you sync a track then "go back" to the full album then "go back" to all the albums you own by that artist. Throw in "genius" and Apple can start flogging you new tracks by indicating other levels that are available via the iTunes store.

What else could you do once you hit the cloud? Well things like Animoto show what cloud can do from a compute perspective, why not have that synced directly from your phone via MobileMe? View it then post directly to YouTube without requiring any additional local storage. You can decide to save it locally if you want but you still have the video in your overall cloud view of content.

So after then what do you need? Well HD video is liable to become more normal on compact cameras and where cameras go the phones follow, so why not HD on an iPhone next year? Again this hits the storage side but again having a cloud solution would mean that you could quickly offload it to the cloud. If you want to do some local edits (iMovie for iPhone?) then you get a transcoded short clip back, this lets you edit away on the device in the basic way. Upload to the cloud "iMovie" and a full HD rendered video is then available for your viewing or sharing.

Now of course what you need here is iMovie, and how do you get iMovie? Oh yes you have to buy a Mac, so you buy the iMovie4iPhone application (ala Keynote) and the iMovie4MobileMe subscription to give you a nice joined up experience. If you don't want the rendering pieces then its all included but you just have to sync from your Mac to create and view the final HD video.

So suddenly there are a few pieces on your iPhone that just seem to demand that your next computer is a Mac. Not because the iPhone just works, but because its really nice the way the two worlds synchronise via the cloud. Hell throw in TimeMachine via the cloud as a weekly offsite to go with your local daily and suddenly Apple are selling you a media cloud. Use Apple TV in your lounge to view all the stuff from this cloud and have it "just work". So you create a video on your iPhone, edited from lowres transcoded clips, this is then pumped via your main Mac as an HD version which you can then watch on your Apple TV.

So while people are clamouring for lots more whizbang elements on the device we thought that this missed the point. The iPhone isn't technically the most advanced device out there, its just the best device for users. Asking for forward facing cameras and other technical gizmos which would mean the iPhone would match other devices on a tickbox misses the reasons why it is the current leader in its space.

The opportunity for Apple here is to use the cloud, and in particular MobileMe, as the central co-ordination point between all your media life. Apple should be able to commercialise that connection ($100 a year for this sort of functionality, I'd go for it) and can also use it to leverage themselves a stronger position in the home computer market.

Previously people had talked about the "central hub" in a home, whether it be MediaCentre or a PS3, the challenge with these is that they were always very physically centric solutions and they didn't move you as you left the building. A cloud solution gets rids of that problem and offers a whole heap of other opportunities.

Maybe Apple will go down the cram in the features route, but personally I hope they open their eyes to the world they've already created and look to leverage it in new an interesting ways. You can keep your 10 Mega Pixel camera, I want my media cloud.

Others can build clouds and hope that people will come, Apple can build a cloud that will have them queuing up to demand to be locked-in.


Technorati Tags: ,

Thursday, June 26, 2008

Temporalisation or why location is important

The folks at ZapThink have written a report on what they think the "next big thing" will be and they call it location independence and being about the merger of the Web/SOA and mobile. As so often with bold predictions in IT... Bill Joy got there first... almost 12 years ago in fact. In 2006 he presented at MIT on "Six Webs: 10 years on" in particular he coined a term the "here" web which is all about the web that you carry around with you and via which you interact when you are on the move.

One of the bits that I think that doesn't get enough focus right now is temporalization, a sort of made up term that I take to mean "giving something a relevance in time and space". Most applications today are "location independent" and "location unaware" and the closest they get to understanding anything about time is "is it Tuesday" or lobbing something on a log-file.

If I was going to say what the next big thing would be I think it would be much more about applications becoming temporalized than it will be about them becoming location independent. If I'm a shop-owner with special offers, why would I want to put the application anywhere? What I want to do is deploy it to the cloud and have the cloud determine where to put it (for instance in the nearest mobile cell) so its then available as a service only in that location... in other words its only available "here".

This basically means that rather than applications becoming large and learning everything about geo-data like trying to work out where all the war criminals in Washington DC (A wonderful example of how searching can produce comedy) and having every application know all the other option is to go small. So a small application (e.g. an ad server) could be configured and deployed to a local market. This means that when you search for "shops" you have not only the traditional centralised approach but also the ability to "geo search" by which I mean make a query for services available at a given location. This federated model enables greater targeting but has issues in terms of service comprehension that would need to be addressed.

The final piece in the puzzle is then time. "Here" Services would need to know not only where "here" is but they would also need to know when "here" is. I don't want to be told about a show that finished yesterday or about one that is six months away. If its 11pm tell me about the Kebab shop, if its 9am tell me where to get Asprin.

So I think that Zapthink are on to something from an infrastructure perspective but I think that location and time awareness and in particular the federation of services to the "here" could really be the next big thing. Of course what I really want to know is what Bill Joy thinks today is the next big thing so I can understand what I'll be doing in 2020!




Technorati Tags: ,

Monday, July 02, 2007

Defeating software patents using ideas and the internet

There has been much talk in IT about the evils of software patents, most specifically the stark raving obvious patents that abound. Now looking at the US Patent definitions (the worst of the global ones) the key bit that invalidates a patent is of course Prior Art. Where Prior Art is defined as
known or used by others in this country, or was patented or described in a printed publication in this or a foreign country

Now to me it seems that a blog post can be classified in this day and age as the modern equivalent of a "printed publication".

So here is a simple idea, any time you have an idea, just quickly right a blog post on it. Then in later years you will never think "but I thought of that" when you see a patent application that is just an obvious thing with paperwork. Most of the dumb ideas are things that people either implemented, or thought about, and just didn't bother getting them patented because that would be silly, and didn't even publish the information because it was so obvious.

If every idea that is had is blogged about then rapidly all of the obvious ideas will have clear and documented prior art. You think its obvious that you could use a PVR to create targeted ads? Blog about it, use the technorati tag "prior art" and suddenly there is prior art to cite. By publishing it like this you are clearly saying "I'm not going to patent this idea" and you aren't going to get into some silly license argument. The reason to publish your ideas is because you don't like the current patent regime and you want to get rid of silly patents.

Now clearly I'm not a lawyer, but under the definition of the US Patent Office why wouldn't that count?

Technorati Tags: ,

Friday, June 22, 2007

Bold and old predictions from Microsoft

Remember all those Bill G predictions that turned out to be not so visionary?... well It appears that this ability isn't limited to the top of the company, BillG has managed to instill this through out the organisation.

While flying to Amsterdam the other day (a very nice walking city) on business I was reading through the British Airways in flight magazine "High-life". On pages 125 and 126 was a double page advert from Microsoft entitled
"Everything My Computer Science Teacher Taught Me Was Wrong"
Its an article by Andrew Herbert who is one of their R&D leads. Its basically trying to say how technology is rapidly changing and what that means and it makes 5 predictions for major change in IT, which refers back to the CS teacher being wrong.

The first one is The Single Threaded Program in which he says this is a "20th century idea that has become obsolete", now I have a bit of an issue here because I'm sitting in my home office right now and looking behind me I can see one of my University text books "Communicating Sequential Processes" which was published in 1985. Now I don't know what they were teaching at Cambridge in the 20th Century, but certainly up at York it was assumed that multi-threaded was the way to go. So this is certainly an old prediction as its predicting that the past will happen. Its a good point to stress however that single threaded applications are very limiting but I hope that Mr Herbert isn't right when he says "it's going to change the way we teach programming" as I'd hope that all good universities had been teaching multi-threading for decades.

Next up is an even less bold statement when Andrew predicts the end of "Low-level programming languages", which contains the cracking line (this is 2007 remember) "Once considered an extravagant use of memory, compilers are now essential tools", this is even less of a prediction than the multi-threading one. The debate on assembler v compiler was pretty much answered in the 1970s and by the 1980s it was a question of how high level the programming language was rather than assembler v C/Ada/Smalltalk/LISP etc. This area finishes up with an amazingly bold prediction however "We are moving towards designing yet higher level languages with greater levels of automation and self-checking that eliminate programming mistakes"... hang on did I get that right
moving towards [...] higher level languages [...] that eliminate programming mistakes
Yes I did read it right.... Oh boy, moving from the old to the practically and mathematically impossible, apart from issues like the Halting Problem and Busy Beaver there is the basic challenge that nothing can eliminate wilful stupidity. Reduce yes... eliminate no.

The next one is the first real prediction as Andrew predicts then end to "Screens on Desks" basically saying that future displays will just be "there" as part of the wall or surface by either projection or direct integration. Now here I'm with him. The ability to not have this monitor and use all of the wall space in front of me would be cracking when I'm working. Its not overly bold as some of this technology exists today, but its a pretty good prediction that can be measured in a reasonable (10 year or less) time frame and you can see the business case for it.

"Virtual Memory" or Disk swapping of memory to disk is the next thing that is going to die as memory is ceasing to become an issue. Fair enough really again as its a solid prediction that can be measurable, for instance by the next version of Windows after Vista not having Virtual Memory support. What are the odds on that though?

In the last paragraph for Virtual Memory is another sub-prediction (like the programming languages one) that boldly predicts two things "Within a few years we will probably be able to carry a terabyte of personal storage, enough to hold all the audio and video you'd want to use in a lifetime". So its a "few years" (lets say 5) to get 1TB. This seems fair enough as you can get 1TB USB drives these days and the iPod is already at 80Gb, with other players significantly higher. So 1TB in 5 years seem a rock solid prediction that can be measured by having 1TB personal players available in that time. The 2nd part though is whether 1TB is enough, and here I'd have to say that I've already passed the 1TB level and I'm hopefully years away from being dead. I've taken 30+ hours of video, and lets assume that in future everyone will do this in HD (or more) which means a (post compression) bit rate of around 750KBs, which means I've already created over 1TB in video alone. So the question is whether I've watched over 30 hours of video/film/TV in my lifetime... and the answer clearly has to be yes. So its a prediction but its definitely not a valid one. 1TB is a lot, but its not all you will ever need.

The final prediction is the death of "Hierarchical File Systems" By which he means the OS storing stuff in that sort of format and users accessing it like that. His prediction, again measurable by the next version of Windows (and Linux), is that this will be replaced by "Modern Database technology" which is where it puts his predictions at odds with Google who seem fine with just leaving stuff around and letting the search find it. And isn't this what BeOS sort of did?

Its always brave when people predict the future, so good on Andrew Herbert for doing that. But to describe these things as "top five obsolete software ideas" says much more about the mindset of an organisation that thought they were valid approachs in the late 20th century than it does about a radical shift happening today. Out of the 5 predictions, two are already mainstream and have been for over 20 years, and 2 are predictions for the future but about hardware (screens & Virtual Memory) and the other one is about operating system specifics that has already been released in a product in the 20th Century, and which was originally meant to be in Windows Vista.

But remember folks "1TB of storage is all you will ever need" can now be added to the list of foolish predictions.

Technorati Tags:

Monday, December 11, 2006

Predictions for 2007

Okay its time to don my cap of predictions for 2007, the following should be thought about as reliable as the average horoscope, i.e. I'm making them up off the top of my head.

Products and vendors

  1. First off 2007 will the year of the SOA technical "platform" not exactly news as quite a few people are claiming this today, but 2007 will be the year that really sees this become useful. With Oracle, BEA and IBM all looking like having pretty major releases next years its going to be an entertaining marketplace.
  2. Smaller vendors will struggle even more, the justifications for picking niche products will increase.
  3. The "rich and fat" ESB model will die on its arse. Products will start to clearly split the communications infrastructure from the application infrastructure.
  4. Microsoft will slip further behind in the enterprise space while they wait for Longhorn server
  5. IBM will finally admit that it does have a cohesive strategy based around J2EE and that the non-J2EE bits are going to EOL.
  6. Rumours of BEA being bought out by Chelsea FC will abound, then die away once Roman Abramovich realises they don't posses a top quality international striker.
  7. SCA will become the "accepted" enterprise way of doing things
  8. REST will be delivered into the stacks so they can remain buzzword compliant, REST advocates will denounce them both as heretics and as proof that REST is the "answer".
  9. Business level modelling will continue to be a pipe dream, filled either with overly complex tools or insanely technical ones.
  10. Oracle will buy some more companies, probably including some sort of registry
  11. IBM will buy some more companies, probably focused around provisioning and management
  12. Windows Workflow exploits will show up in the wild
  13. Some product vendors will finally get the difference between product and application interfaces and stop confusing the two.
  14. Questions will be asked about why you have to pay so much money for an invoicing process in an ERP.
  15. Java being Open Sourced will not be the big "wow" that Slashdot predicted
  16. ERP vendors will start to get their heads around SaaS licensing models
  17. Hardware virtualisation will become the "norm" for new deployments
WS-*
  1. WS-Contract and WS-SLA will remain in the future while WS-* concentrates on more technical challenges.
  2. WS-* will continue to be plagued by insanely simple bugs in various implementations, but vendors will hopefully each have just one WS stack (rather than all having multiples like they do now).
  3. BPEL 2.0 will go up a hype curve like almost no technology in history... people will then complain about their visual COBOL applications being unmaintainable.
  4. WS-* will split into competing factions, those that think everything must be done in "pure" WS-*, and those that think that sometimes its okay to not use the standard way if its actually simpler.
REST
  1. REST will start re-creating the sorts of things that WS-* has, so await "RESTful security" and "RESTful reliability" as well as "RESTful resource descriptions" being bandied about.
  2. REST will aim for a MIME type explosion, this won't get very far and lead to lots of "local" standards that are nothing of the sort.
  3. REST will split into competing factions, those that hold to the "literal truth" of the REST paper, and a more progressive sect who treat it as a series of recommendations that are to be applied with thought.
IT/Business
  1. IT will continue to not care about the TCO and will focus on the cost of development
  2. Some major IT horror stories will emerge based on SOA and REST "failures" the reality will be that the project was screwed from the start but it was a damned fine scapegoat
  3. More engineering and measurable solutions will be required by the business
  4. Business will demand more visible value from IT
  5. Offshoring will continue, and South America will rise further as an offshore location
  6. The business will want to see IT clearly split the utility from the value
  7. IT will continue to focus on technical details and miss the business big picture
Well that is it for starters like all good horoscopes there are some specific elements that won't come true and a bunch of generalities that I can claim did. But my big prediction for 2007?
  1. Sun will finally get their act together and pull all those brilliant minds into a cohesive enterprise strategy and ditch all the fan-boy bells and whistles that have dogged their recent past.
Well I can dream can't I?

Technorati Tags: , ,