As everyone knows, when computers were first invented, they were the size of rooms. Slowly over the proceeding half century they became the size we expect them to be today. Often when people interacted with these early computers they didn’t talk to them directly – but used something that looked like a computer – a “dumb terminal” or a “thin client” – essentially a keyboard and mouse (and maybe a very basic computer) that would do all of the clever processing tasks on a main server server computer – the sort that could have filled a room.
As time went on, the home computer market developed and terminals and clients were no longer needed by most people – their computer could do all of the processing there and then. This allowed people to use computers at home – hence “home computer”!
What’s interesting now is that as we’ve seen over recent years in the move towards “the cloud” is that we’re slowly shifting back to this “dumb terminal” model of computing. And this makes me wonder – will we eventually reach a point where how fast our CPU is, or how much storage our computers have doesn’t actually matter any more?
Cloud services are already slowly taking over. I’m writing this in Google Drive, for example – rather than using Microsoft Word. Services like Drive as well as the likes of Dropbox mean that I don’t have to worry about where my files are stored – I just know they’re somewhere on “the cloud”. Similarly – all of my music is stored on Spotify, and my videos on Netflix – I don’t need a big hard disk, just an internet connection.
Even gaming is joining the Cloud. OnLive has been going for a few years now and works by sending your controls over to the server – and then the game, rather than processing what should happen on your local computer shows a ‘video stream’ of the screen processed on the server. This requires a fast internet connection, but it does mean that you can use your phone to play a game that looks as good as a top-end PC game.
And that’s all it requires – fast internet connections. In a world driven by the cloud, you could conceivably have the same “dumb terminal” with a keyboard, mouse and a screen for the rest of your life – with all of the processor and hard disk upgrades (etc) taken care off by the employees at the cloud datacentre, with no perceptible difference by you at home.
Living in the cloud like this makes a lot of sense – not only could you conceivably access all of your stuff on any device, be it tablet, computer, or mobile (or, thinking futuristically, Google Glass and Oculus Rift?), but imagine logging into a friend’s dumb terminal and having your stuff appear. Not just your emails in their browser, with their shortcuts – but literally the same screen you see everywhere.
There’s perhaps only two things delaying this retro vision of computing from becoming a plausible reality.
The first is fast internet connections. Whilst it’s possible to get home broadband that runs at the sorts of speeds that make it possible (I imagine over 30 meg), it’s still not hugely common. It’s only a matter of time though – remember that only ten years ago a not-insignificant number of people were still on dial-up. Even mobile broadband speeds are getting better – 4G is delivering some amazing speeds. It’s just a case of the phone companies putting up enough masts and allocating enough bandwidth to make it ubiquitous.
The other stumbling block is perhaps mobile. As we carry our mobiles everywhere, internet connections cannot be guaranteed (ever been to northern Scotland?) – do we really want our mobiles to become lifeless bricks when an internet connection isn’t there? I mean – I guess mine is already like this (who makes calls these days), but then at least running apps on the phone means I can still play games when the train is held at a red signal on the Northern Line.
But assuming these problems can be overcome – which I think it is reasonable to assume that they will be – what will that mean for our gadgets and the upgrade cycle?
If my phone outsources all of the processing and clever stuff to a server somewhere, then there’s no point in upgrading to a faster processor as it won’t make any difference. Same for my laptop and my games console. Once they’ve reached a basic level of sophistication (wifi, ability to send control data and receive the the screen image), there’s no need to get a new device as the devices will always have the latest software.
If this future is realised it could cause huge disruptions for software companies used to making a profit on handsets – but we’re already seeing the shift towards charging for services rather than physical lumps of plastic, and how crucial it is to lock users into your app ecosystem, rather than let them install whatever they like on their phone.
So could the cloud be the next big thing? Or have I got the forecast all wrong?
By James O'Malley | November 5th, 2013