Amazon makes a terabyte of public data available on its servers

Internet, Web 2.0
Share

amazon-web-services.jpgAmazon’s got quite a bit of spare server capacity. In its goal to become the world’s top online retailer, it bought so many servers that it’s now also running a cloud computing business on the side that’s actually rather cheap.

Last night, Amazon announced on its Amazon Web Services blog that it would be making a terabyte of public data available to its cloud computing users, for them to do whatever they like with.

The data includes stats from the US bureau of transportation , an entire dump of Wikipedia, the DBPedia knowledgebase (which includes info on 2.6 million people, places, films, albums and companies) and all publicly available DNA sequences, including the entire human genome.

There’s also a bunch of other stuff, and it’s all being made available at lightning-fast speed in machine-readable databases to Amazon’s cloud computing customers. It’ll take a while for the internet to really get to grips with this stuff and use it, but anything that’s about freeing up data and information is wholly supported around here. Three cheers for Amazon.

What would you do with the data? Work out why your trains are always late? Work out how many degrees of link separation a random Wikipedia article has to another? Use the human genome to create a clone army and take over the world? Share your ideas in the comments, and make me your second-in-command as world leader.

Amazon Blog (via ReadWriteWeb)

Duncan Geere
For latest tech stories go to TechDigest.tv