Why governments should outsource open data to ESRI and Google

Mapping, environment, demographics or transport data: From the White House to Europe and Canberra, it seems like governments are suddenly throwing the whole lot over the fence in the name of transparency. #OpenData is the new buzzword in the land of open web awesomeness, and everybody shouts, hooray!

In the UK the government has so far published over 3000 datasets on data.gov.uk. This is truly great news and there are some interesting nuggets in there. The media somewhat naively refer to it as a huge “database” but, in reality (with the exception of Ordnance Survey mapping), it’s an unsightly bucket of wonky PDFs, random Word documents, grainy JPEGs, and dubious spreadsheets that look like they fell off the back of a truck. That’s because they did. Which is exactly the point.

Open data reminds me of the tree house I recently built. It looks like a garden shed dropped from 30ft. But the kids love it, so it’s fit-for-purpose. It’s just not very usable for other purposes. And the same goes for government data.

But the quality of open data will improve, you say. Indeed, some of it will improve. But governments primarily exist to govern, not to create beautiful datasets. People would rather have more police on the streets and not be mugged, rather than see awesome data showing all the muggings.

But good open data will improve those very public services, you say. True, but I doubt that professional experts in public agencies need to be told by amateur Joe Blogs how to do their jobs. Transparency and accountability are great, but “like citizen dentistry, some things are best left to the experts.” It’s the same with open data. Let’s leave that to the experts too.

In the US, the White House and ESRI have done a major deal on hosting all federal geodata through a proprietary ‘open’ portal. A contradiction in terms? Anti-competitive? Who cares – both parties are very clever and you can read more about it on James Fee’s blog. The government gets a free service, ESRI gets the data format advantage and everyone else can piggy-back off it using industry standards like OGC. This nicely sorts out the 21.5% I mentioned in my previous blog post.

And while we’re at it, why don’t governments just give the rest to Google to host and manage, on behalf of everyone else, so we can actually find it, and slice & dice it off the shelf. So governments can get on with the job of governing, and the economy can get on with the job of innovating.

But of course there are many commercial and other reasons why this would be undesirable. Besides it wouldn’t pass competition or privacy laws in Europe. So instead there will be continued piece-meal data creation, management, duplication, and tendering. As someone once said, “there’s a lot of money to be made in prolonging the data problem.”

We can’t have it both ways. So just forget everything I wrote. Instead, how about you enjoy some truly beautiful data in a new book by http://www.InformationIsBeautiful.net. Now that is awesome. And, interestingly, its main source is… Wikipedia.

Advertisements

2 thoughts on “Why governments should outsource open data to ESRI and Google

  1. Thierry, worthy comment as always. You know in Europe it just aint gonna happen. Well not in the medium term.
    People are still not saavy enough to use tha data fully and too many people just don’t care. They will but its gonna take a while. To be honest at the moment GeoInformation is the opiate of the techno-masses.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s