Open Data in Government
on
I've been thinking a lot recently about the accessibility of information that the government collects sometimes even distributes. We've blogged about problems getting information about postal code to riding data in the past. We ended up purchasing the data, but each time we do it is so inefficient it seems like it must cost the government more to sell it to me than give it away. Fortunately, there's been a lot of movement in this area around the world and there are a lot of good things to be inspired by.
In the USA today the Obama administration launched Data.gov which aims to "increase public access to high value, machine readable data-sets generated by the Executive Branch of the Federal Government." Now, government data-sets seem pretty boring on the outset, but if you are trying to understand a problem it could be just the information you need. They are offering data in XML, CVS, Text, KML & map data. No restrictions (that I could see) on how I use the data. They've even got a rating system, and a call for other suggestions. This is a great step forward.
It's being propelled by the great folks at Sunlight Labs who along with Google & O'Reilly are sponsoring Apps for America 2: The Data.gov Challenge. What better way to make government more accountable to it's citizens than to ensure transparency by opening up data for public scrutiny. The challenge is also going to inspire a bunch of new open source software to be developed and released to help others make good use of this data too. A lot of software is going to be developed for under $20K in prize money.
Even without new software being developed, there is so much that you can to make data more meaningful by with existing visualization tools like IBM's Many Eyes. There are also a great many open source tools to help with this too ranging from the SIMILE Widgets to OpenGeo mapping tools. We're looking forward to doing more mapping with mapping with Drupal's GMap or OpenLayers modules.
A bit closer to home I was impressed to hear about the results from Change Camp Ottawa. I wasn't able to attend this unfortunately, but was very happy to hear about the energy & ideas that went into this unconference. One of the sessions was about a new initiative, StimulusWatch.ca to watch government stimulus package spending. We've begun giving them a bit of support to help them move this project along. Other groups like CivicAccess & VisibleGovernment are also advocating for better access to this data.
I was encouraged also be getting a RFP today from a Canadian government agency that wanted to have a site that could produce RDFa. This has been a big push in the Drupal community since Dries Keynote address at the Boston DrupalCon in March 2008. So much of the time it seems that government IT is behind the leading edge, but perhaps times are changing.
Jeni Tennison has an interesting blog and in her article "Your Website is Your API: Quick Wins for Government Data", she makes a good case and approach for developing more accessible frameworks for government data through existing technology. Her approach is to identify the data that you control, represent that data in a way that people can use expose the data to the wider world. Her challenge is to do it now rather than waiting till you've got a the resources to do it "right". With a thoughtful approach information can be exposed so that it is useful for those outside your department.
Now I was thinking about this and where I could get open data from the government and use it to track something that concerns most Canadians. Knowing that the Weather Office of Environment Canada is an active supporter of open source, and that there is nothing that people like to complain about in Canada more than the weather, I thought this was a great start. I decided that I would run an evaluation of the weather reports by parsing the RSS feeds that are produced for Ottawa's Weather Conditions.
It's true, it's hardly something as interesting as tracking the amount of $$ that the Canadian government is spending to support women's groups in Afghanistan and being able to plot it against the number of times that our politicians have used the plight of women to explain why we have put our troops there. But we don't have ready access to that data to be able to do that type of analysis.
The good news is that the weather office is using a standardized format and that I can use common parsing tools to extract much of the data out of their feeds.
I will do a bit more work on this over the weeks to come to see if I can't flesh out this tool to give a more accurate reporting on the forecast. I'll also be extending it to other cities, and especially extending it to Whistler where I just found out that Environment Canada has placed a million dollar Doppler system for weather prediciton in the 2010 Games. Crazy the things we've got money for and those we don't.
Share this article
About The Author
Mike Gifford is the founder of OpenConcept Consulting Inc, which he started in 1999. Since then, he has been particularly active in developing and extending open source content management systems to allow people to get closer to their content. Before starting OpenConcept, Mike had worked for a number of national NGOs including Oxfam Canada and Friends of the Earth.