Open data, by its nature, should be for everyone. That’s why many good people such as Chris Taggart, Will Perrin and The Open Knowledge Foundation are trying to get data to the people who care. And who will care enough to look into it that bit further. Who will find the people, ask the questions and dig out the real story behind the numbers.
For me, this is the real key to citizen journalism. The above mentioned are the mavericks of data journalism. Because they are hyperlocal. They aim to reach further by breaking down data into what people care about and what they can understand.
Mainstream data journalism moves hand in hand with interactives and visualizations. Both of which I love and believe have not yet been given their rightful place in news spheres. The Guardian Data Store is a great example but it paints the painting-by-numbers picture of open data rather than contextualizing it.
My journey will be to discover how far one miniscule individual can go with the mountain of gigabytes freely available. I’m looking to read the numbers but first I must manage them. So it’ll be a combination of MySQL, Excel, Many Eyes and many more things I find along the way. I will share it with you if you’d like to accompany me on my journey.
It will be time consuming but not money consuming. I am a recent graduate and a hair’s breadth away from rummaging in bins. Martin Moore of Media Standards Trust blogs: “Soon every news organization will have its own ‘bunker’ – a darkened room where a hand-picked group of reporters hole up with a disk/memory stick/laptop of freshly opened data, some stale pizza and lots of coffee.”
May I someday earn said stale pizza…