Here is a timeline of my data journey. Starting from when I first heard about this thing called Computer-Assisted-Reporting.
Well, computers have moved on since journalists were hacking away on spreadsheets a decade ago so I decided to see how CAR has come along. This proved puzzling. In almost all news institutions it has been overlooked.
So I had been at ITN, BBC and working at CNN during this time of data curiosity on my part. Social media provided some sort of platform to explore data in the newsroom. It being the latest buzz word that execs are actually interested in (unlike data which, in my opinion, is a much more fruitful venture when it comes to generating actual news).
So I did my data journalism stuff out of hours. Gathered a lot of news from social media during hours. This made possible by the many web applications made by developers (as they make money, the sore point in data journalism).
At the beginning of this year, I up and left the newsroom for the programming terminal. To look at applications for data, serious data. I’m now at ScraperWiki. The thinking behind this: The Times paired a programmer and journalist to start working on stories for the web. So the programmer has the newspaper as his playing field. So what if you pair a journalist and programmers in the programming playing field? You can make the field. You create the platform for a purpose. And then repurpose it for the data, not the story.
It’s hard to explain but hopefully this blog about my progress will reveal whether this experiment will ultimately work.
In the meantime I should endeavor to be more punctual with my blog. My excuse is that I’ve been working on my first news project involving data. There will be a post. There will also be two more Hacks and Hackers Days with ScraperWiki this month. So hopefully new horizons will become clear.
Oh, and if you want to find out how I got the timeline on it’s in this post.