Posts Tagged ‘MozNewsLab’

So I’m back from Berlin and in the US. I met some amazing people at the Knight Mozilla Hacktoberfest, a 4 day hackathon with people from all over the world and from all walks of life. It was the most fun I’ve had all year and I’ve made some friends for life. The project ideas were brilliant and the discussion inspiring. To have the news partners (Al Jazeera, BBC, Guardian, BostonGlobe and Zeit) be active participants was a great move on Mozillla’s part. To have big news organisations look outside for ideas and solutions shows they realise news is out there, not solely within structured organisations.

I remember first seeing a blog post about this partnership process and thinking: “Wow, I wish I could apply. Shame I’m not a developer”. I went along to the application process out of curiosity and thankfully my creative juices got the best of me.

Even then, my scepticism told me not to expect any part of my MozNewsLab pitch, the Big Picture, to be built in 4 days and so I made a little side project, MoJoNewsBot. On the third day of the hackathon I presented my data stream connected chat bot via the Big Discussion part of Big Picture. Thanks to an amazing participant, David Bello, we got a conference with website submission, approval and iframe designed and coded in two days. I only found out before presenting that he is in management at a university in Colombia and doesn’t code for a living. I was truly blown away by how an idea; developed, designed and pitched, can be made reality owing solely to the good will of someone who “plays” with code.

You can keep track of both projects, Big Picture and MoJoNewsBot on the Mozilla wiki. I’m looking to make the first and third part of Big Picutre with further help and advice from the participants. Thanks to the magic of GitHub and DotCloud, I have a local version of Big Picture running on my computer. I’m going to learn JavaScript and add to/clean up Big Picture before I present it formally on my blog. As for my chat bot, I need to add error messages and tidy up the code a bit. Then I’ll relocate him from the #botpark to #HacksHackers on IRC. During events in the US I’m going to add more modules with interesting data for journalists to reference.

To all my viewers, whoever you are, I recommend you hop on the MoJo bandwagon next year. It’ll be the ride of your life! Almost as eventful as driving the ScraperWiki digger 😉

Advertisements

Things have been quiet on the blog front and I apologize. What began as a tumultuous year with a big risk on my part has become even more turbulent. Happily with opportunities rather than uncertainties. Trips to Germany and the US have landed in my lap. Both hugely challenging and exciting.

I completed the Knight Mozilla Learning Lab successfully and have been invited to Berlin for the MoJoHackfest next week. I’m really looking forward to meeting all the participants and getting some in depth hands-on experience of creating applications built around a better news flow.

This is a level between the hack days ScraperWiki ran and the ScraperWiki platform development itself (I don’t play a part in this but work closely with those who do), which is more akin to the development newsroom.

My pitch for the Learning Lab, Big Picture, is asking a lot of developers coming with their own great ideas and prototypes. I would love to get some of the functionality working but that very much depends on the goodwill, skills and availability of a small group of relative strangers.

I have a tendency to bite off more than I can chew and ask a lot of people who have no vested interests in my development. I am acutely aware that I cannot build any part of the Big Picture project. That being said I have built a new project that can be added to with a basic knowledge of Python. I give you MoJoNewsBot:

If you want to know more about how the Special Advisers’ query was done read my ScraperWiki blog post. Also, I fixed the bug in the Goolge News search so the links match the headline.

Come October I will be heading to the US to help fulfill part of ScraperWiki’s obligations to the Knight News Challenge. I am honoured to be one of ScraperWiki’s first full-time employees and actually get paid to further the field of data journalism!

Being part of a startup has its risks. No one’s role is every fully defined. This really is a huge experiment and I’m not sure I can even describe what it is I am doing. I am not a noun, however. I am a verb. My definition is in my functionality and defining this through ScraperWiki, MoJo and any other opportunities that come my way will be the basis of this blog from now on. So my posts will be sporadic but I hope you look forward to them.

The Big Picture

Posted: August 4, 2011 in My Data Journey
Tags:

The Big Challenge:

The final outcome of the Knight-Mozilla learning lab is a one to two page (800 to 1,000 words max) proposal – that includes a “show and sell” pitch, design document and business brief – for a software product to be integrated into a news organization.

Build upon an idea you submitted as part of one of the challenges (open video, comment systems or web apps) or pitch an entirely new idea. We encourage you to think big and bold…but, with one caveat: consider the viability of your project for the newsrooms and end users of the product.

The Big Sell:

The Big Design:

The design consists of 3 separate but interconnected web applications. The submission application (the Big Bucket) allows crowd-sourced curation of online content. The live show (the Big Discussion) is a webinar format which allows multiple video links, a voting and question panel and content viewing. The end product (the Big Picture) is automatically generated from the previous applications.

The Big Bucket

Content is uploaded along with a justification (word limited), an email address (link to approve submission) and an online contact (to be added as a link). The moderator will have algorithmic support in the form of meta-moderation; weighting, social meta-data and user grading. Once approved, the content gets added to the Big Bucket and the contributor along with his/her details gets added to the ‘Contributors’ list. Those on the list get video and audio access. Invites to the Big Discussion will be sent via email along with screencast tutorials and spec testing.

The Big Discussion

During this live session there will be three types of participants. Viewers who can watch or vote in the live show and submit questions to the presenter, ‘Big Fish’ who can be called up on audio or video to justify why the content they submitted should be part of the Big Picture, and experts who will be discussing the content (permanently on video). The final Big Picture page will be automatically generated from this where the single video recording will be like the main window in Google Hangouts which shows the loudest speaker.

The Big Cheese

The presenter, or the ‘Big Cheese’, will be in control of the time allocated for each item of content and can change the content by moving the Big Bucket item into the content window. The items in the Big Bucket and ‘Question stream’ can be deleted and reordered during the Big Discussion. All questions are wiped after a new content item is selected and a viewer can only submit one question at a time. The ‘Big Cheese’ and moderators can add content to the Big Bucket during the Big Discussion, once it starts the Big Bucket will be closed to the public.

My main concern is that I have built my proposal thinking of the Minimum Viable Person rather than the Minimum Viable Product. The least use of Big Picture by the viewer is the Big Picture page. This adds news value online by providing the readability and finishability of a news issue. It makes the conversation the navigation and gives people the relevancy with the option to read the content items.

Not articulating content directly may seem inconsistent with the news media, however, we can no longer assume an entry point of knowledge consistent with a single publication. More and more people are getting news and updates from social media but also TV and good old fashioned conversation. Rather than assuming value lies in delivering what is new, news organisations are taking on a curatorial role and need to deliver depth and understanding but also a full product where people don’t feel lost and consumed by the tidal flow of information streaming onto a sea of devices.

The next level of participation, the Median Viable Person, is the viewer of the Big Discussion. This involves logging in, voting and possibly submitting questions. The entry threshold would be minimal but one would expect a smaller audience than that of the end product (Big Picture). The top level of participation, the Maximum Viable Person, is the person who submits content. This will require getting to know the command console and setting up audio and video. This commitment raises the threshold of participation, however I see this as a bonus, being the soft power that ensures content is of value to the news issue and lowers the energy spent on moderation.

So the value to the audience is weighted and the participation ladder is setup to aid the quality of the content, however, my concern on the product side is that 3 separate web apps need to be built in conjunction for the Big Picture to work. Dedication, belief and commitment will be required. However, three tiers of value are added to the newsroom with each application. The Big Bucket is a way of getting crowd-sourced curation coming into the newsroom whilst the journalists are busy working on the story. The Big Discussion is a way of not only creating a platform for UGC whilst allowing the social value to remain within the users’ space (thus encouraging UGC to come to you), but a platform for journalists and experts to interact with the audience and elevate their standing. The Big Picture automatically creates a ‘netcast’ built around a news story that works across multiple devices, keeping the conversation, insight and depth in one place i.e. not spread across articles across news outlets across media.

Two major concerns are scalability testing of the software and the legalities of republishing content that is not the intellectual property right of the publishing organisation. These would need to be discussed with the developers and the news organisation. Others include avoiding the duplicity of online content and making the control of the Big Discussion flexible yet simple, to accomodate the pliability required for working in a live news environment. All the applications will be built with existing web infrastructure in mind.

The (not so big) Brief:

The Big Picture is a platform not a product. It is built using human consumption rather than content management as the template for story telling. It is to enlighten rather than inform. It is to source rather than supply. It is to finish rather than continue. It is to build on what is open rather than compartmentalise what is closed. It is to cross platforms rather than cater for them. It is to find the voices that add authority rather than echo the voices in authority. It is the studio discussion for online news and it’s sole purpose is to provide everyone with the Big Picture.

The Big Pictures:

The Little Links:

  • How I got the idea for The Big Picture
  • Where I got my extra thinking from, with a talk through of the design slides

So I say this is a journey, I say this is an experiment. In that case this blog is my journal/lab book. I am coming up to the end of the Knight-Mozilla Learning Lab and you’ll read my proposal shortly. Although it wasn’t coding-led, it did highlight the importance of coding for the future of journalism, and I was very happy to be invited to take part as it brought me back to my student days of lectures and homework. I’m starting to learn to code from the bottom up and UX and design are more top end programming. I wanted to leave that until later but the opportunity arose sooner.

Opportunities, that’s something I haven’t been short of of late. Conferences, webinars and workshops. I am now giving them rather than partaking. I’ve had to turn down an invite to the NUJ seminars. My blog has gone from sink to source. The top 2 search terms are ‘datamineruk’ and ‘data miner uk’. I never planned for all this and the 1,000+ Twitter followers. This was all supposed to be an internal tool, for learning. But learning has turned to teaching and teaching turns to knowing. And knowing is what turns you from a sink to a source. But the cycle must continue. And that’s my big concern. That’s why I am attracted to data journalism. You can never know it all.

Maybe that’s what detracts traditional journalists from the niche. Or maybe they don’t feel learning is part of the job. I am a typical nerd. I need to study. But I’m disillusioned with institutional education, so I’m setting up my own course work with no real qualifications other than satisfaction. For me, it’s about the knowing and my downfall is being able to mediate that. I come from a strong scientific background where discovery is my strong point and mediation my sore point. My communication was good when I was in science, but in journalsim it was used as stick to beat me. I didn’t see the point of making stories ‘punchy’ when they came from press releases.

I find myself now, a datajournalism advocate which puts me as a person in the limelight more than I had intended. I haven’t put my picture or CV on this site for a reason. It’s not about me. However, a really lovely journalism graduate interviewed me for her blog, datajournalismblog, so here I am:

It’s a very good site. I recommend you join. For most people who get into broadcast journalism it’s all about them and getting their face known. For me, it was because I am more articulate at conversation than writing as I’m used to academic writing. That and I liked the constructive nature of filming and editing. Although I am enjoying that aspect of programming more, and data mining using ScraperWiki. Or ‘the pursuit of facts in plain sight’ as I now like to call it thanks to Evan Hansen, Chief Editor of Wired.com.

That being said, the Knight-Mozilla Learning Lab has taught me some great lessons that apply to journalism as much as code. Chris Heilmann, Mozilla Developer Evangelist, said “The web is amazing but where is the amazing?“, the same can be said for journalism. Jesse James Garrett, cofounder of Adaptive Path, said “Good design has human experience as the outcomes and engagement as the goal“, so should journalism. Oliver Reichenstein, CEO of iA, said “Really understand what you need to do. If you don’t you can’t work” in terms of prototyping but the same can be said for journalism. Echoing my blog, Mohamed Nanabhay, Head of Online at Al Jazeera English, said “Any [news] technology project should solve problems journalists have, even [ones] they don’t know they have“. Reflecting my life mantra, Shazna Nessa, Director of Interactive at the Associated Press, said “Frustrations is part of the challenge, don’t let it poison your mission“.

Building a new product, working for a new business, exploring a new area of journalism means taking risks. I like taking risks. If you don’t take risks you can’t get lucky.

I’m over the half-way through the Knight Mozilla Learning Lab with lofty goals of changing the way we experience news online. I came up with my idea after filming a rant by my friend and colleague. I threw down something very quickly on paper and have since made wire frame mock-ups using MockFlow (thanks Chris Keller for suggesting that software).

I have no experience in UX (user-experience) and no design skills to speak of. I’ve just started learning to programme but my objective was never to build sites or applications, but to get at data for story generation and knowledge in the newsroom.

The learning lab so far has included some really big hitters and even though I don’t write directly about their words of wisdom (my criteria for forming my software proposal has come from blogs and an article by Jeff Jarvis, who will be giving the last webinar), their advice manifests itself as the cogs in my idea-making machine. So I give you my idea so far, for which I will have to write a proposal. It’s a bit long and rambling but I don’t like to edit out the person so much. It is my ‘workings’ which will be refined and made succinct in the traditional media fashion.

Here is Phil Gyford’s blog post, Andy Rutledge’s blog post, Brad Colbow’s blog post, Anil Dash’s blog post and Jeff Jarvis’ article. Highly recommended reading. Also, here is the blog post about the calculated and strategic killing of future party leaders by someone who’s thinking was so frighteningly designed he could not be insane. Here also is a blog post about comments on the Norwegian massacre in Hebrew which sheds another horrific light on the anti-Muslim sentiment world wide. This is a particularly good example of how I need user-generated-content to access even parts of the web which would not otherwise be available to me.

This week, Shazna Nessa Director of Interactive at AP, spoke about making changes in the newsroom and working with staff that have a multitude of different skills. Now I work with geeks. They make my world a better place for me to live in. I’ve been introduced to MakerNight where I’m building a hamster feeder that looks for a twitter hashtag, and GeekUp where mostly we go to the pub! That being said, I presented “The Big Picture” at the last GeekUp in Liverpool to get some geeky guidance. Here’s the result, pardon the modulating audio, I didn’t have a mic:

I see ‘The Big Picture’ as a major collaborative effort between the public, the newsroom community managers, journalists (as they’ll know the topic and should be amongst the invited guests), and experts in the field who may even be the presenter. Now the minimum-viable-person for this project is Joe Blogs so the minimum technical skills are required. Just as a discussion show requires producers, journalists, researchers, directors and studio-hands, so everyone should be involved in providing the big picture.

Here’s a video of my friend Francis saying how online news is s**t. He started giving out so, having a background in broadcast journalism, I filmed him for the purpose of the Knight-Mozilla News Challenge. Now Francis doesn’t have a TV or radio and doesn’t read the newspaper (he gets The Economist). For daily news, his laptop does all. It is his only outlet to the world wide media monster. Yet he thinks no one has got online news right:

So here is Today’s Guardian. I would highly recommend you read Phil Gyford’s post about his creation. He has tried to recreate a printed newspaper for online and it’s beautiful in its simplicity. Why try and reinvent the wheel? The newspaper structure, design and user experience has been fine tuned for centuries now. It is the oldest medium. Why are we trying to kill it with graphics and clicks and a maze of navigation tunnels in an attempt to decipher what is relevant?

This obsession with personalised personalisation is detracting from the fact that people want to get together and combine their knowledge to understand what is new in the world and how it affects them. Online news separates you from the crowd, it isolates your knowing by assuming an entry point (and placing background and foreground somewhere within each article) and it gives you no relevancy by limiting conversation to a stream of comments from unidentified sources at the bottom of each article. The article is not the right space. The principle element of news should be the story, not the article (there’s a difference). The article is no longer the atomic unit of news. So why are we trying to put everything there?

In Phil’s sense of Friction, Readability and Finishability; why don’t we try and take news pre-mediation? Why don’t we take it back to the conversation? You engage, you communicate, you understand. Social reduces friction, people’s understanding is what is most readable, and the conversation gives you something tangible in your head. It creates a “thing” which you can take away and personalise in your own head in the form of enlightenment.

In that vein, I give you my proposal for the Knight-Mozilla News Challenge. I call it: “The Big Picture” and it is a mashup of the studio discussion, Storify, Big Blue Button, a Reader and Phil’s creation. Sounds crazy right? But the key point is: the conversation is the navigation and you comment with content to create an editorial crowd-sourced democracy for a news issue.

Here’s a quick video going through my drawings I threw down later that night after filming the above video (I’ll try and make a better proposal MVP, not sure a prototype can be made in time):

Here are pictures of my musings late one night:

The Big Idea

The Big Bucket (of content)

The Big Discussion

The Big Picture

Burt Herman, former bureau chief and correspondent for The Associated Press, CEO of Storify and founder of Hacks/Hackers gave the following webinar to the participants of the Knight-Mozilla Learning Lab:

 

Here is my take on how the elements needed to build a business for a newsroom are also what you need to make building a virtual newsroom your business.

Follow your passion

My passion is data journalism, data and journalism in equal measure. Burt established his career before he took the big step of starting his own business. I had yet to start my career when I decided to make data journalism my business. My foray into coding is not to build an end-of-line product to mediate information but to build machines for the factory. To build things for journalists to machine read information – to unearth stories that cannot be got by the human eye. It may be a quick way to gauge where Cabinet Office money is going (click on the image to get to the view). Every time the data is updated the visual will update automatically.

Or a way to get information to the public in a way where you can catch the conversation, such as judges who have been reprimanded over personal conduct (read the blog post).

Or an email alert system to bring potential stories to the journalist.

Burt already had a name for himself in journalism. I have not. I’m not looking to make my name; I’m looking to make things that help me find stories that might not otherwise be told.

Build a community

Around this time last year I started this blog and my twitter account not to broadcast what I know but to act as a semantic sink for all things data journalism. So I could find the people who can educate me by what they publish: my data miners. As much as I was working for them, they were working for me. They led me to the Hacks/Hackers community and ultimately to the ScraperWiki team.

Build a team

ScraperWiki is not my team (as much as I love them). The ScraperWiki community is my team. The Australian building planning alerts is on my team. The Icelander looking into foreclosures is on my team. I can see their code; I know what they build; I can ask them for help. My scrapers are my team (and I’ve built those!).

Just build it

Just scrape it. Data in the public interest is public data. Now I can write a scraper in a day or add little things on. And that’s how I’m learning to code but every piece of code I write has to have a journalistic purpose (what ever way you define that!).

Listen to your users

Listening to the stream of information delivered to me by my data miners is what led me to take a leap of faith and leave CNN to join ScraperWiki. I would never have been able to judge, even from within a news organisation, that data journalism was worth pursuing. But I was able to glean this by tracking the metadata from my blog and my twitter footprint.

Stay flexible

Use backend, barebones code. Make it open. Mould it to the purpose of your journalistic endeavour. Here’s what I want to make (which is limited by what I can make!).