Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

September 19 2011

10:53

Help Get Olympic Data off the Start Line

As part of Media2012 we’ll be running (no pun intended) a Hacks and Hackers Data Journalism workshop.

It’s part of the Abandon Normal Devices Festival. It’ll be on 2nd October from 11:00-17:00 at FACT (Foundation for Art and Creative Technology) Medialab, 88 Wood Street, Liverpool, L1 4DQ.

So if you’re interested in sports data and want to see times, points and medal tables get off the line then come on down.

To book email hello@andfestival.org.uk

Most importantly, beer and pizza will be provided!

So watch out London 2012, you’re being ScraperWikied!


April 28 2011

13:30

A reluctant goodbye to Guardian Local

ScraperWiki is sad to hear that Guardian Local is being wound down, just over a year after its public launch. We’ve had the good fortune to work with the talented Guardian Local journalists at three of our Hacks & Hackers events: in Cardiff, Leeds and Glasgow.

We would like to say a particular thank you to the project’s editor, Sarah Hartley, for her generous help. We wish Sarah, Hannah, John and Michael the very best in their new ventures, whatever they may be.

As you can see from the comments under the Guardian post announcing the sites’ closure, the beatbloggers, led by Sarah, have done amazing work for their respective communities. It’s testament to their hard work and energy that they’ve built up such a loyal following in a short space of time.

Michael MacLeod from Guardian Edinburgh at our Glasgow event (right):

December 23 2010

12:50

Student scraping in Liverpool: football figures and flying police

A final Hacks & Hackers report to end 2010! Happy Christmas from everyone at ScraperWiki!

Last month ScraperWiki put on its first ever student event, at Liverpool John Moores University in partnership with Open Labs for students from both LJMU’s School of Journalism and the School of Computing & Mathematical Sciences, as well as external participants. This fabulous video comes courtesy of the Hatch. Alison Gow, digital executive editor at the Liverpool Daily Post and the Liverpool Echo has kindly supplied us with the words (below the video).

http://vimeo.com/18030897

Report: Hacks and Hackers Hack Day – student edition

By Alison Gow

At the annual conference of the Society of Editors, held in Glasgow in November, there was some debate about journalist training and whether journalism students currently learning their craft on college courses were a) of sufficient quality and b) likely to find work.

Plenty of opinions were presented as facts and there seemed to be no recognition that today’s students might not actually want to work for mainstream media once they graduated – with their varied (and relevant) skill sets they may have very different (and far more entrepreneurial) career plans in mind.

Anyway, that was last month. Scroll forward to December 8 and a rather more optimistic picture of the future emerges. I got to spend the day with a group of Liverpool John Moores University student journalists, programmers and lecturers, local innovators and programming experts, and it seemed to me that the students were going to do just fine in whatever field they eventually chose.

This was Hacks Meet Hackers (Students) – the first event that ScraperWiki (Liverpool’s own scraping and data-mining phenomenon that has done so much to facilitate collaborative learning projects between journalists and coders) had held for students. I was one of four Trinity Mirror journalists lucky enough to be asked along too.

Brought into being through assistance from the excellent LJMU Open Labs team, backed by LJMU journalism lecturer Steve Harrison, #hhhlivS as it was hashtagged was a real eye-opener. It wasn’t the largest group to attend a ScraperWiki hackday I suspect, but I’m willing to bet it was one of the most productive; relevant, viable projects were crafted over the course of the day and I’d be surprised if they didn’t find their way onto the LJMU Journalism news website in the near future.

The projects brought to the presentation room at the end of the day were:

  • The Class Divide: Investigating the educational background of Britain’s MPs
  • Are Police Helicopters Effective in Merseyside?
  • Football League Attendances 1980-2010
  • Sick of School: The link between ill health and unpopular schools

The prize for Idea With The Most Potential went to the Police Helicopters project. This group had used a sample page from Merseyside Police helicopter movements report, which showed time of flight, geography, outcome and duration. They also determined that of the 33% of solved crimes, 0.03% involved the helicopter. Using the data scraped for helicopter flights, and comparing it to crimes and policing costs data, the group extrapolated it cost £1,675 per hour to fly the helicopter (amounting to more than £100,000 a month), and by comparing it to average officer salaries projected this could fund recruitment of 30 extra police officers. The team also suggested potential spin-off ideas around the data.

The Best Use of Data went to the Football League Figures team an all-male bunch of journos and student journos aided by hacker Paul Freeman who scraped data of every Football League club and brought it together into a database that could be used to show attendance trends. These included the dramatic drop in Liverpool FC attendances during the Thatcher years and the rises that coincided with exciting new signings, plunging attendances for Manchester City and subsequent spikes during takeovers, and the affects of promotion and relegation Premier League teams. The team suggested such data could be used for any number of stories, and would prove compelling information for statistics-hungry fans.

The Most Topical project went to the Class Divide group – LJMU students who worked with ScraperWiki’s Julian Todd to scrape data from the Telegraph’s politics web section and investigate the educational backgrounds of MPs. The group set out to investigate whether parliament consisted mainly of privately-educated elected members. The group said the data led them to discover most Lib Dem MPs were state educated, and that there was no slant of figures between state and privately educated MPs, contrary to what might have been expected. They added the data they had uncovered would prove particularly interesting once the MPs’ vote was held on University tuition fees.

The Best Presentation and the Overall Winner of the hackday went to Sick of Schools by Scraping The Barrel – a team of TM journos and students, hacker Brett and student nurse Claire Sutton – who used Office for National Statistics, Census, council information, and scraped data from school prospectuses and wards to investigate illness data and low demand for school places in Sefton borough. By overlaying health data with school places demand they were able to highlight various outcomes which they believed would be valuable for a range of readers, from parents seeking school places to potential house buyers.

Paul Freeman, described in one tweet as the “the Johan Cruyff of football data scraping” was presented with a Scraperwiki mug as the Hacker of the Day, for his sterling work on the Football League data.

Judges Andy Goodwin, of Open Labs, and Chris Frost, head of the Journalism department, praised everyone for their efforts and Aine McGuire, of ScraperWiki, highlighted the great quality of the ideas, and subsequent projects.  It was a long day but it passed incredibly quickly – I was really impressed not only by the ideas that came out but by the collaborative efforts between the students on their projects.

From my experience of the first Hacks Meet Hackers Day (held, again with support from Open Labs, in Liverpool last summer) there was quite a competitive atmosphere not just between the teams but even within teams as members – usually the journalists – pitched their ideas as the ones to run with. Yesterday was markedly less so, with each group working first to determine whether the data supported their ideas, and adapting those projects depending on what the information produced, rather than having a complete end in sight before they started. Maybe that’s why the projects that emerged were so good.

The Liverpool digital community is full of extraordinary people doing important, innovative work (and who don’t always get the credit they deserve). I first bumped into Julian and Aidan as they prepared to give a talk at a Liver and Mash libraries event earlier this year – I’d never heard of ScraperWiki and I was bowled over by the possibilities they talked about (once I got my brain around how it worked). Since then team has done so much to promote the cause of open data, data journalism, the opportunities it can create, and the worth and value it can have for audiences; Scraperwiki hackdays are attended by journalists from all media across the UK, eager to learn more about data-scraping and collaborative projects with hackers.

With the Hacks Meet Hackers Students day, these ideas are being brought into the classroom, and the outcome can only benefit the colleges, students and journalism in the future. It was a great day, and the prospects for the future are exciting.

Watch this space for more ScraperWiki events in 2011!

December 07 2010

10:06

Hacks & Hackers Belfast: ‘You don’t realize how similar coding and reporting are until you watch a hack and a technologist work together to create something’

In November, Scraperwiki went to Belfast and participant Lyra McKee, CEO, NewsRupt (creators of the news app Qluso) has kindly supplied us with this account!

The concept behind Hacks and Hackers, a global phenomenon, is simple: bring a bunch of hacks (journalists) and hackers (coders) together to build something really cool that other journalists and industry people can use. We were in nerd heaven.

The day kicked off with a talk from the lovely Francis Irving (@frabcus), Scraperwiki’s CEO. Francis talked about Scraperwiki’s main use-scraping data, stats & facts from large datasets – and the company’s background, from being built by coder Julian Todd to getting funded by 4IP.

After that, the gathered geeks split off into groups, all with the same goal: scrape data and find an explosive, exclusive story. First, second and third prizes would be awarded at the end of the day.

You don’t realize how similar coding and reporting are until you watch a hack and a technologist work together to create something. Both vocations have the same core purpose: creating something useful that others can use (or in the hack’s case, unearthing information that is useful to the public).

The headlines that emerged out of the day were amazing. ‘Mr No Vote’ won first prize. When citizen hacks Ivor Whitten, Matt Johnston and coder Robert Moore of e-learning company Learning Pool used Scraperwiki to scrape electoral data from local government websites, they found that over 60% of voters in every constituency in Northern Ireland (save one) abstained from voting in the last election, raising questions about just how democratically MPs and MLAs have been elected.

What was really significant about the story was that the guys were able to uncover it within a number of hours. One member of Team Qluso, an ex investigative journalist, was astounded, calling Scraperwiki a “gamechanger” for the industry. It was an almost historical event, seeing technology transform a small but significant part of the industry: the process of finding and analyzing data. (A process that, according to said gobsmacked Team Qluso member, used to take days, weeks, even months).

If you get a chance to chat with the Scraperwiki team, take it with both hands: these guys are building some cracking tools for hacks’n’hackers alike.

November 15 2010

13:02

Lichfield Hacks and Hackers: PFIs, plotting future care needs, what’s on in Lichfield and mapping flood warnings

The winners with judges Lizzie and Rita. Pic: Nick Brickett

By Philip John, Journal LocalThis has been cross-posted on the Journal Local blog.

It may be a tiny city but Lichfield has shown that it has some great talent at the Hacks and Hackers Hack Day.

Sponsored by Lichfield District Council and Lichfield-based Journal Local, the day was held at the George Hotel and attended by a good selection of local developers and journalists – some coming from much further afield.

Once the introductions were done and we’d all contributed a few ideas the work got started and five teams quickly formed around those initial thoughts.

The first two teams decided to look into Private Finance Initiatives (PFIs) and Information Asset Registers (IARs). The first of these scraped information from 470 councils to show which of these published information about PFIs. The results showed that only 10% of councils actually put out any details of PFIs, highlighting a lack of openness in that area.

Also focused on PFIs was the ‘PFI wiki’ project which scraped the Partnerships UK database of PFIs and re-purposed it to allow deeper interrogation, such as by region and companies. It clearly paves the way for an OpenCharities style site for PFIs.

Future care needs was the focus of the third team who mapped care homes along with information on ownership, public vs private status and location. The next step, they said, is to add the number of beds and match that to the needs of the population based on demographic data, giving a clearer view of whether the facilities exist to cater for the future care needs in the area.

A Lichfield-related project was the focus of the fourth group who aimed to create a comprehensive guide to events going on in Lichfield District. Using about four or five scrapers, they produced a site that collated all the events listing sites serving Lichfield into one central site with a search facility. The group also spawned a new Hacks/Hackers group to continue their work.

Last but not least, the fifth group worked on flood warning information. By scraping the Environment Agency web site they were able to display on a map, the river level gauges and the flood warning level so that at a glance it’s possible to see the water level in relation to the flood warning limit.

So after a long day Lizzie Thatcher and Rita Wilson from Lichfield District Council joined us to judge the projects. They came up with a clever matrix of key points to rate the projects by and decided to choose the ‘what’s on’ and ‘flood warning’ projects as joint winners, who each share a prize of £75 in Amazon vouchers.

The coveted ScraperWiki mug also went to the ‘what’s on’ project for their proper use of ScraperWiki to create good quality scrapers.

Pictures from the event by Nick Brickett:

 

http://www.flickr.com/apps/slideshow/show.swf?v=71649

October 19 2010

09:38

Scraperwiki/RBI launch first in-house Hacks & Hackers event – for B2Bs

Tickets are now available for a Scraperwiki hack day at Reed Business Information (RBI) on Monday 29th November in Quadrant House, Surrey, from 8am (registration) – 8.30pm.

B2B journalists, developers and designers are invited to attend the one-day ‘Hacks and Hackers’ event hosted and sponsored by RBI, B2B publisher of titles including FlightGlobal, Farmers Weekly and New Scientist.

The idea is that business journalists and bloggers (‘Hacks’) pair up with computer programmers and designers (‘Hackers’) to produce innovative data projects in the space of one day. Food and drink will be provided throughout the event. Prizes for the best projects will be awarded in the evening.

Any journalist from a B2B background, or developer/designer with an interest in business journalism is welcome to attend. We’re especially keen to welcome people who are interested in producing data visualisations.

“Data journalism is an important area of development for our editorial teams in RBI,” said Karl Schneider, RBI editorial development director:

“It’s a hot topic for all journalists, but it’s particularly relevant in the B2B sector. B2B journalism is focused on delivering information that it’s audience can act on, supporting important business decisions.

“Often a well-thought-out visualisation of data can be the most effective way of delivering critical information and helping users to understand key trends.

“We’re already having some successes with this kind of journalism, and we think we can do a lot more. So building up the skills of our editorial teams in this area is very important.”

The event is the first in-house hack day that Scraperwiki has organised as part of its UK and Ireland Hacks & Hackers tour.

5o places are available in total: half for RBI staff; half for external attendees. People wishing to attend should select the relevant ticket at this link.

Past hacks and hackers days have run in London, Liverpool, Birmingham and Manchester. For a flavour of the projects please see this blog post.

If you have any questions please contact Aine McGuire via Aine [at]scraperwiki.com.

October 16 2010

13:39

ScraperWiki: Hacks and Hackers day, Manchester.

If you’re not familiar with scraperwiki it’s ”all the tools you need for Screen Scraping, Data Mining & visualisation”.

These guys are working really hard at convincing Journos that data is their friend by staging a steady stream of events bringing together journos and programmers together to see what happens.

So I landed at NWVM’s offices to what seems like a mountain of laptops, fried food, coke and biscuits to be one of the judges of their latest hacks and hackers day in Manchester (#hhhmcr). I was expecting some interesting stuff. I wasn’t dissapointed.

The winners

We had to pick three prizes from the six of so projects started that day and here’s what we (Tom Dobson, Julian Tait and me)  ended up with.

The three winners, in reverse order:

Quarternote: A website that would ‘scrape’ myspace for band information. The idea was that you could put a location and style of music in to the system and it would compile a line-up of bands.

A great idea (although more hacker than hack) and if I was a dragon I would consider investing. These guys also won the Scraperwiki ‘cup’ award for actually being brave enough to have a go at scraping data from Myspace. Apparently myspace content has less structure than custard! The collective gasps from the geeks in the room when they said that was what they wanted to do underlined that.

Second was Preston’s summer of spend.  Local councils are supposed to make details of any invoice over 500 pounds available, and many have. But many don’t make the data very useable.  Preston City council is no exception. PDF’s!

With a little help from Scraperwiki the data was scraped, tidied and put in a spreadsheet and then organised. It through up some fun stuff – 1000 pounds to The Bikini Beach Band! And some really interesting areas for exploration – like a single payment of over 80,000 to one person (why?) – and I’m sure we’ll see more from this as the data gets a good running through.  A really good example of how a journo and a hacker can work together.

The winner was one of number of projects that took the tweets from the GMP 24hr tweet experiment; what one group titled ‘Genetically modified police’ tweeting :). Enrico Zini and Yuwei Lin built a searchable GMP24 tweet database (and a great write up of the process) of the tweets which allowed searching by location, keyword, all kinds of things. It was a great use of the data and the working prototype was impressive given the time they had.

Credit should go to Michael Brunton-Spall of the Guardian into a useable dataset which saved a lot of work for those groups using the tweets as the raw data for their projects.

Other projects included mapping deprivation in manchester and a legal website that if it comes off will really be one to watch. All brilliant stuff.

Hacks and hackers we need you

Give the increasing amount of raw data that organisations are pumping out journalists will find themselves vital in making sure that they stay accountable. But I said in an earlier post that good journalists don’t need to know how to do everything, they just need to know who to ask.

The day proved to me and, I think to lots of people there,  that asking a hacker to help sort data out is really worth it.

I’m sure there will be more blogs etc about the day appearing over the next few days.

Thanks to everyone concerned for asking me along.

October 14 2010

16:05

Hacks and Hackers hack day Manchester

Any sufficiently complicated regular expression is indistinguishable from magic

A bit of a nod to Arthur C.Clarke there but something that hits home every time I do any hacking around under the bonnet of the interwebs.

When it comes to this data journalism malarky some might say (to steal another movie quote) a mans got to know his limitations. But I firmly believe a good journalist, when stuck, knows who to ask. I’m very excited that more and more journos are realising that there are no end of tools and motivated people who can be part of the storytelling process.

So I was delighted to be asked to be one of the judges for ScraperWiki’s hacks and hackers hack day in Manchester tomorrow and see that in action.

The event just one of a number of similar days around the UK.  The successes in Birmingham and Liverpool amongst others, mean that tomorrow should be fun.

If your going, see you there (later on). If not I’ll tweet etc. as I can.

October 06 2010

14:51

Event: Hacks and Hackers Hack Day Lichfield (#hhhlich)

We have another event to announce, as part of Scraperwiki’s UK & Ireland tour. We’re going to Lichfield, Staffordshire! In partnership with Lichfield District Council, we’re holding a hacks and hackers hack day at Venture House on Monday 11th November.

“Lichfield District Council have been publishing open data for a while now, and it seems a good fit to put on a day where we can showcase the data we have published, as well as encourage people to do something with it,” said council webmaster Stuart Harrison.

“We’re not precious though, and if something is built using other public data, we’ll be just as happy!”

The details:

What? Scraperwiki, the award-winning new screen scraper and data mining tool, funded by 4iP and Lichfield District Council are putting on a one day practical hack day* in Lichfield, Staffordshire at which web developers and designers (hackers) will pair up with journalists and bloggers or anyone with an interest in media and communications (hacks) to produce a number of projects and stories based on public data. It’s all part of the ScraperWiki UK & Ireland Hacks and Hackers tour.

Who’s it for? We hope to attract ‘hacks’ and ‘hackers’ from all different types of backgrounds – across programming, media and communications.

What will I get out of it?
The aim is to show journalists how to use programming and design techniques to create online news stories and features; and vice versa, to show programmers how to find, develop, and polish stories and features. To see what happened at our past events in Liverpool and Birmingham visit the ScraperWiki blog.

How much? NOTHING! It’s free, thanks to our sponsors

What should I bring? We would encourage people to come along with ideas for local ‘datasets’ that are of interest. In addition we will create a list of suggested data sets at the introduction on the morning of the event but flexibility is key for this event. If you have a laptop, please bring this too.

So what exactly will happen on the day? Armed
with their laptops and WIFI, journalists and developers will be put
into teams of around four to develop their ideas, with the aim of
finishing final projects that can be published and shared publicly. Each team will then present their project to the whole group. Overall winners will receive a prize at the end of the day.

*Not sure what a hack day is? Let’s go with the Wikipedia definition: It “an event where developers, designers and people with ideas gather to build ‘cool stuff’”…

July 21 2010

10:08

I’m a journalist – should I learn programming?

Many reporters are starting to move on from the world of HTML or CSS coding and getting to grips with more technical programming knowledge.

But web development isn’t for everyone, so how do you know if it will be right for you? Using some trusty know-how and specially selected questions, digital journalist Mark Luckie has tried to help reporters answer that very question.

His flowchart, shown below, is hosted on his 10,000 words blog.

Similar Posts:



July 08 2010

08:00

#Tip of the day from Journalism.co.uk – jargon-busting for new media

New media: The excellent Hacks/Hackers has compiled a glossary of new media/web/social media and coding terms. Really helpful as a jargon-busting guide to online journalism and publishing. Tipster: Laura Oliver. To submit a tip to Journalism.co.uk, use this link - we will pay a fiver for the best ones published.


July 02 2010

13:42

Announcing the Birmingham Hacks & Hackers day

If you are a journalist, blogger or developer interested in the possibilities of public data I’d be very happy if you came to a Hack Day I’m involved in, here in Birmingham on Friday July 23.

The idea is very simple: we get a bunch of public data, and either find stories in it, or ways to help others find stories.

You don’t need technical expertise because that’s why the hackers are there; and you don’t need journalistic expertise because that’s why the hacks are there.

What I’m particularly excited about in Birmingham is that we’ve got a real mix of people coming – from press and broadcast, and local bloggers, and hopefully a mix of people with backgrounds in various programming languages and even gaming.

And apart from all that there should be free beer and pizza. Which is the important thing.

So come.

The day is being organised by Scraperwiki and we’ve already got a whole bunch of interesting people signed up.

You can register for the day here.

May 04 2010

08:00

#Tip of the day from Journalism.co.uk – advice for hacks and hackers

A new Hacks and Hackers group in the US, made up of both developers and journalists, is offering some basic programming advice and tips. Check out its place to ask questions about journalism and technology at this link... Tipster: Judith Townend. To submit a tip to Journalism.co.uk, use this link - we will pay a fiver for the best ones published.


February 05 2010

09:05

ScraperWiki blog: Hacks and Hackers hack day report

As we reported earlier this week, journalists and programmers got together last Friday in London to produce some fantastically inspirational projects.

ScraperWiki (behind a new data tool soon to launch in beta) has now published its report of the day, explaining each of the projects. With a little more work, these projects could make excellent news stories.

Here are two of the ideas, for starters:

Conservative Safe Seats (the project that won overall; see video for presentation)

Developer Edmund van der Burg, freelance journalist Anne Marie Cumiskey, Charlie Duff from HRzone.co.uk, Ian McDonald of the BBC and Dafydd Vaughn munged a whole host of datasets together to produce an analysis of the new Conservative candidates in the 12 safest Tory seats in Britain. Their conclusions: British white and male, average age 53, Oxford-educated, rarely on Facebook or Twitter.

Who Pays Who (Enterprise Ireland)

Gavin Sheridan from TheStory.ie and Duncan Parkes of mySociety used ScraperWiki to combine a list of grants made by Enterprise Ireland (which Gavin had aquired via an FOI request) with the profile data listed on the Enterprise Ireland website. This will no doubt be a source for stories in the near future.

Full post at this link…

Similar Posts:



November 24 2009

13:56

Journalism, Technology Starting to Add Up

Back in early 2008, as I headed off to a conference at Georgia Tech, I wrote a post for Idealab headlined "Computation + Technology = ?"

Two recent developments suggest that we're starting to find answers to that question -- and more importantly, that there's a growing number of people trying to find these answers. Duke University has released an interesting report, and a group of journalists and technologists has begun meeting in Silicon Valley to address challenges that journalists and technologists might tackle together.

The February 2008 conference at Georgia Tech, entitled "Journalism 3G: The Future of Technology in the Field," introduced many of its 200+ attendees to the idea of computational journalism -- applying computer programming to the challenges facing journalism, journalists and a society that needs original reporting to provide information for citizens in a democracy. Two of the other attendees were the first Knight News Challenge "programmer-journalist" scholarship winners: computer programmers enrolled in the master's program at the Medill School of Journalism at Northwestern University.

When the John S. and James L. Knight Foundation awarded the scholarship grant to Medill in 2007, the idea of teaching journalism to technology professionals seemed odd to many people -- both journalists and technologists. But now there seem to be a lot of initiatives aimed at addressing the same set of issues.

cohen_sarah-withcaption.jpg

Duke University, through its DeWitt Wallace Center for Media and Democracy, built on the ideas generated by the Georgia Tech conference in a couple of ways. First, the center created -- and has now filled -- a faculty position specializing in the field. The new Knight professor of the practice of journalism and public policy is an old friend, Sarah Cohen, previously database editor for The Washington Post, where she contributed to countless enterprise reporting projects, including a Pulitzer-winning investigation of child welfare agencies in the District of Columbia. Besides teaching courses, Cohen is expected to lead the development of open-source reporting tools designed to make it easier for journalists to discover and research stories.

Earlier this month, Duke released "Accountability Through Algorithm: Developing the Field of Computational Journalism," a report based on a workshop held in July. The report is full of interesting ideas for applying technology to journalists' challenges. Here are a few of them.

Information Extraction, Integration and Visualization

A new set of tools would help reporters find patterns in otherwise unstructured or unsearchable information. For instance, the Obama administration posted letters from dozens of interest groups providing advice on issues, but the letters were not searchable. A text-extraction tool would allow reporters to feed PDF documents into a Web service and return a version that could be indexed and searched. The software might also make it easy to tag documents with metadata such as people's names, places and dates. Another idea is to improve automatic transcription software for audio and video files, often available (but not transcribed) for government meetings and many court hearings.

The report also suggests developing "lightweight" templates that enable journalists to create data visualizations based on XML or spreadsheet files, and tools that help them organize their findings in a timeline. As the report points out, reporters working on in-depth projects often create chronologies in lengthy spreadsheets or text documents. A better tool would let journalists "zoom in, tag events for publication, turn on and off players or events and otherwise use them effectively," the report says.

The Journalist's Dashboard

Here the Duke report suggests that journalists need "a tool with which to spot what's new and what's important in the flow of daily information." A dashboard could include:

  • A news alert system similar to Google News that scanned only the sources specified by a beat reporter,identifying the originating publisher and the number of other sites that linked to the item;
  • A tool helping journalists keep track of their sources, including news items about that person and citations from the reporter's own archived stories mentioning him or her;
  • A "trends and outliers" tool that might generate an alert any time a data source reveals a significant change in a piece of data -- say, a surge in monthly expenditures by a government agency, or a flurry of crime reports in a short period of time.
  • A timeline generator that would display incidents related to a particular story as well as coverage on blogs and news sites.
  • An annotator that would allow a reporter to see past stories, images and contextual information while writing -- for instance, by displaying background information about the person being written about. (This idea bears some similarity to the EasyWriter tool developed this spring by students in a Northwestern University journalism/technology class.)

Reader-Reporter Interaction

Philip Bennett, formerly managing editor of the Washington Post and now a professor at Duke, is quoted in the report describing a new approach to investigative projects that engages and taps into reader interest. Instead of seeing long-term investigative projects ending with publication of a package of stories, the initial investigation could serve as just the midpoint in the reporting process. Stories could be presented in ways that enabled each reader to explore the story in layers, giving each a "differentiated news experience depending on her interests." Bennett suggests that a series like the Post's Pulitzer-winning investigation of Walter Reed Army Medical Center could have become a focal point for readers interested in veterans' issues. "If the paper could nurture a community of interest around the story, readers might use the site as a discussion place for the action that follows from the investigation," the report says.

Applying 'Sensemaking' Approaches From Other Fields

The Duke report points out that academic researchers are wrestling with many of the same challenges that journalists face and suggests that their solutions could be helpful. For instance, Georgia Tech researchers have built a tool called Jigsaw that creates visualizations to display connections between individuals and entities mentioned in different documents -- something every investigative reporter would lust for. And the Muninn Project, an interdisciplinary research project focusing on World War I records, is seeking to convert images of handwritten forms into machine-readable databases -- a problem faced by journalists in many states that allow political candidates to file handwritten campaign contribution reports..



Another new development worth taking note of: a new "Hacks and Hackers" Meetup group formed in Silicon Valley by Burt Herman, a reporter for the Associated Press who recently completed a Knight fellowship at Stanford University. The group -- billed as being "for hackers exploring technologies to filter and visualize information, and for journalists who use technology to find and tell stories" -- held its first meeting Nov. 19.

The first gathering attracted about 30 people, including people from Google and Google News, Yahoo, sfgate, SF Chronicle, Current TV, PARC (Palo Alto Research Center), and Topix.com, Herman reported. "It felt like the seeds of a movement, and the many lively conversations showed that everyone was able to find common ground," he wrote in an email to me.

Herman said his Knight fellowship -- during which he focused on innovation and entrepreneurship -- taught him that innovation requires bringing people from different disciplines together.

"I started the Hacks and Hackers meetup group to open a broader dialogue between technologists and journalists, so we can move past the endless hand-wringing about the future of news and get down to work building it," Herman said. "Technology and media come together here in Silicon Valley like nowhere else in the world, and there was no group yet focused on this. I'm hoping it will lead to better understanding and perhaps even spawn new ventures."



As some readers of this blog will remember, "Hacks and Hackers" is also the name that Aron Pilhofer and I came up with to describe a new organization and Web site for people working at the intersection of technology and journalism. At the Future of News and Civic Media Conference in June, Aron and I won a $2,000 prize to create an online community for people with these interests.

The Web community idea is still in the early stages of development, but Aron and I would welcome your ideas about how best to make it work. The original concept was to create a place where members can seek help solving problems and provide assistance to their peers by, for instance, sharing a tutorial for a project using Django or Ruby on Rails or Drupal. We know there are people -- in journalism and technology, in industry and academia, scattered through organizations such as the Online News Association, Investigative Reporters and Editors and the Society for News Design -- who can use each other's help and support. We like the idea of having some kind of reputation management system -- say, like Stack Overflow -- that would reward members based on the quality and quantity of their contributions to the community.

If you have ideas for the Hacks and Hackers site, please post them in the comments below or email me at richgor - at - northwestern.edu.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl