Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 17 2012

16:07

Metrics, metrics everywhere: How do we measure the impact of journalism?

If democracy would be poorer without journalism, then journalism must have some effect. Can we measure those effects in some way? While most news organizations already watch the numbers that translate into money (such as audience size and pageviews), the profession is just beginning to consider metrics for the real value of its work.

That’s why the recent announcement of a Knight-Mozilla Fellowship at The New York Times on “finding the right metric for news” is an exciting moment. A major newsroom is publicly asking the question: How do we measure the impact of our work? Not the economic value, but the democratic value. The Times’ Aaron Pilhofer writes:

The metrics newsrooms have traditionally used tended to be fairly imprecise: Did a law change? Did the bad guy go to jail? Were dangers revealed? Were lives saved? Or least significant of all, did it win an award?

But the math changes in the digital environment. We are awash in metrics, and we have the ability to engage with readers at scale in ways that would have been impossible (or impossibly expensive) in an analog world.

The problem now is figuring out which data to pay attention to and which to ignore.

Evaluating the impact of journalism is a maddeningly difficult task. To begin with, there’s no single definition of what journalism is. It’s also very hard to track what happens to a story once it is released into the wild, and even harder to know for sure if any particular change was really caused by that story. It may not even be possible to find a quantifiable something to count, because each story might be its own special case. But it’s almost certainly possible to do better than nothing.

The idea of tracking the effects of journalism is old, beginning in discussions of the newly professionalized press in the early 20th century and flowering in the “agenda-setting” research of the 1970s. What is new is the possibility of cheap, widespread, data-driven analysis down to the level of the individual user and story, and the idea of using this data for managing a newsroom. The challenge, as Pilhofer put it so well, is figuring out which data, and how a newsroom could use that data in a meaningful way.

What are we trying to measure and why?

Metrics are powerful tools for insight and decision-making. But they are not ends in themselves because they will never exactly represent what is important. That’s why the first step in choosing metrics is to articulate what you want to measure, regardless of whether or not there’s an easy way to measure it. Choosing metrics poorly, or misunderstanding their limitations, can make things worse. Metrics are just proxies for our real goals — sometimes quite poor proxies.

An analytics product such as Chartbeat produces reams of data: pageviews, unique users, and more. News organizations reliant on advertising or user subscriptions must pay attention to these numbers because they’re tied to revenue — but it’s less clear how they might be relevant editorially.

Consider pageviews. That single number is a combination of many causes and effects: promotional success, headline clickability, viral spread, audience demand for the information, and finally, the number of people who might be slightly better informed after viewing a story. Each of these components might be used to make better editorial choices — such as increasing promotion of an important story, choosing what to report on next, or evaluating whether a story really changed anything. But it can be hard to disentangle the factors. The number of times a story is viewed is a complex, mixed signal.

It’s also possible to try to get at impact through “engagement” metrics, perhaps derived from social media data such as the number of times a story is shared. Josh Stearns has a good summary of recent reports on measuring engagement. But though it’s certainly related, engagement isn’t the same as impact. Again, the question comes down to: Why would we want to see this number increase? What would it say about the ultimate effects of your journalism on the world?

As a profession, journalism rarely considers its impact directly. There’s a good recent exception: a series of public media “impact summits” held in 2010, which identified five key needs for journalistic impact measurement. The last of these needs nails the problem with almost all existing analytics tools:

While many Summit attendees are using commercial tools and services to track reach, engagement and relevance, the usefulness of these tools in this arena is limited by their focus on delivering audiences to advertisers. Public interest media makers want to know how users are applying news and information in their personal and civic lives, not just whether they’re purchasing something as a result of exposure to a product.

Or as Ethan Zuckerman puts it in his own smart post on metrics and civic impact, ”measuring how many people read a story is something any web administrator should be able to do. Audience doesn’t necessarily equal impact.” Not only that, but it might not always be the case that a larger audience is better. For some stories, getting them in front of particular people at particular times might be more important.

Measuring audience knowledge

Pre-Internet, there was usually no way to know what happened to a story after it was published, and the question seems to have been mostly ignored for a very long time. Asking about impact gets us to the idea that the journalistic task might not be complete until a story changes something in the thoughts or actions of the user.

If journalism is supposed to inform, then one simple impact metric would ask: Does the audience know the things that are in this story? This is an answerable question. A survey during the 2010 U.S. mid-term elections showed that a large fraction of voters were misinformed about basic issues, such as expert consensus on climate change or the predicted costs of the recently passed healthcare bill. Though coverage of the study focused on the fact that Fox News viewers scored worse than others, that missed the point: No news source came out particularly well.

In one of the most limited, narrow senses of what journalism is supposed to do — inform voters about key election issues — American journalism failed in 2010. Or perhaps it actually did better than in 2008 — without comparable metrics, we’ll never know.

While newsrooms typically see themselves in the business of story creation, an organization committed to informing, not just publishing, would have to operate somewhat differently. Having an audience means having the ability to direct attention, and an editor might choose to continue to direct attention to something important even it’s “old news”; if someone doesn’t know it, it’s still new news to them. Journalists will also have to understand how and when people change their beliefs, because information doesn’t necessarily change minds.

I’m not arguing that every news organization should get into the business of monitoring the state of public knowledge. This is only one of many possible ways to define impact; it might only make sense for certain stories, and to do it routinely we’d need good and cheap substitutes for large public surveys. But I find it instructive to work through what would be required. The point is to define journalistic success based on what the user does, not the publisher.

Other fields have impact metrics too

Measuring impact is hard. The ultimate effects on belief and action will mostly be invisible to the newsroom, and so tangled in the web of society that it will be impossible to say for sure that it was journalism that caused any particular effect. But neither is the situation hopeless, because we really can learn things from the numbers we can get. Several other fields have been grappling with the tricky problems of diverse, indirect, not-necessarily-quantifiable impact for quite some time.

Academics wish to know the effect of their publications, just as journalists do, and the academic publishing field has long had metrics such citation count and journal impact factor. But the Internet has upset the traditional scheme of things, leading to attempts to formulate wider ranging, web-inclusive measures of impact such as Altmetrics or the article-level metrics of the Public Library of Science. Both combine a variety of data, including social media.

Social science researchers are interested not only in the academic influence of their work, but its effects on policy and practice. They face many of the same difficulties as journalists do in evaluating their work: unobservable effects, long timelines, complicated causality. Helpfully, lots of smart people have been working on the problem of understanding when social research changes social reality. Recent work includes the payback framework which looks at benefits from every stage in the lifecycle of research, from intangibles such as increasing the human store of knowledge, to concrete changes in what users do after they’ve been informed.

NGOs and philanthropic organizations of all types also use effectiveness metrics, from soup kitchens to international aid. A research project at Stanford University is looking at the use and diversity of metrics in this sector. We are also seeing new types of ventures designed to produce both social change and financial return, such as social impact bonds. The payout on a social impact bond is contractually tied to an impact metric, sometimes measured as a “social return on investment.”

Data beyond numbers

Counting the countable because the countable can be easily counted renders impact illegitimate.

- John Brewer, “The impact of impact

Numbers are helpful because they allow standard comparisons and comparative experiments. (Did writing that explainer increase the demand for the spot stories? Did investigating how the zoning issue is tied to developer profits spark a social media conversation?) Numbers can be also compared at different times, which gives us a way to tell if we’re doing better or worse than before, and by how much. Dividing impact by cost gives measures of efficiency, which can lead to better use of journalistic resources.

But not everything can be counted. Some events are just too rare to provide reliable comparisons — how many times last month did your newsroom get a corrupt official fired? Some effects are maddeningly hard to pin down, such as “increased awareness” or “political pressure.” And very often, attributing cause is hopeless. Did a company change its tune because of an informed and vocal public, or did an internal report influence key decision makers?

Fortunately, not all data is numbers. Do you think that story contributed to better legislation? Write a note explaining why! Did you get a flood of positive comments on a particular article? Save them! Not every effect needs to be expressed in numbers, and a variety of fields are coming to the conclusion that narrative descriptions are equally valuable. This is still data, but it’s qualitative (stories) instead of quantitative (numbers). It includes comments, reactions, repercussions, later developments on the story, unique events, related interviews, and many other things that are potentially significant but not easily categorizable. The important thing is to collect this information reliably and systematically, or you won’t be able to make comparisons in the future. (My fellow geeks may here be interested in the various flavors of qualitative data analysis.)

Qualitative data is particularly important when you’re not quite sure what you should be looking for. With the right kind, you can start to look for the patterns that might tell you what you should be counting,

Metrics for better journalism

Can the use of metrics make journalism better? If we can find metrics that show us when “better” happens, then yes, almost by definition. But in truth we know almost nothing about how to do this.

The first challenge may be a shift in thinking, as measuring the effect of journalism is a radical idea. The dominant professional ethos has often been uncomfortable with the idea of having any effect at all, fearing “advocacy” or “activism.” While it’s sometimes relevant to ask about the political choices in an act of journalism, the idea of complete neutrality is a blatant contradiction if journalism is important to democracy. Then there is the assumption, long invisible, that news organizations have done their job when a story is published. That stops far short of the user, and confuses output with effect.

The practical challenges are equally daunting. Some data, like web analytics, is easy to collect but doesn’t necessarily coincide with what a news organization ultimately values. And some things can’t really be counted. But they can still be considered. Ideally, a newsroom would have an integrated database connecting each story to both quantitative and qualitative indicators of impact: notes on what happened after the story was published, plus automatically collected analytics, comments, inbound links, social media discussion, and other reactions. With that sort of extensive data set, we stand a chance of figuring out not only what the journalism did, but how best to evaluate it in the future. But nothing so elaborate is necessary to get started. Every newsroom has some sort of content analytics, and qualitative effects can be tracked with nothing more than notes in a spreadsheet.

Most importantly, we need to keep asking: Why are we doing this? Sometimes, as I pass someone on the street, I ask myself if the work I am doing will ever have any effect on their life — and if so, what? It’s impossible to evaluate impact if you don’t know what you want to accomplish.

July 19 2011

09:50

New York Magazine: New(s) business - 21 New Media Innovators

New York Magazine :: While the dark days of journalism have receded a bit — it was only three years ago that layoffs were a weekly occurrence, and serious people discussed the closure of the New York Times — the business is still very much in a state of chaotic flux. The so-called war between new and old media rages on among the pundits, with Facebook supplanting Google News as the new bogeyman.

But if you look past the hype, a bumper crop of new jobs and new ways of reporting have taken root, created by people who are willing to throw themselves into the breach and experiment. What follows is a list of 21 journalists and like-minded inventors who have created something exciting, interesting, and just plain cool.

Do you think there are people missing? Tweet me your thoughts!

Continue to read Chris Rovzar | Noreen Malone | Dan Amira | Adam Pasick | and Nitasha Tiku, nymag.com

December 21 2010

17:00

At #Niemanleaks, a new generation of tools to manage floods of new data

Whether it’s 250,000 State Department cables or the massive spending databases on Recovery.gov, the trend in data has definitely become “more.” That presents journalists with a new problem: How do you understand and explain data when it comes by the gigabyte? At the Nieman Foundation’s one-day conference on secrecy and journalism, presenters from the New York Times, Sunlight Foundation, and more offered solutions — or at least new ways of thinking about the problems.

Think like a scientist

With the massive amounts of primary documents now available, journalists have new opportunities to bring their readers into their investigations — which can lead to better journalism. John Bohannon, a contributing correspondent for Science Magazine, said his background as a scientist was great preparation for investigative reporting. “The best kind of investigative journalism is like the best kind of science,” he said. “You as the investigator don’t ask your readers to take your claims at face value: You give them the evidence you’ve gathered along the way and ask them to look it with you.”

It’s not a radical idea, but it’s one being embraced in new ways. For Bohannon, it meant embedding with a unit in Afganistan and methodically gathering first-hand data about civilian deaths — a more direct and reliable indicator than the less expensive and safer method of counting media-reported deaths. He also found his scientific approach was met with more open answers from a military known for tight information control. “Sometimes if you politely ask for information, large powerful organizations will actually give it to you,” he said.

The future will be distributed: BitTorrent, not Napster

Two of the projects discussed, Basetrack and DocumentCloud, invite broader participation in the news process, the former in the sourcing and the latter with the distribution.

Basetrack, a Knight News Challenge winner, goes beyond the normal embedding process to more actively involving the Marines of First Battalion, Eighth Marine Regiment as they deploy overseas in reporting their experiences. Teru Kuwayama, who leads the project and deployed with the battalion to Afghanistan, said ensuring that confidential information wasn’t released, putting lives in danger, was essential to building trust and openness with the project. So Basetrack built a “Denial of Information” tool that allowed easy, pre-publication redactions, with the caveat that the fact of those redactions — and the reasons given for them — would be made public. It’s a compromise that promises a greater intimacy and a collaborative look at life at war while ensuring the safety of the soldiers.

Fellow News Challenge winner DocumentCloud, on the other hand, distributes the primary documents dug up through traditional investigative journalism, such as historical confidential informant files or flawed electoral ballot designs. Aron Pilhofer, editor of interactive news at The New York Times, said he was unsure about whether journalists would actually use it when his team began working on the project — but since then dozens of organizations have embraced it, happy to take readers along for the ride of the investigative process.

These new ways of distributing reporting were just the beginning, Pilhofer said, with a trend that will likely push today’s marquee whistleblower out of the limelight. “WikiLeaks was very much a funnel going in and very much a funnel going out,” he said. “Distributed is the future.” A new project, called OpenLeaks, will embrace a less centralized model, building technology to allow anonymous leaks without a central organization to be taken out.

Big data’s day is here

The panel also tackled how to digest truly massive data sets. Bill Allison, editorial director of the Sunlight Foundation, detailed how his organization collected information on everything from earmarks to political fundraising parties. Allison said making this data actually meaningful required context, which could be simple as mapping already available data or scoring government databases based on understandable criteria.

“We try to make the information easy to use,” he said. But beyond the audience of curious constituents who use Sunlight’s tools, a much broader audience is reached as hundreds of journalists around the country use Sunlight’s tools to dig up local stories they might not otherwise have noticed — creating a rippling effect of transparency

16:00

Tablet-only, mobile-first: News orgs native to new platforms coming soon

Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.

Here are 10 predictions from Vadim Lavrusik, community manager and social strategist at Mashable. Mashable, where these predictions first appeared, covers the heck out of the world of social media and have an honored place in our iPhone app.

In many ways, 2010 was finally the year of mobile for news media, and especially so if you consider the iPad a mobile device. Many news organizations like The Washington Post and CNN included heavy social media integrations into their apps, opening the devices beyond news consumption.

In 2011, the focus on mobile will continue to grow with the launch of mobile- and iPad-only news products, but the greater focus for news media in 2011 will be on re-imagining its approach to the open social web. The focus will shift from searchable news to social and share-able news, as social media referrals close the gap on search traffic for more news organizations. In the coming year, news media’s focus will be affected by the personalization of news consumption and social media’s influence on journalism.

Leaks and journalism: a new kind of media entity

In 2010, we saw the rise of WikiLeaks through its many controversial leaks. With each leak, the organization learned and evolved its process in distributing sensitive classified information. In 2011, we’ll see several governments prosecute WikiLeaks founder Julian Assange for his role in disseminating classified documents and some charges will have varying successes. But even if WikiLeaks itself gets shut down, we’re going to see the rise of “leakification” in journalism, and more importantly we’ll see a number of new media entities, not just mirror sites, that will model themselves to serve whistle blowers — WikiLeaks copycats of sorts. Toward the end of this year, we already saw Openleaks, Brusselsleaks, and Tradeleaks. There will be many more, some of which will be focused on niche topics.

Just like with other media entities, there will be a new competitive market and some will distinguish themselves and rise above the rest. So how will success be measured? The scale of the leak, the organization’s ability to distribute it and its ability or inability to partner with media organizations. Perhaps some will distinguish themselves by creating better distribution platforms through their own sites by focusing on the technology and, of course, the analysis of the leaks. The entities will still rely on partnerships with established media to distribute and analyze the information, but it may very well change the relationship whistleblowers have had with media organizations until now.

More media mergers and acquisitions

At the tail end of 2010, we saw the acquisition of TechCrunch by AOL and the Newsweek merger with The Daily Beast. In some ways, these moves have been a validation in the value of new media companies and blogs that have built an audience and a business.

But as some established news companies’ traditional sources of revenue continue to decline, while new media companies grow, 2011 may bring more media mergers and acquisitions. The question isn’t if, but who? I think that just like this year, most will be surprises.

Tablet-only and mobile-first news companies

In 2010, as news consumption began to shift to mobile devices, we saw news organizations take mobile seriously. Aside from launching mobile apps across various mobile platforms, perhaps the most notable example is News Corp’s plan to launch The Daily, an iPad-only news organization that is set to launch early 2011. Each new edition will cost $0.99 to download, though Apple will take 30%. But that’s not the only hurdle, as the publication relies on an iPad-owning audience. There will have been 15.7 million tablets sold worldwide in 2010, and the iPad represents roughly 85% of that. However, that number is expected to more than double in 2011. Despite a business gamble, this positions news organizations like The Daily for growth, and with little competition, besides news organizations that repurpose their web content. We’ve also seen the launch of an iPad-only magazine with Virgin’s Project and of course the soon-to-launch News.me social news iPad application from Betaworks.

But it’s not just an iPad-only approach, and some would argue that the iPad isn’t actually mobile; it’s leisurely (yes, Mark Zuckerberg). In 2011, we’ll see more news media startups take a mobile-first approach to launching their companies. This sets them up to be competitive by distributing on a completely new platform, where users are more comfortable with making purchases. We’re going to see more news companies that reverse the typical model of website first and mobile second.

Location-based news consumption

In 2010, we saw the growth of location-based services like Foursquare, Gowalla and SCVNGR. Even Facebook entered the location game by launching its Places product, and Google introduced HotPot, a recommendation engine for places and began testing it in Portland. The reality is that only 4% of online adults use such services on the go. My guess is that as the information users get on-the-go info from such services, they’ll becomes more valuable and these location-based platforms will attract more users.

Part of the missing piece is being able to easily get geo-tagged news content and information based on your GPS location. In 2011, with a continued shift toward mobile news consumption, we’re going to see news organizations implement location-based news features into their mobile apps. And of course if they do not, a startup will enter the market to create a solution to this problem or the likes of Foursquare or another company will begin to pull in geo-tagged content associated with locations as users check in.

Social vs. search

In 2010, we saw social media usage continue to surge globally. Facebook alone gets 25% of all U.S. pageviews and roughly 10% of Internet visits. Instead of focusing on search engine optimization (SEO), in 2011 we’ll see social media optimization become a priority at many news organizations, as they continue to see social close the gap on referrals to their sites.

Ken Doctor, author of Newsonomics and news industry analyst at Outsell, recently pointed out that social networks have become the fastest growing source of traffic referrals for many news sites. For many, social sites like Facebook and Twitter only account for 10% to 15% of their overall referrals, but are number one in growth. For news startups, the results are even more heavy on social. And of course, the quality of these referrals is often better than readers who come from search. They generally yield more pageviews and represent a more loyal reader than the one-off visitors who stumble across the site from Google.

The death of the “foreign correspondent”

What we’ve known as the role of the foreign correspondent will largely cease to exist in 2011. As a result of business pressures and the roles the citizenry now play in using digital technology to share and distribute news abroad, the role of a foreign correspondent reporting from an overseas bureau “may no longer be central to how we learn about the world,” according to a recent study by the Reuters Institute for the Study of of Journalism. The light in the gloomy assessment is that there is opportunity in other parts of the world, such as Asia and Africa, where media is expanding as a result of “economic and policy stability,” according to the report. In 2011, we’ll see more news organizations relying heavily on stringers and, in many cases, social content uploaded by the citizenry.

The syndication standard and the ultimate curators

Syndication models will be disrupted in 2011. As Clay Shirky recently predicted, more news outlets will get out of the business of re-running the same story on their site that appeared elsewhere. Though this is generally true, the approach to syndication will vary based on the outlet. The reality is that the content market has become highly fragmented, and if content is king, then niche is certainly queen. Niche outlets, which were once curators of original content produced by established organizations, will focus more on producing original content. While established news brands, still under pressure to produce a massive amount of content despite reduced staff numbers, will become the ultimate curators. This means they will feature just as much content, but instead through syndication partners.

You already see this taking place on sites like CNN.com or NYTimes.com, both of whose technology sections feature headlines and syndicated content from niche technology publications. In this case, it won’t only be the reader demand for original content that drives niche publications to produce more original content, but also its relationship with established organizations that strive to uphold the quality of their content and the credibility of their brand. Though original content will be rewarded, specialized, niche publications could benefit the most from the disruption.

Social storytelling becomes reality

In 2010, we saw social content get weaved into storytelling, in some cases to tell the whole story and in other cases to contextualize news events with curation tools such as Storify. We also saw the rise of social news readers, such as Flipboard and Pulse mobile apps and others.

In 2011, we’ll not only see social curation as part of storytelling, but we’ll see social and technology companies getting involved in the content creation and curation business, helping to find the signal in the noise of information.

We’ve already heard that YouTube is in talks to buy a video production company, but it wouldn’t be a surprise for the likes of Twitter or Facebook to play a more pivotal role in harnessing its data to present relevant news and content to its users. What if Facebook had a news landing page of the trending news content that users are discussing? Or if Twitter filtered its content to bring you the most relevant and curated tweets around news events?

News organizations get smarter with social media

In 2010, news organizations began to take social media more seriously and we saw many news organizations hire editors to oversee social media. USA Today recently appointed a social media editor, while The New York Times dropped the title, and handed off the ropes to Aron Pilhofer’s interactive news team.

The Times’ move to restructure its social media strategy, by going from a centralized model to a decentralized one owned by multiple editors and content producers in the newsroom, shows us that news organizations are becoming more sophisticated and strategic with their approach to integrating social into the journalism process. In 2011, we’re going to see more news organizations decentralize their social media strategy from one person to multiple editors and journalists, which will create an integrated and more streamlined approach. It won’t just be one editor updating or managing a news organization’s process, but instead news organizations will work toward a model in which each journalist serves as his or her own community manager.

The rise of interactive TV

In 2010, many people were introduced to Internet TV for the first time, as buzz about the likes of Google TV, iTV, Boxee Box and others proliferated headlines across the web. In 2011, the accessibility to Internet TV will transform television as we know it in not only the way content is presented, but it will also disrupt the dominance traditional TV has had for years in capturing ad dollars.

Americans now spend as much time using the Internet as they do watching television, and the reality is that half are doing both at the same time. The problem of being able to have a conversation with others about a show you’re watching has existed for some time, and users have mostly reacted to the problem by hosting informal conversations via Facebook threads and Twitter hashtags. Companies like Twitter are recognizing the problem and finding ways to make the television experience interactive.

It’s not only the interaction, but the way we consume content. Internet TV will also create a transition for those used to consuming video content through TVs and bring them to the web. That doesn’t mean that flat screens are going away; instead, they will only become interconnected to the web and its many content offerings.

15:00

Tracking documents, numbers, and military social media: New tools for reporting in an age of swarming data

To conclude our series of videos from the Nieman Foundation’s secrecy and journalism conference, here’s a video of the day’s final session — the Labbiest of the bunch. Our own Megan Garber moderates a set of presentations on new digital approaches to dealing with new data and new sources.

The presenters: John Bohannon, contributing correspondent for Science Magazine; Teru Kuwayama, Knight News Challenge winner for Basetrack; Aron Pilhofer, editor of interactive news at The New York Times and Knight News Challenge winner for DocumentCloud; and Bill Allison, editorial director of the Sunlight Foundation. Below is an embed of the session’s liveblog.

December 09 2010

20:00

Aron Pilhofer and Jennifer Preston on the new shape of social in The New York Times’ newsroom

In some ways, the most successful social media editor is an obsolete social media editor. The better you do your job — integrate social media into your newsroom, make it a seamless part of your organization’s workflow — the less you’re required to actually, you know, do your job.

By that measure, Jennifer Preston’s work as The New York Times’ first social media editor has been a resounding success — as evidenced by the fact that she will also likely be the outlet’s last social media editor. Earlier this week, the paper announced that it’s scrapping its SM editor role, instead folding responsibility for the Times’ social media oversight into the paper’s Interactive News division, under the leadership of Interactive News Editor Aron Pilhofer.

Pilhofer’s team is composed of developers who are at the vanguard of the journo-hacker movement (we wrote a bit about their intriguing backgrounds this summer), and they’ve already had a big hand in developing some of the Times’ most notable forays into social media: creative commenting systems like Health Care Conversations (which uses what the team internally calls a “bento box” structure to visualize commentary); crowdsourcing efforts like Survival Strategies, which asked for readers’ approaches to coping with the recessed economy; and other efforts. The move into social, in fact, isn’t a move at all. As Pilhofer notes, “It’s really more of an expansion of something that we’re already deeply involved in.”

That expansion has been met with general approval from the Twittersphere and elsewhere: The idea of diffusing the responsibility for social media, shifting it from one person to a team and a newsroom, makes eminent sense. And Preston, for her part — who will continue in the Times’ long tradition of editors-returning-to-reporting, helping to cover, fittingly, the social media beat — likes what the shift to Interactive means for the Times’ future. “That’s where social media belongs, and that’s the direction we have been taking it,” she told me. “And I am thrilled — I am just so delighted — that social media is now going to sit in Aron Pilhofer’s group.”

Evolving toward obsolescence

The shift marks something of a victory for social media at the paper of record — and the news organization where some of the most exciting interplays between tradition and technology are being developed. Social media, at the Times as everywhere else, has undergone an evolution — something that’s been in the works long before Preston took the social media editor role. “The New York Times had a very robust presence in the social media space before I took on that role,” she notes; for the past year and a half, Preston has been not only overseeing the paper’s efforts when it comes to social media, but also playing a get-the-word out role within the Times’ newsroom itself.

One of her great successes in that respect has actually been a quiet one: She has, in a way, normalized the newsroom’s relationship with social media, taking it from a somewhat controversial presence — that newfangled tool of the Twitters — into something now generally accepted as part of the new way of doing journalism. “When I took on the role of social media editor last summer, my primary responsibility was to be an evangelist among our journalists,” Preston told me.

Increasingly, though, that role is becoming obsolete: “Our journalists get it,” she says. “Which has just been so exciting.”

Now, under Pilhofer, social media’s presence will be even more widely distributed throughout the newsroom — and, thus, even more finely integrated within it. “A lot of people, I think, think of engagement as just comments,” Preston notes. “And we have to do such a better job in that area, which everyone is aware of, and we’re really focused on making improvements there. But there’s so many other ways that you can engage people and use social to help drive that.”

In taking over Preston’s general social media responsibilities, Pilhofer told me, he’ll still be playing a training role, introducing Times journalists to new tools for social engagement — Twitter, Facebook, whatever’s next. “We are now at a point when we can start pushing these ideas, and these tools, and responsibilities, and getting this out into the newsroom to continue what Jennifer was doing so effectively for so long.” Pilhofer agrees with Preston, though: It won’t be about evangelism. In fact, “I’m always looking for other words besides ‘evangelizing,’” he notes. “I don’t like it because [social media] is not faith-based. There are very clear, very demonstrable, empirical data we can point to, if that’s what it comes to, to convince somebody that this is worth taking very, very seriously and building into your reporting process. We can show you that. You don’t have to take our word for it.”

And the job of trainer/advocate, going forward, will be distributed. “I think, to some degree, we’re going to be relying on folks who have — to continue the metaphor — ‘gotten the religion’ to continue that,” Pilhofer says. “And one of those people, obviously, is going to be Jen.”

“A seamless mix”

Another aspect of social media’s diffusion into the newsroom will be a broad definition of what “social media” means in the first place. “When this first came up, I was very clear that I wanted to think about all things social — not just a handful of websites that we now talk about,” Pilhofer says. Social media, he points out, “is the entire conversation: It is anywhere readers are talking about us, talking with us, talking with one another, talking about issues that are important — things we’re covering — whether that occurs on NYTimes.com or off. To me, that’s a continuum of things. It’s not just one thing.”

Included in that is an idea that’s especially exciting for future-of-newsies: social-media-as-reporting-tool. “Social media,” as Pilhofer construes the term, includes “a lot of really, really cool new tools that are coming online to help more effectively filter the flow of information — and home in on those interesting tidbits that could turn into great stories.” (Preston echoes that: “I think it’s a mistake just to think of Facebook as a platform just to push out — as a distribution tool. I think it’s just a real opportunity for news organizations to use it to seed communities around your content,” she notes.) And — Pilhofer again — “helping reporters do that, I think, will be very much a part of our responsibility.”

Pilhofer’s team is used to working with text reporters, web producers, business side-ers, and other members of the Times’ organization to accomplish its goals. Interactivity, increasingly, touches every corner of the building. And the entire group of journo-hackers will be involved in making the interpersonal connections within the newsroom that, hopefully, will translate to interpersonal connections in the world beyond its walls. “In effect,” Pilhofer notes, “the responsibilities are coming under the team, and into the team.” The community-building strategies and the technological strategies, currently somewhat separate, are going to be folded into Interactive News. The vision? “A seamless mix of technologists working with journalists to accomplish those kinds of goals. That’ll happen inside the group, but it’ll happen outside the group, as well.”

October 04 2010

07:41

Where should an aspiring data journalist start?

In writing last week’s Guardian Data Blog piece on How to be a data journalist I asked various people involved in data journalism where they would recommend starting. The answers are so useful that I thought I’d publish them in full here.

The Telegraph’s Conrad Quilty-Harper:

Start reading:

http://www.google.com/reader/bundle/user%2F06076274130681848419%2Fbundle%2Fdatavizfeeds

Keep adding to your knowledge and follow other data journalists/people who work with data on Twitter.

Look for sources of data:

ONS stats release calendar is a good start http://www.statistics.gov.uk/hub/release-calendar/index.html Look at the Government data stores (Data.gov, Data.gov.uk, Data.london.gov.uk etc).

Check out What do they know, Freebase, Wikileaks, Manyeyes, Google Fusion charts.

Find out where hidden data is and try and get hold of it: private companies looking for publicity, under appreciated research departments, public bodies that release data but not in a granular form (e.g. Met Office).

Test out cleaning/visualisation tools:

You want to be able to collect data, clean it, visualise it and map it.

Obviously you need to know basic Excel skills (pivot tables are how journalists efficiently get headline numbers from big spreadsheets).

For publishing just use Google Spreadsheets graphs, or ManyEyes or Timetric. Google MyMaps coupled with http://batchgeo.com is a great beginner mapping combo.

Further on from that you want to try out Google Spreadsheets importURL service, Yahoo Pipes for cleaning data, Freebase Gridworks and Dabble DB.

More advanced stuff you want to figure out query language and be able to work with relational databases, Google BigQuery, Google Visualisation API (http://code.google.com/apis/charttools/), Google code playgrounds (http://code.google.com/apis/ajax/playground/?type=visualization#org_chart) and other Javascript tools. The advanced mapping equivalents are ArcGIS or GeoConcept, allowing you to query geographical data and find stories.

You could also learn some Ruby for building your own scrapers, or Python for ScraperWiki.

Get inspired:

Get the data behind some big data stories you admire, try and find a story, visualise it and blog about it. You’ll find that the whole process starts with the data, and your interpretation of it. That needs to be newsworthy/valuable.

Look to the past!

Edward Tufte’s work is very inspiring: http://www.edwardtufte.com/tufte/ His favourite data visualisation is from 1869! Or what about John Snow’s Cholera map? http://www.york.ac.uk/depts/maths/histstat/snow_map.htm

And for good luck here’s an assorted list of visualisation tutorials.

The Times’ Jonathan Richards

I’d say a couple of blogs.

Others that spring to mind are:

If people want more specific advice, tell them to come to the next London Hack/Hackers and track me down!

The Guardian’s Charles Arthur:

Obvious thing: find a story that will be best told through numbers. (I’m thinking about quizzing my local council about the effects of stopping free swimming for children. Obvious way forward: get numbers for number of children swimming before, during and after free swimming offer.)

If someone already has the skills for data journalism (which I’d put at (1) understanding statistics and relevance (2) understanding how to manipulate data (3) understanding how to make the data visual) the key, I’d say, is always being able to spot a story that can be told through data – and only makes sense that way, and where being able to manipulate the data is key to extracting the story. It’s like interviewing the data. Good interviewers know how to get what they want out from the conversation. Ditto good data journalists and their data.

The New York Times’ Aron Pilhofer:

I would start small, and start with something you already know and already do. And always, always, always remember that the goal here is journalism. There is a tendency to focus too much on the skills for the sake of skills, and not enough on how those skills help enable you to do better journalism. Be pragmatic about it, and resist the tendency to think you need to know everything about the techy stuff before you do anything — nothing could be further from the truth.

Less abstractly, I would start out learning some basic computer-assisted reporting skills and then moving from there as your interests/needs dictate. A lot of people see the programmer/journalism thing as distinct from computer-assisted reporting, but I don’t. I see it as a continuum. I see CAR as a “gateway drug” of sorts: Once you start working with small data sets using tools like Excel, Access, MySQL, etc., you’ll eventually hit limits of what you can do with macros and SQL.

Soon enough, you’ll want to be able to script certain things. You’ll want to get data from the web. You’ll want to do things you can only do using some kind of scripting language, and so it begins.

But again, the place to start isn’t thinking about all these technologies. The place to start is thinking about how these technologies can enable you to tell stories you otherwise would never be able to tell otherwise. And you should start small. Look for little things to start, and go from there.

July 22 2010

10:18

Are you on the j-list? The leading innovators in journalism and media in 2010

Recent industry lists ranking the great and good in journalism and the media fell a bit short of the mark for Journalism.co.uk. Where were the online innovators? Where were the journalists on the ground outside of the executives’ offices?

So we’ve compiled our own rundown listing those people we think are helping to build the future of journalism and the news media.

Some important points to note:

  • There are no rankings to this list – those included are from such varied areas of work it seemed pointless;
  • We will have missed some people out – let us know in the comments below who you are working with that should be included;
  • We’ve listed groups as well as individuals – with individuals we hope you’ll see them as representing a wider team of people, who have worked together on something great;
  • And it’s not limited to 50 or 100 – we’ll see where it takes us…

So here’s the first batch. There’s a Twitter list of those included so far at this link and more will be added in the coming weeks.

Click on the ‘more’ link after these five to to see the full list.

Iain Overton

The Bureau of Investigative Journalism is both a return to supporting classic, investigative journalism and an experiment in collaborative working and new business models for heavyweight reporting. Overseen by managing editor Iain Overton, the bureau is working with news organisations across a range of media and investing efforts in data mining and new business models.

Will Perrin/TalkAboutLocal

Will Perrin and his team at Talk About Local are changing the local media landscape one website at a time. Through training workshops and community groups, TAL is helping citizens have a voice online – but also encouraging new growth in hyperlocal news. It all began with Kings Cross Environment, the local site that Perrin set up himself.

James Hatts, SE1

There’s a lot of hype about hyperlocal as a future model for local news – and in James Hatts’ case it’s justified. Hatts was still a student when London SE1, which covers London’s Bermondsey and Southwark areas, started. It’s now more than 10 years old and is a great example of quality news and information for the community with an innovative approach to making money to support that goal.

Marc Reeves

The former Birmingham Post editor makes our list because of his straight-talking, forward-thinking attitude to business journalism. Having recently helped launched a new edition of successful online business news network TheBusinessDesk.com for the West Midlands, Reeves views on niche news and the role of editorial in the commercial life of a news organisation are not to be missed.

Stewart Kirkpatrick

The former editor of Scotsman.com, Kirkpatrick launched a new newspaper for Scotland in January this year. With 200,000 unique users in its first month, you wouldn’t bet against the Caledonian Mercury and Kirkpatrick’s innovative approach to creating a truly complimentary print and online newspaper with a strong and independent identity.

Martin Moore

As director of the Media Standards Trust, Martin Moore has many responsibilities and aims – but near the top of that list is more transparency for public data online and for the metadata associated with news. His work on the hNews project with the Associated Press in particular is something to keep an eye on.

Charlie Beckett

As director of journalism and society think tank POLIS and a former broadcast journalist, Charlie Beckett is a leading exponent of networked journalism: the idea that journalists can work together across organisations, media and with non-journalists to produced news. His research and writings on this model for journalism show a new way of thinking about the role of the journalist and reader in the production and distribution of news.

Paul Egglestone

Egglestone is digital director at the School of Journalism Media and Communication at the University of Central Lancashire. He’s been instrumental in the innovative Meld and Bespoke schemes that run projects from multimedia training for freelance journalists to work aimed at improving local community relationships and living spaces through hyperlocal news, mapping and social media projects. Image courtesy of Andy Dickinson

Pierre Haski

The former Liberation journalist and colleagues from the title are busy carving out a model for successful, heavyweight and independent journalism online with Rue89. The site is not afraid to innovate when it comes to revenue models and crucially not afraid to kill off parts of its network if they’re not working. A new print offshoot has just been launched and with or without this new source of revenue Haski expects the venture to move into profit next year.

Jason Mawer/Oxbury Media

Taking something traditional – the parish newsletter – and seeing the potential of community-interest publications when combined with cutting edge technology – Fwix – is Oxbury Media‘s game. The agency is focused on getting hyperlocal and community media networked, particularly in terms of advertising. Currently involved with more than 10,000 titles, Oxbury Media has the opportunity to create a hyperlocal powerhouse.

Andrew Sparrow

Senior political correspondent for Guardian.co.uk, Andrew Sparrow showed us how liveblogging was done during the 2010 UK election campaigns: on a typical day the blog got between 100,000 and 150,000 page views, rising to two million on election night. Sparrow’s ability to report, summarise and aggregate material for the site made it a must-read and has rewritten the rulebook for online political coverage.

Alison Gow

Alison is executive editor for digital at the Liverpool Daily Post and Liverpool Echo. Gow makes the list not only for her work with those titles but also for her openness to new ideas, technologies and experimentation with journalism on the web. Her personal blog Headlines and Deadlines shares her thoughts on these developments and offers important insights into the changing role of local media and its relationship with a community online and offline.

Ben Goldacre

The author of Bad Science and esteemed science writer is as influential for his loyal following – you should see the traffic spikes when he links to anything on Journalism.co.uk – as he is for his views on science journalism and transparency online. As a doctor and health professional his views on journalism come from a different perspective and can offer a necessary antidote to the “media bubble”. Image courtesy of psd on Flickr

Jo Wadsworth

Web editor for the Brighton Argus, Jo Wadsworth is a digital journalist who remembers the importance of offline as well as online networking. Her work on building a team of community correspondents for the paper and her efforts to help with training and mentoring for non-journalist readers wanting to get involved with the website amongst other things show the scope and rewards that a local newspaper website can bring.

Alberto NardeliiAlberto Nardelli/Tweetminster

Alberto Nardelli knows a thing or two about Twitter and social networks – and he’s willing to share it with media and non-media partners to create a better service for users of his site Tweetminster. His and the Tweetminster team’s work shows the power of tracking real-time, social media information, while doing the filtering dirty work for us. It’s a tool for journalists and an example of how new ideas in the digital media world can take hold.

Sarah Hartley/Guardian Local

It’s early days for the Guardian’s venture into hyperlocal ‘beatblogging’ and its architect Sarah Hartley, but the signs are positive. The three existing sites offer a model for how ‘big media’ can do local, making use of third-party websites and dedicated to the online and offline audiences for their patch.

David Cohn/Spot.Us

David Cohn is the founder of Spot.Us, a model for ‘crowdfunded’, investigative journalism. Cohn has carefully built the pitching and funding model, as well as relationships with news media to create partnerships for distributing the finished articles. Spot.Us has grown out of its San Francisco base with a new venture in Los Angeles and even a project built to its model in Australia. Image courtesy of Inju on Flickr

Tom Steinberg/mySociety

Director and founder of non-profit, open source organisation mySociety, Tom Steinberg works to improve the public’s understanding of politics, government and democracy. From campaign literature site the Straight Choice – to FOI request site WhatDoTheyKnow, Steinberg helps create tools for journalists and ways for them to play a part in making a better society. Image courtesy of Tom Steinberg on Flickr

Heather Brooke

From her Freedom of Information rights campaigning to her work on MPs’ expenses, no list of journalism innovators would be complete without Heather Brooke. She’s both a classic investigative journalist with the nose and determination to get a story and someone who knows the best tools to challenge the data and information restrictions that can affect her line of work.

Juan Senor/Innovation Media Consulting

A fantastic speaker on news and magazines, in particular the notions of design and newsroom structure, Senor’s work with Innovation Media Consulting is perhaps best seen through Portuguese microformat newspaper i, a visually stunning and innovative take on what a newspaper or news magazine should look like.

Paul Bradshaw

Founder of the Online Journalism Blog Paul Bradshaw will soon be leaving his online journalism teaching post at Birmingham City University – but that doesn’t mean he’ll be resting on his laurels. Through his teaching, blogging, books and Help Me Investigate site, Paul’s research and insight into new opportunities for journalists, whether that’s tools, collaborations or entrepreneurship, are not to be missed.

Jack of Kent

A.k.a. David Allen Green. A shining example of specialist writing for the web and why bloggers shouldn’t all be tarred with the hobbyist “in their pyjamas” brush. Green’s dedication to his subject matter, his ability to distill often complex or jargon-riddled legal concepts into plain English and give the issues context should be a lesson to all specialist journalists.

James Fryer and Michelle Byrne/SoGlos.com

Online entertainment and arts magazine for Gloucestershire SoGlos.com prides itself on high standards editorially and innovation commercially. The site has embraced a start-up mentality for the news business and is quick to react to new business opportunities sparked by its editorial quality. What’s more the site is developing its model as a potential franchise for elsewhere in the UK, licensing for which would go back into supporting SoGlos.com.

Matt McAlister/Guardian’s Open Platform

Matt McAlister is head of the Guardian’s Developer Network and the driving force behind the Guardian’s Open Platform initiative, which allows third-party developers to build applications using the Guardian’s content and data. The platform has now launched commercially – a revenue stream for journalism from a truly digital age. Image courtesy of pigsaw on Flickr

Aron Pilhofer

Aron Pilhofer and his team at the New York Times are pioneers in data journalism – both creating interactives and visualisations to accompany NYTimes content and opening up the title’s own data to third parties. Image courtesy of Institutt for journalistikk on Flickr

Adam Tinworth

The man involved with most, if not all, things with a social and digital media twist at Reed Business Information, Adam Tinworth is pushing innovation in multimedia journalism and distribution within a big publishing house. He documents his work to help other journalists learn from his experiences – whether that’s reviewing equipment or explaining a common problem – and his liveblogging abilities are something to behold!

Joanna Geary

As part of the Times’ web development team, Joanna Geary is part of one of the biggest experiments in UK journalism. But she’s also a journalist clearly thinking about the future of journalism and news as a business and profession – whether that’s through her own use of new communication tools and technology or in setting up Ruby in the Pub, a meet-up for journalists and programmers.

Similar Posts:



May 21 2010

13:00

The programmer majored in English: A fascinating study of the NYT’s Interactive News unit

At the University of Texas’s International Symposium on Online Journalism conference last month, a series of academics presented papers on the future of news. There’s great stuff, including (Lab contributor) Seth Lewis’s analysis of the professional and participatory logic of the Knight News Challenge and (Lab contributor) C.W. Anderson’s argument for a more holistic approach to academic analysis of news structures.

One that we, at least, found particularly compelling: Cindy Royal’s study of the New York Times’ vaunted Interactive Newsroom Technologies unit. (Think of it as the academic, ethnographic version of “The Renegades at the New York Times,” last year’s New York magazine profile of the team.)

Royal, a Texas State University assistant professor who focuses on digital media and culture, spent a week with the team in an effort to “gain a systematic understanding of the role of technology in the ever-changing newsroom, driven by the opportunities and challenges introduced by the Internet.” The resulting paper examined the group of eleven guys (they’ve since added one gal) widely recognized to be the vanguard of the hacker-journalist movement — and put fascinating anecdotal data behind team leader Aron Pilhofer’s insistence that the group’s mandate is editorial as much as technological.

Though the full paper (PDF) is well worth a read, here’s the slide deck:

One of Royal’s more intriguing findings: Many members of the team don’t have traditional education in programming. “Undergraduate degrees were varied in Art & Design, Anthropology, English, History, Urban Planning, Rhetoric and of course, Journalism. Only two had done extensive educational preparation in a computer-oriented field, and another two had received technical-oriented minors in support of liberal arts degrees.” And their hacking skills? Largely self-taught. “Most had either taken up computing on their own at a very young age or had gravitated toward it due to necessity for a specific job.”

The core unifying quality Royal found among the staff wasn’t a specific programming skill or even a set of those skills. It was passion. Curiosity. Enjoyment of the work and openness to new processes and approaches. “More than half our team members didn’t know Ruby on Rails [one of the Times' core web framework technologies] before they started here,” one member notes. (Team member commentary throughout Royal’s paper is anonymous.) “It’s really more about the concepts inherent in the language,” says another.

However: The editorial part — “getting” the journalism — is also key. (“When I was hired, they definitely cared about how much I was interested in journalism and what my ideas were for projects.”) As is the collaboration part — the institutional realization of the open-source-centric approach the team takes toward its work. The department, Royal notes, “was founded to reduce bureaucracy and introduce flexibility in the process of creating each project, so the group could react more like a reporting team than a support organization.” That’s a goal that the Times is still actively pursuing — most recently, of course, with its decision to move Jill Abramson from her managing editor post to allow her to focus intensively, if temporarily, on digital operations. Abramson will likely be spending at least some of that time in the same way Royal did: studying and learning from the paper’s Interactive News unit.

“The culture of technology is different than that of journalism,” Royal notes. “They each carry different ideas about objectivity, transparency, sharing of information and performance. By merging these cultures, what emerges in terms of a hybrid dynamic? How do the actors, their backgrounds and training, their processes and the organizational structure affect the products they deliver?”

“The Journalist as Programmer” builds on the work of academics like Michael Schudson and Dan Berkowitz, taking an ethnographic (and, more broadly, sociological) approach to news systems — under the logic, as Royal writes, that “news products and ultimate change are not the result of one force or set of forces, but a complete system that encompasses the organization, individual actors and the culture that surrounds them.”

As she explained it to me: “I just wanted to learn about the processes, and who these people were. I knew that they had to be a unique department with unique skills and backgrounds. Because the average programmer really doesn’t have much interest in traditional journalism or storytelling. And the average journalist doesn’t know a lot about programming. So who are these hybrid people — and where did they come from, and how did they learn this stuff?”

After all, “programming and data is journalism,” Royal says. “And it can be practiced in such a way that it can create interaction, user engagement, and more information in terms of seeking the truth. Especially when you talk about Freedom of Information access to government data — if the public can have access to that in a way that makes sense to them, or in a way that’s easy for them to use, then that’s just really powerful.”

May 11 2010

19:26

Crowdsourcing goes global: The NYT’s “Moment in Time”

Visit the New York Times’ Lens blog today, and you’ll find an image slightly different from the high-quality photographs that normally populate the outlet: a spinning globe, highly stylized, its surface popping with piles of pictures.

“Here it is,” the site announces: “Earth, covered by stacks of thousands of virtual photographs, corresponding in location to where they were taken by Lens readers at one ‘Moment in Time.’”

The moment in question? Sunday before last, at 11 a.m. EST — the time when the paper asked its users to take photos in an image-gathering project that was equal parts collaborative art and crowdsourcing on steroids.

The invitation to participate:

Attention: everyone with a camera, amateur or pro. Please join us on Sunday, May 2, at 15:00 (U.T.C./G.M.T.), as thousands of photographers simultaneously record “A Moment in Time.” The idea is to create an international mosaic, an astonishingly varied gallery of images that are cemented together by the common element of time.

The feature’s editors — James Estrin, who conceived of it and oversaw its implementation, David Dunlap (who wrote much of the project’s witty instructions and textual updates), and Josh Haner (who, along with Aron Pilhofer, Jacqui Maher, and the Times’ vaunted interactive news team, helped facilitate the behind-the-scenes tech masterminding) — expected that the blog’s invitation would elicit a couple thousand photos. And that organizing and presenting them would take a couple days.

But: they received, in the end, over 10,000 user-generated submissions, Estrin told me — from photographers amateur and professional, from around the world. (Actually, they received over 13,000 at first — then removed some from the pile when it became obvious, whether by lighting or timestamp or other indicators, that the photos weren’t, in fact, taken at the allotted time.) And the process of organizing all that content took well over a week — a fact about which Dunlap repeatedly (and wittily) apologized. In a post entitled “Patience,” Dunlap wrote, “We have to ask it of you once again. Our interactive ‘Moment in Time’ gallery isn’t ready yet.”

We were bold — O.K., maybe a bit foolhardy — to think we would only need two or three days to prepare a complex three-dimensional computer display showing more than 13,000 photographs from around the world; organized geographically and searchable by topic, with captions and photo credits as coherent and accurate as possible. It’s obviously taken us longer than that and will almost certainly take us a day or two more. (We’re getting out of the prediction business for now.) We simply hadn’t had the experience of dealing with such numbers before. The popular “Documenting the Decade” project, for example, drew only 2,769 submissions.

Please bear with us while we take the time we need to get it right…Be assured that we’ll post as soon as we can. And don’t think for a moment that we’ve been using this time to weed out pictures of cats, dogs, tulips and coffee cups. There’ll be plenty.

And plenty, indeed, there are. The pictures (sortable by fellow-user recommendation, but also by Community, Arts and Entertainment, Family, Money and the Economy, Nature and the Environment, Play, Religion, Social Issues, Work, and — my personal favorite — Other) are, in general, high-quality and compelling. There’s the predictable fare — meditative close-ups of flowers, pictures of cats — but there’s also more surprising and evocative stuff: a grandmother and grandson in Bangladesh, a couple lounging in bed (caption: “My boyfriend and I planned a big adventure for Sunday morning, but we both ended up sick”), an Amish horse-and-buggy (caption: “We live in a part of Pennsylvania where wifi and No-fi coexist pretty well”).

“A Moment in Time” (and, with that, I’ll try not to use the project’s name again in this post — so that you won’t, as I did, get something unfortunate stuck in your head as a result of repeated exposure) is aesthetically compelling and socially revealing. It also suggests the Times’ openness to exploring avenues of documentation and expression that don’t fall into the neat categories of traditional journalism.

“I was driving to work, and it just hit me: Okay, we’ll get thousands of people around the world to take a photograph at the same moment,” Estrin told me of the project’s inception. And the goals of the project mix the artistic and the journalistic to the point that it’s difficult to tell where the journalism ends and the aesthetic begins: first, to produce a valuable document, one that records — to an extent — a particular moment as it’s lived out across the world. Second, from the social media angle, to facilitate the sense of shared identity that comes with “doing things as a community around the world — doing the same things at the same time.” Ultimately, Estrin says, the project was about “the intentional profundity of the moment.”

Whether the feature represents journalism, or something more, or something less, the reaction it’s received from Times users offers a lesson for news organizations chasing after the holy grail-and-sometimes-white-whale that is reader engagement. If the project’s participatory outpouring is any indication, it has struck a nerve with Times users. In a good way. And the ‘why’ in that is instructive. The project involved an assignment with specific instructions; users weren’t merely being asked for something — a hazy invitation to contribute — but to provide something specific, and easily attainable. And to provide something, moreover, that would be part of a project with a clearly defined, but also inspiring, purpose: to document the world, via its many corners, at a particular moment. That mix of depth and breadth, of pragmatism and idealism, can be a potent incitement to action — a fact evidenced by the thousands of images currently blanketing the globe over at the Lens blog.

February 05 2010

16:24

Live webcast from NYC: crowdsourcing and journalism

Via paidContent, we see that a live conference from the New York Times building is being webcast right now (not sure for how much longer), with a stellar line-up: Brian Stelter, media reporter & Media Decoder blogger, the New York Times (moderator) with Aron Pilhofer, editor, interactive news technology, the New York Times; Andy Carvin, senior social media strategist, National Public Radio; Amanda Michel, editor, distributed reporting, ProPublica; Jay Rosen, professor, New York University; and Joaquin Alvarado, senior VP, digital innovation, American Public Media.

Live Webcast Happening Now: Crowdsourcing For Journalists, at NYT Building | paidContent.

Watch live streaming video from smw_newyork at livestream.com

Similar Posts:



Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl