Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 17 2012

16:07

Metrics, metrics everywhere: How do we measure the impact of journalism?

If democracy would be poorer without journalism, then journalism must have some effect. Can we measure those effects in some way? While most news organizations already watch the numbers that translate into money (such as audience size and pageviews), the profession is just beginning to consider metrics for the real value of its work.

That’s why the recent announcement of a Knight-Mozilla Fellowship at The New York Times on “finding the right metric for news” is an exciting moment. A major newsroom is publicly asking the question: How do we measure the impact of our work? Not the economic value, but the democratic value. The Times’ Aaron Pilhofer writes:

The metrics newsrooms have traditionally used tended to be fairly imprecise: Did a law change? Did the bad guy go to jail? Were dangers revealed? Were lives saved? Or least significant of all, did it win an award?

But the math changes in the digital environment. We are awash in metrics, and we have the ability to engage with readers at scale in ways that would have been impossible (or impossibly expensive) in an analog world.

The problem now is figuring out which data to pay attention to and which to ignore.

Evaluating the impact of journalism is a maddeningly difficult task. To begin with, there’s no single definition of what journalism is. It’s also very hard to track what happens to a story once it is released into the wild, and even harder to know for sure if any particular change was really caused by that story. It may not even be possible to find a quantifiable something to count, because each story might be its own special case. But it’s almost certainly possible to do better than nothing.

The idea of tracking the effects of journalism is old, beginning in discussions of the newly professionalized press in the early 20th century and flowering in the “agenda-setting” research of the 1970s. What is new is the possibility of cheap, widespread, data-driven analysis down to the level of the individual user and story, and the idea of using this data for managing a newsroom. The challenge, as Pilhofer put it so well, is figuring out which data, and how a newsroom could use that data in a meaningful way.

What are we trying to measure and why?

Metrics are powerful tools for insight and decision-making. But they are not ends in themselves because they will never exactly represent what is important. That’s why the first step in choosing metrics is to articulate what you want to measure, regardless of whether or not there’s an easy way to measure it. Choosing metrics poorly, or misunderstanding their limitations, can make things worse. Metrics are just proxies for our real goals — sometimes quite poor proxies.

An analytics product such as Chartbeat produces reams of data: pageviews, unique users, and more. News organizations reliant on advertising or user subscriptions must pay attention to these numbers because they’re tied to revenue — but it’s less clear how they might be relevant editorially.

Consider pageviews. That single number is a combination of many causes and effects: promotional success, headline clickability, viral spread, audience demand for the information, and finally, the number of people who might be slightly better informed after viewing a story. Each of these components might be used to make better editorial choices — such as increasing promotion of an important story, choosing what to report on next, or evaluating whether a story really changed anything. But it can be hard to disentangle the factors. The number of times a story is viewed is a complex, mixed signal.

It’s also possible to try to get at impact through “engagement” metrics, perhaps derived from social media data such as the number of times a story is shared. Josh Stearns has a good summary of recent reports on measuring engagement. But though it’s certainly related, engagement isn’t the same as impact. Again, the question comes down to: Why would we want to see this number increase? What would it say about the ultimate effects of your journalism on the world?

As a profession, journalism rarely considers its impact directly. There’s a good recent exception: a series of public media “impact summits” held in 2010, which identified five key needs for journalistic impact measurement. The last of these needs nails the problem with almost all existing analytics tools:

While many Summit attendees are using commercial tools and services to track reach, engagement and relevance, the usefulness of these tools in this arena is limited by their focus on delivering audiences to advertisers. Public interest media makers want to know how users are applying news and information in their personal and civic lives, not just whether they’re purchasing something as a result of exposure to a product.

Or as Ethan Zuckerman puts it in his own smart post on metrics and civic impact, ”measuring how many people read a story is something any web administrator should be able to do. Audience doesn’t necessarily equal impact.” Not only that, but it might not always be the case that a larger audience is better. For some stories, getting them in front of particular people at particular times might be more important.

Measuring audience knowledge

Pre-Internet, there was usually no way to know what happened to a story after it was published, and the question seems to have been mostly ignored for a very long time. Asking about impact gets us to the idea that the journalistic task might not be complete until a story changes something in the thoughts or actions of the user.

If journalism is supposed to inform, then one simple impact metric would ask: Does the audience know the things that are in this story? This is an answerable question. A survey during the 2010 U.S. mid-term elections showed that a large fraction of voters were misinformed about basic issues, such as expert consensus on climate change or the predicted costs of the recently passed healthcare bill. Though coverage of the study focused on the fact that Fox News viewers scored worse than others, that missed the point: No news source came out particularly well.

In one of the most limited, narrow senses of what journalism is supposed to do — inform voters about key election issues — American journalism failed in 2010. Or perhaps it actually did better than in 2008 — without comparable metrics, we’ll never know.

While newsrooms typically see themselves in the business of story creation, an organization committed to informing, not just publishing, would have to operate somewhat differently. Having an audience means having the ability to direct attention, and an editor might choose to continue to direct attention to something important even it’s “old news”; if someone doesn’t know it, it’s still new news to them. Journalists will also have to understand how and when people change their beliefs, because information doesn’t necessarily change minds.

I’m not arguing that every news organization should get into the business of monitoring the state of public knowledge. This is only one of many possible ways to define impact; it might only make sense for certain stories, and to do it routinely we’d need good and cheap substitutes for large public surveys. But I find it instructive to work through what would be required. The point is to define journalistic success based on what the user does, not the publisher.

Other fields have impact metrics too

Measuring impact is hard. The ultimate effects on belief and action will mostly be invisible to the newsroom, and so tangled in the web of society that it will be impossible to say for sure that it was journalism that caused any particular effect. But neither is the situation hopeless, because we really can learn things from the numbers we can get. Several other fields have been grappling with the tricky problems of diverse, indirect, not-necessarily-quantifiable impact for quite some time.

Academics wish to know the effect of their publications, just as journalists do, and the academic publishing field has long had metrics such citation count and journal impact factor. But the Internet has upset the traditional scheme of things, leading to attempts to formulate wider ranging, web-inclusive measures of impact such as Altmetrics or the article-level metrics of the Public Library of Science. Both combine a variety of data, including social media.

Social science researchers are interested not only in the academic influence of their work, but its effects on policy and practice. They face many of the same difficulties as journalists do in evaluating their work: unobservable effects, long timelines, complicated causality. Helpfully, lots of smart people have been working on the problem of understanding when social research changes social reality. Recent work includes the payback framework which looks at benefits from every stage in the lifecycle of research, from intangibles such as increasing the human store of knowledge, to concrete changes in what users do after they’ve been informed.

NGOs and philanthropic organizations of all types also use effectiveness metrics, from soup kitchens to international aid. A research project at Stanford University is looking at the use and diversity of metrics in this sector. We are also seeing new types of ventures designed to produce both social change and financial return, such as social impact bonds. The payout on a social impact bond is contractually tied to an impact metric, sometimes measured as a “social return on investment.”

Data beyond numbers

Counting the countable because the countable can be easily counted renders impact illegitimate.

- John Brewer, “The impact of impact

Numbers are helpful because they allow standard comparisons and comparative experiments. (Did writing that explainer increase the demand for the spot stories? Did investigating how the zoning issue is tied to developer profits spark a social media conversation?) Numbers can be also compared at different times, which gives us a way to tell if we’re doing better or worse than before, and by how much. Dividing impact by cost gives measures of efficiency, which can lead to better use of journalistic resources.

But not everything can be counted. Some events are just too rare to provide reliable comparisons — how many times last month did your newsroom get a corrupt official fired? Some effects are maddeningly hard to pin down, such as “increased awareness” or “political pressure.” And very often, attributing cause is hopeless. Did a company change its tune because of an informed and vocal public, or did an internal report influence key decision makers?

Fortunately, not all data is numbers. Do you think that story contributed to better legislation? Write a note explaining why! Did you get a flood of positive comments on a particular article? Save them! Not every effect needs to be expressed in numbers, and a variety of fields are coming to the conclusion that narrative descriptions are equally valuable. This is still data, but it’s qualitative (stories) instead of quantitative (numbers). It includes comments, reactions, repercussions, later developments on the story, unique events, related interviews, and many other things that are potentially significant but not easily categorizable. The important thing is to collect this information reliably and systematically, or you won’t be able to make comparisons in the future. (My fellow geeks may here be interested in the various flavors of qualitative data analysis.)

Qualitative data is particularly important when you’re not quite sure what you should be looking for. With the right kind, you can start to look for the patterns that might tell you what you should be counting,

Metrics for better journalism

Can the use of metrics make journalism better? If we can find metrics that show us when “better” happens, then yes, almost by definition. But in truth we know almost nothing about how to do this.

The first challenge may be a shift in thinking, as measuring the effect of journalism is a radical idea. The dominant professional ethos has often been uncomfortable with the idea of having any effect at all, fearing “advocacy” or “activism.” While it’s sometimes relevant to ask about the political choices in an act of journalism, the idea of complete neutrality is a blatant contradiction if journalism is important to democracy. Then there is the assumption, long invisible, that news organizations have done their job when a story is published. That stops far short of the user, and confuses output with effect.

The practical challenges are equally daunting. Some data, like web analytics, is easy to collect but doesn’t necessarily coincide with what a news organization ultimately values. And some things can’t really be counted. But they can still be considered. Ideally, a newsroom would have an integrated database connecting each story to both quantitative and qualitative indicators of impact: notes on what happened after the story was published, plus automatically collected analytics, comments, inbound links, social media discussion, and other reactions. With that sort of extensive data set, we stand a chance of figuring out not only what the journalism did, but how best to evaluate it in the future. But nothing so elaborate is necessary to get started. Every newsroom has some sort of content analytics, and qualitative effects can be tracked with nothing more than notes in a spreadsheet.

Most importantly, we need to keep asking: Why are we doing this? Sometimes, as I pass someone on the street, I ask myself if the work I am doing will ever have any effect on their life — and if so, what? It’s impossible to evaluate impact if you don’t know what you want to accomplish.

February 03 2012

14:00

Video Volunteers Makes an Impact in India with Incentives for Media Makers

As part of a 4-part series, Video Volunteers is sharing what we've done over the last year, our experiences, and what we've learned. Part 1, which you can read here, was a basic introduction to IndiaUnheard, our flagship rural feature service.

Part 2 outlines new ideas we implemented into our training programs in 2011. For instance, we set incentives for our community correspondents in India. This triggered a series of valuable positive changes for the communities concerned.

videovolun.jpg

Incentives work

In October, we held an advanced training session for our strongest community correspondents which focused on activism and getting "impact." (To us, "impact" means that the community correspondent is able to resolve the problem the video addresses.) We told them we had decided to incentivize impact.

They would be paid 5,000 rupees (approximately $100) -- more than twice the regular stipend -- for an "impact video," which means they would make a video; show it locally to get the issue solved; and make another documenting that process and proving the impact actually took place -- and for that second video, they would get the 5,000 rupees.

Some amazing impacts happened this year: In Orissa, illegal timber smugglers were stopped by local villagers. In Mumbai, a factory was forced to clean its pollution. In Assam, politicians released desperately needed water to villagers. Rather than be turned away, Dalit children got help in village child centers. Expectant mothers received folic acid which had previously been withheld. And, in one area, some 600 women for the first time were paid minimum wage.

These are just some of our stories. You can watch our impact videos here.

Recruitment is challenging

Our goal is to have 645 community correspondents, or one in every district of India. We had to think hard about how we could quickly scale up if we needed to.

Our first two rounds of recruitment for IndiaUnheard was through our existing network. We sent emails asking people to nominate someone from the villages they work in and then to help them fill out the online application. We got a few hundred applications that way and thought we could keep doing it like that. But when we tried for the third round, the number of eligible applications was low (though the overall applications were higher than previous years). Maybe we had tapped out our existing network.

So how could we quickly scale up? Possibly through big non-profit institutions (like microfinance). We are reaching out to them now.

Choose the right geographies

For our first two rounds, our goal was to get one or two people in every state. Now that we've almost done that, we're going to focus on key regions we feel are "unheard."

Last month, we took about 20 new community correspondents from Jharkhand. We chose Jharkhand because it is part of the so-called Red Corridor where there is a Maoist insurgency taking place. In the future, we'll look at the North East where other separatist movements are taking place, and Kashmir. (Those two areas were out of our budget this year.)

My colleagues Kamini Menon and Stalin K. spent two weeks traveling around this area meeting the activists and doing the recruitment; this live recruitment is making recruitment easier and will also make retention higher because the 13 new correspondents, each representing one district in the same state, can support each other.

Partnerships are challenging

Two years ago, when our Community Video Units were our primary focus, we felt that we could scale this network through investments from NGOs (non-governmental organizations). We've realized that co-ownership is very difficult and can at times be a hindrance to innovation.

We now feel that we can scale better through partnerships with the mainstream media, rather than NGOs, and so for that reason, a huge focus this year has been on ensuring the content can work for both a local community and outside audience.

From our Community Video Units, we've learned a few other things: One is that a model where people are paid only when they perform is better than the Community Video Units model, in which the six or seven people who work together on a film are given a monthly wage.

Women produce more

Two observations we are thrilled to see: Women produce more, and retention is higher with the underprivileged. It suggests that journalism really is an appropriate livelihood for the poor. We started to see that with online recruitment, we had selected certain people whose incomes were clearly higher than they had told us on the phone. Live recruitment in extremely remote areas of Jharkhand will help get the correct balance.

The amount they can produce is low

We ask correspondents to produce two videos a month. They produce on average one or less. One reason is that being a journalist is difficult; it takes a lot of personal courage to confront officials and ask people private questions. They can spend a whole day on a bus getting to an official who then won't see them. They have to take care of their families, too.

I learned this year about the concept of "businesses in a box" and franchises, such as rural women selling solar lamps or soap sachets, and I discovered that we should make the process as simple and step-by-step as possible.

But journalism is simply harder than selling soap. We also ask them to produce tough stories that they have to research and which take time, unlike stringers, who are told to "go film this event and send us the footage." This means that our "cost per story" is higher than we would like. But we also aren't taking huge steps to increase their productivity right now, because we don't yet have enough buyers to support a huge level of production.

Choose the right people to train

The fact that we put such effort in selecting interesting people to train is a huge asset for us. Our new batch of correspondents includes people whose personal stories are, in some ways, the story. We have two boys from Kashmir who have seen the insurgency; a young man whose sister was the first dowry death in his state; women who have experienced sexual violence and have the courage to speak about it; and a good representation from the North East, including one young man who got the first footage of a particular insurgent camp because he's from that area.

In our training, we teach them that their power as a community correspondent will come through using their personal experiences and connections to the issues. This is what they have that no professional, no outsider, can ever replicate. They learn that they themselves must speak out, and speak personally, if they want their communities to do so, too.

Good training is not necessarily scalable. (That's another thing that we learned in 2011 -- that the training aspects of our work will always be expensive because education doesn't have a lot of economies of scale.) But it is the most valuable investment.

You can watch a video from our trainings here:

Stay tuned for Part 3 of this series, which will focus on our modes of online and offline distribution and our experience with earning income from partners and the mainstream media.

February 09 2011

14:00

3 Key Topics for the NetSquared Community: Part 3, Network Narrative

Over the last two weeks, we have posted parts 1 and 2 in a 3-part series, sharing some of our observations and planning concepts, and hoping to gather feedback and ideas from you. The first part in the series focused on Local and Gloal and the second highlighted opportunities to Expand our Impact. This week, we want to examine the ideas and framing for a Network Narrative - a topic we really think you can help with! As we share our early thinking about these areas of our work, we hope you’ll help to shape our thinking and direction by sharing your ideas, feedback and questions in the comments, or directly with us at net2@techsoup.org

Creating a Compelling Narrative

There’s lots going on and lots to talk about - whether it’s Project ideas that emerge and change the world, or Local groups that create the first opportunity to share and collaborate in diverse regions around the world. So, how do we pull it all together into a compelling narrative? One for funders vs one for techies, one for activists and one for organizations, and beyond? What’s the story that supports our work? And, from a strategic development perspective, maybe we need to further explore the difference between the overarching narrative and the various stories that support it and match the different groups within the network. Your story is the one we want to tell and we would love to hear how you see the NetSquared programs helping you change the world!

We are so thankful to have members willing to make time to share, ask questions, and dream with us. And we are so thankful to community members like you who share your ideas here! We are looking forward to continuing this conversation and can’t wait to see what ideas you share.

Some questions to get you started:

  • What is the story you see of this sector and your work?
  • How can we capture a compelling narrative that empowers you to get involved?
  • How would you tell the NetSquared story - how are the community-driven programs helping you change the world?

June 11 2010

14:00

Bill Buzenberg on Center for Public Integrity’s aim to “catalyze impact,” fundraise in a competitive field

Nonprofit news organizations may be all the rage, but they’re not a new animal. Last week, 20-year-old Center for Public Integrity announced a round of recent hires. Since January, CPI has brought on nine new journalists, including reporters, editors and a database expert. For a team of about 50, it’s a significant expansion.

New hires include John Solomon, long-time investigative reporter and the former executive editor of The Washington Times, as “reporter in residence,” Julie Vorman, former Reuters Washington editor as deputy managing editor, and Peter Stone of National Journal.

CPI is known for its investigative projects that appear in major print and broadcast outlets. A recent year-long project on campus sexual assault was picked up by outlets around the country, reaching what CPI said was an audience of 40 million. Last week CPI partnered with The New York Times in publishing Coast Guard logs suggesting authorities knew about the severity of the BP oil spill much sooner than announced. The logs were also published on the Center’s website and were widely used by newspapers across the country.

I spoke with Bill Buzenberg, CPI’s executive director about his expansion and the organization more broadly. Buzenberg says CPI does not fall on one side of the “impact v. audience” question, but acknowledged that their latest strategic plan emphasizes the organization’s desire to “catalyze impact.” He thinks it’s an exciting time for nonprofit journalism, but sees challenges in an increasingly crowded fundraising field. Here’s a lightly edited version of our conversation.

Is this a new team you’re hiring for a specific project or a general expansion of your editorial capacity?

It’s a general expansion of our editorial capacity. We have a very strong push on: The top major newspapers are all using our content, even online at The Huffington Post. The work is being used more than ever. Lots of places want to partner with us. There is so much watchdog work to be done.

Some nonprofits, like MinnPost, are focused on drawing a regular audience to their website. Others are looking for other outlets to pick up their work and reach an audience that way. Could you talk about where Center for Public Integrity fits?

I think from the beginning the Center has had the same trajectory. In the beginning, actually, it did reports, held news conferences and handed out those reports, and they were reported on by other publications. That is still part of our operation. We very much do reporting work — sometimes it’s a year, sometimes it’s months, sometimes its a few days — and we make it available to other organizations very broadly. And it gets used very, very broadly.

One example: We did a project on campus assault, just recently. We worked on it for a year. We collected the data from 160 universities, we did an investigation, we did a lot of FOIAs, which we increasingly do here, we get the documents and the data. Then we did a number of reports. And we look for a specific partner on each platform: online, print, radio, and television. That’s what we’ve done. ABC did a story on it. NPR did a number of reports on it. Huffington Post carried a number of reports. And we made a specific plan to provide a toolkit for campus newspapers: 65 campus newspapers have used that report. We made it available in an ebook. The sum total of that we can now say that 40 million people have heard, watched, seen, or read some part of our campus assault project. It is on our website. And there’s a community interested in this work, that’s concerned about what’s going on with campus assault. So we have a resource on our website. And it’s in the other publications.

So we’re both. We want people to come read it and get our work here, and we love it when it’s published elsewhere and linked back to us. There is always going to be more on our site — more data, more documents, more photographs. We want traffic to our site, as well as have it used elsewhere.

We also run the International Consortium of Investigative Journalists. The consortium is 100 journalists in 50 countries. We are working on, right now, three major cross-border investigations. We’ve been working on global tobacco for quite a while and issuing reports. Those reports are running in publications all over the world where those reporters work or have connections. For example, in July we have a project coming out with the BBC. The BBC has planned two documentaries and several programs. They’re using all of the work that we’ve started. We’re all doing it at the same time. It’ll come out the third week in July and it’ll run all over the world. Not just the BBC World Service, but in countries where we’ve been working. So we work internationally. We work in Washington, increasingly covering federal agencies. And we work at the state level, where we’re able to do 50-state projects. So that’s our model. It’s unique in how it operates. We’ve spent 20 years building this up. We’re very much pushing to do more, do it better, and do it widely.

You mention audience — is that how you measure success? There’s this debate happening right now: Is it audience, or is it impact? How do you define success at CPI?

Increasingly, the real way we measure success is impact. That is a huge part of our strategic plan: We want to catalyze impact. That means we want hearings to follow. We want laws to change. We want actions to happen. We are not an advocacy organization. We don’t go out and say “here is what you should do” in any way shape or form. We’re an investigative journalism organization. We do the reporting, but we love to see actions happen because of our reporting. A few years ago, when we reported on all the lobbyist-paid travel, where the records were kept in the basement of Capitol that no one had ever looked at — that took a year to do, with students. [Disclosure: I was one of those students.] But we listed every single trip taken by every single member of Congress for five years, and every staff member of every member of Congress. We showed every trip, every expense. The minute that was published, the travel started down. Then the new Congress came in and said, “oh, we have to close this loophole.” It was a loophole because it was public and transparent. We love that that’s an action that comes out of it.

But of course we like audience and we like engagement. So audience is a part of it. Engagement is increasingly a part of it. Are people writing comments, giving us ideas? How is the audience engaged? I was just up in Minnesota — the university there had just done a day-long session on campus assault, which came out of a public-radio interview they did with our reporter there. That’s an engagement in an issue at a local level that is very important.

[Buzenberg said that CPI's site attracts more than 1 million unique visitors per year, but declined to release exact traffic statistics.]

Nonprofit journalism is a hot topic right now, but there have been outlets like yours for a long time. I’m wondering, in terms of fundraising, does that give you a leg up right now, given that you’re established, or is it becoming difficult in a more crowded field?

I was in public radio for 27 years, both at National Public Radio and local. I was the head of news at national for seven years and then went to Minnesota Public Radio, now called American Public Media nationally. We raised a lot of money in both places. That’s nonprofit journalism with an important audience and it does great work.

Right now, I think, many funders have understood that the watchdog work, the investigative work, it’s expensive, it’s difficult, it’s risky. It’s the first thing often that gets cut when newspapers are declining, or magazines, or television, when they don’t have as many people out doing it. I think it’s been a period in which foundations and individuals have seen the importance of the kind of work that we do and we’ve gotten some strong support to continue to do this work. Yes, it’s competitive. It’s difficult.

We’re raising money in three ways. We do have foundation support. We’re talking with something like 86 foundations, many of whom do support us. We also are raising money from individuals — small donations with membership, much like public radio. Larger donations from people with resources. We do have a strong base of individual donors. And the third way is earned revenue, and we’re working on various scenarios of how we can earn that. We just did research for BBC. We sold our map on the global climate lobby to National Geographic. We’re selling ebooks. We do have various small revenue streams we want to grow. Those are three ways we raise the money to do this work. It’s important work and it’s not free. Public radio’s not free either. They get government resources — a small amount really. But at the Center we don’t take government money, direct corporate money, and we don’t take anonymous money. We make transparent, which is a very important thing, who is supporting us. It’s difficult. It’s not easy. With all the new centers popping up, there’s competition. There’s a lot going on, but I think many foundations, locally and nationally — and increasingly internationally, because we’ve gotten some good international support — have understood that this kind of work needs to be supported.

One thing I wanted to circle back to is your expansion. It seems like your recent expansion is into financial coverage. How did you come to that decision to expand in such a focussed way?

It came when the financial crisis hit the fall of 2008. We felt like no one was really saying who had caused the subprime problems — who was behind that? So we did a project. We started with 350 million mortgages. The mortgages are public information. From that, we named the 7.5 million subprime mortgages and we picked the 25 top lenders. Who they were, who supported them, where they did their lending, at what interest rates. We put it into a report. It took us six months. It’s “Who is behind the financial meltdown?” It still gets traffic. We put it out as an ebook. It’s being used by attorneys general. It’s being used by all sorts of people. No one had done the definitive work. That’s a project I’m really proud of. From that we grew a business and finance area. We thought there was so much more.

We’re tracking financial regulation and financial regulation issues in a way other people aren’t doing. That’s what our three-person team is doing. Financial is one area — money and politics is obviously one area we work in at the state and the national level. I might add when we did the global climate lobby before Coppenhagen, we were working globally. The other area is environment. The stories we’re working on with the BBC are environment. We’re doing a big project on the 10 most toxic workplaces and the 10 most toxic communities in America. It’ll take us six months.

How big are you? How many people work at the Center?

Right now, with the additions, we’re about 40. With fellows, we have 5 fellows and 6 interns, so we’re close to 50 people, if you add in fellowships and interns. It’s a major investment, there’s no question about it. That’s how we’re able to focus on these new projects.

This is a little touchy, but it jumped out at me. When I looked at the press release for the expansion I noticed that the eight new editorial hires are all men, I’m just curious about your struggles with diversity and bringing on women?

Well, first of all, the corrected version of the press release we sent out has Julie Vorman. We hired a deputy managing editor whose name should have been on there and it’s not on there. It’s not all the hires at the center — the six interns we hired, for example, are all women. We had 350 applicants for our internship program and we picked six, the best six. There are women at the Center. If I looked at the overall Center numbers, it is diverse, and it does have women. My COO and the head of development are in there, and on and on. There are many women here. It looked more male than it should have in the latest hires. It’s a fair question, but I think if you look at the overall numbers of the Center both with diversity and women reporters.

[After our conversation, Buzenberg looked up a breakdown of all staff at the Center, finding 43 percent are women and 23 percent are minority. Their staff page, showing individual positions, is here.]

June 01 2010

16:00

MinnPost’s Joel Kramer on the pull between big audience and big impact

The New York Times’ David Carr took a look recently at the nonprofit news site MinnPost, which he called “one of the more promising experiments in the next version of regional news.” Here’s an excerpt:

“We want MinnPost to be able to stand on its own by 2012, and I have a very aggressive definition of sustainability, which is that we have enough revenues to survive without foundation money,” [MinnPost founder Joel Kramer] said. “A lot of the foundation money for journalism goes to large, investigative-oriented sites, and I don’t know that there will always be money for sites like ours where the emphasis is on regional coverage.”

That means that some ambitions have been deferred. The staff is small, some of the work comes from freelancers and, journalistically, MinnPost is a careful, really smart site, but it is built on high-quality analysis rather than deep reporting and investigative work. Mr. Kramer was hard-pressed to come up with a single large story the site broke that changed the course of events.

Kramer’s right that much of the attention nonprofit news outlets receive focuses on the big investigative operations, most prominently ProPublica. And if your goal is to replace what newspapers no longer do as much of, investigative reporting is an obvious focus for nonprofits and foundations. ProPublica’s Paul Steiger has said he measures his success by “impact” — a.k.a. stories that “changed the course of events” — more than audience.

I was interested in that tension between impact and audience, so I gave Kramer a call. “Having a loyal audience is central to our success and our survival, and, therefore, when we decide how to allocate resources, we have to focus on which things will build this loyal audience,” he told me. Here’s an edited version of the conversation I had with Kramer about the evolution of MinnPost.

I’m remembering when MinnPost launched back in 2007, that it was launched in response to newsroom cuts in Minnesota. Do you still see your site as serving that fill-in function of trying to produce additional news in the state? Or has your vision for what the site is doing changed?

The goal was never fill in. I would say that the goal is to serve a community of people who care about Minnesota, people who are engaged in creating the state’s future, opinion leaders, office holders, activists. It’s an important segment of the people who read newspapers. It’s not everybody. Our goal has always been to serve that audience with news, information, analysis, commentary, forum for discussion, for people who are actively involved in the community of the state. That has always been our goal. It’s never been to replace what mainstream media do, but to supplement it, aimed at the people who read the most and act on what they read the most. And that has not changed.

David Carr referred to journalism that “changed the course of events.” Do you see that sort of journalism as your responsibility as a news outlet?

I don’t think that is our principal responsibility. We take our principal responsibility as informing this community with what they want and need to know to play the roles they want to play in creating our community and creating its future.

We do ask our audience what it is that makes them read MinnPost and why they like it and why they keep coming back to it, and the most important thing is reporting and analysis from writers they trust and being on top of stories they really care about and explaining what the stories really mean. In other words, getting beyond the superficial reporting. For example, reporting on the motives of lawmakers — assessing the quality of their proposals and of their actions. Comparing what happens here to comparable situations elsewhere. Predicting what might happen next, based on the authority of the reporter. And introducing these readers to new ideas they didn’t know about, trends and people they should know about. These are the main things, the most important things we do.

Does the site look the way you would have predicted two years ago? Has it evolved based on feedback from your readers?

It has evolved. We learned both from examining the data about traffic and talking to readers that frequency of appearance on the site by trusted writers is a critical element of success. I’m not going to say that is necessarily true for everybody — I’m just talking to our experience. But for us, we learned that. Whereas before I started I might have thought that writers would take a longer period of time on a story and then write less frequently and maybe at greater length, that does not produce the kind of loyal following that we were after. That critical element is appearing frequently on the site, in a way that it is clear who the writer is.

I went back and looked at the clips from when MinnPost launched and at the time it seemed that the site was going to be more like a traditional newspaper translated online than what it is now, which I think is more like a blog that has taken reporting elements. I think if you were to read your description from when it launched and looked now, I think it looks different.

When we launched, some skeptics said that, you know, ‘Joel and his people don’t really understand new media, they don’t really understand the internet.’ And I would plead guilty to that. At the time I even said, I’m a journalist, I come out of a print background, although we did have a couple of editors with more of an Internet background than I did, and I agreed that I was trying to make something happen here that related to a value system I had built in previous media. But I said we were going to learn. So there’s no question: I’d be shocked if our site looked today like I was talking about 2.5 years ago. That’s a long time ago in the Internet world. So, yes, it’s clearly different — no question about it.

But the differences, in my opinion — and this is important to me — they’re not differences in what constitutes quality. Because you can have quality in short term, [quality] that’s in long form. You can have quality in pieces that took six months and pieces that were turned in four hours. And from day one, we were committed to the idea that our writers did not have to be bound by some false definition of objectivity, in which the writer pretends that he or she has no views about anything. So those thing were there from the beginning. But there has been a significant evolution about what works in the medium and what works to build and audience.

What about other models, like nonprofits that focus more on investigative reporting?

As is mentioned in the Times piece, we have the goal of becoming sustainable without foundations. It’s a very ambitious goal and I’m hoping we’ll achieve it by 2012, our fifth year. It’s certainly not a goal shared by all nonprofit journalism enterprises. A key to succeeding at that goal is you have to have an audience that you can figure out a variety of ways to monetize. That could be advertising, it could be sponsorships, it could be donations. It could be the support of people of wealthy means in the community who love the idea and the audience that’s been created. Having a loyal audience is central to our success and our survival, and, therefore, when we decide how to allocate resources, we have to focus on which things will build this loyal audience. And it’s that that we’re talking about changes over time because you get tremendous feedback, traffic feedback, anecdotal feedback, and you learn what it is that is attracting your audience to you.

I think the differing goals of these nonprofits are interesting; some nonprofits are just not particularly concerned with the traffic levels on their website. What do you think of the differences?

There are all kinds of different missions, and I think it’s a great time in the ecosystem where all different things are being tried. If you’re not concerned with traffic, you need to have a set of supporters who are going to be there, not because of your audience, but because of some other factor — such as your impact through investigations on the quality of government in your community, or something like that. So there are ways you could sustain with that without a focus on a regular audience. Another thing you can do, and some of my peer sites are doing this, is give your content away to other places. Now if you do that, then visits to your site are not important, and then you might be able to build a model based on syndication where publishing less frequently but giving it to prominent places could work for you. But our model is based on building a loyal audience to our site.

March 30 2010

18:56

A “reader affection” formula: Gawker creates a metric for branded traffic

Influence, engagement, impact: For goals that are, in journalism, kind of the whole point, they’re notoriously difficult to quantify. How do you measure, measure a year, and so on.

Turns out, though, that Gawker Media, over the past few years, has been attempting to do just that. Denton and Crew, we learned in a much-retweeted post this morning, have been “quietly tending” a metric both more nebulous and more significant than pageviews, uniques, and the other more traditional ways of impact-assessment: They’ve been measuring branded traffic — or, as the post in question delightfully puts it, “recurring reader affection.” The metric comes from a simple compound: direct type-in visits plus branded search queries in Google Analytics.

In other words, Gawker Media is bifurcating its visitors in its evaluation of them, splitting them into two groups: the occasional audience, you might call it, and the core audience. And it’s banking on the latter. “New visitors are only really valuable if they become regulars,” Denton pointed out in a tweet this morning. (That lines up with Denton’s recent pushing of unique visits over pageviews as a performance metric.)

The goal — as it is for so many things in journalism these days — is to leverage the depth against the breadth. As the post puts it:

While distributing content across the web is essential for attracting the interest of Internet passersby, courting these wanderers, massaging them into occasional visitors, and finally gaining their affection as daily readers is far more important. This core audience — borne of a compounding of word of mouth, search referrals, article recommendations, and successive enjoyed visits that result in regular readership — drives our rich site cultures and premium advertising products.

I spoke with Erin Pettigrew, Gawker Media’s head of marketing and advertising operations — and the author of the post in question — over gChat to learn more about the outlet’s branded-traffic metric.

“The idea came from a few places,” she told me.

First, for so long we concerned ourselves with reach and becoming a significant enough web population such that advertisers would move us into their consideration set for marketing spend. Now that we have attained a certain level of reach and that spend consideration, we’re looking for additional ways to differentiate ourselves against other publisher populations. So branded traffic helps to illuminate our readership’s quality over its quantity, a nuanced benefit over many of the more broadly reaching sites on the web.

Secondly, there’s a myth, especially in advertising, that frequency of visitation is wasteful to ad spend. As far as premium content sites and brand marketers go, however, that myth is untrue. So, the ‘branded traffic’ measure is part of a larger case we’re making that advertising to a core audience (who visits repeatedly) is extremely effective.

Another aspect of that case, she adds, is challenging assumptions about reader engagement. “The wisdom has been that the higher the frequency of ad exposures to a single visitor, the less effective a marketing message becomes to that visitor. To the contrary, the highly engaged reader is actually far more receptive to the publisher’s marketing messaging than the occasional passerby.

In other words, she says: “Branded traffic is to a free website what a subscriber base is to a paid content site. The psychology behind the intent to visit and engage with the publisher brand in those two instances is very similar.”

The approach’s big x-factor — whether branded traffic will get buy-in, in every sense, from marketers — remains to be determined. “It’s something we’re just beginning to explore,” Pettigrew says. But marketers, she points out, “have always considered front door takeovers or roadblocks as one of the most coveted advertising placements on a publisher website. And they “intuitively understand that the publisher brand’s halo is brightest and strongest for a reader who comes through the front door seeking the publisher’s brand experience” — which is to say, they should realize the value of the core audience. “But we’ve yet to see a metric take hold across the industry that gets at a numerical understanding of this marketer intuition.”

March 11 2010

18:25

Witness Creates Sophisticated Evaluation Tools for Video Impact

Last month, Jessica Clark and I explored how various Public Media 2.0 projects are measuring their level of success in informing and engaging publics. We found that many public media organizations are struggling to measure impact -- and some are relying only on traditional indicators of reach, as opposed to other elements of impact such as relevance, inclusion, engagement or influence. Some projects, however, are taking a more holistic approach that is matched closely to their mission.

The international human rights group Witness, which provides training, support and visibility for local groups producing documentaries about human rights issues, has created a Performance Dashboard that tracks more than just the number of viewers. Using "at a glance" metrics, descriptive analysis and direct feedback from participants, the Performance Dashboard provides a concise overview of impact.

It combines traditional metrics -- such as sales and licensing numbers, email subscriptions, blog statistics -- with more nuanced data, including a timeline indicating progress of core partnerships. These reports are published twice per year on the Witness website, and they are made available to other organizations under a Creative Commons license.

witness_performancedashboard.jpg

Videos With a Purpose

Witness is able to efficiently track progress in large part because they begin each media project with clear advocacy goals. According to Sam Gregory, Witness's program director, all work "springs out of an advocacy strategy." He said Witness is focused on "making videos for a purpose as opposed to making videos about an issue."

Each video project starts with the completion of a Video Action Plan, which encourages partners to think purposefully about intended impact, avenues for action, and measures of success.

Some of these measures of success are particularly striking. For example, Witness worked with the Centre for Minority Rights Development (CEMIRIDE), a human rights organization, to create a film about the displaced Endorois community in Kenya. The film ended up being presented as evidence in a landmark case in which the African Commission on Human and Peoples' Rights ruled in favor of the Endorois community. Last month, the African Union, the highest legal authority in Africa, ruled in favor of the earlier decision and ordered the Kenyan government to provide the Endorois with compensation and reinstate their land.

While a direct causal link can be difficult to prove, clearly this film did its job. In a case such as this, the element of impact that is most important is influence, not reach. Gregory explained that even if only a few people saw the film, the film achieved its desired impact because they were the people with the power to decide the case.

New Focus: User-Generated Video

Witness hopes to broaden its impact with a new strategic vision that addresses the exponential growth in user-generated video. The organization is focusing on how user-generated video can be used by human rights advocates. (MediaShift reported on the organization's earlier experiments with viral video in 2006.) Witness currently trains about 500 people in human rights filmmaking across the globe per year, and recognizes the need to shift to a more scalable training approach. One of the ways that Witness will make this shift is by developing shared virtual spaces for fostering discussion on what works and what doesn't.

Yvette J. Alberdingk Thijm, Witness's executive director, explained the strategy in a blog post:

Right now and right here Witness, with your help, can exponentially expand its impact. But the demand for our services is far greater than our capacity. Witness's New Strategic Vision is designed to scale our impact. So beginning in 2010, in addition to continuing to train and support individual grassroots organizations, Witness will forge relationships among organizations and networks, creating a broader, more interconnected global human rights community. By doing this, we'll play a seminal role in forging coalitions that seek shared goals, with video emerging as the common language across all types of borders. In addition, we will scale our work by creating video toolkits and other web tools that facilitate knowledge sharing.

With the new focus on networked campaigns, in some ways, impact will become more difficult for Witness to track. What is the most effective way to measure impact when the media in question spans across so many different modes, timeframes, countries and (sometimes overlapping) networks?

In the future, Witness will likely spend more energy tracking the connections that form within and among networks. The Witness team is currently working through the process of adding new categories to the current Performance Dashboard.

The dashboard offers a great model for other media projects. But it's also clear that projects without similar, specific advocacy goals will likely have a harder time making use of the tool. Outlets and creators with more neutral goals of spurring discussion or raising awareness may have to turn to some of the existing impact assessment toolkits -- or perhaps even develop their own.

Katie Donnelly is Associate Research Director at the Center for Social Media at American University where she blogs about the future of public media. With a background in media literacy education, Katie previously worked as a Research Associate at Temple University's Media Education Lab in Philadelphia. When she's not researching media, Katie spends her time working in the environmental field and blogging about food.

This is a summary. Visit our site for the full post ».

March 04 2010

19:11

A “reporting recipe” to dig up dirt like ProPublica

A core goal of nonprofit news organizations is to create impact. Foundations and donors expect evidence of journalism’s impact in a way that the local department store never did. Jack Shafer wrote a scathing critique of the nonprofit-as-impact driver not long ago, arguing that for-profit media is better insulated against donor whims because the audience is the client:

Nonprofit outlets almost always measure their success in terms of influence, not audience, because their customers are the donors who’ve donated cash to influence politics, promote justice, or otherwise build a better world.

(His view of the nonprofit drive to change the world is more jaded than mine. What for-profit newspaper writer got into the business not to change the world?)

Whatever your stance, the reality is here: Maximizing impact is a key part of nonprofits’ aims. Outlets like the Center for Public IntegrityThe American Independent News Network (where I edited The Washington Independent), ProPublica, and others measure the reach and impact of their work to drum up support. They do this a number of ways. Center for Public Integrity published its work under a Creative Commons license to try to get other publications to reprint it and amplify the message. (They also participate in a young, and struggling, AP-nonprofit distribution program.) The Washington Independent tracks both media pickup and how its work resulted in real change. ProPublica partners with newspapers around the country in printing its stories — all of which is aimed at maximizing their journalism’s impact.

Today ProPublica is unveiling a new approach in increasing impact: a step-by-step reporting guide that shows how its reporters executed a major investigation, with the hopes that state-based reporters and interested citizen journalists will continue their work.

Reporters Charles Ornstein and Tracy Weber have created a guide that reverse-engineers how they reported a year-and-a-half-long investigation on how states handle disciplinary action against nurses. The results of their work were alarming, and its consequences were swift: One day after the Los Angeles Times ran a story on how it took years for the state nursing board to take disciplinary action, while allowing dangerous nurses to keep working, Gov. Arnold Schwarzenegger removed most members of the state nursing board. Newspapers in several other states have picked up on ProPublica’s work and run their own versions.

It took Ornstein and Weber over a year to research their series, but by making the state-based data available and building a guide on how to do the reporting, they say it should be much simpler and less time-consuming for another reporter to follow in their footsteps. The data alone should at least help “find the smoke” in federal reporting-requirement lapses quickly, so a reporter knows where to invest her time, Weber said. They’re both eager to talk to interested reporters, too. (You can contact them directly, or join in on a conference call that will be scheduled soon.)

“When you called all these different states, you realized you were talking to folks who had never talked to journalists before,” Weber told me. “It made me think it was so ripe for local reporters to take a look at this because, frankly, everyone is touched by a nurse.” The guide lays out seven broad steps for reporting out a regulatory board story, with details under each section, including relevant federal law. It also includes relevant links for certain states.

She added the guide’s methods could be applied to any regulatory board, not just those that govern nurses. “This gives them a map to say, ‘Okay, let’s go take a look at this’…They could maybe change the way these boards are overseen in their state if they find, for instance, they never disciplined anyone, which we found, and that just seems impossible.”

Ornstein told me he hopes this experiment, specifically pointing to the online database of disclosure data, creates real change — and impact. “If somebody has to pay $20 to get a copy of a disciplinary order against a nurse, if they’re looking for a home health nurse, is that something they’re really going to do? Is the state really helping them make a smart choice to protect them? I don’t think so. By pointing this out, we’re really doing a service.”

March 03 2010

15:00

Huffington Post outsources section to online fundraising organization

In October, The Huffington Post launched a new section with an unusual goal: turning an audience of passive readers into activists for good causes. The section’s underlying business model is novel, too: All of its content is outsourced to an outside company, a for-profit firm that has nonprofits for clients.

In exchange for that content, HuffPo shares the advertising and sponsorship revenue the section generates with the outside company, Causecast. And Causecast gets a platform to promote its services and the nonprofits it chooses to highlight, some of which are its partner organizations.

The arrangement emerges at the same time news organizations are struggling to make display advertising alone a viable business model. The HuffPo-Causecast arrangement, in conjunction with ads, could be an example of the kind of hybrid solution publishers are struggling to find. However, by blurring the line between advertising and content, it also raises questions about conflicts of interest and editorial responsibility.

A platform to encourage giving

I first noticed the section — Impact — a few months ago, with its hot-pink branding and tagline “in partnership with Causecast.” There’s no further explanation of the relationship between the two organizations on the page; you have to browse away to Causecast’s site to learn that it provides nonprofits with online and mobile fundraising tools. Causecast’s site uses social networking to encourage users to become fans of nonprofits and then donate to them, using a single login and donation platform. About 60 nonprofits, ranging from local homeless shelters to national organizations like Planned Parenthood, are listed as affiliates. Causecast offers nonprofits a menu of services, some of them free, like getting a fan page on Causecast’s site, and others for a price, including technical support for mobile device fundraising. Causecast declined to say how many nonprofits are paying clients.

When I talked to the Impact section’s editor, Jonathan Daniel Harris, I was surprised to learn that — despite having a bio and byline like other Huffington Post editors — he is not a HuffPo employee. He is paid by Causecast and works out of their Santa Monica offices. As part of the arrangement with the Huffington Post, Harris oversees two other writers, who are also Causecast employees, in producing the site’s content, which includes short original stories and aggregation from around the web. The stories and curated links are generally about a social cause, or person in need; The earthquake in Haiti, for example, dominated the section for weeks this winter. But other causes, like malaria or homelessness — many of the same problems Causecast’s partner nonprofits aim to solve — are also featured.

At the end of some of the original posts, which look like other Huffington Post content, readers get a chance to donate money to a nonprofit. Often, the nonprofit highlighted is a Causecast-affiliated organization and the link will take the user to a Causecast-facilitated donation page. Causecast says it does not take a cut from any of the donations. The money is filtered through Causecast’s nonprofit arm and the money — about $200,000 so far — goes directly to the organizations.

When I asked Brian Sirgutz, Causecast’s president, if a Causecast client could pay for a link or a story on the Impact page, a spokeswoman for the organization responded in an email that they could not. I also asked if Causecast clients get any priority in the editorial process when determining what nonprofits to feature. I was told “no.”

Multilayered relationships

But that doesn’t mean Causecast isn’t writing about or linking to affiliated organizations. Here’s an example: On Jan. 31, Harris wrote a 76-word post titled, “Malaria Is The Cause of 2010, Declares Matthew Bishop and Malaria No More.” The quick post notes that the nonprofit group Malaria No More expects the World Cup in South Africa to draw attention to the disease. Underneath the post, a box features a link to donate money to Malaria No More, using Causecast’s donation tool. Harris doesn’t mention in the post that Malaria No More is a member organization of his employer, or that Causecast ran Malaria No More’s mobile fundraising campaign. Causecast lists the campaign as a case study for its text2give services.

Causecast has also linked to and promoted AARP’s project Create the Good. AARP contracted with Causecast to develop the concept and execute the site, which helps would-be volunteers find places in their community to pitch in. Create the Good was an early advertiser on the Impact section, noted by Arianna Huffington in her post announcing the new site. (Huffington didn’t note a relationship between Create the Good and Causecast in her post.) Including Huffington’s post, the Impact section has tagged seven posts with a “Create the Good” tag. None of the posts mention that Causecast was paid to create the site.

The Impact site has also run fundraising events. In the 12 days leading up to Christmas, the site ran a series of stories (about 1,000 words each) “highlighting Americans who have persevered to overcome incredible challenges and the nonprofits that helped change their lives.” I looked up some of the nonprofits readers were encouraged to support. Most are listed as partner organizations on Causecast’s website; some were not. Neither distinction was noted in the stories.

The same series also ran a disclaimer at the end of some of the profiles unlike anything I’ve seen in journalism: “Causecast Corporation and The Huffington Post make no representations or warranties as to the legitimacy of this person’s story, need for assistance, or the amount of any medical or other bills, if any, owed by this individual.” The Huffington Post and Causecast gave me statements noting they run the disclaimer when they ask readers to donate to an individual, rather than a vetted group with IRS nonprofit status.

I asked Harris about the editorial relationship between the two groups. He explained that the Huffington Post “pretty much gave up complete control of a section to another company.” But, he noted, he’s in regular touch with senior editors: “It’s not like we can do whatever we want.”

A joint arrangement

In an email response to questions, the Huffington Post explained that Causecast’s values are in alignment with its own and that the editorial process is similar to other sections on the site. “Impact editors receive this guidance jointly from senior editors at both HuffPost and Causecast. There is an ongoing back and forth between the HuffPost and Causecast teams.”

Sirgutz described the relationship as a service: Causecast takes care of a project that Huffington Post wants, but would not otherwise invest in. “This market is not exactly something where a big media company is going to say, ‘we want to spend resources and time and money to be able to develop this type of content or service for our readership,’ because it isn’t going to exactly blow off the charts on the profit margins or traffic,” he told me. “So, what we’re able to do was to bring our expertise, because this was our field, we were able to provide that service to the Huffington Post and come up with an arrangement where they don’t have to spend any money to cover this type of content or on providing the direct ability for their readership to take action.”

Both Harris and Sirgutz are hopeful about the future of the partnership with Huffington Post, and for these kind of partnerships more broadly. Both pointed to additional corporate sponsorships as an added revenue stream. AARP, for instance, sponsored the Impact site for six weeks, buying up all ads on the page. I asked Harris how he thought the project’s gone and where he thinks it’s headed. “It’s been successful so far,” he said, “and if it can continue to grow and sponsors are interested in paying us, that is kind of proof of concept right there.”

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl