Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 06 2012

07:38

A case study in online journalism part 2: verification, SEO and collaboration (investigating the Olympic torch relay)

corporate Olympic torchbearers image

Having outlined some of the data journalism processes involved in the Olympic torch relay investigation, in part 2 I want to touch on how verification and ‘passive aggressive newsgathering’ played a role.

Verification: who’s who

Data in this story not only provided leads which needed verifying, but also helped verify leads from outside the data.

In one example, an anonymous tip-off suggested that both children of one particular executive were carrying the Olympic torch on different legs of the relay. A quick check against his name in the data suggested this was so: two girls with the same unusual surname were indeed carrying the torch. Neither mentioned the company or their father. But how could we confirm it?

The answer involved checking planning applications, Google Streetview, and a number of other sources, including newsletters from the private school that they both attended which identified the father.

In another example, I noticed that one torchbearer had mentioned running alongside two employees of Aggreko, who were paying for their torches. I searched for other employees, and found a cake shop which had created a celebratory cake for three of them. Having seen how some corporate sponsors used their places, I went on a hunch and looked up the board of directors, searching in the data first for the CEO Rupert Soames. His name turned up – with no nomination story. A search for other directors found that more than half the executive board were carrying torches – which turned out to be our story. The final step: a call to the company to get a reaction and confirmation.

The more that we knew about how torch relay places had been used, the easier it was to verify other torchbearers. As a pattern emerged of many coming from the telecomms industry, that helped focus the search – but we had to be aware that having suspicions ‘confirmed’ didn’t mean that the name itself was confirmed – it was simply that you were more likely to hit a match that you could verify.

Scepticism was important: at various times names seemed to match with individuals but you had to ask ‘Would that person not use his title? Why would he be nominated? Would he be that age now?’

Images helped – sometimes people used the same image that had been used elsewhere (you could match this with Google Images ‘match image’ feature, then refine the search). At other times you could match with public photos of the person as they carried the torch.

This post on identifying mystery torchbearers gives more detail.

Passive aggressive newsgathering

Alerts proved key to the investigation. Early on I signed up for daily alerts on any mention of the Olympic torch. 95% of stories were formulaic ‘local town/school/hero excited about torch’ reports, but occasionally key details would emerge in other pieces – particularly those from news organisations overseas.

Google Alerts for Olympic torch

It was from these that I learned how many places exactly Dow, Omega, Visa and others had, and how many were nominated. It was how I learned about torchbearers who were not even listed on the official site, about the ‘criteria’ that were supposed to be adhered to by some organisations, about public announcements of places which suggested a change from previous numbers, and more besides.

As I came across anything that looked interesting, I bookmarked and tagged it. Some would come in useful immediately, but most would only come in useful later when I came to write up the full story. Essentially, they were pieces of a jigsaw I was yet to put together.  (For example, this report mentioned that 2,500 employees were nominated within Dow for just 10 places. How must those employees feel when they find the company’s VP of Olympic operations took up one of the few places? Likewise, he fit a broader pattern of sponsorship managers carrying the torch)

I also subscribed to any mention of the torch relay in Parliament, and any mention in FOI requests.

SEO – making yourself findable

One of the things I always emphasise to my students is the importance of publishing early and often on a subject to maximise the opportunities for others in the field to find out – and get in touch. This story was no exception to this. From the earliest stages through to the last week of the relay, users stumbled across the site as they looked for information on the relay – and passed on their concerns and leads.

It was particularly important with a big public event like the Olympic torch relay, which generated a lot of interest among local people. In the first week of the investigation one photographer stumbled across the site because he was searching for the name of one of the torchbearers we had identified as coming from adidas. He passed on his photographs – but more importantly, made me aware that there may be photographs of other executives who had already carried the torch.

That led to the strongest image of the investigation – two executives exchanging a ‘torch kiss’ (shown at the top of this post) – which was in turn picked up by The Daily Mail.

Other leads kept coming. The tip-off about the executive’s daughters mentioned above; someone mentioning two more Aggreko directors – one of which had never been published on the official site, and the other had been listed and then removed. Questions about a Polish torchbearer who was not listed on the official site or, indeed, anywhere on the web other than the BBC’s torch relay liveblog. Challenges to one story we linkblogged, which led to further background that helped flesh out the processes behind the nominations given to universities.

When we published the ‘mystery torchbearers’ with The Guardian some got in touch to tell us who they were. In one case, that contact led to an interview which closed the book: Geoff Holt, the first quadriplegic to sail single-handed across the Atlantic Ocean.

Collaboration

I could have done this story the old-fashioned way: kept it to myself, done all the digging alone, and published one big story at the end.

It wouldn’t have been half as good. It wouldn’t have had the impact, it wouldn’t have had the range, and it would have missed key ingredients.

Collaboration was at the heart of this process. As soon as I started to unearth the adidas torchbearers I got in touch with The Guardian’s James Ball. His report the week after added reactions from some of the companies involved, and other torchbearers we’d simultaneously spotted. But James also noticed that one of Coca Cola’s torchbearers was a woman “who among other roles sits on a committee of the US’s Food and Drug Administration”.

It was collaborating with contacts in Staffordshire which helped point me to the ‘torch kiss’ image. They in turn followed up the story behind it (a credit for Help Me Investigate was taken out of the piece – it seems old habits die hard), and The Daily Mail followed up on that to get some further reaction and response (and no, they didn’t credit the Stoke Sentinel either). In Bournemouth and Sussex local journalists took up the baton (sorry), and the Times Higher did their angle.

We passed on leads to Ventnor Blog, whose users helped dig into a curious torchbearer running through the area. And we published a list of torchbearers missing stories in The Guardian, where users helped identify them.

Collaborating with an international mailing list for investigative journalists, I generated datasets of local torchbearers in Hungary, Italy, India, the Middle East, Germany, and Romania. German daily newspaper Der Tagesspiegel got in touch and helped trace some of the Germans.

And of course, within the Help Me Investigate network people were identifying mystery torchbearers, getting responses from sponsors, visualising data, and chasing interviews. One contributor in particular – Carol Miers – came on board halfway through and contributed some of the key elements of the final longform report – in particular the interview that opens the book, which I’ll talk about in the final part tomorrow.

07:38

A case study in online journalism part 2: verification, SEO and collaboration (investigating the Olympic torch relay)

corporate Olympic torchbearers image

Having outlined some of the data journalism processes involved in the Olympic torch relay investigation, in part 2 I want to touch on how verification and ‘passive aggressive newsgathering’ played a role.

Verification: who’s who

Data in this story not only provided leads which needed verifying, but also helped verify leads from outside the data.

In one example, an anonymous tip-off suggested that both children of one particular executive were carrying the Olympic torch on different legs of the relay. A quick check against his name in the data suggested this was so: two girls with the same unusual surname were indeed carrying the torch. Neither mentioned the company or their father. But how could we confirm it?

The answer involved checking planning applications, Google Streetview, and a number of other sources, including newsletters from the private school that they both attended which identified the father.

In another example, I noticed that one torchbearer had mentioned running alongside two employees of Aggreko, who were paying for their torches. I searched for other employees, and found a cake shop which had created a celebratory cake for three of them. Having seen how some corporate sponsors used their places, I went on a hunch and looked up the board of directors, searching in the data first for the CEO Rupert Soames. His name turned up – with no nomination story. A search for other directors found that more than half the executive board were carrying torches – which turned out to be our story. The final step: a call to the company to get a reaction and confirmation.

The more that we knew about how torch relay places had been used, the easier it was to verify other torchbearers. As a pattern emerged of many coming from the telecomms industry, that helped focus the search – but we had to be aware that having suspicions ‘confirmed’ didn’t mean that the name itself was confirmed – it was simply that you were more likely to hit a match that you could verify.

Scepticism was important: at various times names seemed to match with individuals but you had to ask ‘Would that person not use his title? Why would he be nominated? Would he be that age now?’

Images helped – sometimes people used the same image that had been used elsewhere (you could match this with Google Images ‘match image’ feature, then refine the search). At other times you could match with public photos of the person as they carried the torch.

This post on identifying mystery torchbearers gives more detail.

Passive aggressive newsgathering

Alerts proved key to the investigation. Early on I signed up for daily alerts on any mention of the Olympic torch. 95% of stories were formulaic ‘local town/school/hero excited about torch’ reports, but occasionally key details would emerge in other pieces – particularly those from news organisations overseas.

Google Alerts for Olympic torch

It was from these that I learned how many places exactly Dow, Omega, Visa and others had, and how many were nominated. It was how I learned about torchbearers who were not even listed on the official site, about the ‘criteria’ that were supposed to be adhered to by some organisations, about public announcements of places which suggested a change from previous numbers, and more besides.

As I came across anything that looked interesting, I bookmarked and tagged it. Some would come in useful immediately, but most would only come in useful later when I came to write up the full story. Essentially, they were pieces of a jigsaw I was yet to put together.  (For example, this report mentioned that 2,500 employees were nominated within Dow for just 10 places. How must those employees feel when they find the company’s VP of Olympic operations took up one of the few places? Likewise, he fit a broader pattern of sponsorship managers carrying the torch)

I also subscribed to any mention of the torch relay in Parliament, and any mention in FOI requests.

SEO – making yourself findable

One of the things I always emphasise to my students is the importance of publishing early and often on a subject to maximise the opportunities for others in the field to find out – and get in touch. This story was no exception to this. From the earliest stages through to the last week of the relay, users stumbled across the site as they looked for information on the relay – and passed on their concerns and leads.

It was particularly important with a big public event like the Olympic torch relay, which generated a lot of interest among local people. In the first week of the investigation one photographer stumbled across the site because he was searching for the name of one of the torchbearers we had identified as coming from adidas. He passed on his photographs – but more importantly, made me aware that there may be photographs of other executives who had already carried the torch.

That led to the strongest image of the investigation – two executives exchanging a ‘torch kiss’ (shown at the top of this post) – which was in turn picked up by The Daily Mail.

Other leads kept coming. The tip-off about the executive’s daughters mentioned above; someone mentioning two more Aggreko directors – one of which had never been published on the official site, and the other had been listed and then removed. Questions about a Polish torchbearer who was not listed on the official site or, indeed, anywhere on the web other than the BBC’s torch relay liveblog. Challenges to one story we linkblogged, which led to further background that helped flesh out the processes behind the nominations given to universities.

When we published the ‘mystery torchbearers’ with The Guardian some got in touch to tell us who they were. In one case, that contact led to an interview which closed the book: Geoff Holt, the first quadriplegic to sail single-handed across the Atlantic Ocean.

Collaboration

I could have done this story the old-fashioned way: kept it to myself, done all the digging alone, and published one big story at the end.

It wouldn’t have been half as good. It wouldn’t have had the impact, it wouldn’t have had the range, and it would have missed key ingredients.

Collaboration was at the heart of this process. As soon as I started to unearth the adidas torchbearers I got in touch with The Guardian’s James Ball. His report the week after added reactions from some of the companies involved, and other torchbearers we’d simultaneously spotted. But James also noticed that one of Coca Cola’s torchbearers was a woman “who among other roles sits on a committee of the US’s Food and Drug Administration”.

It was collaborating with contacts in Staffordshire which helped point me to the ‘torch kiss’ image. They in turn followed up the story behind it (a credit for Help Me Investigate was taken out of the piece – it seems old habits die hard), and The Daily Mail followed up on that to get some further reaction and response (and no, they didn’t credit the Stoke Sentinel either). In Bournemouth and Sussex local journalists took up the baton (sorry), and the Times Higher did their angle.

We passed on leads to Ventnor Blog, whose users helped dig into a curious torchbearer running through the area. And we published a list of torchbearers missing stories in The Guardian, where users helped identify them.

Collaborating with an international mailing list for investigative journalists, I generated datasets of local torchbearers in Hungary, Italy, India, the Middle East, Germany, and Romania. German daily newspaper Der Tagesspiegel got in touch and helped trace some of the Germans.

And of course, within the Help Me Investigate network people were identifying mystery torchbearers, getting responses from sponsors, visualising data, and chasing interviews. One contributor in particular – Carol Miers – came on board halfway through and contributed some of the key elements of the final longform report – in particular the interview that opens the book, which I’ll talk about in the final part tomorrow.

April 27 2012

21:30

Middle East coverage is full of lies

Foreign Policy :: It has not been a banner week for media coverage of the Arab world. Blame it on journalists unfamiliar with their subject matter, the demands of an ever-quicker news cycle, or simply salacious stories that were "too good to check" -- a number of stories that have made it into major media outlets recently are simply not true, or omit essential details of the tale.

HT: Mark Little, Storyful here:

Middle East Coverage is Full of Lies - Foreign Policy blog.foreignpolicy.com/posts/2012/04/…

— mark little (@marklittlenews) April 27, 2012

Continue to read DAvid Kenner, blog.foreignpolicy.com

21:18

How to verify what’s true? - Storyful’s verification process

Storyful :: British writer Lynn Barber has scalped some formidable egos in her time. Laced with acerbic scrutiny, her interviews have become something of renown in the British Isles. Her secret? Approach. She confessed in a 2010 interview to beginning interviews by adopting a natural dislike of her subject: the conversation is their opportunity to win her over.

At Storyful, we interrogate content shared on the social web in a style not dissimilar Barber’s grilling of dignitaries of media and politics. We adopt a natural skepticism to every item of content we discover.

Verification is a cornerstone of our work and it has to be. Information and content often spreads across social media in ‘Chinese whispers’ fashion. Videos and images are spliced, diced and re-posted. Context and details change, agendas compete. Falsehoods and fabrications are deliberately issued.

Storyful's verification process, examples - Continue to read Malachy Browne, blog.storyful.com

Listen to the Podcast produced by journalism.co.uk: at 5.38 Malachy Browne, Storyful's News Editor, gives some examples of how the verification processes have been applied in his daily news reporting.

Tags: verification
21:14

How the social media community can shape the news agenda [audio]

journalism.co.uk | Soundcloud :: In this week's Journalism.co.uk podcast Sarah Marshall looks at how the social media community can set the news agenda, participate in the story, and how news organisations are using the passion and enthusiasm of the audience to shape output.

This podcast hears from: Malika Bilal, digital producer and co-host of The Stream; Jason Mills, editor of web development at ITV News; and Luke Lewis, editor of NME.com.

journalism.co.uk podcasts - journalismnews, soundcloud.com

Tags: verification

April 20 2012

06:02

New Crowdsourcing, Curation and Liveblogging Training

Hi all! I’ve been traveling a lot for Digital First lately to spread the gospel of social media to my colleagues. So, if you’ve seen my presentations before, you’d know that I make very wordy Powerpoints so that people who weren’t there to see me prattle on about my favorite things can still follow what we went [...]

January 06 2012

15:30

This Week in Review: Lessons from Murdoch on Twitter, and paywalls’ role in 2011-12

Murdoch, Twitter, and identity: News Corp.’s Rupert Murdoch had a pretty horrible 2011, but he ended it with a curious decision, joining Twitter on New Year’s Eve. The account was quickly verified and introduced as real by Twitter chairman Jack Dorsey, dousing some of the skepticism about its legitimacy. His Twitter stream so far has consisted of a strange mix of News Corp. promotion and seemingly unfiltered personal opinions: He voiced his support for presidential candidate Rick Santorum (a former paid analyst for News Corp.’s Fox News) and ripped former Fox News host Glenn Beck.

But the biggest development in Murdoch’s Twitter immersion was about his wife, Wendi Deng, who appeared to join Twitter a day after he did and was also quickly verified as legitimate by Twitter. (The account even urged Murdoch to delete a tweet, which he did.) As it turned out, though, the account was not actually Deng, but a fake run by a British man. He said Twitter verified the account without contacting him.

This, understandably, raised a few questions about the reliability of identity online: If we couldn’t trust Twitter to tell us who on its service was who they said they were, the issue of online identity was about to become even more thorny. GigaOM’s Mathew Ingram chastised Twitter for its lack of transparency about the process, and The Washington Post’s Erik Wemple urged Twitter to get out of the verification business altogether: “The notion of a central authority — the Twitterburo, so to speak — sitting in judgment of authentic identities grinds against the identity of Twitter to begin with.” (Twitter has begun phasing out verification, limiting it to a case-by-case basis.)

Eric Deggans of the Tampa Bay Times argued that the whole episode proved that regardless of what Twitter chooses to do, “the Internet is always the ultimate verification system for much of what appears on it.” Kara Swisher of All Things Digital unearthed the problem in this particular case that led to the faulty verification: A punctuation mixup in communication with Deng’s assistant.

Columbia’s Emily Bell drew a valuable lesson from the Rupert-joins-Twitter episode: As they wade into the social web, news organizations, she argued, need to do some serious thinking about how much control they’re giving up to third-party groups who may not have journalism among their primary interests. Elsewhere in Twitter, NPR Twitter savant Andy Carvin and NYU prof Clay Shirky spent an hour on WBUR’s On Point discussing Twitter’s impact on the world.

Trend-spotting for 2011 and 2012: I caught the front end of year-in-review season in my last review before the holidays, after the Lab’s deluge of 2012 predictions. But 2011 reviews and 2012 previews kept rolling in over the past two weeks, giving us a pretty thoroughly drawn picture of the year that was and the year to come. We’ll start with 2011.

Nielsen released its list of the most-visited sites and most-used devices of the year, with familiar names — Google, Facebook, Apple, YouTube — at the top. And Pew tallied the most-talked-about subjects on social media: Osama bin Laden on Facebook and Egypt’s Hosni Mubarak on Twitter topped the lists, and Pew noted that many of the top topics were oriented around specific people and led by the traditional media.

The Next Web’s Anna Heim and Mashable’s Meghan Peters reviewed the year in digital media trends, touching on social sharing, personal branding, paywalls, and longform sharing, among other ideas. At PBS MediaShift, Jeff Hermes and Andy Sellars authored one of the most interesting and informative year-end media reviews, looking at an eventful year in media law. As media analyst Alan Mutter pointed out, though, 2011 wasn’t so great for newspapers: Their shares dropped 27 percent on the year.

One of the flashpoints in this discussion of 2011 was the role of paywalls in the development of news last year: Mashable’s Peters called it “the year the paywall worked,” and J-Source’s Belinda Alzner said the initial signs of success for paywalls are great news for the financial future of serious journalism. Mathew Ingram of GigaOM pushed back against those assertions, arguing that paywalls are only working in specific situations, and media prof Clay Shirky reflected on the ways paywalls are leading news orgs to focus on their most dedicated users, which may not necessarily be a bad thing. “The most promising experiment in user support means forgoing mass in favor of passion; this may be the year where we see how papers figure out how to reward the people most committed to their long-term survival,” he wrote.

Which leads us to 2012, and sets of media/tech predictions from the Guardian’s Dan Gillmor, j-prof Alfred Hermida, Mediaite’s Rachel Sklar, Poynter’s Jeff Sonderman, and Sulia’s Joshua Young. Sklar and Sonderman both asserted that news is going to move the needle online (especially on Facebook, according to Sonderman), and while Hermida said social media is going to start to just become part of the background, he argued that that’s a good thing — we’re going to start to find the really interesting uses for it, as Gillmor also said. J-prof Adam Glenn also chimed in at PBS MediaShift with his review of six trends in journalism education, including journo-programming and increased involvement in community news.

SOPA’s generation gap: The debate over Internet censorship and SOPA will continue unabated into the new year, and we’re continuing to see groups standing up for and against the bill, with the Online News Association and dozens of major Internet companies voicing their opposition. One web company who notoriously came out in favor of the bill, GoDaddy, faced the wrath of the rest of the web, with some 37,000 domains being pulled in two days. The web hosting company quickly pulled its support for SOPA, though it isn’t opposing the bill, either.

New York Times media critic David Carr also made the case against the bill, noting that it’s gaining support because many members of Congress are on the other side of a cultural/generational divide from those on the web. He quoted Kickstarter co-founder Yancey Strickler: “It’s people who grew up on the Web versus people who still don’t use it. In Washington, they simply don’t see the way that the Web has completely reconfigured society across classes, education and race. The Internet isn’t real to them yet.”

Forbes’ Paul Tassi wrote about the fact that many major traditional media companies have slyly promoted some forms of piracy over the past decade, and GigaOM’s Derrick Harris highlighted an idea to have those companies put some of their own money into piracy enforcement.

Tough times for the Times: It’s been a rough couple of weeks for The New York Times: Hundreds of staffers signed an open letter to Publisher Arthur Sulzberger Jr. expressing their frustration over various compensation and benefits issues. The Huffington Post’s Michael Calderone reported that the staffers’ union had also considered storming Sulzberger’s office or walking out, and Politico’s Dylan Byers noted that the signers covered a broad swath of the Times’ newsroom, cutting across generational lines.

The Atlantic’s Adam Clark Estes gave some of the details behind the union’s concerns about the inequity of the paper’s buyouts. But media consultant Terry Heaton didn’t have much sympathy: He said the union’s pleas represented an outmoded faith in the collective, and that Times staffers need to take more of an everyone-for-themselves approach.

The Times also announced it would sell its 16 regional newspapers for $143 million to Halifax Media Group, a deal that had been rumored for a week or two, and told Jim Romenesko it would drop most of its podcasts this year. To make matters worse, the paper mistakenly sent an email to more than 8 million followers telling them their print subscriptions had been canceled.

Reading roundup: Here’s what else you might have missed over the holidays:

— A few thoughtful postscripts in the debate over PolitiFact and fact-checking operations: Slate’s Dave Weigel and Forbes’ John McQuaid dissected PolitiFact’s defense, and Poynter’s Craig Silverman offered some ideas for improving fact-checking from a recent roundtable. And Greg Marx of the Columbia Journalism Review argued that fact-checkers are over-reaching beyond the bounds of the bold language they use.

— A couple of good pieces on tech and the culture of dissent from Wired: A Sean Captain feature on the efforts to meet the social information needs of the Occupy movement, and the second part of Quinn Norton’s series going inside Anonymous.

— For Wikipedia watchers, a good look at where the site is now and how it’s trying to survive and thrive from The American Prospect.

— Finally, a deep thought about journalism for this weekend: Researcher Nick Diakopoulos’ post reconceiving journalism in terms of information science.

Crystal ball photo by Melanie Cook used under a Creative Commons license.

July 29 2011

19:49

Better to be first or right? - A false choice and an excuse

The Buttry Diary :: An editor asks by email a question Steve Buttry hears often as journalists address the challenges of digital journalism: Is it better to be first, or be right?” Three times recently, the editor said, his staff was beaten (not on breaking news), but the competition had major errors in its reports. “When we published, we got the stories right, though, again, not first,” the editor said."I regard this as a false choice," writes Steve Buttry.

[Steve Buttry:] I believe accuracy and verification become more important in digital journalism than in print journalism. The daily deadlines of print usually give you hours to nail down the facts before you have to publish. The constant deadlines of digital publishing mean that you publish when you have the facts verified

Better to be first or right?

Continue to read Steve Buttry, stevebuttry.wordpress.com

July 09 2011

07:43

Check before you publish - Breaking "news" on social media still needs ... verification

Poynter :: A July 4 fireworks show in downtown Philadelphia was marred by a shooting that sent crowds scattering, according to hundreds of tweets sent that night. But when reporters at the Philadelphia Daily News followed up with police, wrote Philly.com staffer Daniel Victor, they were told there had been no such shooting.

News organizations are learning that they can use social media to get the first, and firsthand, accounts of breaking news. But they face the challenge of verifying that information and deciding whether to publish those accounts.

Continue to read Jeff Sonderman, www.poynter.org

July 08 2011

09:31

Has News International really registered TheSunOnSunday.com?

A number of news outlets – including the BBC, Guardian and Channel 4 News – mentioned yesterday in their coverage of the closure of the News Of The World that TheSunOnSunday.com had been registered just two days ago. (It was also mentioned by Hugh Grant on last night’s Question Time.)

It’s a convenient piece of information for a conspiracy theory – but a little bit of digging suggests it’s unlikely to have been registered by News International as part of some grand plan.

When I tweeted the claim yesterday two people immediately pointed out key bits of contextual information from the WHOIS records:

Firstly, it is unlikely that News International would use 123-reg to register a domain name. @bigdaddymerk noted, News International ”use http://bit.ly/cWSHia for their .coms and have their own IPS tag for .co.uk”

Murray Dick added that it would “be odd for big corporation to withhold info on whois record”

And – not that this is a big issue given recent events – according to @bigdaddymerk “in the case of the .co.uk registering as a UK individual would be whois abuse.”

You might argue that the above might be explained by News International covering their tracks, but if were covering their tracks it’s unlikely they’d do it like this.

Anyway, digging further into the timeline of the ‘Sunday Sun’ casts further doubt on any conspiracy connected to News Of The World.

For example, it was reported over a week ago that The Sun was moving to 7-day production (thanks to Roo Reynolds, again on Twitter).

Between that announcement and the registration of TheSunOnSunday.com, anyone with a habit of domain squatting could have grabbed the domain in the hope that it would become valuable in the future.

Either way, even if it has been registered by someone at News International, the timings just don’t add up to a News Of The World-related conspiracy.

So, as I wrote yesterday, a ‘Sunday Sun’ is not a rebranding of News Of The World. They have just closed the world’s biggest selling English language newspaper – its most profitable tabloid – and made 200 people redundant.

PrintFriendly

June 30 2011

08:30

June 25 2011

06:13

The problem of real-time verification - SwiftRiver: add context to content

Niemanlab :: One of the biggest challenges news organizations face is the real-time aspect of newsgathering: the massive problem that is making sense of the torrent of information that floods in when breaking-news events take place. How do you process, and then verify, hundreds and often thousands and sometimes millions of information pieces (SMS, photos, tweets and more discrete data points), even before those points transform into something that resembles useful information?

[Jon Gosier] .. it is all about adding context to content

SwiftRiver Dataflow 1 from Ushahidi on Vimeo.

 

A team at Ushahidi has been working on that problem since early 2010, developing a way to help users manage large quantities of data more efficiently — and to help verify and prioritize information as it flows in from the crowd.

Continue to read Megan Garber, www.niemanlab.org

May 26 2011

18:00

Sarah Palin’s 2009 “death panel” claims: How the media handled them, and why that matters

Editor’s Note: This article was originally published on the journalism-and-policy blog Lippmann Would Roll. Written by Matthew L. Schafer, the piece is a distillation of an academic study by Schafer and Dr. Regina G. Lawrence, the Kevin P. Reilly Sr. chair of LSU’s Manship School of Mass Communication. They have kindly given us permission to republish the piece here.

It’s been almost two years now since Sarah Palin published to Facebook a post about “death panels.” In a study to be presented this week at the 61st Annual International Communications Association Conference, we analyzed over 700 stories placed in the top 50 newspapers around the country.

“The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s ‘death panel’ so his bureaucrats can decide…whether they are worthy of health care,” Palin wrote at the time.

Only three days later, PolitiFact, an arm of the St. Petersburg Times, published its appraisal of Palin’s comment, stating, “We agree with Palin that such a system would be evil. But it’s definitely not what President Barack Obama or any other Democrat has proposed.”

FactCheck.org, a project of the Annenburg Public Policy Center, would also debunk the claim, and later PolitiFact users would later vote the death panel claim to the top spot of PolitiFact’s Lie of the Year ballot.

Despite this initial dismissal of the claim by non-partisan fact checkers, a cursory search of Google turns up 1,410,000 million results, showing just how powerful social media is in a fractured media climate.

Yet, the death panel claim — as we’re sure many will remember — lived not only online, but also in the newspapers, and on cable and network television. In the current study, which ran from August 8 (the day after Palin made the claim) to September 13 (the day of the last national poll about death panels) the top 50 newspapers in the country published over 700 articles about the claims, while the nightly network news ran about 20 stories on the topic.

At the time, many commentators both in and outside of the industry offered their views on the media’s performance in debunking the death panel claim. Some lauded the media for coming out and debunking the claim, while others questioned whether it was the media’s “job” to debunk the myth at all.

“The crackling, often angry debate over health-care reform has severely tested the media’s ability to untangle a story of immense complexity,” Howard Kurtz, who as then at the Washington Post, said. “In many ways, news organizations have risen to the occasion….”

Yet, Media Matters was less impressed, at times pointing out, for example, that “the New York Times portrayed the [death panel] issue as a he said/she said debate, noting that health care reform supporters ‘deny’ this charge and call the claim ‘a myth.’ But the Times did not note, as its own reporters and columnists have previously, that such claims are indeed a myth…”

So, who was right? Did the media debunk the claim? And, if so, did they sway public opinion in the process?

Strong debunking, but confused readers

Our data indicate that the mainstream news, particularly newspapers, debunked death panels early, fairly often, and in a variety of ways, though some were more direct than others. Nevertheless, a significant portion of the public accepted the claim as true or, perhaps, as “true enough.”

Initially, we viewed the data from 30,000 feet, and found that about 40 percent of the time journalists would call the death panel claim false in their own voice, which was especially surprising considering many journalists’ own conceptions that they act as neutral arbiters.

For example, on August 9, 2009, Ceci Connolly of the Washington Post said, “There are no such ‘death panels’ mentioned in any of the House bills.”

“[The death panel] charge, which has been widely disseminated, has no basis in any of the provisions of the legislative proposals under consideration,” The New York Times’ Helene Cooper wrote a few days after Connolly.

“The White House is letting Congress come up with the bill and that vacuum of information is getting filled by misinformation, such as those death panels,” Anne Thompson of NBC News said on August 11.

Nonetheless, in more than 60 percent of the cases it’s obvious that newspapers abstained from calling the death panels claim false. (We also looked at hundreds of editorials and letters to the editor, and it’s worth noting that almost 60 percent of those debunked the claim, while the rest abstained from debunking and just about 2 percent supported the claim.)

Additionally, of journalists who did debunk the claim, almost 75 percent of those articles contained no clarification as to why they were labeling the claim as false. Indeed, it was very much a “You either believe me, or you don’t” situation without contextual support.

As shown below, whether or not journalists debunked the claim, they often times approached the controversy by also quoting one side of the debate, quoting the other, and then letting the reader dissect the validity of each side’s stance. Thus, in 30 percent of cases where journalists reported in their own words that the claim was false, they nonetheless included either side’s arguments as to why their side was right. This often just confuses the reader.

This chart shows that whether journalists abstained from debunking the death panels claim or not, they still proceeded to give equal time to each side’s supporters.

Most important is the light that this study sheds on the age-old debate over the practical limitations surrounding objectivity. Indeed, questions are continually raised about whether journalists can be objective. Most recently, this led to a controversy at TechCrunch where founder Michael Arrington was left defending his disclosure policy.

“But the really important thing to remember, as a reader, is that there is no objectivity in journalism,” Arrington wrote to critics. “The guys that say they’re objective are just pretending.”

This view, however, is not entirely true. Indeed, in the study of death panels, we found two trends that could each fit under the broad banner of objectivity.

Objectivity: procedural and substantive

First, there is procedural objectivity — mentioned above — where journalists do their due diligence and quote competitors. Second, there is substantive objectivity where journalists actually go beyond reflexively reporting what key political actors say to engage in verifying the accuracy of those claims for their readers or viewers.

Of course, every journalist is — to some extent — influenced by their experiences, predilections, and political preferences, but these traits do not necessarily interfere with objectively reporting verifiable fact. Indeed, it seems that journalists could practice either form of objectivity without being biased. Nonetheless, questions and worries still abound.

“The fear seems to be that going deeper—checking out the facts behind the posturing and trying to sort out who’s right and who’s wrong—is somehow not ‘objective,’ not ‘straight down the middle,” Rem Reider of the American Journalism Review wrote in 2007.

Perhaps because of this, journalists in our sample attempted to practice at the same time both types of objectivity: one which, arguably, serves the public interest by presenting the facts of the matter, and one which allows the journalist a sliver of plausible deniability, because he follows the insular journalistic norm of both presenting both sides of the debate.

As such, we question New York University educator and critic Jay Rosen, who has argued that “neutrality and objectivity carry no instructions for how to react” to the rise of false but popular claims. We contend that the story is more complicated: Mainstream journalists’ figurative instruction manual contains contradictory “rules” for arbitrating the legitimacy of claims.

These contradictory rules are no doubt supported by public opinion polls taken during the August and September healthcare debates. Indeed, one poll released August 20 reported that 30 percent believed that proposed health care legislation would “create death panels.” Belief in this extreme type of government rationing of health care remained impressively high (41 percent) into mid-September.

More troubling, one survey found that the percentage calling the claim true (39 percent) among those who said they were paying very close attention to the health care debate was significantly higher than among those reporting they were following the debate fairly closely (23 percent) or not too closely (18 percent).

Yet, of course, our data does not allow us to say that these numbers are a direct result of the mainstream media’s death panel coverage. Nonetheless, because mainstream media content still powers so many websites’ and news organizations’ content, perhaps this coverage did have an impact on public opinions to some indeterminable degree.

Conclusion

One way of looking at the resilience of the death panels claim is as evidence that the mainstream media’s role in contemporary political discourse has been attenuated. But another way of looking at the controversy is to demonstrate that the mainstream media themselves bore some responsibility for the claim’s persistence.

Palin’s Facebook post, which popularized the death panel, catchphrase said nothing about any specific legislative provision. News outlets and fact-checkers could examine the language of currently debated bills to debunk the claim — and many did, as our data demonstrate. Nevertheless, it appears the nebulous “death panel bomb” reached its target in part because the mainstream media so often repeated it.

Thus, the dilemma for reporters playing by the rules of procedural objectivity is that repeating a claim reinforces a sense of its validity — or at least, enshrines its place as an important topic of public debate. Moreover, there is no clear evidence that journalism can correct misinformation once it has been widely publicized. Indeed, it didn’t seem to correct the death panels misinformation in our study.

Yet, there is promise in substantive objectivity. Indeed, today more than ever journalists are having to act as curators. The only way that they can effectively do so is by critically examining the surplusage of social media messages, and debunking or refusing to reinforce those messages that are verifiable. Indeed, as more politicians use the Internet to circumvent traditional media, this type of critical curation will become increasingly important.

This is — or should be journalists’ new focus. Journalists should verify information. Moreover, they should do so without including quotations from those taking a stance that is demonstrably false. This creates a factual jigsaw puzzle that the reader must untangle. Indeed, on the one hand, the journalist is calling the claim false, and on the other, he is giving inches quoting someone who believes it’s true.

Putting aside the raucous debates about objectivity for a moment, it is clear that journalists in many circumstances can research and relay to their readers information about verifiable fact. If we don’t see a greater degree of this substantive objectivity, the public is left largely at the mercy of the savviest online communicator. Indeed, if journalists refuse to critically curate new media, they are leaving both the public and themselves in a worse off position.

Image of Sarah Palin by Tom Prete used under a Creative Commons license.

May 18 2011

18:40

BBC Social Media Summit: Crowdsourcing a Research Agenda

The BBC College of Journalism is staging a Social Media Summit (hashtag #BBCSMS) in London this week, which will bring together industry leaders, practitioners and academics from around the world, with a view to collaboratively mapping the future of social journalism.

Social media is having a transformative impact on professional journalism. And the speed of the real-time revolution raises significant challenges and opportunities for journalists and their publishers. But it also necessitates a rigorous, industry-relevant academic research agenda.

horrocks_226.jpg

The issues confronting journalism in the social media space include fundamental shifts in the practice of verification, the merger of private lives and professional practice, and the new journalistic role of community engagement.

These themes will be central to the summit, which will culminate in an open forum on Friday, featuring contributions from senior editors at The Guardian, Al Jazeera, NPR and The Washington Post.

Peter Horrocks, BBC head of Global News, said in February 2010 that social media practice for journalists was no longer discretionary. He was right. But this means that the professional training of journalists in social media theory and practice is also essential.

And, fundamental to teaching and training journalists in this new form of "social journalism," should be cutting-edge and industry-relevant academic research in the field of journalism studies.

A collaborative social media research agenda

One of the objectives of the BBC Social Media Summit, which has attracted industry leaders and academics from around the world, will be identifying key areas for research in the field which can assist journalists and media organizations as they adapt to the challenges and opportunities of the social media age.

cojo_oup_graphic.jpg

The process of charting a course for research into journalism and social media at the summit will be collaborative, with researchers in the field (me included) seeking to coordinate an approach that draws on industry expertise and responds to needs identified by the journalists and editors in attendance. We believe journalism research should be informed by journalistic practice and have a professionally relevant purpose. We're also committed to feeding back our research findings -- in an accessible and easily digestible way -- to the broader community for input, in keeping with social media ethos and our belief in practically applicable research.

The Twitterization of Journalism

I'm writing a Ph.D. on the Twitterization of journalism, or the transformative impact of social media on the field. My research has so far highlighted the effect of engagement with sources and Jay Rosen's "people formerly known as the audience," the ways in which professional practice is being reshaped through real-time reporting, increased transparency, and the conflation of private and professional lives in the space.

As I've identified in the course of this research project (some elements of which I've previously explored at MediaShift) there are many rich and important research questions emerging in the field -- almost at the speed of tweets!

Key Research Themes and Questions

Here are some of my contributions to framing a social media research agenda for journalism grouped under key themes I've identified in the process of academic and journalistic research in the field -- a process which has included social media crowdsourcing of responses.

VERIFICATION

• How is social media changing the practices and processes of verification?
• What new methods of verification are emerging? How effective are they?

• What is the impact of changing verification practices -- including crowdsourcing verification -- on accuracy in reporting and journalistic credibility?

CLASH OF THE PROFESSIONAL AND PRIVATE

• What is the impact (personally and professionally) of the merger of journalists' personal and private lives and their professional and public lives on social media sites?
• How do so-called audiences react to the blurring of personal and professional lives by journalists through their social media practice? What impact does it have on their views of journalists who use social media "socially?" Are they more or less likely to collaborate with such journalists?

ENGAGEMENT

• How do journalists' interactions with the "people formerly known as audience" impact their research, reporting, and commentary of issues (including framing, source selection, objectivity and verification)?
• What rules of engagement do journalists bring to social media interaction? With what success/effect?

CONFLICT AND COMPLAINTS

• What are journalists' experiences with being confronted with criticism about their work from colleagues, competitors and audiences on social media sites?
• What views have media organizations formed about the role of individual journalists in complaints handling via social media? What processes and guidelines are being, or need to be, developed?

INDUSTRIAL/LOGISTICAL ISSUES

• What are the impacts on journalists' workload, productivity and well-being of 24/7 real-time social media practice and engagement?
• What systems and procedures are media employers putting in place to address the issues of workload, time management and risks associated with social media?

NETWORKING, PROFESSIONAL DEVELOPMENT & GLOBALIZATION

• Explore the role and impact of cross-cultural and transnational communication via social media on journalists and their subjects
• Explore mentoring, networking, and employment patterns among professional journalists through social media

ROUNDS & BEATS

• Develop case studies of best-practice approaches to social media strategies in reporting rounds such as health, education, courts, emergencies, politics
• Explore the role of social media in public journalism projects

JOURNALISM EDUCATION

• How should social media be incorporated into university and professional training courses?
• Measure outcomes/impacts of training

TECHNOLOGY

• Explore cross-disciplinary approaches to problem-solving, involving computer scientists, journalists/journalism researchers (et al.) in development of industry-applicable resources and programs applicable to aiding reporting via social media, measuring social media impacts, verification, etc.
• Platform-specific research, e.g., How is Facebook changing journalism?

LEGAL/REGULATORY ISSUES

• How are courts and governments around the world responding to the challenges posed to publishing laws presented by real-time "masses media?"
• What are the implications for media freedom/freedom of expression of attempts to regulate the social web?

Share your ideas, help frame the research agenda

So, that's my contribution to framing the research discussion at the summit. But what ideas would you like to throw into the mix? And what research approaches would you suggest, with what estimated value? We are particularly interested in hearing from journalism professors and researchers in the field.

There are three ways you can get involved. 1) You can contribute your ideas directly by participating in the summit in London this week; or 2) you can contribute your ideas and express interest by commenting on this post; or 3) you can participate remotely in the open conference session on Friday, May 20, by contributing to the Twitter discussion curated under the #BBCSMS hashtag.

We look forward to hearing your ideas and working together to chart the future of journalism research in the field of social media.

Julie Posetti is an award-winning journalist and journalism academic who lectures in radio and television reporting at the University of Canberra, Australia. She's been a national political correspondent, a regional news editor, a TV documentary reporter and presenter on radio and television with the Australian national broadcaster, the ABC. Her academic research centers on journalism and social media, on talk radio, public broadcasting, political reporting and broadcast coverage of Muslims post-9/11. She blogs at J-Scribe and you can follow her on Twitter.

This is a summary. Visit our site for the full post ».

April 01 2011

20:07

The Charlie Sheen Twitter intern hoax – how it could be avoided

Hoax email Charlie Sheen

image from JonnyCampbell

Various parts of the media were hoaxed this week by Belfast student Jonny Campbell’s claim to have won a Twitter internship with Charlie Sheen. The hoax was well planned, and to be fair to the journalists, they did chase up documentation to confirm it. Where they made mistakes provides a good lesson in online verification.

Where did the journalist go wrong? They asked for the emails confirming the internship, but accepted a screengrab. This turned out to be photoshopped.

They then asked for further emails from earlier in the process, and he sent those (which were genuine) on.

They should have asked the source to forward the original email.

Of course, he could have faked that pretty easily as well (I’m not going to say how here), so you would need to check the IP address of the email against that of the company it was supposed to be from.

An IP address is basically the location of a computer (server). This may be owned by the ISP you are using, or the company which employs you and provides your computer and internet access.

This post explains how to find IP addresses in an email using email clients including Gmail, Yahoo! Mail and Outlook – and then how to track the IP address to a particular location.

This website will find out the IP address for a particular website – the IP address for Internships.com is 204.74.99.100, for example. So you’re looking for a match (assuming the same server is used for mail). You could also check other emails from that company to other people, or ideally to yourself (Watch out for fake websites as well, of course).

And of course, finally, it’s always worth looking at the content the hoaxer has provided and clues that they may have left in it – as Jonny did (see image, left).

For more on verifying online information see Content, context and code: verifying information online, which I’ll continue to update with examples.

20:07

The Charlie Sheen Twitter intern hoax – how it could be avoided

Hoax email Charlie Sheen

image from JonnyCampbell

Various parts of the media were hoaxed this week by Belfast student Jonny Campbell’s claim to have won a Twitter internship with Charlie Sheen. The hoax was well planned, and to be fair to the journalists, they did chase up documentation to confirm it. Where they made mistakes provides a good lesson in online verification.

Where did the journalist go wrong? They asked for the emails confirming the internship, but accepted a screengrab. This turned out to be photoshopped.

They then asked for further emails from earlier in the process, and he sent those (which were genuine) on.

They should have asked the source to forward the original email.

Of course, he could have faked that pretty easily as well (I’m not going to say how here), so you would need to check the IP address of the email against that of the company it was supposed to be from.

An IP address is basically the location of a computer (server). This may be owned by the ISP you are using, or the company which employs you and provides your computer and internet access.

This post explains how to find IP addresses in an email using email clients including Gmail, Yahoo! Mail and Outlook – and then how to track the IP address to a particular location.

This website will find out the IP address for a particular website – the IP address for Internships.com is 204.74.99.100, for example. So you’re looking for a match (assuming the same server is used for mail). You could also check other emails from that company to other people, or ideally to yourself (Watch out for fake websites as well, of course).

And of course, finally, it’s always worth looking at the content the hoaxer has provided and clues that they may have left in it – as Jonny did (see image, left).

For more on verifying online information see Content, context and code: verifying information online, which I’ll continue to update with examples.

09:39

The Charlie Sheen Twitter intern hoax – how it could be avoided

Jonny Campbell's Charlie Sheen internship hoax

Image from jonnycampbell.com

Various parts of the media were hoaxed this week by Belfast student Jonny Campbell’s claim to have won a Twitter internship with Charlie Sheen. The hoax was well planned, and to be fair to the journalists, they did chase up documentation to confirm it. Where they made mistakes provides a good lesson in online verification.

Where did the journalist go wrong? They asked for the emails confirming the internship, but accepted a screengrab. This turned out to be photoshopped.

They then asked for further emails from earlier in the process, and he sent those (which were genuine) on.

They should have asked the source to forward the original email.

Of course, he could have faked that pretty easily as well (I’m not going to say how here), so you would need to check the IP address of the email against that of the company it was supposed to be from.

An IP address is basically the location of a computer (server). This may be owned by the ISP you are using, or the company which employs you and provides your computer and internet access.

This post explains how to find IP addresses in an email using email clients including Gmail, Yahoo! Mail and Outlook – and then how to track the IP address to a particular location.

This website will find out the IP address for a particular website – the IP address for Internships.com is 204.74.99.100, for example. So you’re looking for a match (assuming the same server is used for mail). You could also check other emails from that company to other people, or ideally to yourself (Watch out for fake websites as well, of course).

And of course, finally, it’s always worth looking at the content the hoaxer has provided and clues that they may have left in it – as Jonny did (see image, left).

For more on verifying online information see Content, context and code: verifying information online, which I’ll continue to update with examples.

PrintFriendly

January 10 2011

17:09

January 06 2010

08:15

December 22 2009

08:59

Internet news as a market for news lemons

This article frames the problem of news dissemination as a problem of market lemons, analogous to the issue raised by George Akerlof in 1970. Framing news as a mechanism of vetting common knowledge as opposed to entertainment allows one to see that instant common knowledge in the byzantine and uncertain way in which humans communicate and live in is unattainable. Given this frame of the problem a potential solution is posited which allows traditional newspaper companies to serve and focus on the role of validating news rather than simply creating or capturing it. The most value added service that traditional news organizations can provide is validation of truth and quality assurance.

“It is hard to get the news from poems, but everyday, men die miserably for lack of what can be found there.” (William C. Williams)

Introduction

Gauging quality of entertainment is fairly simple and self-evident. Consumers know instantly whether a product is entertaining and consumers continue to pay attention if they find the material to be entertaining.

News providers tend to serve both an individual’s desire for entertainment and information in one product bundle. Although it is very easy for consumers to test the quality of the entertainment component of news it is much more difficult to gauge the information quality of news.

Consumers face the intangible dilemma of assessing whether news is accurate or true, which poses a problem of asymmetric information for consumers.

Despite the availability of virtually infinite potential news sources and automated search engines, the search costs of getting the truth are too high. Human beings are bombarded with information throughout the day and despite the ease of search engine technology only 28% of the internet is actually available for search (Barabasi, 2002). The internet is growing in content exponentially and current computing cannot search the majority of the internet.

The threat of news becoming a market for lemons is an important issue worth exploring as news serves to provide a gatekeeping and watchdog function in democracies.

Although it might appear that the advent of increased competition for news via independent and unbiased bloggers on the internet would improve news quality this may not be true in practice. Without a way to assess the accuracy and quality of the information the market of news on the internet tends toward a market for news lemons.

Shleifer’s research on the market for news shows that competition is not enough to ensure accurate news and that, ironically, competition results in “lower prices, but common slanting toward reader biases” (Shleifer and Mullainathan, 2005).

Shleifer posits that ‘a reader with access to all news sources could get an unbiased perspective’ and that ‘reader heterogeneity is more important for accuracy in media’ (2005).

That said, the issue of search costs of consumers has not been explored as in practical terms as no reader has time to read all news sources to form a perfect model of unbiased information.

The problem of assessing the validity of news quality is in essence the ‘market for lemons’ problem raised by Akerlof (Akerlof, 1970). The market for lemons phenomenon relates to ‘quality and uncertainty’ and news is clearly a business in which “‘trust’ is important” and, as Akerlof points out, “Informal unwritten guarantees are preconditions for trade and production” and “where these guarantees are indefinite, business will suffer” (1970).

The aim of this paper is raise the issue of the market for internet news lemons as the quality of free information served piping hot on the internet is ‘indefinite’. When the quality of a good is unknown consumers are willing to pay for it, assuming it is not reliable, and thus this drives sellers with a good product out of the market as the consumer is unable to determine high quality from low quality goods.

Akerlof showed the detrimental effect of markets for lemons using the case of used cars in the 1970s where people with good used cars could not obtain the price their car was worth and would not sell their cars, thus leaving the market full of lemons in a self-fulfilling prophecy of sorts. Similarly, any market for good where the quality of the product is uncertain tends to a market for lemons.

This phenomenon has been at play in the mortgage securities market and is no different for news as a product.

Towards a definition of news and newspaper quality

News as a system for humans provides the following affordances to humans:

  • connects people with information,
  • provides branding of perceived truth,
  • helps support democracy and its ideals, and
  • fulfills an entertainment component via narrative integrity.

The narrative integrity itself has recently been criticized by Taleb as it encourages readers to build unrealistic assessment of risk in financial and other aspects of daily life (Taleb, 2005). Newspapers in general tend to either exaggerate or under-represent risks faced by individuals and are not sound guardians from a risk management point of view.

Quality for a news product is a perception of validity and truth amongst peer groups that consumers communicate with. Most consumers of news want to know what is going on. What is big? News thus functions to provide roles of gatekeeping, watchdog, anti-corruption, and in general a sharing of true facts of interest to human communities in relation to purported values and themes.

The existence of a strong free press has been associated with lowered corruption across nations (Brunettia & Wederb, 2003). In a study of government ownership of the news media, which is the case in 97% of countries, it was found that per ‘public choice theory…government ownership undermines political and economic freedom” (Schleifer, Djankov, Mcliesh & Menova, 2003).

Scoping News

For the scope of this work the emphasis will be on the non-entertainment quality aspects of news as a product. This is consistent with Shleifer’s definition that the ‘quality of [news] information is its accuracy. The more accurate the news, the more valuable is its source to the consumer. Pressure from audiences and rivals force news outlets to seek and deliver more accurate information, just as market forces motivate auto-makers to produce better cars’ (Shleifer, 2005).

Hamilton’s book on the economics of news highlights the fact that news is meant for rapid commoditization, it is information good and is a product of network effects (Hamilton, 2003). Per Hamilton’s point, speed of delivery, accuracy, and relevancy seem to be desirable characteristics of news as a product (2003).

If we step back and look at this, news is really a mechanism of generating ‘common knowledge’ within a byzantine environment where quality and truth are uncertain.

Taking this perspective one can see that the work in artificial intelligence and philosophy conducted by Halpern and Moses is relevant in this context (Halpern etal, 1984). Halpern and other students of common knowledge find that in practice it is impossible to guarantee reliable and true common knowledge in real time. The closest one can get to is almost common knowledge (Halpern etal, 1994).

Given the complex nature of the problem of common knowledge in a distributed uncertain environment Halpern et al point out that the modeling of time is critical in achieving eventual common knowledge. One way to look at this is, given that a consumer wants common knowledge, they should wait a sufficient time until a news story can be vetted. The expectation of instant and true knowledge is a pipe dream, as Eugene O’Neil would say.

One side effect of the current market equilibrium for news is the segmentation of the market for news into the following groups of people:

  1. people who don’t read the news,
  2. people who the read news to interpret facts to suit agendas i.e. politicians, lobbyists etc, and
  3. people who read what they want to believe and are aware of it.

I believe this segmentation exists due to high search costs for the truth.

I personally don’t read the news much at all. If I am interested in a topic I research the field, get input from experts, and make my own inferences. I of course do not engage much in casual conversation. For the majority of citizens who do, news is an invaluable source to relate with others and share experiences of ‘true events’ and common knowledge.

Noted anthropologist Roy Wagner has pointed out the pervasive problem of information which humans grapple with:

“Persuasion, from the days of Aristotle onwards, never works as it is intended to and has its greatest effect on the persuaders … To the extent that the vast, worldwide communications industry, the media, the internet or Web, the ubiquitous ’sensory’ modes and guidance-circuitries use ‘information’ or ‘communication’ as code words for what is really going on, we live in a world that is actually created by a failure of persuasion.

“This means that we live in a world of information-stealth – the half truths of our lies and the lies of half truth  - or what the CIA, or at least its critics, would call disinformation. I wouldn’t be kidding you, now would I? Disinformation has a far more ambiguous or ambivalent effect than persuasion ever could have and is both more informative and communicative than its buzz-word surrogates. It works on a ‘leakage’ principle, partial truths leaked out in the telling of deliberate lies and deliberate lies leaked in the telling of partial truths. It is motivated by goals and objectives that have nothing directly to do with either belief or conviction on one hand or doubt and cynicism on the other; it offers deniability with both hands. ‘It is either half true,’ as the Viennese aphorist Karl Krauss said of the aphorism or ‘one and a half times true’.

“We are unconvinced (e.g. apathetic) on one hand, and overconvinced on the other, and the middle ground is the most contested of all … Disinformation rules the world, and it does so through ‘deniability’. We know for a fact that every single trade, occupation, and especially profession has its secrets, known to its initiates and unknown to others.” (Wagner, 2000)

The last piece applies to journalists as well.

Potential solutions: a new business model for news

To date innovation in news has been focused on either transforming traditional media into high tech companies, which is unlikely, and the adoption of the market niche strategy of hyperlocal news.

The model of niche and differentiation/specialization has potential but is perplexed with the issue of changing interest and taste. How does one know which hyperlocal news is of interest? With limited time and highly contested attention spans hyperlocal news is a difficult to maintain proposition. That said, given non-profit and community support it can work as a niche solution.

The solution we propose here is targeted to larger well established news players and is a novel approach to the problem.

Traditional print sources like the Washington post etc. have a platform and reputation for checking and ensuring high quality information. The expertise that existing print media companies have can be used to focus on validation and authentication of breaking news stories, as on the internet there is no authority for the validity of news.

One innovative solution to the market of news lemons problem might be for traditional news papers to create reputation-based blogging spaces where stories are tested and validated before publication. This is consistent with the work of Yamagishi who studied the market-for-lemons problem in online trading and found an online reputation system to be a useful solution to the problem (Yamagishi, 2002).

Yamagishi noted that online trading results in “information asymmetry” which “drives the … market into a lemons market” (2002). This is analogous to the problem of news consumption.  Yamagishi’s analysis segments reputation into 2 forms: positive and negative reputation. Yamagishi finds the openness of internet trading precludes negative reputation and  “promotes positive reputation as an effective means for curtailing the lemons problem” (2002).

An important aspect of understanding why negative reputation is not effective on the internet is that it is too easy to switch and create new identities. Thus methods of “inclusion” which validate positive reputation are critical to combating the lemons problem (2002).

Per Yamagishi’s suggestion, existing newspapers with positive brand reputations have value as providers of positive reputation in an open market of internet news.

An enterprise devoted to assuring quality of the news could be a new hybrid form of existence for traditional newspapers in which the goals of the news system is preserved.

The price differential paid to the news companies would be based on their quality of checking and not on slant of the news or sensational quality of it.

Under this model papers would specialize in news domains with expertise and offer objective validation stories. For true objectivity the influence of advertising profit would need to be removed. Perhaps the advertising revenue would accrue to content providers who provided the stories along with advertisements which underwrite the authors. The news intermediaries who select the stories based on quality and validation would be paid only for quality assurance.

A successful example of dealing with ‘cyber lemons’ was that of an ‘online intermediary’ used by China’s largest online consumer-to-consumer trading site, which built a “credit evaluation system to serve as a quality-intermediary and reputation” (Pan, 2005).

In short, several eBays for news, specializing in different news domains, would serve to mitigate the lemons problem.

The newspaper industry must face the disintermediation of its power to dictate the news agenda. The notion that a few, supported by commercial advertising, would decide what was newsworthy was paternalistic and with the disintermediation of this component the responsibility of what to pay attention to falls on society. This issue itself is best tackled through education and the fostering of civic and democratic ideals in youth.

Dhruv Sharma is an independent scholar in the fields of organization behavior, risk management, artificial intelligence, and systems engineering. A graduate of the McIntire School at the University of Virginie, he holds a Masters in Systems Engineering and a Masters in Organizational Development from Marymount University.

Special thanks to George Akerlof for email discussion of the idea and also guidance of areas to research and focus.

This article is dedicated to Emma Brown, a greater writer and journalist and George Akerlof, the great economist.

Citations:

  • Akerlof, GA. (1970) The Market for “Lemons”: Quality Uncertainty and the Market Mechanism. The Quarterly Journal of Economics, Vol. 84, No. 3. (Aug., 1970), pp. 488-500
  • Barabási, A.L. (2002) Linked: The New Science of Networks, Perseus, Cambridge
  • Brunettia, Aymo & Wederb, Beatrice (2003) A free press is bad news for corruption. Journal of Public Economics 87 (2003) 1801–1824
  • Hamilton, James T., (2003), All the News That’s Fit to Sell. Princeton, NJ: Princeton University Press
  • Halpern,J.Y. and Moses,Y. (1984). Knowledge and common knowledge in a distributed environment. Journal of the ACM, 37(3):549–587, 1990. A preliminary version appeared in Proc. 3rd ACM Symposium on Principles of Distributed Computing
  • Halpern,J.; Fagin, R; Moses,Y. and Vardi,MY (1994). Common knowledge revisited. Theoretical aspects of rationality. Proceedings 6th Conference. Retrieved from http://www.cs.cornell.edu/home/halpern/papers/ck_revisited.pdf
  • Schleifer, A. Djankov, S., Mcliesh, C. Menova, T. (2003) WHO OWNS THE MEDIA? Journal of Law and Economics. vol. XLVI
  • Shleifer, Andrei & Mullainathan, S. (2005) The Market for News. The American Economic Review
  • Taleb, N.N. (2005) “THE OPIATES OF THE MIDDLE CLASSES” Retrieved from http://www.edge.org/3rd_culture/taleb05/taleb05_index.html
  • Yamagishi, T. Masafumi, Matsusa. (2002) Improving the Lemons Market with a Reputation System: An Experimental Study of Internet Auctioning. Retrieved from http://joi.ito.com/archives/papers/Yamagishi_ASQ1.pdf Hokkaido University
  • Wagner, Roy (2000) “Our Very Own Cargo Cult”. Oceana
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl