Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 26 2013

16:48

What’s New in Digital Scholarship: A generation gap in online news, and does The Daily Show discourage tolerance?

Editor’s note: There’s a lot of interesting academic research going on in digital media — but who has time to sift through all those journals and papers?

Our friends at Journalist’s Resource, that’s who. JR is a project of the Shorenstein Center on the Press, Politics and Public Policy at the Harvard Kennedy School, and they spend their time examining the new academic literature in media, social science, and other fields, summarizing the high points and giving you a point of entry. Roughly once a month, JR managing editor John Wihbey will sum up for us what’s new and fresh.

We’re at the halfway mark in our year-long odyssey tracking all things digital media and academic. Below are studies that continue to advance understanding among various hot topics: drone journalism; surveillance and the public; Twitter in conflict zones; Big Data and its limits; crowdsourced information platforms; remix culture; and much more. We also suggest some further “beach reads” at bottom. Enjoy the deep dive.

“Reuters Institute Digital News Report 2013: Tracking the Future of News”: Paper from University of Oxford Reuters Institute for the Study of Journalism, edited by Nic Newman and David A. L. Levy.

This new report provides tremendous comparative perspective on how different countries and news ecosystems are developing both in symmetrical and divergent ways (see the Lab’s write-up of the national differences/similarities highlighted.) But it also provides some interesting hard numbers relating to the U.S. media landscape; it surveys news habits of a sample of more than 2,000 Americans.

Key U.S. data points include: the number of Americans reporting accessing news by tablet in the past week rose, from 11 percent in 2012 to 16 percent in 2013; 28 percent said they accessed news on a smartphone in the last week; 75 percent of Americans reported accessing news online in the past week, while 72 percent said they got news through television and 47 percent reported having read a print publication; TV (43 percent) and online (39 percent) were Americans preferred platforms for accessing news. Further, a yawning divide exists between the preferences of those ages 18 to 24 and those over 55: among the younger cohort, 64 percent say the Web is their main source for news, versus only 25 percent among the older group; as for TV, however, 54 percent of older Americans report it as their main source, versus only 20 percent among those 18 to 24. Finally, 12 percent of American respondents overall reported paying for digital news in 2013, compared to 9 percent in 2012.

“The Rise and Fall of a Citizen Reporter”: Study from Wellesley College, for the WebScience 2013 conference. By Panagiotis Metaxas and Eni Mustafaraj.

This study looks at a network of anonymous Twitter citizen reporters around Monterrey, Mexico, covering the drug wars. It provides new insights into conflict zone journalism and information ecosystems in the age of digital media, as well the limits of raw data. The researchers, both computer scientists, analyze a dataset focused on the hashtag #MTYfollow, consisting of “258,734 tweets written by 29,671 unique Twitter accounts, covering 286 days in the time interval November 2010-August 2011.” They drill down on the account @trackmty, run by the pseudonym Melissa Lotzer, which is the largest of the accounts involved.

The scholars reconstruct a sequence in which a wild Twitter “game” breaks out — obviously, with life-and-death stakes — involving accusations about cartel informants (“hawks,” or “halcones”) and citizen watchdogs (“eagles,” or “aguilas”), with counter-accusations flying that certain citizen reporters were actually working for the Zetas drug cartel; indeed, @trackmty ends up being accused of working for the cartels. Online trolls attack her on Twitter and in blogs.

“The original Melissa @trackmty is slow to react,” the study notes, “and when she does, she tries to point to her past accomplishments, in particular the creation of [a group of other media accounts] and the interviews she has given to several reporters from the US and Spain (REF). But the frequency of her tweeting decreases, along with the community’s retweets. Finally, at the end of June, she stops tweeting altogether.” It turns out that the real @trackmty had been exposed — “her real identity, her photograph, friends and home address.”

Little of this drama was obvious from the data. Ultimately, the researchers were able to interview the real @trackmty and members of the #MTYfollow community. The big lessons, they realize, are the “limits of Big Data analysis.” The data visualizations showing influence patterns and spikes in tweet frequency showed all kinds of interesting dynamics. But they were insufficient to make inferences of value about the community affected: “In analyzing the tweets around a popular hashtag used by users who worry about their personal safely in a Mexican city we found that one must go back and forth between collecting and analyzing many times while formulating the proper research questions to ask. Further, one must have a method of establishing the ground truth, which is particularly tricky in a community of — mostly — anonymous users.”

“Undermining the Corrective Effects of Media-Based Political Fact Checking? The Role of Contextual Cues and Naïve Theory”: Study from Ohio State University, published in the Journal of Communication. By R. Kelly Garrett, Erik C. Nisbet, and Emily K. Lynch.

As the political fact-checking movement — the FactChecks and Politifacts, along with their various lesser-known cousins — has arisen, so too has a more hard-headed social science effort to get to the root causes of persistent lies and rumors, a situation made all the worse on the web. Of course, journalists hope truth can have a “corrective” effect, but the literature in this area suggests that blasting more facts at people often doesn’t work — hence, the “information deficit fallacy.” Thus, a cottage psych-media research industry has grown up, exploring “motivated reasoning,” “biased assimilation,” “confirmation bias,” “cultural cognition,” and other such concepts.

This study tries to advance understanding of how peripheral cues such as accompanying graphics and biographical information can affect how citizens receive and accept corrective information. In experiments, the researchers ask subjects to respond to claims about the proposed Islamic cultural center near Ground Zero and the disposition of its imam. It turns out that contextual information — what the imam has said, what he looks like and anything that challenges dominant cultural norms — often erodes the positive intentions of the fact-checking message.

The authors conclude that the “most straightforward method of maximizing the corrective effect of a fact-checking article is to avoid including information that activates stereotypes or generalizations…which make related cognitions more accessible and misperceptions more plausible.” The findings have a grim quality: “The unfortunate conclusion that we draw from this work is that contextual information so often included in fact-checking messages by professional news outlets in order to provide depth and avoid bias can undermine a message’s corrective effects. We suggest that this occurs when the factually accurate information (which has only peripheral bearing on the misperception) brings to mind” mental shortcuts that contain generalizations or stereotypes about people or things — so-called “naïve theories.”

“Crowdsourcing CCTV surveillance on the Internet”: Paper from the University of Westminster, published in Information, Communication & Society. By Daniel Trottier.

A timely look at the implications of a society more deeply pervaded by surveillance technologies, this paper analyzes various web-based efforts in Britain that involve the identification of suspicious persons or activity. (The controversies around Reddit and the Boston Marathon bombing suspects come to mind here.) The researcher examine Facewatch, CrimeStoppers UK, Internet Eyes, and Shoreditch Digital Bridge, all of which had commercial elements attached to crowdsourcing projects where participants monitored feed from surveillance cameras of public spaces. He points out that these “developments contribute to a normalization of participatory surveillance for entertainment, socialization, and commerce,” and that the “risks of compromised privacy, false accusations and social sorting are offloaded onto citizen-watchers and citizen-suspects.” Further, the study highlights the perils inherent in the “‘gamification’ of surveillance-based labour.”

“New Perspectives from the Sky: Unmanned aerial vehicles and journalism”: Paper from the University of Texas at Arlington, published in Digital Journalism. By Mark Tremayne and Andrew Clark.

The use of unmanned aerial vehicles (UAVs, or “drones”) in journalism is an area of growing interest, and this exploration provides some context and research-based perspective. Drones in the service of the media have already been used for everything from snapping pictures of Paris Hilton and surveying tornado damaged areas in Alabama to filming secret government facilities in Australia and protestor clashes in Poland. In all, the researchers found “eight instances of drone technology being put to use for journalistic purposes from late 2010 through early 2012.”

This practice will inevitably raise issues about the extent to which it goes too far. “It is not hard to imagine how the news media, using drones to gather information, could be subject to privacy lawsuits,” the authors write. “What the news media can do to potentially ward off the threat of lawsuits is to ensure that drones are used in an ethical manner consistent with appropriate news practices. News directors and editors and professional associations can establish codes of conduct for the use of such devices in much the same way they already do with the use of hidden cameras and other technology.”

“Connecting with the user-generated Web: how group identification impacts online information sharing and evaluation”: Study from University of California, Santa Barbara, published in Information, Communication & Society. By Andrew J. Flanagin, Kristin Page Hocevar, and Siriphan Nancy Samahito.

Whether it’s Wikipedia, Yelp, TripAdvisor, or some other giant pool of user-generated “wisdom,” user-generated platforms convene large, disaggregated audiences who form loose memberships based around apparent common interests. But what makes certain communities bond and stick together, keeping online information environments fresh, passionate, and lively (and possibly accurate)?

The researchers involved in this study perform some experiments with undergraduates to see how adding small bits of personal information — the university, major, gender, or other piece of information — to informational posts changed perceptions by viewers. Perhaps predictably, the results show that “potential contributors had more positive attitudes (manifested in the form of increased motivation) about contribution to an online information pool when they experienced shared group identification with others.”

For editors and online community designers and organizers, the takeaway is that information pools “may actually form and sustain themselves best as communities comprising similar people with similar views.” Not exactly an antidote to “filter bubble” fears, but it’s worth knowing if you’re an admin for an online army.

“Selective Exposure, Tolerance, and Satirical News”: Study from University of Texas at Austin and University of Wyoming, published in the International Journal of Public Opinion Research. By Natalie J. Stroud and Ashley Muddiman.

While not the first study to focus on the rise of satirical news — after all, a 2005 study in Political Communication on “The Daily Show with Jon Stewart” now has 230 subsequent academic citations, according to Google Scholar — this new study looks at satirical news viewed specifically in a web context.

It suggests the dark side of snark, at least in terms of promoting open-mindedness and deliberative democracy. The conclusion is blunt: “The evidence from this study suggests that satirical news does not encourage democratic virtues like exposure to diverse perspectives and tolerance. On the contrary, the results show that, if anything, comedic news makes people more likely to engage in partisan selective exposure. Further, those viewing comedic news became less, not more, tolerant of those with political views unlike their own.” Knowing Colbert and Stewart, the study’s authors can expect an invitation soon to atone for this study.

The hidden demography of new media ethics”: Study from Rutgers and USC, published in Information, Communication & Society. By Mark Latonero and Aram Sinnreich.

The study leverages 2006 and 2010 survey data, both domestic and international, to take an analytical look at how notions of intellectual property and ethical Web culture are evolving, particularly as they relate to ideas such as remixing, mashups and repurposing of content. The researchers find a complex tapestry of behavioral norms, some of them correlated with certain age, gender, race or national traits. New technologies are “giving rise to new configurable cultural practices that fall into the expanding gray area between traditional patterns of production and consumption. The data suggest that these practices have the potential to grow in prevalence in the United States across every age group, and have the potential to become common throughout the dozens of industrialized nations sampled in this study.”

Further, rules of the road have formed organically, as technology has outstripped legal strictures: “Most significantly, despite (or because of) the inadequacy of present-day copyright laws to address issues of ownership, attribution, and cultural validity in regard to emerging digital practices, everyday people are developing their own ethical frameworks to distinguish between legitimate and illegitimate uses of reappropriated work in their cultural environments.”

Beach reads:

Here are some further academic paper honorable mentions this month — all from the culture and society desk:

Photo by Anna Creech used under a Creative Commons license.

April 23 2012

18:23

Are you a young dude interested in news? All else equal, this study says you’re a top paywall target

Here’s a biggie: How do you get someone to pay for online news? A new study out of the University of Texas develops a theoretical model to begin answering that question.

The goal of the study, by Iris Chyi and Angela M. Lee, is to clarify the interrelationship among news preference, use, and intent to pay. What emerges, among other things, is a profile of the kind of people most likely to pay for online news: Young males who are — wait for it — interested in the news.

That last part is key, because while younger people are more likely to pay for news online, the study finds, they’re also less likely to be interested in news in the first place.

Another paradox: People say they prefer reading print products, yet online use is growing. In other words, consumers don’t always use what they prefer, and they’re not always willing to spend money on what they use.

That’s an idea that Chyi has been exploring since the 1990s. She sometimes refers to it as “ramen noodle theory,” which we’ve written about before: People might prefer steak over ramen — but when it comes time to reach for their wallets, they opt for ramen more often. Because it’s free and abundant, the “ramen” is perceived as inferior — which reinforces consumers’ preference for “steak.” This could help explain why Chyi found “very weak correlations” between use and intent to pay in her latest study. This is from its abstract:

While media scholars tend to take “media use” as an indicator of popularity or diffusion, media use alone does not fully capture the complexity of online news consumption. For instance, given free online news offerings in most cases, consumers do not always use what they prefer, and most are not willing to pay for what they use. This study identifies three distinct factors — preference, use, and paying intent — each helps explain a specific facet of online news consumption.

Given how many variables play a role in a consumer’s decision to buy online news, the takeaway is a bit more complicated, and that’s kind of the point: Fully understanding online news consumption is about more than just looking at how often people are going online. News organizations must also get dig into what consumer’s want, what they’re willing to buy, then figure out how (and why) these factors overlap.

“The overall picture when we are looking at intention to pay for online news is that we have to consider as many as five predictors,” Chyi told me. “I think that sort of explains why most newspapers have found it’s so difficult to monetize their online content.” From the study:

Specifically, age is a key factor influencing every aspect of online news consumption. Gender, in comparison, only affects paying intent. Paying intent for online news is influenced by five factors (age, gender, news interest, preference, and online news use), with age and news interest being the strongest predictors.

The study was presented Saturday at the International Symposium on Online Journalism. Chyi’s study was based on an online survey of 767 adult respondents in August 2010. While that seems like a really long time ago in Internet years — the first iPad was only five months old — Chyi says her research is current enough to offer a useful picture of a longer-term shift she’s tracked for more than a decade. Though print is declining by just about every metric, Chyi is convinced that there is an “over-optimistic bias toward online news.” At the same time, she acknowledges it’s “very late and very difficult to change the perception that the future is online or online-only.”

It’s not that she’s anti-technology, she says: “I believe that new platforms will be really important for people to access news, but in terms of how to monetize it, I think it’s getting more and more difficult,” says Chyi, who also sees a conflation between print decline and online growth. “Very often we mix the two together and say, ‘Because print is declining, the future must be online.’ But I don’t think that’s the case.”

Whatever the case may be, the theoretical model her study produced might offer the beginnings of a structural map for those who need to find a way to convince audiences that their news products are worth paying for.

May 26 2011

18:00

Sarah Palin’s 2009 “death panel” claims: How the media handled them, and why that matters

Editor’s Note: This article was originally published on the journalism-and-policy blog Lippmann Would Roll. Written by Matthew L. Schafer, the piece is a distillation of an academic study by Schafer and Dr. Regina G. Lawrence, the Kevin P. Reilly Sr. chair of LSU’s Manship School of Mass Communication. They have kindly given us permission to republish the piece here.

It’s been almost two years now since Sarah Palin published to Facebook a post about “death panels.” In a study to be presented this week at the 61st Annual International Communications Association Conference, we analyzed over 700 stories placed in the top 50 newspapers around the country.

“The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s ‘death panel’ so his bureaucrats can decide…whether they are worthy of health care,” Palin wrote at the time.

Only three days later, PolitiFact, an arm of the St. Petersburg Times, published its appraisal of Palin’s comment, stating, “We agree with Palin that such a system would be evil. But it’s definitely not what President Barack Obama or any other Democrat has proposed.”

FactCheck.org, a project of the Annenburg Public Policy Center, would also debunk the claim, and later PolitiFact users would later vote the death panel claim to the top spot of PolitiFact’s Lie of the Year ballot.

Despite this initial dismissal of the claim by non-partisan fact checkers, a cursory search of Google turns up 1,410,000 million results, showing just how powerful social media is in a fractured media climate.

Yet, the death panel claim — as we’re sure many will remember — lived not only online, but also in the newspapers, and on cable and network television. In the current study, which ran from August 8 (the day after Palin made the claim) to September 13 (the day of the last national poll about death panels) the top 50 newspapers in the country published over 700 articles about the claims, while the nightly network news ran about 20 stories on the topic.

At the time, many commentators both in and outside of the industry offered their views on the media’s performance in debunking the death panel claim. Some lauded the media for coming out and debunking the claim, while others questioned whether it was the media’s “job” to debunk the myth at all.

“The crackling, often angry debate over health-care reform has severely tested the media’s ability to untangle a story of immense complexity,” Howard Kurtz, who as then at the Washington Post, said. “In many ways, news organizations have risen to the occasion….”

Yet, Media Matters was less impressed, at times pointing out, for example, that “the New York Times portrayed the [death panel] issue as a he said/she said debate, noting that health care reform supporters ‘deny’ this charge and call the claim ‘a myth.’ But the Times did not note, as its own reporters and columnists have previously, that such claims are indeed a myth…”

So, who was right? Did the media debunk the claim? And, if so, did they sway public opinion in the process?

Strong debunking, but confused readers

Our data indicate that the mainstream news, particularly newspapers, debunked death panels early, fairly often, and in a variety of ways, though some were more direct than others. Nevertheless, a significant portion of the public accepted the claim as true or, perhaps, as “true enough.”

Initially, we viewed the data from 30,000 feet, and found that about 40 percent of the time journalists would call the death panel claim false in their own voice, which was especially surprising considering many journalists’ own conceptions that they act as neutral arbiters.

For example, on August 9, 2009, Ceci Connolly of the Washington Post said, “There are no such ‘death panels’ mentioned in any of the House bills.”

“[The death panel] charge, which has been widely disseminated, has no basis in any of the provisions of the legislative proposals under consideration,” The New York Times’ Helene Cooper wrote a few days after Connolly.

“The White House is letting Congress come up with the bill and that vacuum of information is getting filled by misinformation, such as those death panels,” Anne Thompson of NBC News said on August 11.

Nonetheless, in more than 60 percent of the cases it’s obvious that newspapers abstained from calling the death panels claim false. (We also looked at hundreds of editorials and letters to the editor, and it’s worth noting that almost 60 percent of those debunked the claim, while the rest abstained from debunking and just about 2 percent supported the claim.)

Additionally, of journalists who did debunk the claim, almost 75 percent of those articles contained no clarification as to why they were labeling the claim as false. Indeed, it was very much a “You either believe me, or you don’t” situation without contextual support.

As shown below, whether or not journalists debunked the claim, they often times approached the controversy by also quoting one side of the debate, quoting the other, and then letting the reader dissect the validity of each side’s stance. Thus, in 30 percent of cases where journalists reported in their own words that the claim was false, they nonetheless included either side’s arguments as to why their side was right. This often just confuses the reader.

This chart shows that whether journalists abstained from debunking the death panels claim or not, they still proceeded to give equal time to each side’s supporters.

Most important is the light that this study sheds on the age-old debate over the practical limitations surrounding objectivity. Indeed, questions are continually raised about whether journalists can be objective. Most recently, this led to a controversy at TechCrunch where founder Michael Arrington was left defending his disclosure policy.

“But the really important thing to remember, as a reader, is that there is no objectivity in journalism,” Arrington wrote to critics. “The guys that say they’re objective are just pretending.”

This view, however, is not entirely true. Indeed, in the study of death panels, we found two trends that could each fit under the broad banner of objectivity.

Objectivity: procedural and substantive

First, there is procedural objectivity — mentioned above — where journalists do their due diligence and quote competitors. Second, there is substantive objectivity where journalists actually go beyond reflexively reporting what key political actors say to engage in verifying the accuracy of those claims for their readers or viewers.

Of course, every journalist is — to some extent — influenced by their experiences, predilections, and political preferences, but these traits do not necessarily interfere with objectively reporting verifiable fact. Indeed, it seems that journalists could practice either form of objectivity without being biased. Nonetheless, questions and worries still abound.

“The fear seems to be that going deeper—checking out the facts behind the posturing and trying to sort out who’s right and who’s wrong—is somehow not ‘objective,’ not ‘straight down the middle,” Rem Reider of the American Journalism Review wrote in 2007.

Perhaps because of this, journalists in our sample attempted to practice at the same time both types of objectivity: one which, arguably, serves the public interest by presenting the facts of the matter, and one which allows the journalist a sliver of plausible deniability, because he follows the insular journalistic norm of both presenting both sides of the debate.

As such, we question New York University educator and critic Jay Rosen, who has argued that “neutrality and objectivity carry no instructions for how to react” to the rise of false but popular claims. We contend that the story is more complicated: Mainstream journalists’ figurative instruction manual contains contradictory “rules” for arbitrating the legitimacy of claims.

These contradictory rules are no doubt supported by public opinion polls taken during the August and September healthcare debates. Indeed, one poll released August 20 reported that 30 percent believed that proposed health care legislation would “create death panels.” Belief in this extreme type of government rationing of health care remained impressively high (41 percent) into mid-September.

More troubling, one survey found that the percentage calling the claim true (39 percent) among those who said they were paying very close attention to the health care debate was significantly higher than among those reporting they were following the debate fairly closely (23 percent) or not too closely (18 percent).

Yet, of course, our data does not allow us to say that these numbers are a direct result of the mainstream media’s death panel coverage. Nonetheless, because mainstream media content still powers so many websites’ and news organizations’ content, perhaps this coverage did have an impact on public opinions to some indeterminable degree.

Conclusion

One way of looking at the resilience of the death panels claim is as evidence that the mainstream media’s role in contemporary political discourse has been attenuated. But another way of looking at the controversy is to demonstrate that the mainstream media themselves bore some responsibility for the claim’s persistence.

Palin’s Facebook post, which popularized the death panel, catchphrase said nothing about any specific legislative provision. News outlets and fact-checkers could examine the language of currently debated bills to debunk the claim — and many did, as our data demonstrate. Nevertheless, it appears the nebulous “death panel bomb” reached its target in part because the mainstream media so often repeated it.

Thus, the dilemma for reporters playing by the rules of procedural objectivity is that repeating a claim reinforces a sense of its validity — or at least, enshrines its place as an important topic of public debate. Moreover, there is no clear evidence that journalism can correct misinformation once it has been widely publicized. Indeed, it didn’t seem to correct the death panels misinformation in our study.

Yet, there is promise in substantive objectivity. Indeed, today more than ever journalists are having to act as curators. The only way that they can effectively do so is by critically examining the surplusage of social media messages, and debunking or refusing to reinforce those messages that are verifiable. Indeed, as more politicians use the Internet to circumvent traditional media, this type of critical curation will become increasingly important.

This is — or should be journalists’ new focus. Journalists should verify information. Moreover, they should do so without including quotations from those taking a stance that is demonstrably false. This creates a factual jigsaw puzzle that the reader must untangle. Indeed, on the one hand, the journalist is calling the claim false, and on the other, he is giving inches quoting someone who believes it’s true.

Putting aside the raucous debates about objectivity for a moment, it is clear that journalists in many circumstances can research and relay to their readers information about verifiable fact. If we don’t see a greater degree of this substantive objectivity, the public is left largely at the mercy of the savviest online communicator. Indeed, if journalists refuse to critically curate new media, they are leaving both the public and themselves in a worse off position.

Image of Sarah Palin by Tom Prete used under a Creative Commons license.

May 19 2011

15:00

The Conversation, the startup Australian news site, wants to bring academic expertise to breaking news

What would happen if you had close to 1,000 academics available to contribute to the breaking news cycle? Would it change the course, and the discourse, of news?

Andrew Jaspan thinks it will.

Jaspan, formerly an editor at The Age, the Melbourne-based newspaper, founded The Conversation, an Australian nonprofit news site, in order to combat problems that are just as present there as in other news environments: shrinking newsrooms and a sound-bite driven broadcast culture.

But The Conversation’s approach is a novel one: While the site uses professional journalists as its editors, it uses academics to provide the content for the site. The goal, says the site’s charter, is to provide “a fact-based and editorially-independent forum” that will “unlock the knowledge and expertise of researchers and academics to provide the public with clarity and insight into society’s biggest problems” and “give experts a greater voice in shaping scientific, cultural and intellectual agendas by providing a trusted platform that values and promotes new thinking and evidence-based research.”

As Jaspan explained: “Our model is not so much to use the university as a source of news, though we do report research findings as news. What we really try to do is use academics and researchers to analyze live news events, like the killing of Osama Bin Laden through to the Fukiyama earthquakes or whatever [other] complex news stories…. We are using people who are experts to give greater depth to the understanding of complex and live issues.

The Conversation offers a number of surprises to those looking for a more in-depth approach to issues in the news:

  • Academics are writing about the “now,” within the news cycle, in areas related to their expertise
  • Taking experts to the people, instead of selectively filtering their expertise. Want the big voice on climate change? Then read what he or she has to say directly — rather than through a few sample quotes in a story
  • Readability. The site is set — mechanically, within its content management system — to make the stories easy (enough) to read. Using the Flesch-Kincaid readability index (set to the reading level of a 16-year-old for maximum readability), the CMS can actually tell academics when they’ve veered into jargon…and an editor can help steer them back
  • Real-time news updates filed twice a day — once in the morning and once in the afternoon
  • Coverage of business and the economy, environment and energy, health and medicine, politics and society, and science and technology

And, as the site’s tagline promises, “academic rigour, journalistic flair.”

As an academic myself, I was a bit skeptical of the idea. After all, some of the most bombastic and opinionated folks reside in academia — so I wasn’t exactly sure how Jaspan’s site would deliver on a promise to provide more in-depth coverage without the rhetorical flourishes that often seem to come with American academic publishing. And what about the political implications? Academics, after all, as a group, tend to be more liberal than the population at large.

Jaspan had three counterpoints to my concern:

First, “every author has to fill out a profile, so the reader knows who the person is and their education. And there is the additional requirement of a disclosure of any potential conflicts which might color their judgment.” Second, in response to the political question — after noting that my academics-are-liberal assertion might be a bit loaded — he replied that what The Conversation is ultimately doing is putting people in touch with “academics who are usually better informed than the general public because of their depth of knowledge and their sense of the complexity of the issue.”

Third, and most important, Jaspan sees The Conversation, true to its name, as leading to public debate. “One of the key things we want to do with a public-facing media channel is to make sure we have a range of views on something like the execution of Osama Bin Ladin, and that we have different interpretations of what happened and whether or not the means in which it was done were judicial.” The main goal, though: “We want to surprise our readers. We don’t want to give them the usual explanations, alternative insights, and viewpoints — and that will lead to lively conversation with readers.”

Jaspan’s backers come from both the nonprofit and for-profit realms. The Conversation is backed by Ernst & Young, among other corporate supporters. And from academia, he has drawn on some of the top Australian research universities, in addition to Australia’s Department of Education. To find the academics, Jaspan and his staff did a “census” of academics based on their areas of expertise. Then, by word of mouth, they asked participating academics to recommend colleagues who would make good contributors to the site.

But, again, the skeptical academic in me had another question: Why on earth would a busy academic take time away from publishing (ahem) to write for The Conversation?

Part of the answer has to do with Australia’s current approach to university promotion. Research and teaching form part of the core methods of evaluation, but a third arm of assessment is an academic’s quality of public engagement and social impact. According to Jaspan, Australian universities are putting a new stress on the third.

And since The Conversation gives each writer a dashboard to measure his or her own metrics, the academic can then use those data for his or her professional promotion and evaluation, actually measuring his or her social impact in a quantifiable way for university administrators — based, say, on retweets or traffic for a particular story. The academics don’t get paid for their work. Instead, though, they might pick up speaking engagements or consulting gigs.

There’s also the instant-gratification factor. While traditional academic publishing generally makes academics wait a year (or more) to see something in print, Jaspan said that some academics relish being able to turn something around in two hours.

Currently, The Conversation is still in beta form, with Jaspan looking to add more audience engagement and commenting features, as well as richer multimedia. Jaspan estimates that the site is getting about 120,000 to 150,000 visitors each month — with those metrics rising by “10 percent a week.”

But Jaspan isn’t seeing, or hoping for, an audience purely composed of academic eggheads. “This is not a site for academics,” he notes. “This is not a site for university sector. This is a site for every day public discourse.”

May 04 2011

13:30

MIT management professor Tom Malone on collective intelligence and the “genetic” structure of groups

Do groups have genetic structures? If so, can they be modified?

Those are two central questions for Thomas Malone, a professor of management and an expert in organizational structure and group intelligence at MIT’s Sloan School of Management. In a talk this week at IBM’s Center for Social Software, Malone explained the insights he’s gained through his research and as the director of the MIT Center for Collective Intelligence, which he launched in 2006 in part to determine how collective intelligence might be harnessed to tackle problems — climate change, poverty, crime — that are generally too complex to be solved by any one expert or group. In his talk, Malone discussed the “genetic” makeup of collective intelligence, teasing out the design differences between, as he put it, “individuals, collectively, and a collective of individuals.”

The smart group

First is the question of whether general cognitive ability — what we think of, when it comes to individuals, as “intelligence” — actually exists for groups. (Spoiler: it does.) Malone and his colleagues, fellow MIT researchers Sandy Pentland and Nada Hashmi, Carnegie Mellon’s Anita Williams Woolley, and Union College’s Christopher Chabrisassembled 192 groups — groups of two to five people each, with 699 subjects in all — and assigned to them various cognitive tasks: planning a shopping trip for a shared house, sharing typing assignments in Google Docs, tackling Raven’s Matrices as a group, brainstorming different uses for a brick. (For you social science nerds, the team chose those assignments based on Joe McGrath‘s taxonomy of group tasks.) Against the results of those assignments, the researchers compared the results of the participants’ individual intelligence tests, as well as the varying qualities of the group, from the easily quantifiable (participants’ gender) to the less so (participants’ general happiness).

And what they found is telling. “The average intelligence of the people in the group and the maximum intelligence of the people in the group doesn’t predict group intelligence,” Malone said. Which is to say: “Just getting a lot of smart people in a group does not necessarily make a smart group.” Furthermore, the researchers found, group intelligence is also only moderately correlated with qualities you’d think would be pretty crucial when it comes to group dynamics — things like group cohesion, satisfaction, “psychological safety,” and motivation. It’s not just that a happy group or a close-knit group or an enthusiastic group doesn’t necessarily equal a smart group; it’s also that those psychological elements have only some effect on groups’ ability to solve problems together.

So how do you engineer groups that can problem-solve effectively? First of all, seed them with, basically, caring people. Group intelligence is correlated, Malone and his colleagues found, with the average social sensitivity — the openness, and receptiveness, to others — of a group’s constituents. The emotional intelligence of group members, in other words, serves the cognitive intelligence of the group overall. And this means that — wait for it — groups with more women tend to be smarter than groups with more men. (As Malone put it: “More females, more intelligence.”) That’s largely mediated by the researchers’ social sensitivity findings: Women tend to be more socially sensitive than men — per Science! — which means that, overall, more women = more emotional intelligence = more group intelligence.

Which, yay. And it’s easy to see a connection between these findings and the work of journalists — who, whether through crowdsourcing or commentary, are trying to figure out the most productive ways to amplify, and generally benefit from, the wisdom of crowds. News outfits are experimenting not just with inviting group participation in their work, but also with, intriguingly, defining the groups whose participation they invite — the starred commenters, the “brain trust” of readers, etc. Those experiments are based, in turn, on a basic insight: that the “who” of groups matters as much as the “how.” Attention to the makeup of groups on a more granular, person-to-person level may extend the benefits even further.

The group genome

But where Professor Malone’s ideas get especially interesting from the Lab’s perspective is in another aspect of his work: the notion that groups have, in their structural elements, a kind of dynamic DNA. Malone and his colleagues — in this case, Robert Laubacher and Chrysanthos Dellarocas — are essentially trying to map the genome of human collectivity, the underlying structure that determines groups’ outcomes. The researchers break the “genes” of groups down to interactions among four basic (and familiar) categories: what, who, why, and how. Or, put another way: what the project is, who’s working to enact it, why they’re working to enact it, and what methods they’re using to enact it. (So the “genetic structure” of the Linux community, for example, breaks down to relationship among the what of creating new tools and shaping existing ones; the who of the crowd combined with Linus Torvalds, and his lieutenants; the why of love, glory, and, to an extent, financial gain; and the how of both collaboration and hierarchical ordering. The interplay among all those factors determines the community’s outward expression and outcomes.)

That all seems simple and obvious — because it is — but what makes the approach so interesting and valuable from the future-of-news perspective is, among other things, its disaggregation of project and method and intention. Groups form for all kinds of reasons, but we generally pay little attention to the discrete factors that lead them to form and flourish. Just as understanding humans’ genetic code can lead us to a molecular understanding of ourselves as individuals, mapping the genome of groups may help us understand ourselves as we behave within a broader collective.

And that knowledge, just as with the human genome, might help us gain an ability to manipulate group structures. When it comes to individuals, intelligence is measurable — and, thus, it has a predictive element: A smart kid will most likely become a smart adult, with all the attendant implications. Individual intelligence is fairly constant, and, in that, almost impossible to change. Group intelligence, though, Malone’s findings suggest, can be manipulated — and so, if you understand what makes groups smart, you can adjust their factors to make them even smarter. The age-old question in sociology is whether groups are somehow different, and greater, than the sum of their parts. And the answer, based on Malone’s and other findings, seems to be “yes.” The trick now is figuring out why that’s so, and how the mechanics of the collective may be put to productive use. Measuring group intelligence, in other words, is the first step in increasing group intelligence.

Malone and his colleagues have identified 16 “genes” so far, as expressed in groups like Wikipedia contributors, YouTube uploaders, and eBay auctioneers. “We don’t believe this is the end, by any means, but we think it’s a start,” he said — a way to rethink, and perhaps even revolutionize, the design of groups. Organizational design theory in the 20th century, he noted, generally focused on traditional, hierarchical corporations. But as digital tools give way to new kinds of collectives, “it seems to me,” the professor said, that “it’s time to update organizational design theory for these new organizations.”

Image via ynse used under a Creative Commons license.

October 27 2010

14:00

Metrics, impact, and business plans: Things to watch for as the Knight News Challenge enters a new cycle

In recent years, it’s been something of a parlor game in future-of-journalism circles to speculate about the $25 million Knight News Challenge: Who’s going to win this year? What are the judges looking for, exactly? And, whoa, how on earth did that finalist get passed up? (On that last question, see CoPress in 2009; e.g., read the comments on this post.)

The buzz and chatter are mostly just idle guesswork, and of course it’s all to be expected when serious money (think: $5 million for MIT, $1 million for EveryBlock) is on the line. (Indeed, there’s an extra $1 million on the table this year, thanks to Google’s donation to journalism innovation announced yesterday.)

So, that’s why this year, the fifth installment of the Knight News Challenge, already feels a little different. In years past, the Knight Foundation has approached the News Challenge with a “hey, we’re not the experts — you tell us what’s innovative” kind of attitude, purposefully leaving the door open to just about any submission, assuming that it met certain basic requirements of geographic community focus, open-source software, and so on. With the exception of some tweaking along the way, the general focus of the News Challenge remained the same: to stimulate innovation in the name of making communities better informed. Simple enough.

But this year, even though the KNC’s general pitch remains the same, applicants will make their submissions in one of four categories: Mobile, Authenticity, Sustainability, or Community. Only the Community category requires a place-based geographical focus, which marks a significant break from previous cycles where all projects had to be tested in a local community. Overall, the categorization scheme lends some direction — even a certain narrowing — of the contest, and it suggests that Knight has learned a few things over the past four years that it’s going to apply in this final go-round, to get a more focused pool of contenders.

And that’s where this post comes in, on the question of lessons learned. At the risk of contributing more baseless speculation to this parlor game, I’d like to share some insights I gained during the past year as I examined the News Challenge — and the Knight Foundation more generally — for my doctoral dissertation at the University of Texas. (I’m now a journalism professor at the University of Minnesota.)

For starters, you can read the full text of my dissertation (“Journalism Innovation and the Ethic of Participation: A Case Study of the Knight Foundation and its News Challenge“) by going here, or by reading the embedded Scribd file below. If you’re looking for the highlights, skip to page 182 and read the last chapter (Participation and the Professions). Quick tip: This is generally a good way to go when trying to interpret academic articles — look for that “discussion and conclusion” section toward the end.

I described some of my key findings in an earlier Lab post. But with regard to the changes in the KNC for 2011, here are several observations from my time studying the Knight Foundation that might fill in some of the context:

Knight cares intensely about evaluation

This is increasingly true of all nonprofit foundations, really — not just the Knight Foundation. But it was striking to see the extent to which the foundation is working to assess the impact and effectiveness of its funding efforts, through an ongoing “deep review” of its mission and goals. A major part of this review: an examination of the Knight News Challenge after its first three cycles (2007-09). This included a massive content analysis of nearly all proposal documents — resulting in a data set that I analyzed as my part of my project (see Chapter 6 of my dissertation) — and interviews, conducted by outside consultants, with many KNC grantees. At one level, there’s the basic assessment of seeing if grantees’ outcomes matched their goals. At another, there is the big question of reach and influence. For nonprofits funding myriad online sites, as Knight does, at least part of that means reviewing web metrics: traffic, unique visitors, etc. All foundations want metrics to justify their investment — and now more than ever.

So, what does this emphasis on evaluation mean for News Challenge applicants this year? Well, it suggests that in a world where user behaviors are easier to track and analyze than ever before, and thus funders of all stripes (for-profit and nonprofit alike) are hungry for good numbers, having a plan for web metrics — for reaching quantifiable and identifiable targets — is probably going to be more important than in previous cycles.

Is this the News Challenge on SEO steroids? Not exactly, but you get the idea. And this gets to the second point, which is…

Is citizen journalism out? Are business models (and the like) in?

There was an interesting quote in recent coverage of KNC changes that got some attention. It was from Jennifer 8. Lee, a Knight consultant and contest reviewer:

We’re not totally into the citizen journalism thing anymore. It has been given its chance to do its thing and kind of didn’t do its thing that well.

Now, Lee was quick to clarify that she was speaking only for herself, and that the KNC is open to citizen media approaches — just not the kind of generic and repetitive pitches that have populated the pool of applicants recently (think: Flip cams for urban youth):

The contest welcomes content or citizen journalism projects. Innovative content or community reporting models can and do get funded…Since innovation is a core value of the contest, traditional content and citizen journalism projects lacking in innovation were generally not looked upon favorably by contest reviewers.

But, nonetheless, this statement is telling because it gets at a key focus of my dissertation: how Knight has dealt with participation in journalism. In my study of the first three years of the News Challenge, I found that the foundation and its KNC winners championed citizen participation in the news process as something that should happen, not merely something that could happen because of new technologies. Participation was portrayed as an ethic of good journalism in the digital age, a foundational piece of journalism innovation.

So, does that square with the notion of we’re not so into citizen journalism anymore? Perhaps there’s a better way to think about this: Knight has already funded lots of citizen media projects, and the evidence — based on my interviews with KNC winners and overall analysis — suggests that many of these sites struggled to build and maintain a base of users. On the one hand, that’s perfectly understandable: Some of these projects were meant to be short-term in duration; Knight knew many of them would fail, because that’s the nature of innovation; and, hey, in the attention economy, it’s tough for any content provider these days, right? Yet, on the other hand, this struggle to get attention — from citizen contributors and audiences alike — was a formidable challenge for many of the early KNC projects, and, well, it just so happened that many of those early projects happened to be citizen media sites. As a result, citizen journalism comes off looking like a failure, even if the motivation behind it was well intentioned and still well regarded in Knight circles.

The lesson here: Going forward, with this ramped-up emphasis on evaluation and impact, and with apparent concerns about citizen journalism’s sustainability, it would seem that Knight wants to see applicants with a clearer path to success, especially in web metrics. Or, perhaps there’s another way to read this: In a media ecosystem awash in sites pushing content — read our blogs! watch our videos! — with less thought about how that content gets subsidized on a regular basis, Knight wants a better business plan. It wants a sustainable model. After all, there’s a reason it hired a director of business consulting.

David Sasaki, of the 2007 KNC winner Rising Voices, might have captured this problem best in this prescient blog post from 2008:

The Knight Foundation is single-handedly making citizen media both more serious and more respected by giving financial support to some of the field’s most innovative thinkers. But is this a sustainable model for the transformation of media? What happens when the News Challenge’s five-year funding period concludes? All of the News Challenge grantee projects are impressive, innovative, and important, but not a single one is turning a profit, nor do they seem poised to any time soon.

What happens to the “news” in News Challenge?

This is a truly intriguing and as-yet-unanswered question going into this final cycle. The five-year funding period Sasaki described is coming to an end. What comes next?

On the one hand, the News Challenge has proved a successful template for Knight’s growing network of prize-philanthropy challenge contests, and it represents the foundation’s most visible link to its historic roots as a “journalism foundation” with close ties to the industry and its concerns. But, as I pointed out previously, Knight is undergoing a shift in emphasis from “news” to “information” as a way of broadening the boundaries of journalism to accomplish innovation with outside help from other fields and philanthropic funders. The most obvious manifestation of this is the Knight Community Information Challenge, which involves partnering with place-based foundations to meet the “information needs” of local communities.

What becomes, then, of the News Challenge? Is there a renewal of some kind — and if so, does it keep the “journalism” tag? Or does the Community Information Challenge suffice in this space? Only time will tell, but the important thing here is to recognize that Knight has an increasingly nuanced view of journalism — one that sidesteps the “baggage” of professional exclusivity and proactively seeks ideas from other fields (say, the tech sector).

David Cohn, whose Spot.Us is one of the best-known KNC success stories, put it recently, in describing startups like Kommons:

As I’ve said before, we may not call it ‘journalism’ in the future, but if it still meets the news and information needs of a community, more power to it.

That, right there, nicely summarizes the feeling of the Knight Foundation: that it cares much more about the ends (i.e., informed communities) than the means (i.e., journalists and traditional news). How that translates into future challenges (or not) is left to be seen.

October 18 2010

15:23

Survey on Innovative Use of Technologies

Dear all,

Nice to meet you all! I've become aware of NetSquared during my research on activism and organizations using innovative communication tools, and Claire encouraged me to link my survey here for all of you to see (thank you!), so...

read more

September 01 2010

14:00

All the web’s a stage: Scholar Joshua Braun on what we show and what we choose to hide in journalism

Joshua Braun is a media scholar currently pursuing his Ph.D in Communications at Cornell. His work is centered at the intriguing intersection of television and the web: He’s currently studying the adoption of blogging software by network news sites, and the shifts that that adoption are bringing about in terms of the relationship between one-way communication something more conversational. At this spring’s IOJC conference in Austin, Braun presented a paper (pdf) discussing the results of his research — a work that considered, among other questions:

As journalistic institutions engage more and more fully in interactive online spaces, how are these tensions changing journalism itself? How do the technical systems and moderation strategies put in place shape the contours of the news, and how do these journalistic institutions make sense of these systems and strategies as part of their public mission? What is the role of audiences and publics in this new social and technical space? And how do journalistic institutions balance their claim to be “town criers” and voices for the public with the fact that their authority and continued legal standing depend at times on moderating, and even silencing the voices of individuals?

The whole paper is worth reading. (You can also watch Braun’s IOJC talk here.) But one aspect of it that’s especially fascinating, for our purposes, is Braun’s examination of TV-network news blogs in the context of the sociology of dramaturgy (in particular, the work of Erving Goffman).

News organizations are each a mix of public and private — preparing information for a public audience, but generally doing so in a private way. As with a theater production, there’s a performance going on for the audience but a big crew backstage. Blogging represents a potential shift in this dynamic by exposing people and processes that would otherwise be kept hidden behind a byline or a 90-second news piece.

And the blogging interplay — between presentation and communication, between product and process, and, perhaps most interestingly, between process and performance — is relevant to any news organization trying to navigate familiar journalistic waters with new vessels. I spoke with Braun about that dynamic and the lessons it might have to offer; below is an edited transcript of the conversation.

Megan Garber: I’m intrigued by the idea of theater dynamics you mention in the paper — in particular, the distinction between backstage and front-stage spaces for news performances. Can you explain that in a bit more detail?

Joshua Braun: This is Steve Hilgartner’s idea. He took this idea of stage management from classic sociology, which has normally been an interpersonal theory, and decided it worked for organizations. He looked at the National Academy, and noticed the way in which they keep all their deliberations effectively secret and then release a document at the end that gives the consensus opinion of the scientific community. And there are two aspects of that. One is that it’s intended to protect the integrity of the process. So when you’re a big policy-advisory body like the National Research Council, you have senators who will call you and tell you they don’t want you working on something; you’ll have lobbyists who’ll want to influence your results; you’ll have, basically, a lot of political pressure. So there’s this aspect in which this system of enclosure — in the Goffman/Hilgartner metaphor — this keeping of things backstage, really is meant to protect the integrity of the process.

But it also has the other effect, which is that it also gives the illusion of the scientific community speaking with a single voice. So basically, all the messy process of sausages being made — and all the controversial issues that, by definition, the National Research Council is dealing with — you don’t see reflected in the reports. Or you see it in very official language. So it gives them a tremendous amount of authority, this illusion of the scientific community speaking with one voice, and they cultivate that. I was actually a graduate fellow at the National Academies, and they definitely want that — they recognize that the authority of the documents rests on that.

And many organizations that deal with information and knowledge production, including journalism, operate in this way, frequently. The publication of the finished news item and the enclosure of the reporting process — there’s a very real sense that that protects the authority of the process. So if you’re investigating a popular politician, you need that. And at the same time, it protects the brand and the legal standing and the authority of the organization, and bolsters that. Those things are very reliant on this process of enclosure, oftentimes.

And so what you see in the new media spaces, and these network experiments with blogging, is that sort of process. They’ve taken a medium that they themselves talked about in terms of accountability and transparency and openness and extended it to this traditional stage management process. They continue to control what remains backstage and what goes front-stage. And there are good justifications for doing that. But they’ve also extended that to the process of comment moderation. You’ll get pointed to a description of why comments are moderated the way they are — but you’ll never see exactly why a comment is spammed or not. That’s not unique to the news, either. But it’s an interesting preservation of the way the media’s worked for a long time.

And this has been described by other scholars, as well. So Alfred Hermida has a really neat piece on blogging at the BBC where he talks about much the same thing. He uses different terms — he talks about “gatekeeping,” as opposed to this notion of stage management — but it’s a pretty robust finding across a lot of institutions.

And I don’t want to portray it as something unique to journalism. This process of self-presentation and this performance of authority is widespread — and maybe necessary to journalism. I think the jury’s out on that.

MG: Definitely. Which brings up the question of how authority is expressed across different media. Does broadcast, for example, being what it is, have a different mandate than other types of journalism?

JB: Right. One of the remarkable things about broadcast news is the amount of stage management that you see in the traditional product. So if you look at an organization like ABC News, for instance — before their recent mass layoffs — they have several dozen correspondents: 77 or so people. But they have 1,500 total staff. And when you’re producing for a visual medium, you’re very selective about what appears on front-stage — this mise-en-scène of network news: what appears on camera and what ends up on the cutting-room floor, and so on. The vast majority of their newsgathering operation — the desk assistants and the bookers and the people who do all the pre-interviewing and the off-air correspondents — are people who never appear on-air. No network is its anchor.

So there’s that aspect, in which a large portion of the news ecosystem isn’t visible to the public — and there’s an argument to be made that having a small set of news personalities with whom audiences can identify is good for the product — and there are a lot of organizations where the vast majority of people involved in things don’t really speak. So that was one of the interesting aspects of looking at the blogging efforts of network news: Once that somewhat natural distinction between on-air and off-air talent and support staff disappears, who becomes visible online?

And you do have a lot of producers, a lot of bookers and other types of professionals who appear on the blogs, which is a really fascinating thing. The blogs are an extension of the stage management thing, but also a challenge to that model.

Image from daveynin used under a Creative Commons License.

February 04 2010

17:00

Is online news just ramen noodles? What media economics research can teach us about valuing paid content

The New York Times’ announcement that it would be charging for some access to its website, starting in 2011, rekindled yet another round of debate about paywalls for online news. Beyond the practical question (will it work?) or the theoretical one (what does this mean for the Times’ notion of the “public”?), there remains another question to be untangled here — perhaps one more relevant to the smaller papers who might be thinking of following the Times’ example:

What is the underlying economic value of online news, anyway?

Media economist Iris Chyi [see disclosure below] has a few ideas about this problem. An assistant professor in the School of Journalism at the University of Texas, she has been researching the paid-vs.-free, print-vs.-online conundrum since the late ’90s. Her research has consistently found that even while online news use continues growing, its preference lags behind that of traditional media. In other words: Even as audiences transition from TV/print news consumption to the web, they still like the traditional formats better for getting news, all other things being equal.

Now, this seemingly makes no sense: How could a format as clunky, messy and old-school as print “beat” such a faster, richer and more interactive medium on likability?

Chyi believes she found the answer in the economic principle of “inferior goods.” The idea is simple: When income increases, consumers buy more “normal goods” (think: steak) and fewer “inferior goods” (think: ramen noodles). When income goes down, the opposite occurs (again, all things being equal in economics terms). Inferiority, in this case, isn’t so much a statement of actual quality as it is of consumer perception and demand. If we get richer, our desires for steak go up and our desires for ramen go down.

What does this mean for journalism? “Users perceive online news in similar ways — online news fulfills certain needs but is not perceived as desirable as print newspapers,” Chyi said.

She and co-author Mengchieh Jacie Yang make this point through an analysis of data on news consumption gathered from a random sample of U.S. adults; their findings are published in the latest issue of Journalism & Mass Communication Quarterly, the flagship peer-reviewed journal for AEJMC. (See the related news release, overall highlights, and the full-text PDF). Chyi and Yang summarize their key findings as follows:

This analysis, based on data collected by the Pew Research Center in 2004, identified a negative relationship between income and online news consumption: When income increases, online news use decreases; when income decreases, online news use increases, other things (demographics, news interest, and/or other news media use) being equal — suggesting that online news is an inferior good among users. In contrast, the print newspaper is a normal good.

Such findings, at first glance, may surprise media scholars as well as online news professionals. After all, in communication research, no news products have been labeled as inferior goods before. In addition, major U.S. media companies have invested heavily in their online ventures, offering an array of interactive features and multimedia content — most of which are unattainable by print newspapers. It is therefore difficult to understand why online news could be an inferior good. Yet, from an economic perspective, “goods are what are thought of as goods.” Any product’s economic nature is determined by consumer perception and response. Based on this particular data set, which consists of survey responses collected from a national sample of online news users by a major polling institution in 2004, online news is an inferior good among users.

Clearly, the use of 2004 data is a limiting factor here (although the authors explain why more recent Pew surveys couldn’t be used for this kind of question). Yet, if we accept these findings, we’re left to unravel two mysteries: Why is online news perceived as an inferior good in the first place? And what should that mean for the future of web journalism?

On the first question, there are at least several possibilities, as Chyi suggests. Maybe the computer screen just isn’t an enjoyable reading device. (And how might that compare with smartphones and e-readers?) Or maybe online newspapers still have content/design problems — think of all the ads for teeth whitening and tummy tightening, not to mention the general lack of contextual cues afforded by print. Or maybe it’s simply because online news is free — and, as behavioral economics research has indicated, sometimes consumers perceive higher-price products as more enjoyable. In any case, as Chyi puts its: “More research, as opposed to guesswork or wishful thinking, on the perception of news products is essential.”

Then there’s the second question: What does this suggest about the future of online news? Perhaps nothing too dire, as people still do pay for ramen noodles when it suits them — when the price, convenience, or alternatives make ramen noodles the preferred choice. This isn’t to suggest that consumers invariably will pay for online news, but rather that they might if the perception calculation is right.

The key here is to recognize that consumers are rapidly adopting online news not necessarily because they prefer the medium to print, but because online news is “good enough” — cheap, convenient, flexible, and sufficient to satiate our information cravings. (This takes us into territory related to disruptive innovations and fidelity vs. convenience — interesting stuff, but something for a later post.) But the danger is in taking a “platform-neutral” approach if that leads one to assume that content value remains constant between print and online — that, basically, you can charge for content either way. Chyi suggests that is like trying to market ramen noodles as steak: Newspapers do so at their peril.

So, what does all of this say about the Times and its paywall? Perhaps not much because, after all, “the Times is the Times.” Yet, the notion of online news as an inferior good highlights a few salient points for thought: (1) news usage doesn’t always correlate with preference, counterintuitive as that is; (2) publishers hoping to charge for niche content need to understand where their offering fits in the normal-inferior goods relationship, and how that should affect pricing and marketing strategies; and (3) there’s a critical need for R&D to help us grasp why consumers perceive online news as inferior, and how that perception might vary among different demographics of users and/or according to different types of news content.

In the meantime, enjoy your ramen noodles.

[Disclosure: Chyi and I have collaborated on several research projects through her Media Economics Research Group in the School of Journalism at the University of Texas — including a recent peer-reviewed article on newspapers' effectiveness in penetrating the local online market (PDF). Also, she's currently a member of my dissertation committee.]

Photo of ramen by Broderick used under a Creative Commons license.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl