Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 05 2012

19:30

Hacking consensus: How we can build better arguments online

In a recent New York Times column, Paul Krugman argued that we should impose a tax on financial transactions, citing the need to reduce budget deficits, the dubious value of much financial trading, and the literature on economic growth. So should we? Assuming for a moment that you’re not deeply versed in financial economics, on what basis can you evaluate this argument? You can ask yourself whether you trust Krugman. Perhaps you can call to mind other articles you’ve seen that mentioned the need to cut the deficit or questioned the value of Wall Street trading. But without independent knowledge — and with no external links — evaluating the strength of Krugman’s argument is quite difficult.

It doesn’t have to be. The Internet makes it possible for readers to research what they read more easily than ever before, provided they have both the time and the ability to filter reliable sources from unreliable ones. But why not make it even easier for them? By re-imagining the way arguments are presented, journalism can provide content that is dramatically more useful than the standard op-ed, or even than the various “debate” formats employed at places like the Times or The Economist.

To do so, publishers should experiment in three directions: acknowledging the structure of the argument in the presentation of the content; aggregating evidence for and against each claim; and providing a credible assessment of each claim’s reliability. If all this sounds elaborate, bear in mind that each of these steps is already being taken by a variety of entrepreneurial organizations and individuals.

Defining an argument

We’re all familiar with arguments, both in media and in everyday life. But it’s worth briefly reviewing what an argument actually is, as doing so can inform how we might better structure arguments online. “The basic purpose of offering an argument is to give a reason (or more than one) to support a claim that is subject to doubt, and thereby remove that doubt,” writes Douglas Walton in his book Fundamentals of Critical Argumentation. “An argument is made up of statements called premises and a conclusion. The premises give a reason (or reasons) to support the conclusion.”

So an argument can be broken up into discrete claims, unified by a structure that ties them together. But our typical conceptions of online content ignore all that. Why not design content to more easily assess each claim in an argument individually? UI designer Bret Victor is working on doing just that through a series of experiments he collectively calls “Explorable Explanations.”

Writes Victor:

A typical reading tool, such as a book or website, displays the author’s argument, and nothing else. The reader’s line of thought remains internal and invisible, vague and speculative. We form questions, but can’t answer them. We consider alternatives, but can’t explore them. We question assumptions, but can’t verify them. And so, in the end, we blindly trust, or blindly don’t, and we miss the deep understanding that comes from dialogue and exploration.

The alternative is what he calls a “reactive document” that imposes some structure onto content so that the reader can “play with the premise and assumptions of various claims, and see the consequences update immediately.”

Although Victor’s first prototype, Ten Brighter Ideas, is a list of recommendations rather than a formal argument, it gives a feel of how such a document could work. But the specific look, feel and design of his example aren’t important. The point is simply that breaking up the contents of an argument beyond the level of just a post or column makes it possible for authors, editors or the community to deeply analyze each claim individually, while not losing sight of its place in the argument’s structure.

Show me the evidence (and the conversation)

Victor’s prototype suggests a more interesting way to structure and display arguments by breaking them up into individual claims, but it doesn’t tell us anything about what sort of content should be displayed alongside each claim. To start with, each claim could be accompanied by relevant links that help the reader make sense of that claim, either by providing evidence, counterpoints, context, or even just a sense of who does and does not agree.

Each claim could be accompanied by relevant links that help the reader make sense of that claim by providing evidence, counterpoints, context, or even just a sense of who does and does not agree.

At multiple points in his column, Krugman references “the evidence,” presumably referring to parts of the economics literature that support his argument. But what is the evidence? Why can’t it be cited alongside the column? And, while we’re at it, why not link to countervailing evidence as well? For an idea of how this might work, it’s helpful to look at a crowd-sourced fact-checking experiment run by the nonprofit NewsTrust. The “TruthSquad” pilot has ended, but the content is still online. One thing that NewsTrust recognized was that rather than just being useful for comment or opinion, the crowd can be a powerful tool for sourcing claims. For each fact that TruthSquad assessed, readers were invited to submit relevant links and mark them as For, Against, or Neutral.

The links that the crowd identified in the NewsTrust experiment went beyond direct evidence, and that’s fine. It’s also interesting for the reader to see what other writers are saying, who agrees, who disagrees, etc. The point is that a curated or crowd-sourced collection of links directly relevant to a specific claim can help a reader interested in learning more to save time. And allowing space for links both for and against an assertion is much more interesting than just having the author include a single link in support of his or her claim.

Community efforts to aggregate relevant links along the lines of the TruthSquad could easily be supplemented both by editor-curators (which NewsTrust relied on) and by algorithms which, if not yet good enough to do the job on their own, can at least lessen the effort required by readers and editors. The nonprofit ProPublica is also experimenting with a more limited but promising effort to source claims in their stories. (To get a sense of the usefulness of good evidence aggregation on a really thorny topic, try this post collecting studies of the stimulus bill’s impact on the economy.)

Truth, reliability, and acceptance

While curating relevant links allows the reader to get a sense of the debate around a claim and makes it easier for him or her to learn more, making sense of evidence still takes considerable time. What if a brief assessment of the claim’s truth, reliability or acceptance were included as well? This piece is arguably the hardest of those I have described. In particular, it would require editors to abandon the view from nowhere to publish a judgment about complicated statements well beyond traditional fact-checking. And yet doing so would provide huge value to the reader and could be accomplished in a number of ways.

Imagine that as you read Krugman’s column, each claim he makes is highlighted in a shade between green and red to communicate its truth or reliability. This sort of user interface is part of the idea behind “Truth Goggles,” a master’s project by Dan Shultz, an MIT Media Lab student and Mozilla-Knight Fellow. Shultz proposes to use an algorithm to check articles against a database of claims that have previously been fact-checked by Politifact. Shultz’s layer would highlight a claim and offer an assessment (perhaps by shading the text) based on the work of the fact checkers.

The beauty of using color is the speed and ease with which the reader is able to absorb an assessment of what he or she is reading. The verdict on the statement’s truthfulness is seamlessly integrated into the original content. As Shultz describes the central problem:

The basic premise is that we, as readers, are inherently lazy… It’s hard to blame us. Just look at the amount of information flying around every which way. Who has time to think carefully about everything?

Still, the number of statements that PolitiFact has checked is relatively small, and what I’m describing requires the evaluation of messy empirical claims that stretch the limits of traditional fact-checking. So how might a publication arrive at such an assessment? In any number of ways. For starters, there’s good, old-fashioned editorial judgment. Journalists can provide assessments, so long as they resist the view from nowhere. (Since we’re rethinking the opinion pages here, why not task the editorial board with such a role?)

Publications could also rely on other experts. Rather than asking six experts to contribute to a “Room for Debate”-style forum, why not ask one to write a lead argument and the others not merely to “respond,” but to directly assess the lead author’s claims? Universities may be uniquely positioned to help in this, as some are already experimenting with polling their own experts on questions of public interest. Or what if a Quora-like commenting mechanism was included for each claim, as Dave Winer has suggested, so that readers could offer assessments, with the best ones rising to the top?

Ultimately, how to assess a claim is a process question, and a difficult one. But numerous relevant experiments exist in other formats. One new effort, Hypothes.is, is aiming to add a layer of peer review to the web, reliant in part on experts. While the project is in its early stages, its founder Dan Whaley is thinking hard about many of these same questions.

Better arguments

What I’ve described so far may seem elaborate or resource-intensive. Few publications these days have the staff and the time to experiment in these directions. But my contention is that the kind of content I am describing would be of dramatically higher value to the reader than the content currently available. And while Victor’s UI points towards a more aggressive restructuring of content, much could be done with existing tools. By breaking up an argument into discrete claims, curating evidence and relevant links, and providing credible assessments of those claims, publishers would equip readers to form opinions on merit and evidence rather than merely on trust, intuition, or bias. Aggregation sites like The Atlantic Wire may be especially well-positioned to experiment in this direction.

I have avoided a number of issues in this explanation. Notably, I have neglected to discuss counter-arguments (which I believe could be easily integrated) and haven’t discussed the tension between empirical claims and value claims (I have assumed a focus on the former). And I’ve ignored the tricky psychology surrounding bias and belief formation. Furthermore, some might cite the recent PolitiFact Lie of the Year controversy as evidence that this sort of journalism is too difficult. In my mind, that incident further illustrates the need for credible, honest referees.

Aggregation sites like The Atlantic Wire may be especially well-positioned to experiment in this direction.

Returning once more to Krugman’s argument, imagine the color of the text signaling whether his claims about financial transactions and economic growth are widely accepted. Or mousing over his point about reducing deficits to quickly see links providing background on the issue. What if it turned out that all of Krugman’s premises were assessed as compelling, but his conclusion was not? It would then be obvious that something was missing. Perhaps more interestingly, what if his conclusion was rated compelling but his claims were weak? Might he be trying to convince you of his case using popular arguments that don’t hold up, rather than the actual merits of the case? All of this would finally be apparent in such a setup.

In rethinking how we structure and assess arguments online, I’ve undoubtedly raised more questions than I’ve answered. But hopefully I’ve convinced you that better presentation of arguments online is at least possible. Not only that, but numerous hackers, designers, and journalists — and many who blur the lines between those roles — are embarking on experiments to challenge how we think about content, argument, truth, and credibility. It is in their work that the answers will be found.

Image by rhys_kiwi used under a Creative Commons license.

July 27 2011

10:50

Just in time for 2012 elections: NewsTrust dives into the fact-check business with expanded Truthsquad

Niemanlab :: Just in time for the 2012 elections, the cottage industry of media fact-checking is ramping up. That latest addition is Truthsquad, which began last year as a pilot project of NewsTrust. TruthSquad will differentiate itself from its peers by bringing in the crowd, combining the talents of professional journalists with the eagerness (if not competitiveness) of the public to separate fact from less-than fact. As the Truthsquad homepage puts it, they’re “developing a pro-am network to fact-check political claims during the 2012 elections.

Continue to read Justin Ellis, www.niemanlab.org

February 09 2011

15:00

NewsTrust Baltimore takes a local approach to media literacy and showcasing new journalism

NewsTrust sees its mission as helping readers find “good journalism” by giving people the tools to separate good from bad. But when it comes to journalism, good and bad aren’t exactly universal truths anymore. Is a story good if it adheres to facts but lacks strong writing? Is a story bad if it’s on a blog, regardless of how it’s reported? And what if its told through an ideological or political lens different from your own?

While NewsTrust has previously employed its tools for vetting journalism on a national level, their newest test, NewsTrust Baltimore, takes things to a smaller scale — namely one where readers’ connection to news is based on geography (will a new school be built? is the police department cutting staff? did the legislature cut taxes?) and necessity.

That familiarity, with both the news and outlets reporting it, could make for a better experiment in media criticism as well as media literacy. Who better to judge the Baltimore Sun or WYPR than the people who live in the area?

When I spoke to Fabrice Florin, executive director of NewsTrust, he said their two-month Baltimore project makes sense because of the upheaval in local journalism, as staffs and resources have shrunk at traditional media outlets in the region. At the same time new blogs and alternative media, some created by former journalists, are cropping up.

“The news ecosystem in Baltimore is fascinating. It’s extremely diverse right now,” he told me.

That’s reflected in NewsTrust Baltimore’s partners, ranging from The Sun and WYPR to Baltimore Brew, Urbanite Magazine and local Patch sites. Stories from these sites are aggregated on NewsTrust Baltimore where local reviewers can rate them. (NewsTrust also has a widget that a number of partners are using that allow readers to review stories without having to leave their site.) The rating tools let reviews decide whether stories are factual, fair, well-sourced, well written, and provide context. Beyond that, it also asks whether reviewers trust the publication and would recommend the story. A glance at a recent review shows promising signs:

This would seem to be an in-depth investigative piece, presenting multiple points of view. I can cite no part of it that I know for certain is erroneous or slanted. Therefore I must cautiously assume it to be an informative article. However, I have past experiences with this publication that cause me to bring a skeptical eye to it.

Florin said the value to news organizations is a balanced, structured system to offer feedback. (Contrast that review with what you might see in the comment section at the end of stories.) “Going through the review process the participant is forced to explicitly give criticism,” he said. “The rating system is based on journalist qualities, and when they click on rating buttons they’re giving actionable feedback to the journalist.”

Of course, a smart series of buttons does not automatically make one a media critic. Florin said they offer helpful explainers to what “fair” or “factual” mean to a story. Additionally they’re putting together a library of guides on media literacy and the basics of thinking like a journalist — although that too can be contentious turf.

The goal isn’t to a better informed citizenry, not to make readers think like journalists. That also means trying to foster a broader news appetite among readers. That means exposing readers to the wider variety of media outside of traditional news sources. “There are people who are doing good journalism on the fringes and not necessarily getting the recognition they deserve,” Florin said. “This is where NewsTrust shines.”

They’ve created a formidable regional news aggregator, one that is headed up by editors and a community manager, which makes NewsTrust Baltimore something of a news hub for the region. “In a different world, if we were a for-profit, we could offer a very credible news consumer destination,” Florin said. “We’re really proud of our feed. We really do aggregate the best journalism in Baltimore.” In that sense, NewsTrust Baltimore is more than just an experiment in media literacy or a response to shrinking news sources. By presenting a menu of local news sources, NewsTrust Baltimore is encouraging people to sample broadly and rate their server.

Food analogies aside, NewsTrust could potentially set up franchises (sorry) around the country, with NewsTrust sites for communities with an abundance of new and traditional news sources. Though the Baltimore project is expected to run two months, the NewsTrust team did apply for a Knight News Challenge grant to continue their work and develop funding models to make the project sustainable.

“We’re careful, we don’t want to disrupt the ecosystem. We want to add value to it,” Florin said. “We don’t want to replace the people who are there.”

December 01 2010

17:50

'Report an Error' Button Should Be Standard on News Sites

The web is a two-way medium. But when it comes to reporting errors on news sites, too often, it might as well be broadcast or print.

It's time to change that. That's why, yesterday, we announced the launch of the Report an Error Alliance -- an ad hoc coalition of news organizations and individuals who believe that every news page on the web ought to have a clearly labeled button for reporting errors.

Today's articles come with their own array of buttons for sharing -- and print and email and so on. We believe that opening a channel for readers to report errors is at least as important as any of those functions.

We aim to make the "report an error" button a new web standard. Toward that end, we're releasing a set of icons that anyone can use for this purpose. It's up to each publisher what to do with them -- link them to a form or an email address, use a dedicated error-reporting service like MediaBugs, or choose any other option that suits your needs. What's important is that the button be handy, right by the story, not buried deep in a sea of footer links or three layers down a page hierarchy.

We've got a handful of forward-thinking web news outfits signed on already -- including the Toronto Star, TBD.com, Salon.com, Poynter.org, and NewsTrust.net. We hope to see this roster grow. We also encourage individuals to add their names to our alliance as an indication of your support for this new standard.

Kathy English, public editor at the Toronto Star, which already has its own "report an error" button, said, "I'm pleased that the Star is a founding member of this important initiative to help assure greater accuracy in digital journalism. The Star has long encouraged readers to report errors for correction, in print and online, where the 'Report an Error' function in effect turns every reader into a fact checker. This is a strong step forward in establishing industry best practices for online accuracy and corrections."

Not a Magic Solution

Report an Error is intended to be a focused effort toward a simple goal. Too many news sites still make it hard for you to tell them they made a mistake. Such reports get buried in voice-mail boxes and lost in flame-infested comment threads. Yet journalists still need to hear them, and readers deserve to know that they've been heard.

Implementing a "report an error" button isn't by itself a magic solution to the problem of accuracy and the erosion of confidence in the media. But it's a good start at repairing the growing rift between the press and the public. It's like putting a badge on everything you publish that says, "If you see a problem, we really want to know about it!"

So visit our Report an Error site, join the Alliance yourself, and grab some of our icons to use on your news pages and posts.

The Report an Error Alliance project is a collaboration between Craig Silverman of Regret The Error (and managing editor of MediaShift and Idea Lab) and myself. Though it grows out of my work on MediaBugs, it's a separate effort, intended to distill the simplest, easiest, and most important step in this area that every news website can take.

November 16 2010

19:40

Crowdsourced Fact-Checking? What We Learned from Truthsquad

In June, Senator U.S. Senator Orrin Hatch made the statement that "87 million Americans will be forced out of their coverage" by President Obama's health care plan.

It was quite a claim. But was it true?

That's a common, and important, question -- and it can often be hard to quickly nail down the real facts in the information-overloaded world we live in. Professional fact-checking organizations like PolitiFact and FactCheck.org have taken up the charge to verify the claims of politicians, pundits and newsmakers, and they provide a great service to the public. But I believe there's also a role for the average person in the fact-checking process. By actively researching and verifying what we hear in the news, we can become more informed citizens, and more discriminating news consumers. These are essential skills for all of us to develop.

With that in mind, we at NewsTrust, a non-profit social news network, have been working on Truthsquad, a community fact-checking service that helps people verify the news online, with professional guidance.

Our first pilot for Truthsquad took place in August 2010, with the help of our partners at the Poynter Institute, our advisors at FactCheck.org and our funders at Omidyar Network. That pilot was well received by our community, partners and advisors, as noted in our first report, and by third-party observers such as GigaOm. We've since hosted a variety of weekly Truthsquads, and are starting a second pilot with MediaBugs.org and RegretTheError.com to identify and correct errors in the news media. (Disclosure: MediaShift managing editor Craig Silverman runs RegretTheError.com.)

Our first test project was by our standards a success; more importantly, it revealed several important lessons about the best ways to manage crowdsourced fact-checking, and about why people participate in this activity. Here are our key takeaways from this first pilot, which I'll elaborate on below:

  • A game-like experience makes fact-checking more engaging.
  • A professional-amateur (pro-am) collaboration delivers reliable results and a civil conversation.
  • Crowd contributions are limited, requiring editorial oversight and better rewards.
  • Community fact-checking fills a gap between traditional journalism and social media.

What is Truthsquad?

truthsquadlogo.pngTruthsquad.com features controversial quotes from politicians or pundits and asks the community whether they think they are true or false. Community members are welcome to make a first guess, then check our answers and research links to see if they are correct. They can change their answer anytime, as they come across new facts.

To help participants find the right answer, we invite them to review and/or post links to factual evidence that supports or opposes each statement. A professional journalist leads each "truthsquad" to guide participants in this interactive quest. This "squad leader" then writes an in-depth verdict based on our collaborative research. That verdict is emailed to all participants, with request for comments. (It can be revised as needed.)

Finding #1: Game-Like Experience Makes Fact-Checking Engaging

We noted a significant increase in community participation for Truthsquad compared to other NewsTrust services we tested this year. Some data from our first pilot:

  • This pilot generated twice as much participation as other 2010 pilots.
  • Users gave ten times more answers per quote than reviews per story on our site.
  • Over half of the participants read linked stories, and a third answered a Truthsquad quote.
  • One in six participants reviewed the stories linked as factual evidence.

We think this high level of engagement is partly due to the game-like quality of our user experience, which starts by inviting people to guess whether a statement is true or false -- an easy task that anyone can do in under a minute.

After their first guess, people are more likely to participate as researchers, because their curiosity has been piqued and they want to know the answer. As a result, participants often take the time to review linked stories and post more evidence on their own. Without realizing it, they are becoming fact-checkers.

Finding #2: Pro-Am Collaboration Delivers Reliable Results

We decided early on that professionals needed to guide this collaborative investigation. We wanted to avoid some of the pitfalls of pure crowdsourcing initiatives, which can turn into mob scenes -- particularly around politically charged issues. At the start of this experiment, we asked experienced journalists at FactCheck.org and the Poynter Institute to coach us and our community and help write and edit some of our first verdicts.

We think the pro-am approach paid off in a number of ways:

  • Amateurs learned valuable fact-checking skills by interacting with professionals.
  • A few community members posted links that were critical to reaching our verdicts.
  • Answers from our community generally matched final verdicts from our editors.
  • We came to the same conclusions as FactCheck.org in side-by-side blind tests.
  • Comments from participants were generally civil and focused on facts.

hatchverdict.png

The results of our first pilot led our advisor Brooks Jackson, director at FactCheck.org, to comment, "So far I would say the experiment is off to a solid start. The verdicts of the Truthsquad editors seem to me to be reasonable and based on good research."

This collaboration between journalists and citizens made us all more productive. The professionals shared helpful information-gathering tips, and the citizens extended that expertise on a larger scale, with multiple checks and balances between our community and our editors. Our editors spearheaded this investigation, but the community made important contributions through comments and links to factual evidence (some of which were invaluable). On a couple occasions, we even revised our verdicts based on new evidence from our community. This focus on facts also helped set the tone for our conversations, which were generally civil and informative.

Finding #3: Crowd Contributions Are Limited, Requiring Better Rewards

Despite high levels of participation, we didn't get as many useful links and reviews from our community as we had hoped. Our editorial team did much of the hard work to research factual evidence. (Two-thirds of story reviews and most links were posted by our staff.) Each quote represented up to two days of work from our editors, from start to finish. So this project turned out to be more labor-intensive than we thought, and a daily fact-checking service will require a dedicated editorial team to guarantee reliable results.

Managing our community and responding thoughtfully to their posts also takes additional time, and is an important part of this process. In future releases, we would like to provide more coaching and educational services, as well as better rewards for our contributors.

Training citizens to separate fact from spin is perhaps the greatest benefit of our initiative, but keeping them engaged will require ingenuity and tender loving care on our part.

"It seems based on this pilot that citizens can learn fact-checking skills quite easily," said Kelly McBride of the Poynter Institute. "The challenge is to motivate them to do this occasionally."

To address this issue, future versions of Truthsquad could reward members who take the time to fact-check the news in order to get them to do it more often. We would like to give them extra points for reading, reviewing or posting stories, as well as special badges, redeemable credits and/or prizes. We can also feature high scores on leaderboards, and give monthly awards to the most deserving contributors.

Finding #4: Community Fact-Checking Fills a Need

Every survey we have done in recent years has consistently shown fact-checking as a top priority for our community, and this was confirmed by the results of this pilot.

Here are some key observations from our recent survey about NewsTrust's 2010 pilots:

  • A majority of survey respondents (61 percent) found Truthsquad useful or very useful.
  • Half of survey respondents wanted a new quote every day -- or several quotes a day.
  • Half of survey respondents said they could fact-check quotes several times per week.
  • One in seven survey respondents were prepared to donate for this service.

Screen shot 2010-11-16 at 8.43.40 AM.png

We think the generally favorable response to Truthsquad is due to two factors: a growing demand for fact-checking services, combined with a desire to contribute to this civic process. Fact-checking is still the best way to verify the accuracy of what people hear in the news, and it is perceived as an effective remedy to expose politicians or pundits who propagate misinformation.

At the same time, the explosion of social media makes people more likely to participate in investigations like these. They want this civil watchdog network, and expect to have a voice in it.

Next steps

Based on the lessons from this experiment, we would like to offer Truthsquad on an ongoing basis, with a goal to fact-check one quote a day, year-round -- as well as to feature the work of other trusted research organizations on Truthsquad.com.

We also want to let members post their own quotes for fact-checking and reward them for their contributions through both a game-like interface and more educational benefits. We have an opportunity to track the expertise of participants based on their answers, which could allow us to measure their progress with core news literacy skills, as well as their overall understanding of important public issues and the overall impact of our service.

Over time, we hope to provide more training and certification services, to build lasting research skills that could help students and adults alike play a more active role in our growing information economy. If this appeals to you, we invite you to sign up here and join our experiment.

As for that Orrin Hatch quote? In the end, 163 participants helped us fact-check his statement about health care. Our final verdict was that the Senator's claim was false. That finding was based on factual evidence provided by one of our NewsTrust members, who dug up the right set of regulations, and pointed out they had been misstated by Hatch. Our editor's verdict was confirmed by similar findings from FactCheck.org, and also matched our community's consensus: 138 participants answered that this statement was false (versus 11 who thought it was true).

More importantly, we as a community learned how to separate fact from fiction -- and became more engaged as citizens.

Fabrice Florin is executive director and founder of NewsTrust, where he manages creative and business development for this next-generation social news network. NewsTrust helps people find and share good journalism online, so they can make more informed decisions as citizens. With a 30-year track record in new media and technology, Fabrice has developed a wide range of leading-edge entertainment, education and software products. Fabrice's previous wireless venture, Handtap, was a leading provider of multimedia content for mobile phones. Fabrice was recently elected an Ashoka Fellow for his work as a social entrepreneur in journalism.

This is a summary. Visit our site for the full post ».

November 09 2010

15:00

Loose ties vs. strong: Pinyadda’s platform finds that shared interests trump friendships in “social news”

There isn’t a silver bullet for monetizing digital news, but if there were, it would likely involve centralization: the creation of a single space where the frenzied aspects of our online lives — information sharing, social networking, exploration, recommendation — live together in one conveniently streamlined platform. A Boston-based startup called Pinyadda wants to be that space: to make news a pivotal element of social interaction, and vice versa. Think Facebook. Meets Twitter. Meets Foursquare. Meets Tumblr. Meets Digg.

Owned by Streetwise Media — the owner as well of BostInnovation, the Boston-based startup hub — Pinyadda launched last year with plans to be a central, social spot for gathering, customizing, and sharing news and information. The idea, at first, was to be an “ideal system of news” that would serve users in three ways:

1. it should gather information from the sites and blogs they read regularly;

2. it should mimic the experience of receiving links and comments from the people in their personal networks; and

3. it should be continually searching for information about subjects they were interested in. This pool of content could then be ranked and presented to users in a consistent, easily browsed stream.

Again, centralization. And a particular kind of centralization: a socialized version. Information doesn’t simply want to be free, the thinking went; it also wants to be social. The initial idea for Pinyadda was that leveraging the social side of the news — making it easy to share with friends; facilitating conversations with them — would also be a way to leverage the value of news. Which ties into the conventional wisdom about the distributive power of social news. In her recent NYRB review of The Social Network, Zadie Smith articulates that wisdom when it comes to Facebook’s Open Graph — a feature, she wrote, that “allows you to see everything your friends are reading, watching, eating, so that you might read and watch and eat as they do.”

What Pinyadda’s designers have discovered, though, is that “social” news doesn’t necessarily mean “shared with friends.” Instead, Pinyadda has found that extra-familiar relationships fuel news consumption and sharing in its network: Social news isn’t about the people you know so much as the people with whom you share interests.

Pinyadda’s business model was based on the idea that the social approach to news — and the personalization it relied on — would allow the platform to create a new value-capture mechanism for news. The platform itself, its product design and development lead, Austin Gardner-Smith, told me — with its built-in social networks and its capacity for recommendation and conversation — bolsters news content’s value with the experiential good that is community — since a “central point of consumption” tends to give the content being consumed worth by proximity.

The idea, in other words, was to take a holistic approach to monetization. Pinyadda aimed to take advantage of the platform’s built-in capacity for personalization — via behavioral tracking, or, less nefariously, paying attention to their individual users — to sell targeted ads against its content. “Post-intent” advertising is interest-based advertising — and thus, the thinking goes, more effective/less annoying advertising. That thinking still holds; in fact, the insight that common interests, rather than familiarity, fuels news consumption could ratifies it. As Dan Kennedy put it, writing about the startup after they presented at a Hacks/Hackers meetup this summer: “Pinyadda may be groping its way toward a just-right space between Digg (too dumb) and NewsTrust (too hard).” The question will be whether news consumers, so many of them already juggling relationships with Facebook and Twitter and Tumblr and Posterous and other such sites, can make room for another one. And the extent to which the relationships fostered in those networks — connections that are fundamentally personal — are the types that drive the social side of news.

March 25 2010

21:05

Featured Project Update: NewsTrust

NewsTrust is a N2Y2 Featured Project that they described as a "free online social news network to help people find and share good journalism". Three years later, NewsTrust is still helping people make more informed decisions through journalism and offers an integrated online service which includes a news filter, media literacy tools and a civic engagement network.

Recently, NewsTrust has been expanding its platform and over the next six months plans to run pilot programs for use of the following tools:

read more

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl