Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 20 2011

16:00

Chasing pageviews with values: How the Christian Science Monitor has adjusted to a web-first, SEO’d world

Editor’s note: At the International Symposium on Online Journalism earlier this month, one of my favorite papers presented was by Drury’s Jonathan Groves and Carrie Brown-Smith of the University of Memphis. They’ve been spending a lot of time in the newsroom of the Christian Science Monitor to observe its transition from a daily print newspaper to a web-first newsroom with a weekly print edition. That transition required shifts in operations, in culture, and in the kind of journalism the Monitor produces.

Their full paper (pdf) is worth a read for its analysis of how those changes were made and what was gained and lost. But I’ve asked them to write a summary of their findings for the Lab. As they write, it’s up to you to judge how much this counts as a tragedy or a success for journalism.

We’ve seen a flood of innovations over the past few years in journalism on the web: from technology and the delivery of news to new forms of storytelling and reporting. But making those innovations happen has been neither fast nor easy. How do you manage meaningful change that sticks? That question drives our research.

Since October 2009, we have immersed ourselves in the Christian Science Monitor as it took the “web-first” mantra beyond platitudes and abandoned its daily print edition.

It was a difficult, wrenching process for many journalists used to the rhythm of the daily newspaper and concerned about the fate of the Monitor’s serious take on the news of the day. But the lessons learned along the way are valuable for any legacy news organization.

Like many newspapers, the Monitor faced a critical moment in 2008. Its national circulation had plummeted from 220,000 in 1970 to 52,000. Revenue was dwindling. And its owner, the First Church of Christ, Scientist, told newsroom managers the paper’s $12 million annual subsidy would be slashed to $4 million in five years. Such moments are fear-inducing and disruptive. They are also opportunities for meaningful change.

Monitor editor John Yemma and publisher Jonathan Wells developed a plan: Remove the shackles of the daily print edition, increase pageviews, and aggressively pursue online advertising. The paper also maintained a weekly print edition that allowed it to continue doing some longer-form journalism.

They set a clear five-year newsroom target: Drive pageviews from 3 million per month to 25 million. And they reached it.

Key to the Monitor’s transformation was having strong change agents who were able to challenge deeply embedded cultural assumptions and push the newsroom toward thinking about things differently — even if it sometimes meant ruffling some feathers. Leading the way were Yemma, managing editor Marshall Ingwerson, and particularly online editor Jimmy Orr, whose non-traditional background in the worlds of politics and blogging gave him a fresh perspective on the news ecosystem.

In news organizations we and others have examined, journalists are often skeptical of change efforts, especially when it alters the way news is gathered and disseminated. As one staffer we interviewed in December 2009 said of the web: “Hopefully, we can be in it, but not of it.” Monitor employees had strong ideas about the paper’s values. Here are excerpts from our interviews with three staffers:

The Monitor story before was a very particular kind of story. You always looked for a larger analytical story on any given news point. You just didn’t do the news story, you know. You always did something larger than that, and you always looked for, to be, you know, to be more analytical about it…

We talk about being solution-based journalism. We don’t go into the fray; we try to push the discussion in a new way that is productive…

…seeking solutions to problems, staying away from sensationalism, analysis and thoughtful kind of assessment of what’s going on rather than jumping to snap conclusions and going for, not so much a focus on breaking news, but more on understanding the reasons, the causes behind the news of the day — I mean, that’s what we aspire to…

Over the course of our study, Orr challenged staffers’ ideas about Monitor journalism, and many recoiled. He pushed for more blogs on the site. He encouraged pursuing items about Tiger Woods and other topics that many staffers felt didn’t fit with the original Monitor mission: “To injure no man, but to bless all mankind.”

The newsroom incorporated a four-pronged strategy:

  1. Increase the frequency of updating, writing several posts on a subject rather than one long story.
  2. Use search engine optimization to find key phrases that would improve a post’s ranking in Google.
  3. Monitor Google Trends for hot topics and sometimes assign stories on that basis, allowing the paper to “ride the Google wave,” as one editor put it.
  4. Use social media including Twitter, Facebook, and Tumblr to reach new audiences.

In this process, the organization embraced emergent strategy, an idea some referred to at the recent International Conference on Online Journalism as “failing fast.” The Monitor took an iterative approach to innovation, trying new ideas, and dropping those that didn’t work. Over the course of the study period, the newsroom tried many forms of web content, including blogs, live webcasts, and podcasts. And managers weren’t afraid to halt those items that weren’t garnering traffic. Podcasts, a weekly Yemma webcast, and video didn’t generate the return they’d hoped for, so each was stopped or scaled back.

The strategy helped push web numbers to new heights. By July 2010, the site had reached its 25 million pageview goal. And though many staffers expressed concerns about the changes, success reduced tension. Several noted the greater traffic infused the newsroom with a new sense of relevance. “This revival has been a real morale booster for yours truly,” said one staffer who had been with the paper for more than 20 years. “For a long time, I felt like I was on a losing team. Not losing in the sense of — we had a strong product. But it didn’t have much reach.”

A key factor in the success was a new content management system designed for web publishing. It democratized the process of web production and made it easier for anyone to develop and post new content.

But work remains to be done. Though pageviews have climbed, ad revenues have not grown in corresponding fashion, and the church subsidy will continue to diminish. And the hard work continues, as one editor noted in January:

So I have to do it six, seven times (a day), you know — to think of stories that bring what I would consider our Monitor values to a topic that is not where we normally would have been, and we’re doing it because the public is interested in this topic. So, what do we have to say about it that’s interesting, or clearer, or sheds some new perspective on what’s going on here? And it’s hard. You know, we weren’t accustomed to having to be that instantaneously responsive, and we don’t have the luxury of saying, “Well, you know that story is really not for us.” And when we’ve got pageview targets that we’re all assessed to hit every month, you’ve gotta come up with something on what people want to read about.

Whether the Monitor’s transition can be categorized as a tragedy or a success for journalism remains difficult to gauge. “Riding the Google wave” is difficult for the serious, in-depth international news the Monitor has long been known for. But even the greatest journalism has little impact on the world when its readership is small and diminishing. And today, the Monitor is increasingly injecting itself into the national conversation.

March 02 2011

15:00

Dennis Mortensen: Are news orgs worrying too much about search and not enough about the front page?

Editor’s Note: This is a guest post from Dennis R. Mortensen, former director of data insights at Yahoo and founder of Visual Revenue, a New York firm that sells its predictive-analytics services to news organizations.

Dennis saw my talk at the Canadian Journalism Foundation in January and wanted to comment on the role of a news site’s front page in its success in assembling an audience. He argues that paying too much attention to SEO on current articles could backfire on news orgs.

In Josh’s talk in Toronto, he hypothesized that:

[The front page is] still an enormously powerful engine of traffic. I would say actually that for most American newspapers…it’s probably 70 percent in a lot of cases.

Josh is saying you should view the front page as a traffic channel unto itself, just as you’d think of Facebook or Google — something I wholeheartedly agree with. If you choose to view your front page as a traffic channel, you’ll also sign up for a different kind of data analysis — analysis that mixes external referrers with internal referrers. In other words, a link from the Drudge Report is no different than a link from your own front page, in the sense that they both should be viewed as channels to optimize.

I argue that the front page is the most important engine of traffic for news media publishers today. I would also argue that this whole notion of news media publishers being held hostage by Google — and the slightly suboptimal idea of optimizing your articles for search to begin with — is somewhat misguided. It certainly seems wrong when we look at the data.

In this analysis, it’s important to distinguish between two core segments: current article views and archived article views. To begin, I’ve chosen a set of very strict and non-favorable definitions to my conclusion. A current article is defined as an item of content that is directly being exposed on the front or section front page right now. Any other content not currently exposed on a front page or section front page is deemed to be an archived article.

We looked at a sample of about 10 billion article views, across a sample of Visual Revenue clients, and found that on any given day, 64 percent of views are on current articles, and 36 percent of views are on archived articles.

So on a typical day, for most if not all news media publishers, the largest portion of article views comes off of their current article segment — stories published today or perhaps yesterday and still being actively promoted. I find this analysis fascinating and almost empowering, for the simple reason that most current news events are initially non-searchable. If a revolution breaks out in Egypt, I won’t know until I’m told about it. Non-searchable translates into a need for the stories to be discoverable by the reader in a different way, such as on a front page, through RSS, or in their social stream — all channels the publisher either owns or can influence.

There is no doubt that search, as a channel, owns the archive. One can discuss the data of why that is and why it is or isn’t optimal — I’ll leave that for a later discussion. But today, let’s focus on the current article segment, by far the bigger of the two. Where do those views come from? Looking at the same dataset from our clients, we get a very clear indication of where one’s focus should lie:

Sources of current article views:

78 percent come from front pages
7 percent come from search
6 percent come from social media
5 percent come from news aggregators
3 percent come from news originators
1 percent come from RSS & email

(We’re defining “news originators” as sites where at least two-thirds of the stories excerpted and promoted on their front page are original pieces generated by the news organization — which includes most traditional media. “News aggregators” are sites where less than two-thirds are original, such as Google News, Techmeme, or Drudge.)

I doubt this front-page dominance is much of a surprise to most editors — but for some reason, it seems like we aren’t taking the appropriate action today. We have 78 percent of all views on current articles coming from the front page — that’s 49 percent of all your article views on any given day — and what do we do to optimize it? And why is it that so many news organizations think immediately of search when we write a new story, when search has minimal initial impact? Even worse, writing an SEO-optimized article excerpt title for your front page probably deoptimizes your results on current articles.

The front page is indeed still an enormously powerful engine of traffic. We now know that about half of your article views can be attributed to the primary front page or the section front pages — and with it a huge chunk of any news organization’s online revenue. The question, then, is what kind of processes and optimization methodologies have you applied to take advantage of this fact?

February 28 2011

15:00

“Like,” “share,” and “recommend”: How the warring verbs of social media will influence the news’ future

It appears that Facebook has settled on a central metaphor for the behavior of its 600 million users.

See an interesting article? Want your friends to see it too? Facebook’s offered up two primary verbs to bring action to that formless desire: “Share” and “Like.”

But the writing’s been on the wall for “Share” for some time. Facebook seemed to abandon development on “Share” in the fall. And on Sunday, Mashable reported that the remaining functionality of “Share” is being moved over to the much more popular “Like” button. (Clicking “Like” on a webpage will now post a thumbnail and excerpt of it on your Facebook wall, just as “Share” used to do. The old “Like” behavior made the links less prominent. It’s actually a pretty big deal that will likely lead to stories spreading more readily through Facebook.)

But I’m less interested in the details of the implementation than the verbs: sharing (tonally neutral, but explicitly social) has clearly lost to liking (with its ring of a personal endorsement).

There’s actually a third verb, “Recommend.” Unlike “Share,” it’s not its own separate action within FacebookWorld; it’s just “Like” renamed, with a less forceful endorsement. But it lives deep in the shadow of “Like” everywhere — except on traditional news sites, which have tended to stay far away from “Like.” I just did a quick scan of some of the web’s most popular news sites to see what metaphor they use to integrate with Facebook on their story pages.

“Share”: Los Angeles Times, ProPublica, Talking Points Memo, Reuters, ESPN, The Guardian.

“Recommend”: MSNBC, CNN, New York Times, New Yorker, Washington Post, Globe and Mail, Le Monde, El Pais, Newsweek, Telegraph, CBC.

“Like”: Gawker, Politico, Slate, Wired, Time, Wall Street Journal.

Both “Like” and “Share”: Huffington Post, Chicago Tribune.

Now, that’s an unscientific sampling. And, among those who use “Share,” some might have preferred the different functionality (although that difference has now disappeared). But looking at those names, it seems to me that many more traditional news organizations are uncomfortable with the “Like” metaphor that has become the lingua franca of online sharing. The “Likers” are more likely to be Internet-era creations; news orgs that existed 30 years ago tend toward the more neutral choices. (With a few exceptions.)

And that’s understandable: Newsroom culture has long been allergic to explicitly connecting the production of journalism and the expression of a reader’s endorsement. (Just the facts, ma’am!) And “Like” is awkward. When I click a button next to a story, does that mean I like the fact that “Tunisian Prime Minister Resigns,” or that I like the storyTunisian Prime Minister Resigns“? But there’s no doubting the appeal of “Like,” which feels like a vote when “Share” mostly feels like work.

Facebook hasn’t announced that “Share” buttons will stop working any time soon, and there’s always “Recommend” sitting there as a milquetoast alternative for the emotion-squeamish. (Although technically “Recommend” presents most the same problems as “Like” — it can still be read as a fuzzy endorsement.) But there’s a bigger issue here, as news organizations — many of them traditional bringers of bad news — have to adjust to an online ecosystem that privileges emotion, particularly positive emotion.

Emotion = distribution

I can tell you, anecdotally, that for our Twitter feed, @niemanlab, one of the best predictors of how much a tweet will get retweeted is the degree to which it expresses positive emotion. If we tweet with wonderment and excitement (“Wow, this new WordPress levitation plugin is amazing!”), it’ll get more clicks and more retweets than it we play it straight (“New WordPress plugin allows user levitation”).

For harder data, check out some work done by Anatoliy Gruzd and colleagues at Dalhousie University, presented at a conference last month. Their study looked at a sample of 46,000 tweets during the Vancouver Winter Olympics and judged them on whether they expressed a positive, negative, or neutral emotion. They found that positive tweets were retweeted an average of 6.6 times, versus 2.6 times for negative tweets and 2.2 times for neutral ones. That’s two and a half times as many acts of sharing for positive tweets. (Slide deck here.)

Facebook’s own internal data, looking at major news sites’ presence within Facebook, found that “provocative” or “passionate” stories generated two to three times the engagement of other stories.

Or take the Penn study by Jonah Berger and Katherine L. Milkman of The New York Times’ most emailed list. It found that “positive content is more viral than negative content,” but noted that it’s actually as much about arousal (speaking emotionally, not sexually) as anything. Content that you can imagine someone emailing with either “Awesome!” or “WTF?” in the subject line gets spread.

Social media as the new SEO

Here’s the thing: The way that news gets reported and presented is influenced by economic incentives. When publishers realized that Google search traffic was a big driver of traffic, you saw punny headlines swapped for clots of “keyword-dense” verbiage and silly repetitive tag clouds — all trying to capture a little bit more attention from Google’s algorithm and, with it, a little more ad revenue.

But I believe we’ll soon be at a point where social media is a more important driver of traffic than search for many news organizations. (It certainly already is for us.) And those social media visitors are already, I’d argue, more useful than search visitors because they’re less likely to be one-time fly-by readers. As people continue to spend outrageous amounts of time on Facebook (49 billion minutes in December), as Twitter continues to grow, as new tools come along, we’ll see more and more people get comfortable with the idea that their primary filter for news will be what gets shared by their friends or networks.

And that means a phrase like social media optimization will mean more than just slapping sharing buttons on your stories and telling your reporters to check in on Twitter twice a day. It’ll also mean changing, in subtle ways, the kinds of content being produced to encourage sharing. I’m not saying that’s a good thing or a bad thing — just that it’s the natural outcome of the economic incentives at play.

Does that just mean more listicles? Maybe. But I’d argue that, on the whole, figuring out how to make people want to share your work with their friends generates a healthier set of incentives than figuring out how to manipulate Google’s algorithm. Providing pleasure — pleasure that someone wants to share — is not an inappropriate goal. And when you broaden out beyond “positive emotions” to the idea of driving arousal or stimulation — positive or negative — the idea starts to fall a little more neatly into what news organizations consider their job to be.

Let’s be clear: I’m not saying that news orgs should become engines of happy stories or only focus on the most outrageous or enticing news. Their mission can’t be channeled exclusively in that direction. I don’t know what it will look like for a quality news organization to focus on making more sharable journalism; it’ll be up to the very smart people who work at them to figure out how to do that while defending their brand identities. But I do know that the role of social media is going to keep increasing, and with it will come increased economic pressures to maximize for it. They may not “Like” or “Recommend” it, but I suspect it’s a fate they’ll all, er, “Share.”

December 21 2010

16:00

Tablet-only, mobile-first: News orgs native to new platforms coming soon

Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring.

Here are 10 predictions from Vadim Lavrusik, community manager and social strategist at Mashable. Mashable, where these predictions first appeared, covers the heck out of the world of social media and have an honored place in our iPhone app.

In many ways, 2010 was finally the year of mobile for news media, and especially so if you consider the iPad a mobile device. Many news organizations like The Washington Post and CNN included heavy social media integrations into their apps, opening the devices beyond news consumption.

In 2011, the focus on mobile will continue to grow with the launch of mobile- and iPad-only news products, but the greater focus for news media in 2011 will be on re-imagining its approach to the open social web. The focus will shift from searchable news to social and share-able news, as social media referrals close the gap on search traffic for more news organizations. In the coming year, news media’s focus will be affected by the personalization of news consumption and social media’s influence on journalism.

Leaks and journalism: a new kind of media entity

In 2010, we saw the rise of WikiLeaks through its many controversial leaks. With each leak, the organization learned and evolved its process in distributing sensitive classified information. In 2011, we’ll see several governments prosecute WikiLeaks founder Julian Assange for his role in disseminating classified documents and some charges will have varying successes. But even if WikiLeaks itself gets shut down, we’re going to see the rise of “leakification” in journalism, and more importantly we’ll see a number of new media entities, not just mirror sites, that will model themselves to serve whistle blowers — WikiLeaks copycats of sorts. Toward the end of this year, we already saw Openleaks, Brusselsleaks, and Tradeleaks. There will be many more, some of which will be focused on niche topics.

Just like with other media entities, there will be a new competitive market and some will distinguish themselves and rise above the rest. So how will success be measured? The scale of the leak, the organization’s ability to distribute it and its ability or inability to partner with media organizations. Perhaps some will distinguish themselves by creating better distribution platforms through their own sites by focusing on the technology and, of course, the analysis of the leaks. The entities will still rely on partnerships with established media to distribute and analyze the information, but it may very well change the relationship whistleblowers have had with media organizations until now.

More media mergers and acquisitions

At the tail end of 2010, we saw the acquisition of TechCrunch by AOL and the Newsweek merger with The Daily Beast. In some ways, these moves have been a validation in the value of new media companies and blogs that have built an audience and a business.

But as some established news companies’ traditional sources of revenue continue to decline, while new media companies grow, 2011 may bring more media mergers and acquisitions. The question isn’t if, but who? I think that just like this year, most will be surprises.

Tablet-only and mobile-first news companies

In 2010, as news consumption began to shift to mobile devices, we saw news organizations take mobile seriously. Aside from launching mobile apps across various mobile platforms, perhaps the most notable example is News Corp’s plan to launch The Daily, an iPad-only news organization that is set to launch early 2011. Each new edition will cost $0.99 to download, though Apple will take 30%. But that’s not the only hurdle, as the publication relies on an iPad-owning audience. There will have been 15.7 million tablets sold worldwide in 2010, and the iPad represents roughly 85% of that. However, that number is expected to more than double in 2011. Despite a business gamble, this positions news organizations like The Daily for growth, and with little competition, besides news organizations that repurpose their web content. We’ve also seen the launch of an iPad-only magazine with Virgin’s Project and of course the soon-to-launch News.me social news iPad application from Betaworks.

But it’s not just an iPad-only approach, and some would argue that the iPad isn’t actually mobile; it’s leisurely (yes, Mark Zuckerberg). In 2011, we’ll see more news media startups take a mobile-first approach to launching their companies. This sets them up to be competitive by distributing on a completely new platform, where users are more comfortable with making purchases. We’re going to see more news companies that reverse the typical model of website first and mobile second.

Location-based news consumption

In 2010, we saw the growth of location-based services like Foursquare, Gowalla and SCVNGR. Even Facebook entered the location game by launching its Places product, and Google introduced HotPot, a recommendation engine for places and began testing it in Portland. The reality is that only 4% of online adults use such services on the go. My guess is that as the information users get on-the-go info from such services, they’ll becomes more valuable and these location-based platforms will attract more users.

Part of the missing piece is being able to easily get geo-tagged news content and information based on your GPS location. In 2011, with a continued shift toward mobile news consumption, we’re going to see news organizations implement location-based news features into their mobile apps. And of course if they do not, a startup will enter the market to create a solution to this problem or the likes of Foursquare or another company will begin to pull in geo-tagged content associated with locations as users check in.

Social vs. search

In 2010, we saw social media usage continue to surge globally. Facebook alone gets 25% of all U.S. pageviews and roughly 10% of Internet visits. Instead of focusing on search engine optimization (SEO), in 2011 we’ll see social media optimization become a priority at many news organizations, as they continue to see social close the gap on referrals to their sites.

Ken Doctor, author of Newsonomics and news industry analyst at Outsell, recently pointed out that social networks have become the fastest growing source of traffic referrals for many news sites. For many, social sites like Facebook and Twitter only account for 10% to 15% of their overall referrals, but are number one in growth. For news startups, the results are even more heavy on social. And of course, the quality of these referrals is often better than readers who come from search. They generally yield more pageviews and represent a more loyal reader than the one-off visitors who stumble across the site from Google.

The death of the “foreign correspondent”

What we’ve known as the role of the foreign correspondent will largely cease to exist in 2011. As a result of business pressures and the roles the citizenry now play in using digital technology to share and distribute news abroad, the role of a foreign correspondent reporting from an overseas bureau “may no longer be central to how we learn about the world,” according to a recent study by the Reuters Institute for the Study of of Journalism. The light in the gloomy assessment is that there is opportunity in other parts of the world, such as Asia and Africa, where media is expanding as a result of “economic and policy stability,” according to the report. In 2011, we’ll see more news organizations relying heavily on stringers and, in many cases, social content uploaded by the citizenry.

The syndication standard and the ultimate curators

Syndication models will be disrupted in 2011. As Clay Shirky recently predicted, more news outlets will get out of the business of re-running the same story on their site that appeared elsewhere. Though this is generally true, the approach to syndication will vary based on the outlet. The reality is that the content market has become highly fragmented, and if content is king, then niche is certainly queen. Niche outlets, which were once curators of original content produced by established organizations, will focus more on producing original content. While established news brands, still under pressure to produce a massive amount of content despite reduced staff numbers, will become the ultimate curators. This means they will feature just as much content, but instead through syndication partners.

You already see this taking place on sites like CNN.com or NYTimes.com, both of whose technology sections feature headlines and syndicated content from niche technology publications. In this case, it won’t only be the reader demand for original content that drives niche publications to produce more original content, but also its relationship with established organizations that strive to uphold the quality of their content and the credibility of their brand. Though original content will be rewarded, specialized, niche publications could benefit the most from the disruption.

Social storytelling becomes reality

In 2010, we saw social content get weaved into storytelling, in some cases to tell the whole story and in other cases to contextualize news events with curation tools such as Storify. We also saw the rise of social news readers, such as Flipboard and Pulse mobile apps and others.

In 2011, we’ll not only see social curation as part of storytelling, but we’ll see social and technology companies getting involved in the content creation and curation business, helping to find the signal in the noise of information.

We’ve already heard that YouTube is in talks to buy a video production company, but it wouldn’t be a surprise for the likes of Twitter or Facebook to play a more pivotal role in harnessing its data to present relevant news and content to its users. What if Facebook had a news landing page of the trending news content that users are discussing? Or if Twitter filtered its content to bring you the most relevant and curated tweets around news events?

News organizations get smarter with social media

In 2010, news organizations began to take social media more seriously and we saw many news organizations hire editors to oversee social media. USA Today recently appointed a social media editor, while The New York Times dropped the title, and handed off the ropes to Aron Pilhofer’s interactive news team.

The Times’ move to restructure its social media strategy, by going from a centralized model to a decentralized one owned by multiple editors and content producers in the newsroom, shows us that news organizations are becoming more sophisticated and strategic with their approach to integrating social into the journalism process. In 2011, we’re going to see more news organizations decentralize their social media strategy from one person to multiple editors and journalists, which will create an integrated and more streamlined approach. It won’t just be one editor updating or managing a news organization’s process, but instead news organizations will work toward a model in which each journalist serves as his or her own community manager.

The rise of interactive TV

In 2010, many people were introduced to Internet TV for the first time, as buzz about the likes of Google TV, iTV, Boxee Box and others proliferated headlines across the web. In 2011, the accessibility to Internet TV will transform television as we know it in not only the way content is presented, but it will also disrupt the dominance traditional TV has had for years in capturing ad dollars.

Americans now spend as much time using the Internet as they do watching television, and the reality is that half are doing both at the same time. The problem of being able to have a conversation with others about a show you’re watching has existed for some time, and users have mostly reacted to the problem by hosting informal conversations via Facebook threads and Twitter hashtags. Companies like Twitter are recognizing the problem and finding ways to make the television experience interactive.

It’s not only the interaction, but the way we consume content. Internet TV will also create a transition for those used to consuming video content through TVs and bring them to the web. That doesn’t mean that flat screens are going away; instead, they will only become interconnected to the web and its many content offerings.

December 17 2010

18:00

Jason Fry: A blow to content farms, Facebook’s continued growth, and the continued pull of the open web

Editor’s Note: We’re wrapping up 2010 by asking some of the smartest people in journalism what the new year will bring. Today, our predictor is Jason Fry, a familiar byline at the Lab. Jason also prognosticated earlier this week about the potential success of  the NYT paywall.

Hyperlocal will remain stubbornly small scale. Large-scale efforts at cracking hyperlocal will seed more news organizations with content, but that content will remain mostly aggregation and data and still feel robotic and cold. Meanwhile, small-scale hyperlocal efforts will continue to win reader loyalty, but struggle to monetize those audiences. By the end of 2011, the most promising developments in hyperlocal will come from social media. Promising efforts to identify and leverage localized news and conversation in social media will be the buzz of late 2011, and we’ll be excited to think that social media is proving an excellent stepping stone to greater involvement in our physical communities.

Google will deal the content farms a big blow by tweaking its algorithms to drive down their search rankings. But the company will be opaque to the point of catatonia about exactly what it did and why it did it, reflecting its reluctance to be drawn into qualitative judgments about content. There will be talk of lawsuits by the spurned content farms and no small amount of jawing about Google’s power, lack of transparency, and whether or not it’s being evil. But even those worried about Google’s actions will admit that search is a much better experience now that results are less cluttered with horribly written crap.

Tablets will carve out a number of interesting niches, from favored input device of various specialists to device you like to curl up with on the couch. But these will be niches: The open web will remain as robust as ever, and be the killer app of the tablet just like it is everywhere else. News organizations’ walled-garden apps will win some converts, and apps in general will continue to point to promising new directions in digital design, but there will be no massive inflows of app revenue to news organizations. This will be seen as a failure by folks who got too excited in the first place.

Facebook will further cement its dominance by beginning to focus on ways to extract and preserve moments that matter to us from the ceaseless flow of the news feed, building on its photo-archiving role to also become a personal archive of beloved status updates, exchanges, links, and other material. The most promising startups and efforts from established social media companies will center around creating quiet water that draws from the river of news without leaving us overwhelmed by the current.

November 24 2010

15:30

A/B testing for headlines: Now available for WordPress

Audience data is the new currency in journalism. I don’t just mean the traditional Costco buy-in-bulk kind — “our readers are 52 percent male, 46 percent over $75,000 household income, 14 percent under age 35,” and so on. I mean data that looks at how individual readers interact with individual pieces of content. And beyond that shift there’s also the move from observational data — watching what your audience does — to experimental data, testing various ways of presenting or structuring content to see what works and what doesn’t.

My desire for more experimental data is one reason why I’m very happy to point you to a new resource for sites built on WordPress (like this one): a new Headline Split Tester plugin, built by Brent Halliburton and Peter Bessman, two Baltimore developers.

Not sure if you want a straight, newsy headline or something with a little more pizzazz? Something keyword-dense and SEO friendly or something more feature-y? This plugin lets you write two headlines for each post and have them presented at random to readers. The plugin records how often each version of the headline has been clicked and, once it has enough data, swaps full-time to the most effective one.

If you’re in the kind of operation that has regular debates over headline strategy, here’s a great way to test it. (Although note that this is measuring clicks on articles within your site — it doesn’t tell you anything about the SEO effectiveness of a headline. You’d have to wait for Google data for that.)

We have lots of debates over the appropriate role of audience metrics in journalism. But personally, I’d rather have those debates armed with as much data as possible. If you want your site to be filled with puns and plays on words instead of SEO-friendly nouns, fine — but it’s worth knowing how much of a traffic impact that decision has when you make it.

I’m happy to say we apparently played a small role in its creation: Halliburton writes that he was inspired by an old Lab post that described how The Huffington Post uses A/B split testing on some of its headlines:

Readers are randomly shown one of two headlines for the same story. After five minutes, which is enough time for such a high-traffic site, the version with the most clicks becomes the wood that everyone sees.

Give it a try — and if you’re a PHP coder, try to make it better, as patches are welcome. (Another, more ambitious A/B testing project for WordPress, ShrimpTest, is also in development and in preview release.)

Halliburton (who runs Cogmap and Deconstruct Media) and Bessman (who’s an engineer at marketing firm R2integrated) built the plugin in as 2010 a way as possible: at last weekend’s Baltimore Hackathon, where the plugin won a prize for best prototype. Have a good idea, bang out code in a weekend, share it with a potential audience of millions using the same platformthat’s the promise of open source and collaboration in a nutshell.

September 23 2010

17:30

What impact is SEO having on journalists? Reports from the field

Last week, I wrote that SEO and audience metrics, when used well, can actually make journalism stronger. But I got pushback from journalists who complained that I was parroting back management views rather than the on-the-ground experiences of the reporters who have to deal with SEO-crazy bosses. So the natural next move was to gather more evidence.

It seems that whether SEO makes your journalistic life miserable you depends on how smart your news organization is about using SEO — and how your news organization does in making you feel invested in the process of combining SEO with quality content production. Organizations that understand the power of SEO and social news to drive traffic — rather than chase traffic — will keep their reporters in the loop and make them happy. Even if an organization has a good SEO strategy, it still needs to be communicated effectively to the newsroom, so journalists don’t feel like they’ve been turned from trained professionals into slaves to Google Trends.

Some journalists I communicated with (who shall remain nameless so they can keep their jobs) say SEO is pushing them to the brink. The demand that every story generate traffic creates, in their minds, horrible pressure to produce work that will be measured only by how much it is read. The more they can seed their work with SEO terms and then promote their work on social media platforms, the better their metrics and the happier their bosses. But that formula can make for some very exhausted journalists.

I emailed with one unhappy Washington Post reporter. She described a scene not unlike the one described by Jeremy W. Peters in The New York Times, with the majority of morning editors’ meetings focused on looking at web numbers and “usually, making coverage decisions based on that.” The reporter’s complained:

If my blog has an awesome day, I get complimented on it. If I spend weeks digging into a really juicy story, I don’t hear anything from anyone (well, unless it gets picked up by Gawker or it gets epic levels of hits). So what should I spend my time doing?

There have been several instances when reporters (or content producers) have been told by [SEO people] to drop everything they are doing and file some pithy blog post about the hot topic of the moment, which usually fades by the time we can get a story up.

She was also frustrated that that the new SEO people in the newsroom seem to have “unilateral power” and The Washington Post’s commitment to SEO was changing the way this reporter was writing.

“We are told to put SEO-powered words in our headlines, which I understand, even though it takes the life out of our heads,” she continued. “But every now and then, we have also been told to get these SEO-charged words into our first sentence or lede of a story. Wonder why a lede suddenly sounds robotic? This is why.”

But then I spoke with another Post reporter, who said “SEO has not really affected my work that much.” And while he has opinions about SEO, the reporter said that he does not “have a first-hand account of how SEO has hurt or helped me.” Have his editors not cracked the whip? Or has SEO become a natural part of his work?

Whatever it is, these two reporters aren’t thinking about SEO in the same way — which would seem to indicate that Post management could probably stand to improve the way it communicates with its staff about what it wants. Mixed communication about innovation is a common issue at times of change and the Post should not be singled out — their newsroom just happens to be one I reached out to.

Different newsrooms, different perceptions

I then turned to the crazy and wild world of online-only publications, thinking that SEO might be unusually disruptive for their journalists working under the commands to boost traffic and engage with audiences. I found this hypothesis to be untrue, even at traffic monster The Huffington Post and aspiring traffic monster The Daily Beast. For another perspective, I also spoke to the kind folks at GigaOM, the tech news blog, where technology is nothing to fear, and who actually went on the record. First, HuffPost, where my source emailed:

SEO guides the content we decide to write, but only to a certain degree…at HuffPost in particular, I know this is true because of how effectively we utilize organic SEO. Almost all of our posts are written, or should be written with SEO in mind…Sometimes SEO can determine an entire post. We have people in the office that are pretty hot on the top Google searches, and sometimes an entry will be created to utilize the traffic we get some traffic.

To be clear, organic SEO is the way that SEO works based on algorithms and natural searches. So HuffPost is really good at playing this game. But to play this game, one has to be aware that the algorithms are always changing. But does SEO do anything to their reporting or writing or actual gathering of news at HuffPost? My source:

I would say that SEO rarely impacts the actual reporting that we do. For a website as large and as SEO focused as HuffPost, instances of certain words within the article won’t actually help the search value of a post.

What about at The Daily Beast, which aspires to HuffPost levels of hyper-readership? Are work practices changing there? The source I spoke to didn’t find SEO or audience metrics onerous, simply seeing them as a small part of his job. And like my source at HuffPost, this was not an assault on journalism but the new reality — and one that didn’t actually affect the content of stories.

As my source told me, SEO gets used in pretty predictable ways, adding tags into stories and putting the SEO term in the URL slug — which isn’t same as the headline. But this journalist says the pressure of SEO has no significant impact on what he chooses to report or on how he writes.

The value of metrics

The folks at GigaOM were positive about how SEO was helping to change the way they do things — for the better. Probably because they understand the argument I made last week — that paying attention to the audience brings you in better touch with what the audience cares about. And they’re smart about how to use SEO to their advantage. Liz Gannes, a senior writer for GigaOM, listed off in an email her understanding of how SEO was affecting her work:

Metrics only get you so far…if there’s no spark of idea driving a story and voice behind it, it’s bound to be boring.

Until very recently, the most a journalist could hope for after pushing a perfectly polished piece out into the universe was feedback from (at best) a few dedicated readers or haters…Metrics give us a peek into what happens to our stories after we hit publish.

Metrics have helped Gannes think about ways that other reporters, editors and bloggers can use metrics to do better journalism:

— Recognize new and emerging topics
— Figure out the peak times of audience interest so a story will find its audience
— Help readers who have found old articles see a more recent one through links
— Understand when an article is clear and readable, or when it’s become too complicated
— Identify good sources of referral traffic
— Know when to get involved in comments and social media, potentially as inspiration for further articles.

Is SEO foe or friend for the journalist? Constant harping from editors, driven by fear of the future and the need to monetize the web, can make it feel to some journalists that SEO is destroying news judgment and their craft. But the best solution to this complaint is to figure out how to use SEO more effectively to make journalism better — and make the lives of journalists easier.

Metrics can be good for journalism and for journalists; it just takes putting aside the fear of the now to think about future strategies for building good content that will keep readers coming back. As I’ve said before, good content and high readership levels are not mutually exclusive: good stories will be found, and SEO can help.

September 14 2010

17:30

Why SEO and audience tracking won’t kill journalism as we know it

[I'm happy to introduce Nikki Usher, a new contributor here at the Lab. Nikki is a Ph.D. candidate at USC Annenberg and, before academia, was a reporter for The Philadelphia Inquirer and elsewhere. Here she tackles the question of using metrics in journalism; later today, we'll have a different take on the same topic from C.W. Anderson. —Josh]

Last week, The New York Times featured the scary tale of how some newspapers, including The Washington Post and The Wall Street Journal, are (shockingly!) changing their coverage after using online metrics to figure out what their audience wants to read. And Gene Weingarten, in an amusing takedown of search engine optimization, insinuated earlier in the summer that just by putting Lady Gaga in his column, he’d get more hits.

Jeremy W. Peters had another Times piece about much the same concern: young journalists doing “anything that will impress Google algorithms and draw readers their way” and the scary “big board” that Gawker keeps in its newsroom tracking the 10 most popular blog posts, along with pageviews per hour.

This concern that audience tracking, writing for Google, and SEO will somehow destroy the ability of news organizations to keep news judgment apart from audience demands is misplaced. Instead, being more attentive to audience demands may actually be the best thing that news organizations can do to remain relevant and vital sources of news.

With monetization tied to clicks, and real-time Omniture data a feature of more and more newsrooms, it’s easy to worry that audiences will dictate news coverage. But how about the opposite argument: that journalists, for too long, have been writing about what they think their readers ought to know, and not enough about what their audiences want to know.

Journalism has always depended on having an audience to consume its work and has spent much of the past century trying to figure out exactly what that audience wants to know. Now, journalists have better tools than ever to figure out who their audiences are, learn what they want, and in real time, track their behaviors in order to be more responsive to their needs. This isn’t a bad thing — it turns journalism away from the elitism of writing for itself and back to writing what people are actually looking for.

But what about the concerns that journalists are going to spend all their time writing about pets, or Lady Gaga? The truth is that many of the newsrooms I’ve spoken with are smarter than that. They aren’t abandoning journalism principles; they see metrics as a way to ensure their journalism will be read.

SEO at the Christian Science Monitor

In my academic work, I’ve been following the evolution of The Christian Science Monitor as it has moved from a print daily to a website with a print weekly. Over the course of this evolution, I’ve watched the newsroom grow increasingly sophisticated about audience tracking. When I asked John Yemma about his views on SEO, he had this to say in an email about its impact on the newsroom:

Search engines remain a powerful and preferred tool for online readers. We have no choice but to become adept at SEO if it helps us reach readers where they are. This is nothing new in the news business. In the pre-Web days, newspapers periodically redesigned and reformatted. Editors frequently admonished reporters to write shorter, to use simple and direct language, to “think art” when they were on an assignment — all in the interest of reaching readers.

SEO, at its essence, is about editors thinking the way readers think when they are searching for news. At the Monitor, as at almost every other publication, we work diligently to emphasize key words. But that is only one tool in the toolkit. We try to respond quickly when a subject we know well (international news, for instance) is trending. This gives us an opportunity to offer related links that invite readers to dive deeper into our content. If SEO is about acquisition, related links are about retention. In the past year, we have tripled our online traffic with this strategy.

Does that mean we just write plain-vanilla headlines or merely follow Google/Trends? No. A clever headline can still be a powerful draw, especially on our home-page or in social media. And we still report stories that we know are important even if readers don’t agree. But we are much more attuned these days to what readers will respond to. If our journalism is not read, our work is not effective.

Trend tracking at TheStreet.com

At TheStreet.com, the organization has hired a full-time “SEO guy,” John DeFeo, to monitor trends on Omniture, watch search terms, and optimize TheStreet’s content after it is written so it can be found via search.

The result: Traffic has improved. When I was in TheStreet’s newsroom conducting field research, I did see DeFeo make a suggestion that someone bang out a quick story on a children’s Tylenol recall after seeing it trend on Yahoo. But should we see that as being overly responsive to audience demands? Or should we see it instead as a chance for TheStreet to provide its unique comment on what such a recall might mean for Johnson & Johnson stockholders — and at the same time know that the story will have a chance at reaching an audience because it is trending?

Glenn Hall, Editor at TheStreet, defends SEO journalism as being the core of the basic principles of journalism itself. In an interview, Hall said:

Good journalism is not mutually exclusive with SEO. We have proven over and over again that our best journalism tends to get the best page views. SEO is a tool to make sure the best stories get noticed…SEO increases visibility where users are looking. People consume content differently than they used to through a newspaper.

Hall explains to his staff that SEO is in line with the best practices of journalism. He believes that simple declarative sentences, clear and to the point, makes good sense for both journalism and SEO. And, as he notes, SEO doesn’t have the final say on a story’s success or failure: “It doesn’t matter how good the SEO is if the content isn’t good.”

The new news is social

Nick Bilton, the Times tech blogger, writes in his new book, I Live in the Future & Here’s How It Works about the “consumnivore” — an information-hungry consumer who wants the latest news now. But for this new information consumer, information isn’t just a quest for information. It’s also a social experience, shared with people from Twitter, Facebook, email, or other social media. In other words, if you aren’t looking for news, the news will find you. Good journalism will still be found, even without the high-energy SEO pumping of a daily newsroom — largely, I think, because of the new power of news as a social experience.

This isn’t a myth. At the Pulitzer celebration at The New York Times on April 12, 2010, New York Times Magazine editor Gerald Marzorati noted the following in his celebratory speech for sharing the Pulitzer with Propublica for Investigative Reporting for a story about a New Orleans hospital during Katrina: “[Long form journalism is] our most viewed and most emailed…It does matter to readers. It stops the reader. It slows the reader down.”

Was Memorial Medical Center, the hospital in the story, a hot search term? Probably not. Were 13,000 words likely to produce the quick hits of information that the consumnivore hungers for? No. But the story still reached a substantial audience, person to person. And as it was read by more and more people, it likely climbed up Google’s rankings for those people who were searching for articles about Katrina.

So, if used properly, SEO and audience tracking make newsrooms more accountable to their readers without dictating bad content decisions — and it can help newsrooms focus on reader needs. What is a story if it is never read? SEO won’t kill journalism; it will only enhance how we find and use news.

August 27 2010

14:30

This Week in Review: ‘Mosques’ and SEO, Google’s search and social troubles, and a stateless WikiLeaks

[Every Friday, Mark Coddington sums up the week’s top stories about the future of news and the debates that grew up around them. —Josh]

Maintaining accuracy in an SEO-driven world: Apparently the future-of-news world isn’t immune to the inevitable dog days of August, because this week was one of the slowest in this corner of the web in the past year. There were still some interesting discussions simmering, so let’s take a look, starting with the political controversy du jour: The proposed construction of a Muslim community center in downtown Manhattan near the site of the Sept. 11, 2001, attacks on the World Trade Center. I’m not going to delve into the politics of the issue, or even the complaints that this story is symptomatic of a shallow news media more concerned about drummed-up controversy than substantive issues. Instead, I want to focus on the decisions that news organizations have been making about what to call the project.

It has predominantly been called the “ground zero mosque,” though beginning about two weeks ago, some attention began being trained on news organizations — led most vocally by The New York Times and The Associated Press, which changed its internal label for the story — that wouldn’t use that phrase out of a concern for accuracy. The Village Voice used some Google searches to find that while there’s been an uptick in news sources’ use of the project’s proper names (Park51 and the Cordoba Center), “ground zero mosque” is still far and away the most common designation.

What’s most interesting about this discussion are the ideas about why a factually inaccurate term has taken such a deep root in coverage of the issue, despite efforts to refute it: The Village Voice pointed a finger at cable news, which has devoted the most time to the story, while the Online Journalism Review’s Brian McDermott pinpointed our news consumption patterns driven by “warp-speed skimming” and smart-phone headlines that make easy labels more natural for readers and editors.” Watery qualifiers like ‘near’ or ’so-called’ don’t stick in our brains as much, nor do they help a website climb the SEO ladder.”

Poynter ethicist Kelly McBride zeroed in on that idea of search-engine optimization, noting that the AP is being punished for their stand against the term “ground zero mosque” by not appearing very highly on the all-important news searches for that phrase. In order to stay relevant to search engines, news organizations have to continue using an inaccurate term once it’s taken hold, she concluded. In response, McBride suggested pre-emptively using factchecking resources to nip misconceptions in the bud. “Now that Google makes it impossible to move beyond our distortions — even when we know better — we should be prepared,” she said.

Google’s search and social takes shots: Google takes more than few potshots every week on any number of subjects, but this week, several of them were related to some intriguing future-of-news issues we’ve been talking about regularly here at the Lab, so I thought I’d highlight them a bit. Ex-Salon editor Scott Rosenberg took Google News to task for its placement of an Associated Content article at the top of search results on last week’s Dr. Laura Schlessinger controversy. Associated Content is the giant “content farm” bought earlier this year by Yahoo, and its Dr. Laura article appears to be a particularly mediocre constructed article cynically designed solely to top Google’s ranking for “Dr. Laura n-word.”

Rosenberg takes the incident as a sign that reliability of Google News’ search results has begun to be eclipsed by content producers’ guile: “When Google tells me that this drivel is the most relevant result, I can’t help thinking, the game’s up.” The Lab’s Jim Barnett also questioned Google CEO Eric Schmidt’s recent articulation of the company’s idea of automating online serendipity, wondering how a “serendipity algorithm” might shape or limit our worldviews as Google prefers.

Google’s social-media efforts also took a few more hits, with Slate’s Farhad Manjoo conducting a postmortem on Google Wave, homing in on its ill-defined purpose and unnecessary complexity. Google should have positioned Wave as an advanced tool for sophisticated users, Manjoo argued, but the company instead clumsily billed it as the possible widespread successor to email and instant messenging. Meanwhile, Adam Rifkin of GigaOM criticized the company’s acquisition of the social app company Slide (and its social-media attempts in general), advising Google to buy companies whose products fit well into its current offerings, rather than chasing after the social-gaming industry — which he said “feels like it’s about to collapse on itself.”

WikiLeaks, stateless news and transparency: The saga of the open-source leaking website WikiLeaks took a very brief, bizarre turn this weekend, when reports emerged early Saturday that founder Julian Assange was wanted by Swedish authorities for rape, then later that day prosecutors announced he was no longer a suspect. The New York Times provided some great background on Assange’s cat-and-mouse games with various world governments, including the United States, which is reportedly considering charging him under the Espionage Act for WikiLeaks’ release last month of 92,000 pages of documents regarding the war in Afghanistan.

No one really had any idea what to make of this episode, and few were bold enough to make any strong speculations publicly. Two bloggers explored the (possible) inner workings of the situation, with Nicholas Mead using it to argue that catching Assange isn’t exactly going to stop WikiLeaks — as NYU professor Jay Rosen noted last month, WikiLeaks is the first truly stateless news organization, something only permitted by the structure of the web.

That slippery, stateless nature extends to WikiLeaks’ funding, which The Wall Street Journal focused on this week in a fine feature. Unlike the wide majority of news organizations, there is virtually no transparency to WikiLeaks’ funding, though the Journal did piece together a few bits of information: The site has raised $1 million this year, much of its financial network is tied to Germany’s Wau Holland Foundation, and two unnamed American nonprofits serve as fronts for the site.

Hyperlocal news and notes: A few hyperlocal news-related ideas and developments worth passing along: Sarah Hartley, who works on The Guardian’s hyperlocal news efforts, wrote a thoughtful post attempting to define “hyperlocal” in 10 characteristics. Hyperlocal, she argues, is no longer defined by a tight geographical area, but by an attitude. She follows with a list of defining aspects, such as obsessiveness, fact/opinion blending, linking and community participation. It’s a great list, though it seems Hartley may be describing the overarching blogging ethos more so than hyperlocal news per se. (Steve Yelvington, for one, says the term is meaningless.)

Brad Flora at PBS MediaShift provided a helpful list of blogs for hyperlocal newsies to follow. (Disclosure: The Lab is one of them.) And two online media giants made concrete steps in long-expected moves toward hyperlocal news: Microsoft’s Bing launched its first hyperlocal product with a restaurant guide in Portland, and Yahoo began recruiting writers for a local news site in the San Francisco area.

Reading roundup: Despite the slow news week, there’s no shortage of thoughtful pieces on stray subjects that are worth your time. Here’s a quick rundown:

— Spot.Us founder David Cohn wrote an illuminating post comparing journalists’ (particularly young ones’) current search for a way forward in journalism to the ancient Israelites’ 40 years of wandering in the desert. TBD’s Steve Buttry, a self-described “old guy,” responded that it may not take a generation to find the next iteration of journalism but said his generation has been responsible for holding innovation back: “We might make it out of the desert, but I think our generation has blown our chance to lead the way.”

— A couple of interesting looks at developing stories online: Terry Heaton posited that one reason for declining trust in news organizations is their focus on their own editorial voice to the detriment of the public’s understanding (something audiences see in stark relief when comparing coverage of developing news), and Poynter’s Steve Myers used the Steven Slater story to examine how news spreads online.

— At The Atlantic, Tim Carmody wrote a fantastic overview of the pre-web history of reading.

— In an argument that mirrors the discussions about the values of the new news ecosystem, former ESPN.com writer Dan Shanoff gave a case for optimism about the current diffused, democratized state of sports media.

— Another glass-half-full post: Mike Mandel broke down journalism job statistics and was encouraged by what he found.

— Finally, for all the students headed back to class right now, the Online Journalism Review’s Robert Niles has some of the best journalism-related advice you’ll read all year.

June 07 2010

15:00

When web users cross the Gladwell 10,000-hour standard

Derek Powazek has a piece that tries to bring the Malcolm Gladwell Outliers thesis — that it takes 10,000 hours of practice to master anything — and apply it to the explosion of content brought about by the Internet:

Ladies and gentlemen, we have the internet — the biggest no-experience-required open mic night ever created. It connects us all, whether we’ve put in 10,000 hours or ten.

It’s only because of extremely fortuitous timing that the world was spared my 16-year-old Beatles impersonation. I put in those hours before everything was digital and duplicated for free, forever. Make no mistake, if MySpace had been around when I was 16, my furtive recordings would still be haunting me.

Maybe it’s only because of fortuitous timing that we even expect anyone to be good at anything now. We were spared hearing The Beatles when they were new. There’s no record of Shakespeare’s embarrassing early attempts. No MP3s of Bach’s school choir. Maybe if we were more used to seeing people suck before they get good at something, we wouldn’t expect perfection from day one.

Derek’s right. (Even though I’m a bit suspicious of the random roundness of Gladwell’s 10,000-hour number. Lots of bands play a lot of gigs without becoming the Beatles; lots of programmers spent lots of time on computers without becoming Bill Gates.) The ease with which the Internet exposes less-than-professional work forces us to reset our expectations about what makes something worth public display. That’s a problem for some old-school journalists, who think the entire universe should be filtered through a copy desk before seeing the light of day.

But what if there’s a different implication for online news? Here’s Derek again:

Suppose Gladwell is right and it really does take 10,000 hours to master something. Let’s set the bar lower. Let’s say that it takes half that time to be merely good at it. And just to be generous, let’s say half again just to not suck at something. That would mean it takes 2,500 hours of practice to just not be awful.

Now ask yourself, what have you done for 2,500 hours? That’s 104 days. 14 weeks of constant practice. Just under four months of nonstop repetition.

Very few of us have spent that much time doing anything besides sleeping or watching TV.

Well, I can think of one area where lots of people are crossing 10,000 hours of time invested: using the Internet.

And unlike watching TV — where the rewards for your couch labor amount to mastery of your Tivo and better control of your remote — after 10,000 hours online, you’re a vastly smarter Internet user than you were at the start. You’ve stopped using Internet Explorer. You’ve abandoned the embarrassing email address. Your Google-fu is finely honed. Maybe you’ve messed around with RSS. Maybe you’ve got a smartphone and know how to swim between apps. In other words, the return on time investment isn’t just important for creators of technology; it’s also important to its users, who move past early awkwardness to feeling more like natives.

One recent study estimated Internet users spend 17 hours a week online; another one found for teens the number is 31 hours. At that rate, teens would get to 10,000 hours in a little over six years.

What will this mean for news? I won’t pretend to know. But I think anyone creating content online will have to think about how their products should shift as their audience gains increased mastery of the medium. Just as sites are slowly moving away from dial-up-safe sites to adjust to a broadband reality, sites will have to reckon with a savvier pool of users.

Part of that would include now-basic moves like search-engine optimization and social media, since Internet veterans are less likely to simply default to a news organization’s homepage as a point of entry. Will full-text RSS become more important as more users start using RSS or RSS-like feeds? What new navigation regimes will evolve to meet their needs of users aware of all their other options online? How will advertising evolve in a world where more people are using ad-blocking or Flash-blocking software, things previously the domain of nerds like me?

Who knows? But it’s worth remembering how much your audience is a moving target — one that is learning and practicing and getting better at this Internet thing all the time.

June 03 2010

14:00

Is 70 percent of what we read online really by our friends?

Last month, we tweeted a remarkable stat:

Of everything under 40 year olds read online, about 70% was created by someone they know http://j.mp/bb0jgN

Our source was this article citing a recent panel discussion at an SEO conference in New York. Here’s how the stat was presented, in a piece in the newsletter Publishing Trends, as a product of Forrester Research:

In one of several panels on social media and search, Patricia Neuray of Business.com cited the Forrester research finding that 70% of the content read online by under-40-year-olds was written by someone they know.

(Someone who livetweeted the panel seemed to also attribute it to Forrester, although with a cryptic hint of IBM.)

It’s obviously a remarkable statistic if true, but I wanted to get a little more detail — like how the study defined “someone they know” and “content read online.” Are they talking websites, or are they including things like email? Does “someone they know” mean someone they know in real life, or does an Internet friend count? I engaged in some vigorous Googling, but couldn’t find the original study. Then I emailed Forrester to see if they could produce it. A spokesperson got back to me:

That statistic does not come from a Forrester study. We heard about it and investigated it as well to find out that the original author of the article that used that statistic was in error. I just rechecked his article – he removed Forrester as the source but did not cite another source other than a speaker from IBM at this conference: http://www.publishingtrends.com/2010/04/making-search-convert-search-engine-strategies-2010/

And indeed, now the reference in the original article is thus:

In one of several panels on social media and search, Leslie Reiser of IBM cited the recent finding that “70% of the content read online by under-40-year-olds was written by someone they know.”

I contacted Reiser last week to see if she has a cite for it; my very quick Googling didn’t turn up an obvious IBM reference for the number, either, but that doesn’t mean much. I’ll let you know if I hear back from her. In any event, since by tweeting it we played a part in spreading the number, I thought we should note that the original source is still a bit up in the air.

March 16 2010

19:39

February 02 2010

17:00

CNET and Gizmodo are sharing content, and they don’t seem worried about a “duplicate penalty”

CNET and Gizmodo have been sharing content for the last couple months. I confirmed that a partnership exists, but requests for additional information from either party were not fruitful.

Frankly, the most intriguing aspect of this partnership is already in plain view: The sites are posting the same articles. Take a look at this Gizmodo story then click over to the CNET version. Headlines change and there are subtle formatting differences, but the body copy is essentially the same.

Why is this relevant? If you’ve spent any time in the SEO world, you’ve probably heard of the semi-mythical duplication rule. As far as I can tell, CNET and Gizmodo are in duplication’s gray area.

The duplication penalty, or lack thereof

The cautionary tale of duplication generally goes like this: Google wants its search results to give precedence to the most popular/legitimate/relevant pages, and it’s tough to pull that off if the same articles appear on different domains. So Google uses filters to push copycats to the margins. Some people call this the “duplicate penalty,” but that’s a misnomer. Google isn’t slapping hands.

Here’s how Google describes its policy on cross-site duplication:

If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.

The noindex meta tag doesn’t appear in the source code of these sample stories from Gizmodo and CNET. However, CNET’s version does link back to Gizmodo’s original. Gizmodo returns the favor when it’s hosting CNET content. (Both show up in a Google search for one of the article’s sentences: Gizmodo’s version is No. 1, CNET’s No. 4.)

Since the noindex tag represents the outer limits of my search engine understanding, I dropped a line to Brent Payne, the Tribune Co.’s head of SEO, to get his take on this type of content share.

Payne said variation between two similar articles could help both pieces rank well in search engines. But achieving this variation requires each story to have its own inbound links, as well as feature different title tags, different headlines, and altered HTML and body text. CNET and Gizmodo customize titles and headlines. The body copy doesn’t really change.

Payne noted that Google supports a “canonical” feature that signals the original version of a story (or the version that’s supposed to get the most attention). The canonical tool doesn’t appear to be in use by CNET or Gizmodo.

Payne also brought up an interesting point about Google News, which doesn’t share its big brother’s hang-ups about duplicate content. He said a duplicate article that cites the original — something CNET and Gizmodo both do — could give the original “extra weight” in Google News.

Why are they sharing?

All of this inside-SEO stuff is interesting, but it doesn’t really answer the big question: What’s the upside to duplication?

I’ve got a couple guesses:

Marketing: It used to be you’d visit a publisher’s site to see their latest content, but readers now discover material in a variety of ways — Twitter, Facebook, Digg, RSS, etc. One analytics firm estimates on-site engagement dropped 50 percent between 2007-2009. Smart publishers are already addressing this by pushing content beyond their own sites. Toward that end, Gawker Media (owner of Gizmodo) could be using the CNET partnership to “find the next million people.”

Money (obviously): An influx of content can theoretically generate page views, unique visitors, and better user-session times. Good metrics lead to better ad rates and more revenue.

Again, these are just guesses. I’m sure we’ve got SEO and marketing wizards in the audience, so please post a comment if you see clearer explanations.

November 20 2009

04:01

BBC News website adds SEO friendly headlines


The BBC has made changes to its news website to make its headlines more SEO friendly.

The headlines appearing on index pages are short and concise as usual, but clicking through to the story reveals a longer headline with search keywords.

For example, the index headline on the story on Google’s Chrome browser is “Google previews operating system”, which lacks search keywords.

But click on the story page, the headline becomes “Google previews Chrome open source operating system.”

And the report on the Fort Hood killings has an index headline of “Killings prompt US Army inquiry”, while the story page has the more descriptive “Gates orders Army inquiry after Fort Hood killings.”

The changes have just been introduced as older stories still have the same index and page headlines.

The short headlines were a result of length restrictions as the content was distributed on other platforms, including Ceefax.

It meant that journalists tended to have limited room for keywords.  The new BBC headlines are applying standard search engine optimization tips.

BBC News website editor Steve Herrmann explains the changes in a post on the BBC Editors blog.

The front page headlines will remain limited to between 31 and 33 characters and will continue to appear on Ceefax and Digital Text, as they do now, along with the top four paragraphs of each story.

The space constraints on those platforms mean that on the website the headlines have always been short – which, it has to be said, also has its merits, making them easy to scan and fit into lists. They will also continue to appear on mobiles.

The new longer headlines will be up to 55 characters (with spaces) and will aim to include any key words which we might expect a search engine user to type in when searching for news about that particular topic.

So the lead story on the EU presidency has an index and story page headline of “Belgian PM named as EU president.”

But the title tag of the page has a SEO friendly headline: Belgian PM Van Rompuy is named as EU president.”

The BBC News website has had journalists working on search engine optimisation for some time.

I wonder if the list of most popular stories now reflects what people are searching for on Google.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl