Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

September 14 2010

12:18

Americans spending more time consuming news, research suggests

A report carried out every two years by the Pew Research Center suggests Americans are spending more time consuming news now than 10 years ago.

The research, released this week, found that rather than replacing traditional media with digital platforms, consumers spend an additional 13 minutes daily getting news online as well as 57 minutes on average getting news from traditional media such as television, radio and newspapers. In the year 2000 the survey reported a total of 59 minutes was spent by audiences consuming news, with no time reportedly spent consuming news online by respondents until 2004.

According to the report, this is one of the highest totals measured since the mid-1990s, which does not take into account time spent getting news from mobile phones or other digital devices. Only eight per cent of respondents get their news from their mobile.

The news consumption survey recorded the responses from more than 3000 adults from 8 to 28 of June. Other findings include an increase in ‘news-grazers’ who consume the news on a less regular basis from 40 per cent in 2006 to 57 per cent in 2010. The survey also found an increase in the use of search engines for news gathering, rising to 33 per cent from 19 per cent in 2008.

See the report in full here..Similar Posts:



September 07 2010

09:57

July 27 2010

08:30

#Tip of the day from Journalism.co.uk – improve website page speeds

Advice from DailyBlogTips on how to improve page load speeds on your site and gain higher rankings in search engines. Tipster: Rachel McAthy. To submit a tip to Journalism.co.uk, use this link - we will pay a fiver for the best ones published.


July 21 2010

19:22

How to Gauge Success Using New Metrics

Last week, I met with two people from a non-profit in Phoenix that looks at progressive policies to balance economic development with the environment. Land use and livable communities are two of their key talking points, so it seems logical that they should be aware of a service that encourages and enables people to use light rail to get around the inner city, right? For those unaware, that describes our Knight Foundation-funded project, CityCircles.

As we discussed CityCircles during the meeting, the inevitable question arose: How much traffic are you getting?

The answer, in all honesty, is not much at the moment.

But "hits" -- or page views, or unique visits, or whatever traditional web metric you choose to use here -- is not what we're looking for at CityCircles.

Our project is less about "how many" people are using the service and more about "how" people are using it: How they are interacting with it, with each other, and with the light rail community at-large as a result of our existence. I bring this up because it will inevitably be part of any early discussion you may have about your own startup.

The Battle For the Top of Search Results

Your answer will obviously be critical to how the project is perceived. For us, we do our best to follow how web usage is developing as new startups go live. One particularly interesting development is the mile-wide content creators like Demand Media, Associated Content and other related sites. (See the ongoing feature on these kind of content farms being published this week at MediaShift.)

In general, these companies pay writers of a general skill level to write about almost every topic under the sun for an extremely, ahem, modest fee. They are essentially choosing quantity over quality as their business model. (However, that is in the eye of the beholder, as any piece of content is capable of being high-quality to a particular user if it's exactly what they're looking for at exactly the right time. It just tends to be something that won't win any major journalism accolades.)

A really great story on this topic -- with a really great volley of thoughtful comments -- came out earlier this month on The Wrap.

There's a lot there to contemplate, but what I prefer to ponder is a post written by FoundingDulcinea's Mark Moran in December 2009.

He argues -- successfully, in my mind -- that sites like Associated Content and others will, over time, kill search engines' usefulness (if the search companies don't address this issue). The deluge of content from thousands of writers on multiple topics will come to dominate the top search rankings, thus diminishing the utility each user gets from that search. As some have noted, certain searches require you to wade through posts to get to the deeper results on a topic you are interested in, and this equates to being invisible in search because few users click past the first page or two of search results.

Metrics to Consider

Why do I bring this up for potential startups?

The impact these sort of sites can have may force you to re-think your own metrics. If page views work for you (and you should think beyond that), then that's great. Just remember to follow developments that impact search engines, because that is where validation for your project will come from as you talk to potential stakeholders.

If you'd like to consider other options, here are a few metrics we are tracking under our grant:

  • Number of registered users
  • Number of posts
  • Number of comments on those posts
  • Number of community improvement projects completed
  • Most frequently visited landing pages for light rail stops
  • Length of time spent on our mobile website (train schedules)
  • Interviews with users at our light rail events (anecdotal stories about light rail use)
  • Number and type of merchants participating in our light rail events

Food for thought -- especially at a time when startups have to score millions of page views to attract a whiff of advertising money, if that's your business model. Our model is based on a deeper experience of use, not just information consumption.

Start thinking ahead to answer that inevitable question.

April 16 2010

14:20

How Blocking Search Engines Can Increase Ad Click-Throughs

Search engines, RSS feeds and content aggregators make a reader's life easier by providing new ways to scan for articles and to discover news. One result of this is that readers may no longer feel the need to regularly visit their local paper's website in order to stay informed about the goings-on around town.

Following this logic, publishers work hard to make their content as searchable as possible, to make it accessible outside of a newspaper website. Conventional wisdom dictates that websites should be optimized for search engines.

But what if your content is very specific in nature? Suppose that you have a respected brand, and that people in your community look to you to provide information that is relevant to them? When newspapers give their readers alternative ways to access their information, they are gambling that the a la carte traffic coming back from the search engine will more than make up for the loss of direct traffic they previously received.

The theory goes that the easier your content is to find, the more traffic your site will receive. But a recent experiment by a few newspapers in Northern California suggests there's value in keeping come content away from search engines and aggregators.

Papers Prevent Search Engines, Aggregators

During the first quarter of 2008, three small newspapers in Northern California with website pay walls edited their robots.txt files to disallow search engines and aggregators from indexing any content on their websites. I am vice president of digital media for the newspapers in question. I run web strategy, sales and operations for dailyrepublic.com, davisenterprise.com, and mtdemocrat.com. We made the change when local advertisers started buying Google AdWords instead of ads on our website. Realtors, for example, buy the keywords "Fairfield Real Estate News" and advertise on our content through Google, which is not good for us.

As a result, management at the papers decided to cut off search engines and aggregators. You can view some of the results here:

boydston-figure1.png

As the charts above illustrate, website traffic has grown steadily in each of the four key metrics we studied. What was most surprising, however, was the impact this change had on our ad-serving effectiveness. The click-through rate for ads rose from a modest 0.29 percent in 2008 to an average of 2.87 percent today on paid access pages. (You can also view some related data here. It compares paid and free websites of similar size.)

It appears that for these papers, traffic volume alone does not impact click-through rates. What I'm suggesting is positive correlation between increased reader frequency and the click-through rate. Frequency is key to generating advertising response. Simply put: Newspapers who give their readers too many ways to read their content may be inadvertently destroying the advertising effectiveness that sustains their business.

I am not trying to convince you that every website should block search engines, or that newspapers should all try pay walls. But I implore the news community to consider that it is plausible for a news organization to thrive without search engine traffic.

It's a concept that stirs up emotional responses from many in the news industry -- but it deserves more logical contemplation.

Reblog this post [with Zemanta]

January 08 2010

09:11

November 05 2009

08:45

Guardian makes its comments accessible, SEO friendly and mobile friendly all in one go!

The Guardian has changed its user-generated comment system – moving from a client-side system to a server-side one. (This story was first published here, where you can read a bit more of the background.)

With the old system, once you loaded a story, some javascript would go off and look up readers’ comments and display them. This wasn’t terribly accessibleif you couldn’t or didn’t run javascript, you couldn’t see the comments.

It was also bad for SEO, as search engines couldn’t run the javascript (so couldn’t see the comments). And if your mobile didn’t run javascript (like mine), you couldn’t read the comments either.

With the new system, the comments are just part of the web page, like all the rest of the text.

This is a great change by the Guardian, and not before time. Google has already started to index the text of comments, as this search for some text I left as a comment once shows.

If you notice any problems, they’ve asked you to point them out.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl