Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 25 2011

08:07

The style challenge

Odd one out - image by Cliff Muller

Spot the odd one out. Image by Cliff Muller

Time was when a journalist could learn one or two writing styles and stick with them. They might command enormous respect for being the best at what they did. But sometimes, when that journalist moved to another employer, their style became incongruous. And they couldn’t change.

This is the style challenge, and it’s one that has become increasingly demanding for journalists in an online age.

Because not only must they be able to adapt their style for different types of reporting; not only must they be able to adapt for different brands; not only must they be able to adapt their style within different brands across multiple media; but they must also be able to adapt their style within a single medium, across multiple platforms: Twitter, Facebook, blogs, Flickr, YouTube, or anywhere else that their audiences gather.

Immersion and language

Style is a fundamental skill in journalism. It is difficult to teach, because it relies on an individual immersing themselves in media, and doing so in a way that goes beyond each message to the medium itself. This is why journalism tutors urge their students so strongly to read as many newspapers as they can; to watch the news and listen to it, obsessively. Without immersion it is difficult to speak any language.

Now, some people do immerse themselves and have a handle on current affairs. That’s useful, but not the point.

Some do it and gain an understanding of institutions and audiences (that one is left-leaning; this one is conservative with a small c, etc.).

This is also useful, but also not the point.

The point is about how each institution addresses each audience, and when.

Despite journalists and editors often having an intuitive understanding of this difference in print or broadcast, over the last decade they’ve often demonstrated an inability to apply the same principles when it comes to publishing online.

And so we’ve had shovelware: organisations republishing print articles online without any changes. We’ve had opinion columns published as blogs because ‘blogs are all about opinion’. And we’ve had journalists treating Twitter as just another newswire to throw out headlines.

This is like a person’s first attempt at a radio broadcast where they begin by addressing “Hey all you out there” as if they’re a Balearic DJ. Good journalists should know better.

Style serves communication

Among many other things a good journalism or media degree should teach not just the practical skills of journalism but an intellectual understanding of communication, and by extension, style.

Because style is, at its base, about communication. It is about register: understanding what tone to adopt based on who you are talking to, what you are talking about, the relationship you seek to engender, and the history behind that.

As communication channels and tools proliferate, we probably need to pay more attention to that.

Journalists are being asked to adapt their skills from print to video; from formal articles to informal blog posts; from Facebook Page updates to tweets.

They are having to learn new styles of liveblogging, audio slideshows, mapping and apps; to operate within the formal restrictions of XML or SEO.

For freelance journalists commissioning briefs increasingly ask for that flexibility even within the same piece of work, offering an extra payments for an online version, a structured version, a podcast, and so on.

These requests are often quite basic – requiring a list of links for an online version, for example – but as content management systems become more sophisticated, those conditions will become more stringent: supplying an XML file with data on a product being reviewed, for example, or a version optimised for search.

What complicates things further is that, for many of these platforms, we are inventing the language as we speak it.

For those new to the platform, it can be intimidating. But for those who invest time in gaining experience, it is an enormous opportunity.

Because those who master the style of a blog, or Facebook, or Twitter, or addressing a particular group on Flickr, or a YouTube community, put themselves in an incredible position, building networks that a small magazine publisher would die for.

That’s why style is so important – now more than ever, and in the future more than now.

PrintFriendly

June 27 2011

14:38

My online journalism book is now out

The Online Journalism Handbook, written with Liisa Rohumaa, has now been published. You can get it here.

I’ve been blogging throughout the process of writing the book – particularly the chapters on data journalism, blogging and UGC – and you can still find those blog posts under the tag ‘Online Journalism Book‘.

Other chapters cover interactivity, audio slideshows and podcasting, video, law, some of the history that helps in understanding online journalism, and writing for the web (including SEO and SMO).

Meanwhile, I’ve created a blog, Facebook page and Twitter account (@OJhandbook) to provide updates, corrections and additions to the book.

If you spot anything in the book that needs updating or correcting, let me know. Likewise, let me know what you think of the book and anything you’d like to see added in future.

PrintFriendly
11:03

What I learned from the Facebook Page experiment – and what happens next

Last week my experiment in running a blog entirely through a Facebook Page quietly came to the end of its allotted four weeks. It’s been a useful exercise, and I’m going to adapt the experiment slightly. Here’s what I’ve learned:

It suits emotive material

The most popular posts during that month were simple links that dealt with controversy – Isle of Wight council talking about withdrawing accreditation if a blogger refused to pre-moderate comments; and the wider issue of being denied access to public documents or meetings on the basis of blogging.

This isn’t a shock – research into Facebook tends to draw similar conclusions about the value of ‘social’ content.

That said, it’s hard to draw firm conclusions because the Insights data only gives numbers on posts after June 9 (when I posted a book chapter as a series of Notes), and the network effects will have changed as the page accumulated Likes.

It requires more effort than most blogs

With most blogging it’s quite easy to ‘just do it’ and then figure out the bells and whistles later. With a Facebook Page I think a bit of preparation goes a long way – especially to avoid problems later on.

Firstly, there’s the choice whether to start one from scratch or convert an existing Facebook account into a Page.

Secondly, there’s the page name itself: at first you can edit this, but after 100 Likes you can’t. That leaves my ‘Paul Bradshaw’s Online Journalism Blog on FB for 1 month‘ looking a bit silly 5 weeks later. (It would be nice if Facebook warned you that this was happening)

Thirdly, if you want write more than 420 characters, you’ll need to use Notes (ideally, when logged on as the Page itself, which will result in the Note being auto-posted to the wall). And if you want to link phrases without leaving littering the note with ugly URLs, you’ll need to use HTML code.

Next, there’s integration with other online presences. Here are the apps I used:

  1. RSS Graffiti (for auto-posting RSS feeds from elsewhere)
  2. Slideshare (adds a new tab for your presentations on that site)
  3. Cueler YouTube (pulls new updates from your YouTube account)
  4. Tweets to Pages (pulls from your Twitter account into a new tab)

There’s also Smart Twitter for Pages which publishes page updates to Twitter; or you can use Facebook’s own Twitter page to link pages to Twitter.

Finally, I was thankful that I had used a Feedburner account for the Online Journalism Blog RSS feed. That allowed me to change the settings so that subscribers to the blog would still receive updates from the Facebook page (which also has an RSS feed) – and change it back afterwards.

It’s not suited for anything you might intend to find later

Although Vadim Lavrusik pointed out that you can find the Facebook page through Google or Facebook’s own search, individual posts are rather more difficult to track down.

The lack of tags and categories also make it difficult to retrieve updates and notes – and highlight the problems for search engine optimisation.

This created a curious tension: on the one hand, short term traffic to individual posts was probably higher than I would normally get on the blog outside Facebook. On the other, there was little opportunity for long term traffic: there was no footprint of inbound links for Google to follow.

This may not be a problem for local, hard news organisations which have a rapid turnover of content, no need to rank in Google News, and little value in the archives.

But there are too many drawbacks for most to move (as Rockville Central’s blog recently did) completely to Facebook. It simply leaves you too isolated, too ephemeral, and too vulnerable to changes in Facebook’s policies.

Part of a network strategy

So in short, while it’s great for short term traffic, it’s bad for traffic long-term. It’s better for ongoing work and linking than more finished articles. It shouldn’t be viewed in isolation from the rest of the web, but rather as one more prong in a distributed strategy: just as I tweet some things, Tumblelog others, and just share or bookmark others, Facebook Pages fit in somewhere amidst all of that.

Now I just need to keep on working out exactly how.

PrintFriendly

June 06 2011

01:26

SEO science decoded: The Periodic Table of SEO

Search engine optimization — SEO — seems likes alchemy to the uninitiated, but there’s a science to it. Search engines reward pages with the right combination of ranking factors. To help demystify what seems murky and inexact science to many, Search Engine Land has created a wonderful  “Periodic Table Of SEO Ranking Factors.” Click on the charts below for the condensed and full PDF versions.


June 05 2011

13:55

Infographic: what publishers should know - the Periodic Table of SEO ranking factors

Searchengineland :: Search engines reward pages with the right combination of ranking factors, or “signals.” SEO is about ensuring your content generates the right type of signals. Search Engine Land's chart below summarizes the major factors to focus on for search engine ranking success. They also offer a free full guide (download) that explains factors in more depth.

Search Engine Land Periodic Table of SEO Ranking Factors

Continue to read searchengineland.com

Illustration by columnfivemedia.com

June 02 2011

16:04

Donald Mahoney: Internet journalism after content farms or "What time does the Super Bowl start?"

Some Blind Alleys :: On February 6, 2011, the Huffington Post published what has become one of the most infamous and emblematic stories of the internet journalism age. The story concerned the starting time of Super Bowl XLV. Its headline was: “What Time Does The Super Bowl Start?” and it still owns the top Google ranking for the query.

Consider this overly obvious, if oddly formal intro sentence: “Super Bowl 2011 takes place on Sunday, Feb. 6, 2011, at 6:30 p.m. Eastern Time and 3:30 p.m. Pacific Time.”

The sentence defies the perception that news in the digital era must be as direct and succinct as possible.

"What time does the Super Bowl start? Or, internet journalism after content farms" - continue to read Donald Mahoney, someblindalleys.com

March 31 2011

11:47

Are community moderators etc. journalists? Another ice cream question

Photo by Photoctor

Looking down? Photo by Bhaskar Pyakurel/Photoctor http://www.flickr.com/people/dev07/

Here we go again. Fleet Street Blues reports on a user comment which “seems to makes quite a lot of sense”. It reads as follows:

“Five years or so ago, there was a certain kind of old-school journalist who, converted to the cause, as it were, banged on at length about the importance of hacks having a web presence of the highest order to demonstrate the new skills. It turns out, however, that the new skills are a piece of piss (particularly with current web technology), and promoting a yarn via Google, Facebook, Twitter etc is, in reality, an administrative task rather than a journalistic one. If you want to employ a proper journalist rather than a cheap web monkey, the SEO stuff really is secondary. (Of course, there is the fact that many employers actively want web monkeys rather journalists because they are so much cheaper, but that’s a whole other debate.)”

What is wrong with this picture?

Well even before we get to the conclusion, the premise is flawed.

The headline is indicative: “The difference between promoting a yarn… and writing it”

This is the usual ‘drawing a line’ waste of time (“Is ice cream strawberry?”) that seeks to establish some kind of higher ground that journalists can occupy, rather than actually asking what we want to do in our journalism.

If you want to have a web presence to demonstrate new skills for your career, fine. If you want to use those skills to produce good journalism while you’re at it, however, then you’ll probably do a great deal better.

The point of community management/SEO/social media optimisation etc. from a journalist’s point of view is that it should seek to involve readers as early as possible, and so improve the editorial product while it is produced. Not only that but also so that, once published, any errors/additions etc. are likely to be added by users.

It’s the difference between seeing users as passive audiences, or as active collaborators.

If you see them as audiences then, yes: SEO/SMO/community management is an administrative job akin to being a papergirl or delivery man.

If you see them as collaborators and users, however, then no: SEO/SMO/community management is not something you can comfortably leave entirely to someone else.

via Mary Hamilton.

March 08 2011

17:09

Using Wire Content for Topic Pages without a SEO ding?

Is there a way to pull in AP or Reuters wire content in a way that won't get dinged by Google's SEO rules? We'd like to pull in AP content, for example, and then place related links, multimedia, etc. around it, making it a more valuable page, but this seems to invariably run into the duplicate content rule. If there's enough value-added content in there as well, does that negate the problem? Any recommended resources for learning more?

Tags: seo google ap

March 04 2011

16:00

February 16 2011

13:10

Does Twitter improve your site’s search engine results?

A Tweet's Effect On Rankings - An Unexpected Case Study

Yes. Or at least according to a couple of blog posts in the SEO blogosphere.

Back in December Search Engine Land’s Danny Sullivan asked what “social signals” Google and Bing count in their algorithms. Previously, the answer would have been none, as far as Twitter is concerned, because like most social media (including blog comments, forum posts and social networks) any links posted on Twitter carry a ‘nofollow’ tag, instructing search engines to ignore it.

But now that Twitter has signed deals with the big search engines, they now get the “firehose” of data from Twitter direct – without nofollow attributes. Bing tell Sullivan:

“We take into consideration how often a link has been tweeted or retweeted, as well as the authority of the Twitter users that shared the link.

Google tells him:

“We use the data only in limited situations, not for all of general websearch.”

The post contains more information about how both search engines use the “social authority” of a user (followers, followed, etc.) to further rank links.

A case study

Yesterday, the issue gained a fascinating case study from SEOmoz (image at top), when one of their articles suddenly appeared on the first page of Google search engine results for the term “Beginner’s Guide” following a tweet from Smashing Magazine and hundreds of retweets.

More interesting, the article remained on the first or second page of results for weeks afterwards.

SEOmoz’s takeaways from the experience include:

  • “It appears likely that Google (and Bing) are using the concept they described in the interview on SELand of “Author Authority” to help weight the value of tweets (as we’ve seen that bot-repeated tweeting in similar quantities doesn’t have this affect)
  • “There seems to be some long-term, nascent value carried by tweets in addition to the short-term effects. If this is consistently observed, expect a lot more SEO activity around engaging and incenting tweeting to key URLs.
  • “It’s still unknown whether and how much the text of a tweet impacts the SERPs [Search Engine Results Pages] in a way similar to anchor text. That will be an excellent next test for us to observe.”

Why is this important? Because up till now search engines – actively seeking – and social media – having content brought to your attention – have been the two major sources of traffic for most news websites.

SEO and social media optimisation (or social media marketing: SMM) have traditionally been separate: this might suggest an increasingly integrated approach.

January 18 2011

17:00

New Yorker web editor: The site is “guided by what’s on paper”

In a 2006 post at Design Observer, Michael Bierut praised what he termed the “slow design” of The New Yorker: “the patient, cautious, deliberate evolution of a nearly unchanging editorial format over decades.”

It’s an apt description. As Jon Michaud, the magazine’s archive director, told me, “There have been slight design changes over the years — the pages are now a little smaller than they used to be. We put the bylines at the top of articles, no longer at the bottom. We introduced photographs in the ’90s.”

But “for the most part, the magazine has evolved slowly over the decades.”

The New Yorker’s self-conscious connection to its own past is undoubtedly one of its key selling points. But what about the more future-oriented component of the publication: the digital magazine that lives on the web? When you redesign your site — as The New Yorker did late last year, in its first online revamp since 2007 — how do you balance “a nearly unchanging editorial format” with the needs of transition to (an at least partly) digital existence?

One way: Even online, preserve an ethos of print. “Designers who’ve worked on the print magazine week after week were intimately involved in the web design,” Blake Eskin, The New Yorker’s web editor, explained in an email. So “there are all sorts of ways, articulated and unarticulated, in which the look and feel of newyorker.com is guided by what’s on paper.”

Indeed, the new update is — as The New Yorker has always has been — spare in its use of text, minimal throughout, and squeaky clean. It even makes more use of Irvin, the iconic, 1925 typeface designed by (and named for) the magazine’s original art director, Rea Irvin. Illustrations and other art have also been more integrated into newyorker.com, and can be found at almost every turn — clever, and reliably unpredictable.

Then again, not everything on the new site is print-derivative. The magazine’s vintage sensibility notwithstanding, it was actually The New Yorker’s iPad app that inspired many of the site’s visual design choices, Eskin told me — like the greater use of images, both thumbnail and full-screen. “Before, our website, much like the printed magazine, had been more sparing in its use of art, and the iPad helped pave the way for using more images,” he says. “We tried to optimize both digital formats for readability. Which is why the default font size is bigger — one benefit of removing the sidebar on the left edge of most pages.”

SEO was a factor, as well. “The removal of the sidebar made a more open page, but it should also help search engines to notice our stories,” Eskin notes. (Headlines, with the help of Typekit, are also searchable.) Likewise, “as we’ve added more writing that isn’t from the magazine, and more audio and video and slide shows, we outgrew navigation that largely followed the structure of the print magazine.”

The most telling change, though, is as much about philosophy as it is about design. On the re-launched site, “we put less of the magazine online than we used to,” Eskin says. It’s a choice that will likely become more common as The New Yorker’s fellow outlets make key decisions about paid content. “Especially now that ‘Information wants to be free’ is no longer an article of faith — we wanted to tell our paying subscribers that they can access everything,” he says. “And to tell our non-paying visitors that there’s a lot that they’re missing.”

December 20 2010

11:42

How specialist publishers can compete with national news organsations for SEO

A guest post from news:rewired speaker and SEO and content strategy consultant Malcolm Coles

When national news organisations like the Mail or the BBC take an interest in your specialism, they can siphon off all your search traffic. All of a sudden, you go from being the number one result on Google for a given search term to being buried under a mass of news stories from the mainstream media.

In my last post here I talked about how you should make sure your site is put together in the right way for search engines – that’s a key first step in getting SEO traffic. In the SEO session on the day, I talked about three tactics that celebrity gossip site Holy Moly uses on a day to day basis to help it compete against all of the other sites offering news about celebs.

Know what people are searching for
The first key tactic is to understand how people are searching. There are lots of ways to work out what people are searching for right now – although Yahoo announced the death of one of those, Yahoo Buzz, on the day of news:rewired itself!

So by using Google Autocomplete to work out that people were searching for “Leslie Nielsen quotes” the day of the actor’s death, Holy Moly used that as the angle for its story. It was the number one result for three hours for that search, and got 5,000 viewers – all from just a few seconds’ digging about.

Work out when and how people search
Another key tactic is to work out when people search. If you know an event is coming, you can use your own analytics data and tools like Google Insight to understand how people search in the run up to an event, during it, and afterwards. Holy Moly looks at this data to work out when and what to write about reality TV shows, for instance.

This graph shows search engine traffic to the site for searches on Karen Gillan’s underwear. The searches coincide almost exactly with when Doctor Who was on TV. What’s more, the individual spikes are Saturdays and Sundays – when Doctor Who is broadcast. Data like this means Holy Moly understands it’s worth writing about Karen Gillan in the runup to weekends when Doctor Who is on, and not any other time. Which is probably why, with the Doctor Who Xmas special on its way next week, their latest Karen Gillan underwear story has just gone live…

Look after your searched for pages
Once you’ve got a page that gets you lots of search traffic, make sure you look after it. With news sites, it’s easy for stories to move deeper and deeper into date based archives until eventually search engines forget all about them.

So if you have a page that gets lots of search traffic, keep linking to it from new relevant news stories. That way you can keep reminding search engines that it exists and that it’s important.

Another problem can be search engines showing the “wrong” page. For instance, Holy Moly has an old page about Karen’s underwear that appears top of the search results. There are similar issues with Holy Moly’s stories from previous years’ reality TV shows turning up when people search as opposed to the current years’ pages (or Big Brother pages showing up when people search for Celebrity Big Brother).

When this happens, you should go to the “wrong” page and insert a link to what you think the “right” page is – and also make sure you find ways to link to the newer “right” page from other stories you write. The more you do this, the more you signal to the search engines that the newer page is a better result when people are searching.

If you need any advice about SEO and your own site, feel free to contact me on Twitter or at my content strategy and SEO blog.

December 19 2010

18:00

Games, systems and context in journalism at News Rewired

I went to News Rewired on Thursday, along with dozens of other journalists and folk concerned in various ways with news production. Some threads that ran through the day for me were discussions of how we publish our data (and allow others to do the same), how we link our stories together with each other and the rest of the web, and how we can help our readers to explore context around our stories.

One session focused heavily on SEO for specialist organisations, but included a few sharp lessons for all news organisations. Frank Gosch spoke about the importance of ensuring your site’s RSS feeds are up to date and allow other people to easily subscribe to and even republish your content. Instead of clinging tight to content, it’s good for your search rankings to let other people spread it around.

James Lowery echoed this theme, suggesting that publishers, like governments, should look at providing and publishing their data in re-usable, open formats like XML. It’s easy for data journalists to get hung up on how local councils, for instance, are publishing their data in PDFs, but to miss how our own news organisations are putting out our stories, visualisations and even datasets in formats that limit or even prevent re-use and mashup.

Following on from that, in the session on linked data and the semantic web,Martin Belam spoke about the Guardian’s API, which can be queried to return stories on particular subjects and which is starting to use unique identifiers -MusicBrainz IDs and ISBNs, for instance – to allow lists of stories to be pulled out not simply by text string but using a meaningful identification system. He added that publishers have to licence content in a meaningful way, so that it can be reused widely without running into legal issues.

Silver Oliver said that semantically tagged data, linked data, creates opportunities for pulling in contextual information for our stories from all sorts of other sources. And conversely, if we semantically tag our stories and make it possible for other people to re-use them, we’ll start to see our content popping up in unexpected ways and places.

And in the long term, he suggested, we’ll start to see people following stories completely independently of platform, medium or brand. Tracking a linked data tag (if that’s the right word) and following what’s new, what’s interesting, and what will work on whatever device I happen to have in my hand right now and whatever connection I’m currently on – images, video, audio, text, interactives; wifi, 3G, EDGE, offline. Regardless of who made it.

And this is part of the ongoing move towards creating a web that understands not only objects but also relationships, a world of meaningful nouns and verbs rather than text strings and many-to-many tables. It’s impossible to predict what will come from these developments, but – as an example – it’s not hard to imagine being able to take a photo of a front page on a newsstand and use it to search online for the story it refers to. And the results of that search might have nothing to do with the newspaper brand.

That’s the down side to all this. News consumption – already massively decentralised thanks to the social web – is likely to drift even further away from the cosy silos of news brands (with the honourable exception of paywalled gardens, perhaps). What can individual journalists and news organisations offer that the cloud can’t?

One exciting answer lies in the last session of the day, which looked at journalism and games. I wrote some time ago about ways news organisations were harnessing games, and could do in the future – and the opportunities are now starting to take shape. With constant calls for news organisations to add context to stories, it’s easy to miss the possibility that – as Philip Trippenbachsaid at News Rewired - you can’t explain a system with a story:

Stories can be a great way of transmitting understanding about things that have happened. The trouble is that they are actually a very bad way of transmitting understanding about how things work.

Many of the issues we cover – climate change, government cuts, the deficit – at macro level are systems that could be interestingly and interactively explored with games. (Like this climate change game here, for instance.) Other stories can be articulated and broadened through games in a way that allows for real empathy between the reader/player and the subject because they are experiential rather than intellectual. (Like Escape from Woomera.)

Games allow players to explore systems, scenarios and entire universes in detail, prodding their limits and discovering their flaws and hidden logic. They can be intriguing, tricky, challenging, educational, complex like the best stories can be, but they’re also fun to experience, unlike so much news content that has a tendency to feel like work.

(By the by, this is true not just of computer and console games but also of live, tabletop, board and social games of all sorts – there are rich veins of community journalism that could be developed in these areas too, as theRochester Democrat and Chronicle is hoping to prove for a second time.)

So the big things to take away from News Rewired, for me?

  • The systems within which we do journalism are changing, and the semantic web will most likely bring another seismic change in news consumption and production.
  • It’s going to be increasingly important for us to produce content that both takes advantage of these new technologies and allows others to use these technologies to take advantage of it.
  • And by tapping into the interactive possibilities of the internet through games, we can help our readers explore complex systems that don’t lend themselves to simple stories.

Oh, and some very decent whisky.

Cross-posted at Metamedia.

December 15 2010

10:18

Case Study – Two political blog articles which went viral

One of the areas which interests me is how independent publishers can cut through to build an audience, or drive a story into the wider public arena. This is a cross-post from the Wardman Wire.

Two articles from the last month by the Heresiarch and Anna Raccoon form an interesting study in articles by political bloggers which gained widespread attention. Both of these pieces went viral via Twitter, rather than Facebook or any other social network.

Firstly, a piece, which caught the moment when the conviction of “Twitter Terrorist” Paul Chambers was confirmed. This piece achieved almost 1000 retweets.

This is the headline and abstract:

Heresy Corner: With the Conviction of Paul Chambers, it is now illegal to be English.

There is something deeply and shockingly offensive about the conviction of Paul Chambers for his Twitter joke, almost unbelievably reaffirmed today at the Crown Court in Doncaster. It goes beyond the normal anger anyone would feel at a blatant injustice, at a piece of prosecutorial and judicial overkill that sees the might of the state pitted against a harmless, unthreatening individual for no good reason.

Secondly, a piece from Anna Raccoon last week, about the case of Stephen Neary, who seems to have been caught up in a bureaucratic whirlpool through his autism:

The Orwellian Present – Never Mind the Future.

Steven Neary, Deprivation of Liberty Safeguards, Welfare Deputyships and The Court of Protection

These numbers of tweets are 50-100 times more than will be achieved by a reasonably well-received article. As a comparison the last 6 articles on the Heresy Corner homepage this morning are showing 3, 5, 4, 9, 40 and 2 retweets.

My observations:

1 – Both are non party-aligned writers embedded in the political blog niche, but also cover political questions from a position of non-political knowledge, with a degree of authority/respect which has come from their own work over two years or more.

2 – In these instances, both are amateur or professional subject specialists in the areas they cover here, and have an established readership who are able to give a boost to a piece in the social media nexus. As a comparison, in the world of Internet Consultancy much time (and money) is spent trying to build initial traction for articles and websites to give them a boost into wider internet prominence.

3 – The importance of “connectors”. Anna Raccoon’s piece received a significant boost from Charon QC, who provides an important hub-site in the legal niche – which of course is one place where a real difference can be made to Stephen Neary’s situation.

4 – The “edge of the political blogosphere” has become very important – both for specialist sites writing about political questions, and political blogs who “do more than politics”.

5 – These are two different types of article. The Heresy Corner summarised the online reaction to the “I’l blow you’re airport sky high” Twitter Joke Trial case at the right time to catch the Zeitgeist, while Anna Raccoon’s piece is a campaigning piece trying to direct attention to a particular case, in an area of society she has written about on perhaps a dozen occasions.

6 – Several legal commentators (eg Jack of Kent in addition to Charon) have pointed out (correctly) that for campaigning piece to convert attention into action, there needs to be more complete information about both sides of the story. A spotlight can be directed onto a perceived abuse, but there needs to be objective investigation afterwards.

That is a good distinction; but the rub is that officialdom can prevent both sides of the story being available to the public, and often only react to media spotlights – not to problems which they have not been embarrassed about.

7 – Neither of these bloggers are deeply embedded in the Facebook ecosystem, which is a distinct difference from some other mainly political sites, which report Facebook as a major source of traffic (example). I’ll write more on this another time, because I think it is important.

8 – During November, when the Paul Chambers piece was published, Heresy Corner jumped from 134 in the Wikio blog ranks to number 15 (illustrated). This was after changes which introduced a “Twitter” factor into the Wikio rankings. I’d suggest that this level of volatility may illustrate that they’ve overdone it.

Wrapping Up

The missing link for independent publishers is the ability to translate incisive observation or reporting into an effective influence.

I’ll return to that subject soon.

Can I ask a favour from brave souls who’ve reached the end of this article. I need a couple of dozen Facebook “Likes” for my own site’s new Facebook page to gain access to all features. You can “Like” me at the bottom of the rh sidebar here.

December 05 2010

20:19

Facebook, cartoon avatars, paedophiles and SEO as a public service

A few days ago status updates like this were doing the rounds on Facebook:

“Change your facebook profile picture to a cartoon from your childhood and invite your friends to do the same. Until Monday (December 6), there should be no human faces on facebook, but a stash of memories. This is for eliminating violence against children.”

Of course it is. Or maybe not. Today, the rumour changed poles:

“This cartoon thing has been set up by paedos using A registered charities name to entice kids. apparently on the 6th dec you will be kicked off fb if u have cartoon pics. The more folk that… put up cartoon pics the harder it is fo…r the police to catch these sickos!!”

There doesn’t appear to be any truth in the latter rumour. Internet hoax library Snopes has a similar hoax listed, and this seems to be variant of it.

SEO as a public service

Hoax updates do the rounds on social networks and text messages on a semi-regular basis. Remember the one about children being kidnapped in supermarket toilets? Or how about police banning English flags in pubs for fear of offending people?

In both cases the mainstream media was slow to react to the rumours. A Google search – which would be a typical reaction of anyone receiving such a message – would bring up nothing to counter those rumours. (Notably, perhaps because of its public and real-time nature, Twitter seems better at quashing hoaxes).

Search engine optimisation (SEO) is much derided for a perception that it leads news organisations to write for machines, or to aim for the lowest common denominator. But SEO has a very valuable role in serving the public: if searches on a particular rumour shoot up, or mentions of it increase on social networks, it’s worth verifying and getting up the facts quickly.

This is another reason why journalists should be on social networks, and why publishers should be monitoring them more broadly. Whether your motivations are civic, or commercial, it makes sense both ways.

PS: If you need any tips on methods and tools, see my Delicious bookmarks for verification.

(h/t to Conrad Quilty-Harper)

November 30 2010

20:35

Content or design? Using analytics to identify your problem

editorial analytics

As an industry, online publishing has gone through a series of obsessions. From ‘Content is King’ to information architecture (IA), SEO (search engine optimisation) to SMO (social media optimisation).

Most people’s view of online publishing is skewed towards one of these areas. For journalists, it’s likely to be SEO; for designers or developers, it’s probably user experience (UX). As a result, we’re highly influenced by fashion when things aren’t going smoothly, and we tend to ignore potential solutions outside of our area.

Content agency Contentini are blogging about the way they use analytics to look at websites and identify which of the various elements above might be worth focusing on. It’s a wonderful summary of problems around sites and an equally wonderful prompt for jolting yourself out of falling into the wrong ways to solve them.

The post is worth reading in full, and probably pinning to a wall. But here are the bullet points:

  • If you have a high bounce rate and people spend little time on your site, it might be an information architecture problem.
  • If people start things but don’t finish them on your site, it’s probably a UX problem.
  • If people aren’t sharing your content, it may be a content issue. (Image above. This part of their framework could do with fleshing out)
  • If you’re getting less than a third of your traffic from search engines, you need to look at SEO

Solutions in the post itself. Anything you’d add to them?

November 24 2010

15:30

A/B testing for headlines: Now available for WordPress

Audience data is the new currency in journalism. I don’t just mean the traditional Costco buy-in-bulk kind — “our readers are 52 percent male, 46 percent over $75,000 household income, 14 percent under age 35,” and so on. I mean data that looks at how individual readers interact with individual pieces of content. And beyond that shift there’s also the move from observational data — watching what your audience does — to experimental data, testing various ways of presenting or structuring content to see what works and what doesn’t.

My desire for more experimental data is one reason why I’m very happy to point you to a new resource for sites built on WordPress (like this one): a new Headline Split Tester plugin, built by Brent Halliburton and Peter Bessman, two Baltimore developers.

Not sure if you want a straight, newsy headline or something with a little more pizzazz? Something keyword-dense and SEO friendly or something more feature-y? This plugin lets you write two headlines for each post and have them presented at random to readers. The plugin records how often each version of the headline has been clicked and, once it has enough data, swaps full-time to the most effective one.

If you’re in the kind of operation that has regular debates over headline strategy, here’s a great way to test it. (Although note that this is measuring clicks on articles within your site — it doesn’t tell you anything about the SEO effectiveness of a headline. You’d have to wait for Google data for that.)

We have lots of debates over the appropriate role of audience metrics in journalism. But personally, I’d rather have those debates armed with as much data as possible. If you want your site to be filled with puns and plays on words instead of SEO-friendly nouns, fine — but it’s worth knowing how much of a traffic impact that decision has when you make it.

I’m happy to say we apparently played a small role in its creation: Halliburton writes that he was inspired by an old Lab post that described how The Huffington Post uses A/B split testing on some of its headlines:

Readers are randomly shown one of two headlines for the same story. After five minutes, which is enough time for such a high-traffic site, the version with the most clicks becomes the wood that everyone sees.

Give it a try — and if you’re a PHP coder, try to make it better, as patches are welcome. (Another, more ambitious A/B testing project for WordPress, ShrimpTest, is also in development and in preview release.)

Halliburton (who runs Cogmap and Deconstruct Media) and Bessman (who’s an engineer at marketing firm R2integrated) built the plugin in as 2010 a way as possible: at last weekend’s Baltimore Hackathon, where the plugin won a prize for best prototype. Have a good idea, bang out code in a weekend, share it with a potential audience of millions using the same platformthat’s the promise of open source and collaboration in a nutshell.

November 17 2010

14:54

Using Yahoo! Clues to target your headlines by demographic

Yahoo! Search Clues - Emma Watson hair

Tony Hirst points my attention (again) to Yahoo! Clues, a tool that, like Google’s Insights For Search, allows you to see what search terms are most popular. However, unlike Insights, Yahoo! Clues gives much deeper demographic information about who is searching for particular terms.

Tony’s interest is in how libraries might use it. I’m obviously interested in the publishing side – and search engine optimisation (SEO). And here’s where the tool is really interesting.

Until now SEO has generally taken a broad brush approach. You use tools like Insights to get an idea – based on the subject of your journalism – of what terms people are using, related terms, and rising terms. But what if your publication is specifically aimed at women – or men? Or under-25s? Or over-40s? Or the wealthy?

With Yahoo! Clues, if the search term is popular enough you can drill down to those groups with a bit more accuracy (US-only at the moment, though). Taking “Emma Watson haircut”, for example, you can see that a girls’ magazine and one aimed at boys may take different SEO approaches based on what they find from Yahoo! Clues.

Apart from anything else, it demonstrates just what an immature discipline web writing and SEO is. As more and more user data is available, processed at faster speeds, we should see this area develop considerably in the next decade.

Yahoo! Search Clues - Emma Watson haircut - oops/katie leung

November 16 2010

19:30

Google News experiments with metatags for publishers to give “credit where credit is due”

One of the biggest challenges Google News faces is one that seems navel-gazingly philosophical, but is in fact completely practical: how to determine authorship. In the glut of information on the web, much of it is, if not completely duplicative, then at least derivative of a primary source. Google is trying to build a way to bake an article’s originality into its no-humans-used algorithm.

Today, it’s rolling out an experiment that hopes to tackle the “original authorship” problem: two new metatags, syndication-source and original-source, intended to attribute authorship, via URLs, into the back end of news on the web. Though the tags will work in slightly different ways, Googlers Eric Weigle and Abe Epton note in a blog post, “for both the aim is to allow publishers to take credit for their work and give credit to other journalists.”

Metatags are just one of the many tools Google uses to determine which articles most deserve news consumers’ attention. They work, essentially, by including data about articles within webpages, data that help inform Google’s search algorithms. Google itself already relies on such tagging to help its main search engine read and contextualize the web. (Remember Rupert Murdoch’s so-far-unrealized threats to opt out of Google searches? He would have done it with a noindex tag.)

The tags are simple lines of HTML:

<meta name="syndication-source" content="http://www.example.com/wire_story_1.html">

<meta name="original-source" content="http://www.example.com/scoop_article_2.html">

And they’ll work, Weigle and Epton explain, like this:

syndication-source indicates the preferred URL for a syndicated article. If two versions of an article are exactly the same, or only very slightly modified, we’re asking publishers to use syndication-source to point us to the one they would like Google News to use. For example, if Publisher X syndicates stories to Publisher Y, both should put the following metatag on those articles:

original-source indicates the URL of the first article to report on a story. We encourage publishers to use this metatag to give credit to the source that broke the story. We recognize that this can sometimes be tough to determine. But the intent of this tag is to reward hard work and journalistic enterprise.

(This latter, original-source, is similar to Google’s canonical tag — but original-source will be specific to Google News rather than all of Google’s crawlers.)

Google News is asking publishers to use the new tags under the broad logic that “credit where credit is due” will benefit everyone: users, publishers, and Google. A karma-via-code kind of thing. So, yep: Google News, in its latest attempt to work directly with news publishers, is trusting competing news organizations to credit each other. And it’s also, interestingly, relying on publishers to take an active role in developing its own news search algorithms. In some sense, this is an experiment in crowdsourcing — with news publishers being the crowd.

At the moment, there are no ready-made tools for publishers to use these tags in their webpages — although one presumes, if they get any traction at all, there’ll be a plugins for many of the various content management systems in use at news organizations.

The tags, for any would-be Google Gamers out there, won’t affect articles’ ranking in Google News — at least not yet. (Sorry, folks!) What it will do, however, is provide Google with some valuable data — not just about how its new tags work, but also about how willing news publishers prove to be when it comes to the still-touchy process of credit-giving. That’s a question Google News has been trying to tackle for some time. “We think it is a promising method for detecting originality among a diverse set of news articles,” the tags’ explanation page notes, “but we won’t know for sure until we’ve seen a lot of data. By releasing this tag, we’re asking publishers to participate in an experiment that we hope will improve Google News and, ultimately, online journalism.”

October 27 2010

14:00

Metrics, impact, and business plans: Things to watch for as the Knight News Challenge enters a new cycle

In recent years, it’s been something of a parlor game in future-of-journalism circles to speculate about the $25 million Knight News Challenge: Who’s going to win this year? What are the judges looking for, exactly? And, whoa, how on earth did that finalist get passed up? (On that last question, see CoPress in 2009; e.g., read the comments on this post.)

The buzz and chatter are mostly just idle guesswork, and of course it’s all to be expected when serious money (think: $5 million for MIT, $1 million for EveryBlock) is on the line. (Indeed, there’s an extra $1 million on the table this year, thanks to Google’s donation to journalism innovation announced yesterday.)

So, that’s why this year, the fifth installment of the Knight News Challenge, already feels a little different. In years past, the Knight Foundation has approached the News Challenge with a “hey, we’re not the experts — you tell us what’s innovative” kind of attitude, purposefully leaving the door open to just about any submission, assuming that it met certain basic requirements of geographic community focus, open-source software, and so on. With the exception of some tweaking along the way, the general focus of the News Challenge remained the same: to stimulate innovation in the name of making communities better informed. Simple enough.

But this year, even though the KNC’s general pitch remains the same, applicants will make their submissions in one of four categories: Mobile, Authenticity, Sustainability, or Community. Only the Community category requires a place-based geographical focus, which marks a significant break from previous cycles where all projects had to be tested in a local community. Overall, the categorization scheme lends some direction — even a certain narrowing — of the contest, and it suggests that Knight has learned a few things over the past four years that it’s going to apply in this final go-round, to get a more focused pool of contenders.

And that’s where this post comes in, on the question of lessons learned. At the risk of contributing more baseless speculation to this parlor game, I’d like to share some insights I gained during the past year as I examined the News Challenge — and the Knight Foundation more generally — for my doctoral dissertation at the University of Texas. (I’m now a journalism professor at the University of Minnesota.)

For starters, you can read the full text of my dissertation (“Journalism Innovation and the Ethic of Participation: A Case Study of the Knight Foundation and its News Challenge“) by going here, or by reading the embedded Scribd file below. If you’re looking for the highlights, skip to page 182 and read the last chapter (Participation and the Professions). Quick tip: This is generally a good way to go when trying to interpret academic articles — look for that “discussion and conclusion” section toward the end.

I described some of my key findings in an earlier Lab post. But with regard to the changes in the KNC for 2011, here are several observations from my time studying the Knight Foundation that might fill in some of the context:

Knight cares intensely about evaluation

This is increasingly true of all nonprofit foundations, really — not just the Knight Foundation. But it was striking to see the extent to which the foundation is working to assess the impact and effectiveness of its funding efforts, through an ongoing “deep review” of its mission and goals. A major part of this review: an examination of the Knight News Challenge after its first three cycles (2007-09). This included a massive content analysis of nearly all proposal documents — resulting in a data set that I analyzed as my part of my project (see Chapter 6 of my dissertation) — and interviews, conducted by outside consultants, with many KNC grantees. At one level, there’s the basic assessment of seeing if grantees’ outcomes matched their goals. At another, there is the big question of reach and influence. For nonprofits funding myriad online sites, as Knight does, at least part of that means reviewing web metrics: traffic, unique visitors, etc. All foundations want metrics to justify their investment — and now more than ever.

So, what does this emphasis on evaluation mean for News Challenge applicants this year? Well, it suggests that in a world where user behaviors are easier to track and analyze than ever before, and thus funders of all stripes (for-profit and nonprofit alike) are hungry for good numbers, having a plan for web metrics — for reaching quantifiable and identifiable targets — is probably going to be more important than in previous cycles.

Is this the News Challenge on SEO steroids? Not exactly, but you get the idea. And this gets to the second point, which is…

Is citizen journalism out? Are business models (and the like) in?

There was an interesting quote in recent coverage of KNC changes that got some attention. It was from Jennifer 8. Lee, a Knight consultant and contest reviewer:

We’re not totally into the citizen journalism thing anymore. It has been given its chance to do its thing and kind of didn’t do its thing that well.

Now, Lee was quick to clarify that she was speaking only for herself, and that the KNC is open to citizen media approaches — just not the kind of generic and repetitive pitches that have populated the pool of applicants recently (think: Flip cams for urban youth):

The contest welcomes content or citizen journalism projects. Innovative content or community reporting models can and do get funded…Since innovation is a core value of the contest, traditional content and citizen journalism projects lacking in innovation were generally not looked upon favorably by contest reviewers.

But, nonetheless, this statement is telling because it gets at a key focus of my dissertation: how Knight has dealt with participation in journalism. In my study of the first three years of the News Challenge, I found that the foundation and its KNC winners championed citizen participation in the news process as something that should happen, not merely something that could happen because of new technologies. Participation was portrayed as an ethic of good journalism in the digital age, a foundational piece of journalism innovation.

So, does that square with the notion of we’re not so into citizen journalism anymore? Perhaps there’s a better way to think about this: Knight has already funded lots of citizen media projects, and the evidence — based on my interviews with KNC winners and overall analysis — suggests that many of these sites struggled to build and maintain a base of users. On the one hand, that’s perfectly understandable: Some of these projects were meant to be short-term in duration; Knight knew many of them would fail, because that’s the nature of innovation; and, hey, in the attention economy, it’s tough for any content provider these days, right? Yet, on the other hand, this struggle to get attention — from citizen contributors and audiences alike — was a formidable challenge for many of the early KNC projects, and, well, it just so happened that many of those early projects happened to be citizen media sites. As a result, citizen journalism comes off looking like a failure, even if the motivation behind it was well intentioned and still well regarded in Knight circles.

The lesson here: Going forward, with this ramped-up emphasis on evaluation and impact, and with apparent concerns about citizen journalism’s sustainability, it would seem that Knight wants to see applicants with a clearer path to success, especially in web metrics. Or, perhaps there’s another way to read this: In a media ecosystem awash in sites pushing content — read our blogs! watch our videos! — with less thought about how that content gets subsidized on a regular basis, Knight wants a better business plan. It wants a sustainable model. After all, there’s a reason it hired a director of business consulting.

David Sasaki, of the 2007 KNC winner Rising Voices, might have captured this problem best in this prescient blog post from 2008:

The Knight Foundation is single-handedly making citizen media both more serious and more respected by giving financial support to some of the field’s most innovative thinkers. But is this a sustainable model for the transformation of media? What happens when the News Challenge’s five-year funding period concludes? All of the News Challenge grantee projects are impressive, innovative, and important, but not a single one is turning a profit, nor do they seem poised to any time soon.

What happens to the “news” in News Challenge?

This is a truly intriguing and as-yet-unanswered question going into this final cycle. The five-year funding period Sasaki described is coming to an end. What comes next?

On the one hand, the News Challenge has proved a successful template for Knight’s growing network of prize-philanthropy challenge contests, and it represents the foundation’s most visible link to its historic roots as a “journalism foundation” with close ties to the industry and its concerns. But, as I pointed out previously, Knight is undergoing a shift in emphasis from “news” to “information” as a way of broadening the boundaries of journalism to accomplish innovation with outside help from other fields and philanthropic funders. The most obvious manifestation of this is the Knight Community Information Challenge, which involves partnering with place-based foundations to meet the “information needs” of local communities.

What becomes, then, of the News Challenge? Is there a renewal of some kind — and if so, does it keep the “journalism” tag? Or does the Community Information Challenge suffice in this space? Only time will tell, but the important thing here is to recognize that Knight has an increasingly nuanced view of journalism — one that sidesteps the “baggage” of professional exclusivity and proactively seeks ideas from other fields (say, the tech sector).

David Cohn, whose Spot.Us is one of the best-known KNC success stories, put it recently, in describing startups like Kommons:

As I’ve said before, we may not call it ‘journalism’ in the future, but if it still meets the news and information needs of a community, more power to it.

That, right there, nicely summarizes the feeling of the Knight Foundation: that it cares much more about the ends (i.e., informed communities) than the means (i.e., journalists and traditional news). How that translates into future challenges (or not) is left to be seen.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl