Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 15 2012

11:12

News sites should be Islands in the stream

Islands in the stream
That is what we are
No one in-between
How can we be wrong

Dolly Parton! Well, actually the BeeGees (well if we are being really pedantic Hemingway). What the hell is that about Andy!

Well, Mary Hammilton (a must follow @newsmary on twitter) highlighted a post by entrepreneur, writer and geek living imploring us to stop publishing webpages and start publishing streams:

Start moving your content management system towards a future where it outputs content to simple APIs, which are consumed by stream-based apps that are either HTML5 in the browser and/or native clients on mobile devices. Insert your advertising into those streams using the same formats and considerations that you use for your own content. Trust your readers to know how to scroll down and skim across a simple stream, since that’s what they’re already doing all day on the web. Give them the chance to customize those streams to include (or exclude!) just the content they want.

I found it a little bit of a mish-mash really. In principle, lots to agree with but the practice was less clear. It makes sense if you’re in to developing the ‘native clients’ but harder to quantify if your’e a content creator.

More interesting was the twitter discussion it generated between Mary and her Guardian colleague Jonathan Haynes (the equally essential @jonathanhaynes) which I hitched my wagon to.  Haynes didn’t agree with the premise of the post and that generated an intersting discussion.

I’ve created a storyfy below but it got me thinking about some general points which are a little ‘devils advocate’:

  • What is this stream anyway – is it the capacity to filter  or is the depth and breadth of content you have to filter. I would say it’s the latter. Facebook and Twitter are streams because of the sheer weight of numbers and diversity of users.
  • Why be the stream when you can be part of it – Part of what Anil posted about was making stuff available to use in streams. I can’t disagree with that but it strays in to the idea of feeding the content ecosystem that, in blunt terms, is often played as parasitic. For all the advocacy of allowing user control, the one thing news orgs are still loathed to do is move people outside the site. Is looking at new ways to recreate the stream experience within a site simply a way of admitting that you aren’t really part of the stream?
  • Are you confusing your consumption habits with your users – whilst the stream might be useful for information pros like journos is it really what consumers want for their news. The stream suits the rolling nature of journalism. Not in the broadcast sense, just in the sense of ‘whats new’. Do your audience consume like you do?
  • Are you removing the value proposition of a journalist? – by putting the control of the stream in the hands of the user are you doing yourself out of a job. I know what the reply to that will be: “No, because the content of the stream will be done by us and  we will curate the stream”. Well in that sense it’s not a stream is it. It’s a list of what you already do. Where’s that serendipity or the compulsion to give people what they need (to live,thrive and survive) rather than what they want?
  • Confusing presentation with creation - That last point suggests a broader one. You can’t simply repackage content to simply ride the wave when your core business different. It’s like calling a column a blog – we hate that don’t we. So why call a slightly different way of presenting the chronology of content a stream?

That’s before we have even got to the resource issue. News orgs can’t handle the social media flow as it is.

So, Islands in the stream?  Well, thinking about the points above, especially the first one, what’s wrong with being something different. What’s wrong with being a page is world of updates.  What’s wrong with being a place where people can step out of the stream and stay a while to dry off and get a bit of orientation.

[View the story "What should news sites be - pages or streams" on Storify]

What should news sites be – pages or streams

Entrepreneur, writer and geek Anil Dash has posted a request that people stop publishing pages and start creating streams.

Storified by Andy Dickinson · Wed, Aug 15 2012 04:17:12

Stop Publishing Web PagesMost users on the web spend most of their time in apps. The most popular of those apps, like Facebook, Twitter, Gmail, Tumblr and others,…
Start moving your content management system towards a future where it outputs content to simple APIs, which are consumed by stream-based apps that are either HTML5 in the browser and/or native clients on mobile devices. Insert your advertising into those streams using the same formats and considerations that you use for your own content. Trust your readers to know how to scroll down and skim across a simple stream, since that’s what they’re already doing all day on the web. Give them the chance to customize those streams to include (or exclude!) just the content they want.
An interesting post which generated some interesting discussion when Guardian Journo Mary Hamilton posted it to twitter. 
@newsmary I *hate* that piece. Am I the only person left who likes the web, and webpages, and tolerates apps whilst sincerely hating them?Greg Callus
@Greg_Callus No, I don’t think you are. But I do think there’s room for other presentations as well as single static URL.Mary Hamilton
@newsmary There is, I just hate the Appify movement & ‘streams’. And there’s a reason Guardian Network Front isn’t RSS feed of our content.Greg Callus
@newsmary Where’s the evidence readers ‘like’ streams & apps? Rather than utility sacrificed for convenience b/c that’s what mobile could doGreg Callus
@Greg_Callus Where’s the evidence they don’t? Don’t think people are using Facebook/Tumblr etc while disliking the approach that much.Mary Hamilton
@newsmary Drop/plateau in Facebook numbers since move from Profile to Timeline? Not universal but thnk his claim they ‘like streams’ not metGreg Callus
@Greg_Callus But significant rise since the introduction of the news feed, which is a stream.Mary Hamilton
@newsmary Touche! Thing is I love Twitter as a stream. Where chronological key, it works (like comments). Where content needs hierarchy, notGreg Callus
@Greg_Callus Yeah, there are def some big issues with streams wrt hierarchy – but also with pages too. It’s not a solved problem.Mary Hamilton
It wasn’t the only chat. Mary’s tweet had already attracted the attention of her Guardian colleague Jonathan Haynes who took issue with the basic premise.
@newsmary no! Much more important is: Stop thinking you’re the medium when you’re the content provider!Jonathan Haynes
@JonathanHaynes Different issues, surely? You can be a content provider with a stream.Mary Hamilton
@newsmary what’s a stream Mary, what’s a stream? it’s a load of contentJonathan Haynes
@JonathanHaynes Compared to a flat page, it’s a different way of organising that content. That’s not a difficult distinction…Mary Hamilton
@newsmary it’s the same content! *head desk*Jonathan Haynes
@JonathanHaynes And the point of the piece I linked is that news orgs should present it differently. Struggling to see your point.Mary Hamilton
@JonathanHaynes Compared to a flat page, it’s a different way of organising that content. That’s not a difficult distinction…Mary Hamilton
@newsmary present it how? it’s presented in every way alreadyJonathan Haynes
@alexhern @newsmary *head desk*Jonathan Haynes
I wondered whether, given the content hungry nature of the stream if media orgs had the resource or know-how to take Dash’s advice.
@newsmary @jonathanhaynes also the issue here that stream implies a constant flow. A mechanism of displaying constantly changing content.Andy Dickinson
@newsmary @jonathanhaynes not sure that most orgs can promise that without USB and sm. something most have no talent or resource for.Andy Dickinson
@digidickinson @newsmary indeedJonathan Haynes
Mary didn’t think that was the issue. It was more about what you did with what you had and how people used it.
@digidickinson @JonathanHaynes Not certain that’s true – using a single blog as the example. More talking about customisation & user flow?Mary Hamilton
@newsmary @digidickinson how does a blog show importance? it’s just a stream.Jonathan Haynes
@JonathanHaynes Sticky posts, design highlights. Not a new problem.Mary Hamilton
But that still didn’t answer the core question for me – where does the content needed to create a stream come from?
@JonathanHaynes @newsmary that’s about relevance- is timeliness relevance or curation. Can see a case for chronology but still needs ‘stuff’Andy Dickinson
@JonathanHaynes @newsmary stuff that is new to appear ‘chronologically’Andy Dickinson
Jonathan was still struggling with the idea of the stream
@newsmary @digidickinson then how is that a stream?Jonathan Haynes
@JonathanHaynes @digidickinson How is a blog a stream if it has sticky posts? *headdesk*Mary Hamilton
I could kind of see Jonathan’s point.
@newsmary @jonathanhaynes slightly different issue there. One to watch as you are talking about subverting (damming it with sticky posts)Andy Dickinson
@newsmary @jonathanhaynes that changes the consistency of presentation for publishers sake, without the users permission. Breaks the premiseAndy Dickinson
@newsmary @jonathanhaynes like twitter being able to keep one tweet at top of your feed when it suitedAndy Dickinson
But Dan Bentley pointed out that there are a number of sites that seem to do ‘the stream’ well. 
@digidickinson @newsmary @jonathanhaynes you can stream content and still tell people what’s important http://itv.co/NDpTxdDaniel Bentley
Latest News – ITV NewsTia accused faces Old Bailey No application for Hazell bail by Jon Clements – Crime Correspondent Lord Carlile QC (representing Stuart Ha…
@DJBentley @digidickinson @JonathanHaynes Good example, that. Cheers.Mary Hamilton
But sites like ITV rely heavily on UGC and that’s a big issue. It still comes down to where you get the content from and if the org is resourced to do that.
@DJBentley @newsmary @jonathanhaynes true but the itv example better illustrates the point I made about where the content comes fromAndy Dickinson
@DJBentley @newsmary @jonathanhaynes it’s curating content but it’s still content and it has to come from somewhere at regular intervals.Andy Dickinson
@DJBentley @newsmary @jonathanhaynes that’s not an impossibility but it is a core challenge for orgs – always has been online esp. with smAndy Dickinson
@JonathanHaynes @djbentley @newsmary think that highlights core issue here-presentation separate to mechanism to create content to presentAndy Dickinson
Another example 
@DJBentley @digidickinson @newsmary @jonathanhaynes Breaking News does similar with their verticals (sorry to butt in) http://breakingnews.com/TomMcArthur
Breaking news, latest news, and current events – breakingnews.comThe latest breaking news around the world from hundreds of sources, all in one place.
@TomMcArthur I like @breakingnews style for streams a lot – suits it perfectly.Mary Hamilton
But Jonathan is not a fan of the ITV approach.
@digidickinson @DJBentley @newsmary ITV site is a car crash though. and how a minority want news presented isn’t necessarily representativeJonathan Haynes
And has an example of his own to highlight that the page is not quite dead…
@digidickinson @TomMcArthur @newsmary @DJBentley most successful UK newspaper website is Mail Online. sticks rigidly to articles.Jonathan Haynes
Home | Mail OnlineMailOnline – all the latest news, sport, showbiz, science and health stories from around the world from the Daily Mail and Mail on Sunday…
@JonathanHaynes @digidickinson @TomMcArthur @newsmary is the Mail Online a good news source?Daniel Bentley
Another example pops up later on as an aside to the conversations
The Reddit Editundefined
@newsmary @TomMcArthur The news site of the future looks a lot more like that or http://bit.ly/NDsuHw than 240 hyperlinks and 60 picturesDaniel Bentley
@DJBentley @TomMcArthur Yes, I agree.Mary Hamilton
and Mary takes the chance to voice her view of the term newspaper site.
@JonathanHaynes @digidickinson @DJBentley “Newspaper website” is an oxymoron that cannot die quickly enough for my liking.Mary Hamilton
@newsmary @jonathanhaynes @djbentley agree with sentiment but sadly it is still a very apt description of the general process and mentalityAndy Dickinson
@newsmary @digidickinson @DJBentley touché. sorry, news site.Jonathan Haynes
In the continuing conversations Jonathan is concerned that this might be a bit of the thrill of the new…
@DJBentley @digidickinson @TomMcArthur @newsmary consumption and creation are different. and early adopters are not the norm.Jonathan Haynes
@JonathanHaynes @DJBentley @digidickinson Thing is, stream consumption isn’t a minority or early adopter thing any more.Mary Hamilton
@newsmary @jonathanhaynes @djbentley true but danger is going for mode of presentation without considering the mechanics.Andy Dickinson
@newsmary @jonathanhaynes @djbentley number of individuals needed to make a stream vs number needed to present it.Andy Dickinson
So Jonathan asks about a concrete example.
@newsmary @digidickinson @DJBentley so how would that look for "the Guardian" streams works as multiple source and crows editingJonathan Haynes
@newsmary @digidickinson @DJBentley crowd, not crows. what I get from Twitter I want, but I also want websites to show me hierarchy.Jonathan Haynes
@newsmary @digidickinson @DJBentley and content is discrete elements. should be available in all forms but need to be ‘page’ to do soJonathan Haynes
@JonathanHaynes @digidickinson @DJBentley Let me subscribe to tags; filter my stream on my own interest & curated importance?Mary Hamilton
@newsmary @DJBentley @digidickinson you want to subscribe to tags?! might as well have an RSS feed! ;)Jonathan Haynes
Dan highlighted a problem which, I guess, he would see the stream as helping to solve.
@JonathanHaynes @newsmary @digidickinson I don’t feel current news site frontpages do a particularly good job at hierarchy. Too much stuff.Daniel Bentley
@JonathanHaynes @newsmary @digidickinson Google News or the new digg http://bit.ly/NDuNuc do a better job and that’s mostly algorithm.Daniel Bentley
Google News- As the courtroom emptied after Barry Bonds’ obstruction-of-justice conviction Wednesday afternoon, the slugger stood off to one side, h…
DiggThe best news, videos and pictures on the web as voted on by the Digg community. Breaking news on Technology, Politics, Entertainment, an…
@DJBentley @newsmary @digidickinson too much stuff? and yet you want an endless stream??Jonathan Haynes
But for Dan the stream has a purpose 
@JonathanHaynes @newsmary @digidickinson the stream tells me what’s new, the traditional frontpage doesn’t know what it’s doing.Daniel Bentley
@JonathanHaynes @newsmary @digidickinson Am I what’s new? Am I what’s important? Am I everything that has been written in the last 24hrs?Daniel Bentley
@DJBentley @newsmary @digidickinson no, you’re the carefully edited combination of all of the below!Jonathan Haynes
@JonathanHaynes @newsmary @digidickinson carefully edited? How is 240 links on Guardian and 797 (!) on Mail Online carefully edited?Daniel Bentley
@DJBentley @newsmary @digidickinson *sigh*Jonathan Haynes
Frustrating as it may be it’s a real problem and which Mary sums up with
@DJBentley @JonathanHaynes @digidickinson Part of problem with hierarchy on fronts is trying to be all things to all visitors.Mary Hamilton
But, to be honest, I can’t see how the stream would be any better other than to put the responsibility back on to the user. But I’ve more to add in a blog post….
News sites should be Islands in the stream | andydickinson.netIslands in the stream That is what we are No one in-between How can we be wrong Dolly Parton! Well, actually the BeeGees (well if we are …

 

Enhanced by Zemanta

February 28 2012

14:00

How Social Media, E-Books, Self-Publishing Change Writers Conferences

At first, you came to the San Francisco Writers Conference to learn the craft of writing, to hear famous writers describe how they became famous, to learn the secrets of how to create a winning book proposal, to become enlightened by publishers about what they want and, most of all, to pitch literary agents, those elusive creatures who seem always to be heading the other direction.

Today, it's a different story. Today's conference is about all the traditional basics, but also about topics from blogging and tweeting to e-books and self-publishing. I asked four longtime participants in the 2012 San Francisco Writers Conference earlier this month to describe how this and other writers conferences have morphed to include technical content relevant to today's writers.

You can listen to their takes below.

I started with San Francisco Writers Conference co-organizer Laurie McLean, who told me that the core teachings are still there, but two entirely new tracks have been added to handle tech topics relevant to writers today, and the previously unmentionable option, self-publishing.

Laurie McLean
lauriemclean.jpg

For more than 20 years Laurie ran a public relations agency in California's Silicon Valley. Then she became an agent at Larsen Pomada Literary Agents representing adult genre fiction and children's middle grade and young adult books. As Agent Savant, she works with authors to create their author brand, then develop a digital marketing plan to help them promote that brand online via social media, blogs, websites and more. Laurie is dean of the new San Francisco Writers University and on the management team of the San Francisco Writers Conference. In 2012, Laurie started two e-publishing companies: Joyride Books (for out-of-print vintage romance novels) and Ambush Books (for out-of-print children's books).

Listen to McLean on adding two new tracks to the conference offerings here.

Listen to McLean on still sticking with the basics here.

Kevin Smokler
kevinsmokler.jpg

In 2007, Kevin Smokler founded, with Chris Anderson (editor in chief of Wired Magazine), BookTour.com, the world's largest online directory of author and literary events. Kevin now serves as the company's CEO, regularly speaking at industry conferences and book festivals throughout North America on the future of publishing, books, reading and legacy media in the 21st century. His regular topics include print and digital publishing, legacy media, social media and the web for writers, and business skills for artists and creatives. In April of 2008, Amazon purchased a minority stake in BookTour.com.

From Smokler's vantage, despite all the changes, there are some things that are still, and always will be, basic to publishing -- namely, the need for a quality book and connecting that book to readers.

Listen to Kevin Smokler talk about that here.

Patrick von Wiegandt
patrickvonwiegandt.jpg

Patrick von Wiegandt is a musician and sound engineer in charge of making each session at the San Francisco Writers Conference available in audio formats for sale immediately at the conference and online after the event.

He's seen big changes "backstage," as in the transition from tape to CD to MP3, but because he also hears all the sessions, he has some interesting insights about how the content of the conference has changed since the Internet came to be important to writers.

Listen here to Patrick von Wiegandt talk about the changes he's seen.

Joel Friedlander

joelfriedlander2.jpg

Joel Friedlander is a self-published author and a book designer who blogs about book design, self-publishing and the indie publishing life at TheBookDesigner.com. He's also the proprietor of Marin Bookworks, where he helps publishers and authors who decide to publish get to market on time and on budget with books that are both properly constructed and beautiful to read.

One of the biggest changes Friedlander sees is the massive shift in how books are being publicized (authors now being asked to do promotions themselves) and how writers conferences are adapting to reflect that change.

Hear Friedlander talk about that change and others he's seeing here.

Carla King is an author, a publishing consultant, and founder of the Self-Publishing Boot Camp program providing books, lectures and workshops for prospective self-publishers. She has self-published non-fiction travel and how-to books since 1994 and has worked in multimedia since 1996. Her series of dispatches from motorcycle misadventures around the world are available as print books, e-books and as diaries on her website. The newest version of her e-book, The Self-Publishing Boot Camp Guide for Authors, was released in August 2011 and is available on Smashwords, Amazon Kindle, and for the B&N Nook.

This is a summary. Visit our site for the full post ».

July 28 2011

16:24

The post-post-CMS CMS: Loosely coupled monoliths & read-only APIs

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — for example, like the app that Talking Points Memo demonstrates for managing their front page.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app’ or the read/write API handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity?

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths, read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam again in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — like the app that Talking Points Memo demonstrates for managing their front page, for example.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app,’ or the read/write API, handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity.

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths & read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Would love to hear otherwise, or experiences of how others are handling this or thinking about the challenge.

July 14 2011

14:40

Dear Publishers: Why ponder 'digital editions' when you could be building digital experiences?

If you’ve ever sat in a room where I’ve given a presentation on digital publishing, or run into me at a soem kind of publishing-related event, you will already know that I am no fan of so-called ‘digital editions’. In my experience, they are often expensive and questionable investments that rarely lead to any significant revenue (unless you’re Playboy).

As each year passes, I think to myself “finally, it must be the end of digital editions.” Alas, that has not proved to be the case.

There is such a burst of new technologies available today for creating divine digital reading experiences that I am just floored that certain well-known digital edition vendors are still (very successfully) peddling crap.

I know they are still successful, because I continue to be asked by publishers “Do you think we should invest in a digital edition? If so, who should we work with? Or what tool should we use?

Publishers are in a predicament, it is quite clear to see. Readers are asking for digital products as more of their life moves to new devices. The advantages of having a digital product to offer those readers are clear:

  • Keep the subscriber instead of losing them
  • Deliver a product that (hopefully) costs less to produce and deliver, while still charging a reasonable subscription price
  • Provide a ‘free trial issue’ with very low fulfillment costs
  • Make international subscriptions more affordable and easier to fulfill
  • Access to back issues
  • Last but not least, keeping the cost of renewals low because “digital in, digital renewal”

The challenge is that the range of devices that readers are using is multiplying and the digital edition vendors have not caught up. Not only have they not caught up on the devices, but — more importantly — they are not even close to catching up on the reading experience that people expect today.

Anyway, this is turning into more of a rant than a useful blog post, so I’ll try to get it back on track here…

If you’re a publisher and you’re thinking about investing in a digital edition, start with these questions:

  1. Digital edition vs. adaptable reading experience: Do you want to provide readers with an experience that is device-appropriate, or one that is simply a replica of your print layout? (The reasoning “We can re-use photos licensed for print in a new medium if we don’t change the layout” should not drive your answer to this question.) The term ‘mobile’ applies to just about everything these days — laptops (increasingly smaller), tablets (various sizes), and smart phones (various screen resolutions) — and it’s unlikely that a print layout is going to be an enjoyable reading experiences across them all.

  2. InDesign-centric vs. Web-centric: Do you want to work from the Web version of your stories — the version that includes links to other sites, links to related articles, and so on — or the version that comes out of the print workflow? Remember, if you have a Web site there’s a good chance that you’ve already invested in getting your print-centric content into a digital content management system — thus, you’ve already done the work of the print-to-digital conversion for most devices.

  3. “Somebody do it for us” vs. Do-it-yourself: Finally, do you want to have the control to present your publication the way you want, with the features that your readers want, and own all the data? Or do you want someone to just “get it done” and “keep it working?” Remember, it’s pretty likely that you already have a circulation plan that can be adapted to these digital subscribers, and it’s likely that you’re already taking payments online and managing access to your Web site. Integrating those processes and system into your digital products is not as hard as you think.

There have been some incredible advances over the last year in the capabilities of the modern Web browser. It is the idea of ‘the Web in your pocket’ that draws many people to these new digital devices. And these new devices all run ‘the Web.’ Each of the major platforms — iOS, Android, Blackberry, Windows Phone, Palm WebOS, Kindle — have versions of their devices that support current Web technologies like HTML5, CSS, and Javascript. This opens up a whole new field of opportunity for you — as a publisher — to think about delivering amazing digital experiences, not just a ‘digital edition.’

If you don’t know where to start exploring what’s possible, here are a few pointers:

Hopefully that will give you a place to start. Still have questions, drop me a line. Are you experimenting with a framework, technology, or approach not listed above, leave a comment.

June 13 2011

23:20

A beautiful book sprint for Beautiful Trouble: Tips on collaboratively writing a book.

I’m just heading back to Toronto after what I would consider to be an incredibly successful “book sprint” for the Beautiful Trouble project.

What’s a book sprint?

Basically, we brought together a group of fourteen (incredibly talented and generous) contributors — both physically in NYC and remotely — for a weekend of focused writing. People came from far-and-wide: from as close as Brooklyn, Cleveland and Pennsylvania, and as far as Berlin, Denmark, and Paris.

At the end of the weekend, the group had started work on more than 70 articles and written more than 30,000 words. I don’t think it would be an exaggeration to say: we crushed it.

If you’re working on a book or documentation project with many contributors who are working on discrete pieces of content, here are a few tips on how to run your own book sprint.

First, from the venerable Allen “Gunner” Gunn channeling the venerable Adam Hyde:

  • Focus: The key to the overall success of the book sprint is focus. Staying focused on the tangible outcomes, and the steps that need to be facilitated to get there, helps to ensure that actual work gets done.

  • Deliverables: Be clear about what you’re asking people to write. To accomplish this, we wrote example content and created templates (with word counts, etc.) for each type of content. We asked participants to work from those templates and examples. We used Google Docs for all of this: the templates, examples, and assignments.

  • Output: Put the attention of the gathering on output — generating the raw number of words necessary to gather some momentum. Not everything is going to be great, but that is what editors are for.

  • Distractions Probably the best advice that Gunner gave us is “make sure the work sprint doesn’t turn into a brainstorm sprint.” We really took this to heart and had people focused on writing for about 70-80% of the weekend. The brainstorming we did do was not about the book’s content.

Logistically, we made sure to:

  • Have one person that is facilitating, not writing: This was Andrew and his role was to coach people, spot edit, and to put wind in our sails. Basically, he said “if you’re blocked, come talk to me.” He and Duncan also used a bullhorn to berate us with calls to work harder (or to take group yoga breaks).

  • Make remote participants visible (and vice-versa): To bring the energy of the in-person sprint to our remote participants, I used Ustream.tv to broadcast a continuous window into what was happening in the room in NYC. To bring remote participants into the conversation, I used Skype (with somewhat limited success) and the live stream chat tool. For the next sprint, I’ll probably use an audio conferencing system instead of Skype for our larger group check-ins to ensure that both the people in the room, and the remote participants, can talk to each other.

  • Make editors available: Our tireless editor Dave Oswald Mitchell did a fantastic job working through articles from Paris, but — ultimately — just having two editors for the weekend was a bottleneck. Ideally, we would have had more editors available at the in-person event to work with contributors. Having contributors read each other’s work was helpful, but needs to be facilitated to ensure that peer-based work is useful and not counter-productive.

  • Make progress visible: Allen had suggested that we announce when finished pieces where coming off the pipeline as a way to keep spirits up, especially for the remote participants. However, I took that a step further and created a regularly updated “Book Sprint Leader Board” (screenshot at the top of this post) to add a little fun competitive energy into the mix. Next time, I’ll probably take that a bit further and have it list the number of pieces in progress and completed by author, along with their total word count and something like ‘velocity.’ I’m probably getting carried away — but hey!

Format-wise: We started with a really long day on Saturday — started early and finished late —and limited the post-event socializing to ensure that contributors had lots of energy on day two.

On day two, we started early and finished early. We did a large group brainstorm that day on where we wanted to take the Web compendium of the Beautiful Trouble book — perfect timing, as the folks in the room had be immersed in the content all weekend. Then we encouraged people to not start anything new, and — instead — to focus on finishing up any articles that they had already started.

At the end of day Sunday, we went out for celebratory drinks, food, and — for the exceptionally brave — a screening of Super 8.

There’s our recipe for a successful book sprint. Your mileage may vary.

June 11 2011

12:19

Storytelling - The Atavist: multimedia enriched digital magazine experience

Read Write Web :: The structure for The Atavist is similar to a magazine at first glance: create an assignment for a freelance journalist, who goes out and writes a story. However, The Atavist is much more involved in the creation of a story than a traditional magazine publisher. While the writer goes out and gets the core story, The Atavist gathers other media around it and creates a multimedia package for their apps.

As Co-founder Evan Ratliff explained, "we actually have control over the [publishing] environment. We can build our own way of seeing the story." The result is like a combination of documentary and magazine article.

Continue to read Richard MacManus, www.readwriteweb.com

February 15 2011

18:00

1,900 copies: How a top-selling Kindle Single is generating new audiences for ProPublica

Listing the eight big trends journalism will see over the next year, Josh highlighted the increasing role that the singles model will play in the news. He was talking about the disaggregation of the author and the publisher — “a way for an individual writer to kind of go around getting the approval of a glossy magazine editor or getting a newspaper editor’s approval to get something to an audience.” But the idea has another intriguing twist, as well: individual news organizations using the singles model to circumvent traditional constraints on publishing.

One outlet that’s making a go of that approach is ProPublica, which, at the launch of Amazon’s Kindle Singles platform late last month, published a story as a Kindle Single: staff writer Sebastian Rotella’s nearly 13,000-word-long exposé, “Pakistan and the Mumbai Attacks: The Untold Story.”

The piece, also available for free on the web, is a work of long-form investigative journalism, telling the story of the complex stew of relationships and circumstances that led to the 2008 terror attack on Mumbai. It’s long for a web piece, short for a book — right in the sweet spot Kindle Singles are trying to hit.

Selling it through Amazon for 99 cents was an experiment, Richard Tofel, ProPublica’s general manager, told me. From the looks of things, though, it’s been a successful one: The story’s been a regular in the top 10 of Kindle Singles bestsellers (it’s been as high as #2, as far as I’ve seen; it’s #6 at this writing). It’s also currently #1 in books about both terrorism and international security — and #1, for that matter, across all books (e- and otherwise) in Amazon’s International Security category. That, with almost no marketing effort (besides the placement of the story on Amazon’s site itself) on the part of ProPublica.

And if you’re wondering what being a top-10 Kindle Single gets you in terms of actual sales: In the first two weeks of its availability in the Singles Store, the piece sold more more than 1,900 copies, Tofel says.

The 1,900 sales number is certainly not a lot compared to other metrics (pageviews and the like). And given the story’s 99-cent pricing (the minimum amount for a Single) and the 35 percent royalty rate, the direct financial gain isn’t much, either. “The money will be nice, but even if you multiply the eventual sales of this by ten — and multiply that by 20 — it still doesn’t turn into enough money to float our boat,” Tofel notes.

Then again, pageviews — while they’re good at measuring a story’s popularity and decent at measuring its impact — don’t accrue to much direct financial gain, either, even on a site that accepts advertising. The Singles model, instead, allows ProPublica to take a new twist on the old “diversify your assets” maxim: It’s one more revenue stream for the outfit. And, given the broad brand exposure that being listed on Amazon’s site allows, the Singles model could allow separate (and sometimes contradictory) goals to be achieved on the same publishing platform: editorial impact and financial gain.

Besides, the value proposition here lies more in the cultural proposition that the Kindle Single and its counterparts represent: the editorial normalization of long-form. The web isn’t bringing about the long-predicted “death of long-form”; on the contrary, it seems, the digital world is heralding a renaissance in long-form reportage. “The economics of book and magazine publishing for the last 100 years have had the effect of saying that you cannot write narrative nonfiction at longer than 10,000 words,” Tofel says — and, for that matter, shorter than book length.

Sheri Fink’s story on the chaos at a hospital devastated by Hurricane Katrina — the New York Times Magazine piece that won ProPublica its Pulitzer — was about the same length as Rotella’s story, around 13,000 words. And “that’s pretty much the outer edge of the range for a magazine piece,” Tofel notes. On the other side of the equation, you have books, where short of a certain length, Tofel notes, “it’s hard to charge enough for a book to make money.” Essentially, magazines have had a maximum length for stories, while books have had a minimum. The end result: “There’s this void,” Tofel notes. “And the void is dictated not by narrative, but by economics.” Despite the web’s ability to remove the physical constraints from the editorial process, until now, there hasn’t been a platform that’s been well suited to the length. Journalism hasn’t had its equivalent of the novella.

“One of the things that people were saying a few years ago is that long-form is dead,” Tofel notes. In reality, though, “long-form was never alive as a mass medium.”

Selling 1,900 copies at 99 cents doesn’t make for a mass medium either, exactly, but the hope is that the Singles model might allow for a kind of renaissance of pamphlet, with benefits accruing to reported pieces. The platform allows users to get used (re-used) to the idea of journalism as a long-form, immersive proposition. And for an outfit whose specialty is deep-dive, attention-requiring narrative, that’s valuable. “Anything that promotes ways for people to effectively consume long-form journalism in the modern world is good for us,” Tofel notes. If the big question is whether there’s an audience for longform, he says, “this, to us, looks like an interesting way to find an audience.”

November 16 2010

19:30

Google News experiments with metatags for publishers to give “credit where credit is due”

One of the biggest challenges Google News faces is one that seems navel-gazingly philosophical, but is in fact completely practical: how to determine authorship. In the glut of information on the web, much of it is, if not completely duplicative, then at least derivative of a primary source. Google is trying to build a way to bake an article’s originality into its no-humans-used algorithm.

Today, it’s rolling out an experiment that hopes to tackle the “original authorship” problem: two new metatags, syndication-source and original-source, intended to attribute authorship, via URLs, into the back end of news on the web. Though the tags will work in slightly different ways, Googlers Eric Weigle and Abe Epton note in a blog post, “for both the aim is to allow publishers to take credit for their work and give credit to other journalists.”

Metatags are just one of the many tools Google uses to determine which articles most deserve news consumers’ attention. They work, essentially, by including data about articles within webpages, data that help inform Google’s search algorithms. Google itself already relies on such tagging to help its main search engine read and contextualize the web. (Remember Rupert Murdoch’s so-far-unrealized threats to opt out of Google searches? He would have done it with a noindex tag.)

The tags are simple lines of HTML:

<meta name="syndication-source" content="http://www.example.com/wire_story_1.html">

<meta name="original-source" content="http://www.example.com/scoop_article_2.html">

And they’ll work, Weigle and Epton explain, like this:

syndication-source indicates the preferred URL for a syndicated article. If two versions of an article are exactly the same, or only very slightly modified, we’re asking publishers to use syndication-source to point us to the one they would like Google News to use. For example, if Publisher X syndicates stories to Publisher Y, both should put the following metatag on those articles:

original-source indicates the URL of the first article to report on a story. We encourage publishers to use this metatag to give credit to the source that broke the story. We recognize that this can sometimes be tough to determine. But the intent of this tag is to reward hard work and journalistic enterprise.

(This latter, original-source, is similar to Google’s canonical tag — but original-source will be specific to Google News rather than all of Google’s crawlers.)

Google News is asking publishers to use the new tags under the broad logic that “credit where credit is due” will benefit everyone: users, publishers, and Google. A karma-via-code kind of thing. So, yep: Google News, in its latest attempt to work directly with news publishers, is trusting competing news organizations to credit each other. And it’s also, interestingly, relying on publishers to take an active role in developing its own news search algorithms. In some sense, this is an experiment in crowdsourcing — with news publishers being the crowd.

At the moment, there are no ready-made tools for publishers to use these tags in their webpages — although one presumes, if they get any traction at all, there’ll be a plugins for many of the various content management systems in use at news organizations.

The tags, for any would-be Google Gamers out there, won’t affect articles’ ranking in Google News — at least not yet. (Sorry, folks!) What it will do, however, is provide Google with some valuable data — not just about how its new tags work, but also about how willing news publishers prove to be when it comes to the still-touchy process of credit-giving. That’s a question Google News has been trying to tackle for some time. “We think it is a promising method for detecting originality among a diverse set of news articles,” the tags’ explanation page notes, “but we won’t know for sure until we’ve seen a lot of data. By releasing this tag, we’re asking publishers to participate in an experiment that we hope will improve Google News and, ultimately, online journalism.”

October 07 2010

12:22

#WEFHamburg: Values at the heart of a news organisation’s journalism, structure and business

The panel was called “How to break away from the “he said yesterday” journalism?”, but the discussion moved on to what values should be at the heart of a news organisation’s journalism, structure and business.

Some valuable advice came from Francisco Amarai, director of design studio and media consultancy Cases i Associats and formerly artistic director of and executive editor of Correio Braziliense

Successful newspapers see the news through the eyes of their readers, he said. And through print and online design and editorial choices, newspapers can rethink the relationship that they have with their readers.

According to Amarai, newspapers that are successful:

  • have well-defined values;
  • know their readers;
  • are newsy;
  • have talented staff in their newsrooms, who can offer their own points of view as well as news;
  • and have time.

In discussing time, he referred to the restructuring of O Estado de Sao Paulo in March this year. The paper decided to lengthen its editing time, starting checks, editing and layouts earlier in the day. Since the change in working patterns, circulation has increased by eight per cent in six months and page views have grown by 110 per cent over the past 12 months.

For fellow panellist, Abdel-Moneim Said, chair of the Al Ahram Group in Egypt, said newspapers need to see themselves as part of a media house not just a publishing house.

“We’re not journalists, we’re part of a larger family called media, which means to inform people in a variety of ways,” he said, adding that “different moods [of people] will call for different ways of getting information” and different means of deriving revenue.Similar Posts:



September 17 2010

19:10

5 Mistakes That Make Local Blogs Fail

So you're thinking about starting a local blog. Maybe you're a reporter tired of office politics and lowest-common-denominator assignments. Maybe you're a neighborhood gadfly who wants to create a new place for locals to gather. Maybe you're a realtor who wants to generate new leads.

Either way, your local blog, like most new things, will probably fail.

It will fail to support you. 

It will fail to win an audience. 

It will fail to have real impact in your community.

I meet a lot of local bloggers and people thinking about starting local blogs who ask me for tips or for feedback.  After having several of these conversations, it seems useful to pull these conversations together in one place modeled after a great piece Paul Graham of YCombinator wrote back in 2006. He found 18 mistakes that kill startups. I think the mistakes that kill local blogs can be condensed down to five.

Let's break them down.

Five Mistakes

#1. You're doing it alone.

The first reason your local blog will fail is because you don't have the right people working on it. Notice I said "people." No, you will not succeed working on this alone.

As a solo local blog founder, you alone will be responsible for creating the content, editing it, distributing it, selling ads around it, promoting it, collecting payment, accounting for the money collected and spent, and then covering all your legal bases. That's an incredible amount of work. More importantly, any time spent on any one of these tasks is time NOT spent on the others. If you go it alone, your business will be single-threaded. Everything will have to run through you before it can happen and you can't always be available. In a single-threaded business, if the one agent needs to take a break, everything else grinds to a halt. 

As Graham puts it: "When you have multiple founders, esprit de corps binds them together in a way that seems to violate conservation laws. Each thinks "I can't let my friends down." This is one of the most powerful forces in human nature, and it's missing when there's just one founder." If it's really just you, then your team is weak and your blog will fail.

#2. You don't know your market.

The next reason your blog will fail is because you didn't do your homework. In the case of the local reporter who's been covering her beat for a few years, yes, she knows her subject matter inside and out, but that's just the tip of the iceberg of necessary knowledge for building a business around it. For example, does she know:

a. How many people are actively looking for coverage of her beat?

b. The average incomes of those people?

c. How many of them have Internet access?

d. How much time they spend online?

e. What businesses or organizations would like to reach those people?

f. How much money they spend annually in doing so?

I could go on. My experience has been that very, very few local bloggers have answered any of these questions or have any intention of answering them in the course of working on their blog. And these are not tricky, obscure questions. These are questions that any business founder would need to answer in order to be taken seriously or stand a chance at success. If you don't know these things, then you didn't do your homework and your blog will fail.

#3. Your content is weak

The third reason your blog will fail is because your content stinks. It stinks because it lacks a point of view and it fails to address a real, general human problem.

Whether you're a trained journalist, a neighborhood gadfly, or a realtor, your content probably lacks a point of view. As a newspaper reporter, you were trained to be objective. As a gadfly, you have relationships around the community that you have to protect and worry about. As a realtor, you will never say anything bad about the community you cover and therefore will be a bore.

Your blog has to have a point of view and a voice because people only engage with things they can wrap their heads around and get familiar with. Your local blog will only succeed if it wins an audience. You win an audience by building relationships between your stories and readers. No one relates well to something they don't know and understand. Your blog has to have strong, easily remembered stances on local issues people actually care about or it will fail. Groupon is a company that sells deals, not local news per se, but they have a phenomenal grasp of the voice and point of view of their content. Read their style guide here.

Which brings us to the other reason your content is weak. It's weak because no one wants to read it. And no one wants to read it because it doesn't address any real, general human problem. For all the bluster about hyper-local coverage and blogging in the last five years, as someone who runs a city-specific social news site where people vote for the stories they actually are interested in, it seems pretty clear that most people don't give a fig about what's happening day in and day out in their local elected bodies. That stuff matters a great deal to other elected officials, people who do business with elected officials, and the political/news nerds in your community, but that's it. 

If your local blog is focused on covering local government, it should be a subscriber-only, paid newsletter that goes out to just those people. It should only be a public blog if there's mass interest in the subject matter, which there just isn't for a lot of the stories showing up on hyper-local blogs. If your content lacks a point of view and is centered around things that the general public isn't interested in, it will fail.

#4. You haven't thought through your business model

Let's assume you figured all this stuff out. Now how are you going to make money? Ads, you say? Okay, great. Have you answered these questions?

 -What kind of ads? Banners? Text links?  Sponsored posts? Real-time ads?  

 -Who's going to sell them?

 -How are they going to sell them?

 -What are you going to charge? 

 -Who are you going to sell them to? 

 -What's the value proposition of buying your ads over someone else's?

 -How many ads do you need to sell to cover your costs?

 -What the heck are your costs?

Until you answer these questions and more like them, your blog will make no money and it will fail.

#5. You have no distribution strategy

Finally, your local blog is going to fail because you can't distribute it to enough people. If your local blog is ad-supported, then your ads are your product and your content is a marketing tool created to bring people to look at your ads. In order for you to sell ads, you need to have people coming to look at them. You need eyeballs on your blog. How will you get them? 

Twitter and Facebook are good but not great answers here. Both can drive significant traffic but require a lot of work on your end. Also, their purposes are at odds with yours. Facebook and Twitter are your competitors. They sell ads to the same people you probably want to sell ads to. They would be perfectly happy if you didn't start a blog at all and just started a Twitter/Facebook account and posted your content there. If you are a local blogger, Facebook and Twitter, not your local paper, are your biggest threats. Why should someone visit your blog when they can read your headlines alongside other neighborhood headlines over there? They are useful but can't be your main tools.

Search could be a win for you, but have you devised a search engine optimization strategy?

Partnering with established sites could produce regular traffic and great visibility, but have you had formal conversations with other publishers about that? These things don't just happen.   Unless you have a formal, structured plan for how people are going to find you and see your ads on a regular basis, your local blog will fail.

Conclusion

In the end, the main mistake is looking at it wrong. You are not starting a blog, you are launching a small business. You are no different from the guy opening a bar up the road. You are both starting small, local businesses. You need to know something about blogging and social media, yes, but what you really need to bone up on is what it takes to run a small business. Instead of going to the local blogger meetups in your city, you should go to the local small business owner and entrepreneur meetups. Instead of following the latest social media news, you need to read up on the latest advertising, marketing, and search strategies showing results for actual media entrepreneurs in the field. This is the main mistake local bloggers make that dooms their efforts.

But if you can avoid this and the other five listed above, you'll have a chance to start something that will sustain you and have a real impact on your community. That's a special thing. 

There are opportunities out there for local blogs, they just need to be considered and approached with the right frame of mind. 

Thanks to @tracysamantha, @kiyoshimartinez, and @annatarkov for reading drafts of this.

August 13 2010

15:52

April 20 2010

20:44

Printcasting Plans Mobile Expansion With FeedBrewer

About two years ago, I wrote up an idea for how to leverage standardized web content to create locally-targeted publications with less time, money and software than ever before. The technology and content would be digital, but the output would be optimized for physical distribution as printable PDF magazines. That concept became Printcasting and it earned us a Knight News Challenge grant.

We're still extremely busy with Printcasting and are working on multiple tracks over the next six weeks before our grant ends. We're finishing up version 2.0 of the Printcasting system on Drupal 6 and preparing to open-source everything, including the Drupal 5 version that powers the existing site. And we're also helping partners, such as Temple University's Philadelphia Neighborhoods in Philadelphia, which just printed 500 copies of its Printcasts and distributed them to the urban neighborhoods it serves. (Read more about what they're up to here). Here's a picture of PhiladelphiaNeighborhoods co-directors Linn Washington and Christopher Harper proudly displaying their first print editions.

PhiladelphiaNeighborhoods.com Printcasts

But we're also planning ahead for what comes after Printcasting. So today, I'm very excited to announce the formation of a new for-profit company and future product called FeedBrewer.

FeedBrewer

We're starting FeedBrewer out with a small bootstrap team, with me as President/CEO and Product Manager, Printcasting designer Don Hajicek as the COO, and Drupal developer Andy Lasda as CTO. Learn more about FeedBrewer and its mission on our site.

In addition to maintaining the free Printcasting.com service, which has been acquired from The Bakersfield Californian by FeedBrewer Inc. in exchange for an equity stake, FeedBrewer will expand Printcasting's democratized-publishing approach to apply to more than just print. We'll be adding additional outputs for smartphones, starting with the iPhone and Blackberry, and tablet computers, including Apple's new iPad.


The FeedBrewer Approach

FeedBrewer is a publishing approach that works with almost any standards-based online publishing system. It can best be described as Publish Once, Distribute Everywhere:

What exactly does that mean? Here's what we say on the FeedBrewer.com home page.

"FeedBrewer is a one-stop shop for designing, publishing and distributing your content on multiple platforms -- including e-readers, mobile devices, e-mail and printable PDF magazines. You can even use it to redesign parts of your existing website. You don't need to change how you publish content now to use FeedBrewer. Simply provide the RSS feed from your blogging tool or content management system, choose a design scheme, and we'll do the rest."

In other words, by simply providing an RSS feed and checking off some boxes for the outputs you want, FeedBrewer will let anyone become a multi-platform publisher in just five minutes.

Rethinking Print as Mobile Content

Sounds a lot like Printcasting, doesn't it? It should, because we're simply expanding the concept of print publishing to portable publishing. In our new thinking, printable content is subsumed under the mobile meme. That may sound like a stretch to some, but it makes sense if you think of print as the original mobile / portable format.

A Printcast on the iPad. In addition, Printcasts already work on mobile devices that display PDFS, such as the iPhone and iPad. They're purely digital products that exist solely in The Cloud up until someone decides to send them to a printer or view them on a mobile device.

To prove this point, here's a picture of a Printcast on an iPad, which I brought up by going to Printcasting.com, clicking into a microsite, and clicking a "Download PDF" button. You have the same experience whether you look at the publication on a tablet like this, or by reading it on paper.

FeedBrewer will use many of the same Drupal modules we created for Printcasting for feed aggregation and designed output. We will simply build additional FeedBrewer modules that can plug into a basic Printcasting installation that will enable output for different mobile devices.

The fact that we can do this speaks to the highly-structured nature of the new Printcasting 2.0 system on Drupal 6 which, once open-sourced, will be able to be used by anyone in this way. We know that we will be one of many different parties using the opens-source Printcasting tools, and as the maintainer of those modules we look forward to seeing what other developers can do with them.

Our Business Model

Since FeedBrewer will be for-profit and no longer grant-funded, its business model will rely on paid services. Starting June 1, we will begin building customized installations of Printcasting and, eventually, FeedBrewer for premium customers. (Interested parties can send us a note via our contact page). But please note that we do plan to continue to maintain free services on Printcasting.com, and eventually FeedBrewer.com. At a future date, we will begin to offer paid upsells for a monthly fee.


This new "software as a service" approach is a departure from our experimental business model for Printcasting, which relied on taking a cut of self-serve advertising revenue. While we will continue to experiment with new advertising revenue models, we see more near-term potential in providing value-added services to publishers who are trying to publish in an increasing number of channels with limited or shrinking resources. They will be able to monetize their publications using their existing ad networks, which is what Printcasting partners have been asking us to do from the beginning.

On the financial front, we are also beginning to reach out to investors. Anyone interested in being a financial partner in FeedBrewer's future can contact us at news@feedbrewer.net, through our contact page.

Looking Ahead, and a Big "Thank You"

What's next? We will begin building out the FeedBrewer tools in June and hope to begin alpha testing this summer. You can enter your e-mail address into this form to be notified as soon as our alpha is ready. And you can stay up to date by subscribing to our blog and Twitter feed.

I would also like to send out a huge Thank You to the John S. and James L. Knight Foundation, whose initial funding of Printcasting made future things possible -- including our new company, but also many other projects that will use the Printcasting code in the future. We recognize the role that philanthropy played in our development, and while we will operate as a for-profit company we feel our future mission is still very much in line with the goals of the Knight News Challenge. Our objective always has been, and will continue to be, to preserve the news and information function of local communities. Mobile is an increasingly important part of that.

We're also thrilled that we'll still be able to work with The Bakersfield Californian, where I started Printcasting in my previous role as Senior Manager of Digital Products. In addition to being a shareholder in FeedBrewer, the Californian is also signing on as our first paying customer. In my six year as a Californian employee, I've been privileged to be allowed to play a critical role in its evolution from a single-product, print-centric newspaper to a multi-platform cross-media information company. My hope is that through FeedBrewer, we can help them and others in the next big transition to portable "anywhere" content.

March 25 2010

09:56

March 18 2010

09:59

March 14 2010

00:48

March 12 2010

16:10

AFP: Online pay model will be ‘critical second revenue stream’ says Sulzberger

New York Times publisher Arthur Sulzberger says that charging for the paper’s online content will provide a “critical second revenue stream”.

Speaking at the Bloomberg BusinessWeek 2010 Media Summit, Sulzberger also reassured readers that the print edition of the paper will continue for many years to come:

It’s a critical part of today, it will be a critical part I think for many years to come (…) The iPad is also going to be a critical part just the way the Kindle’s a critical part.

At the end of the day we can’t define ourselves by our method of distribution (…) What we care about at the end of day is our journalism, our quality journalism.

Full story at this link…

Similar Posts:



January 10 2010

13:06
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl