Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

June 28 2013

15:17

“When Editors Design: Controlling Presentation In Structured Content”

Good piece in Smashing Magazine by Lullabot’s Jeff Eaton on how to build a CMS that privileges structured content while also being useful to editors. (This is some of the same turf we covered with Karen McGrane back in January.) Among the highlights:

— Rather than building a manual layout engine, instead create cues for story priority and let the layout be determined by sorting rules.

When we started talking to the editorial team at a major news website, we learned that they wanted to control where articles appeared on the home page — and all of the website’s topical landing pages as well. When we dug deeper and presented simple prototypes, however, we discovered that they meant something different. What the editors really needed were ways to prioritize and organize content on the home page. On their old website, direct manipulation of each page’s layout was the only tool they had, and they were afraid to lose it.

— Use a mixture of in-article shortcodes and custom fields to balance out the requirement for exact asset placement vs. mere association.

— Don’t ruin your core templates to deal with a few oddball pages that don’t fit; let them live off to the side, taxonomically.

Some smart thinking in here. (Jeff Eaton is also host of the Insert Content Here content strategy podcast.)

May 29 2013

14:00

Scott Lewis: Learning from social platforms to build a better news site

About four years ago, I nervously sat on a roundtable between Madeline Albright and Alberto Ibargüen, CEO of the Knight Foundation. Next to Ibarguen was Marissa Mayer, then an executive at Google. She was co-chair of a commission Knight put together to study the information needs of communities at the height of what seemed like a crisis for news and civics.

During the discussion, Mayer described her vision of a hyper-personalized news stream. News publishers, she said, needed to learn from what social media and YouTube were doing. Here’s how a writeup of the gathering later paraphrased her remarks about a new type of news publishing:

Users could get a constant stream of content based on their interests, on what is good for them or on the popular ethos. They could also introduce serendipity. These streams could be available by subscription. They could also involve hyper-personalized, well-targeted advertising that would be engaging.

While Mayer spoke, Ibarguen leaned over to me. He quietly said I should do that on the Voice of San Diego’s website. He would help if I gave it a go.

And that’s when I got the old same feeling I’ve gotten for years: dread. Once again, I would have to reveal how truly far behind on technology we were. We were almost imposters. Counterparts and leaders in our industry across the nation had called Voice of San Diego digital pioneers. Yet we knew next to nothing about technology and had put a paltry amount of resources into it.

Four years later, Mayer runs Yahoo and now Tumblr. I’d like to think she is heading furiously toward her vision of a hyper-personalized news and content experience. I’d like to think I finally am, too. I just couldn’t afford Tumblr. Or anything.

A mission to educate

Because of how Voice of San Diego started and how we’ve grown, we’ve never built up the kind of capital to make a major investment in technology. If we added resources, it was always writers. Then, the focus was on sustainability, and diversifying the money coming in to make the organization stronger and, frankly, to make payroll.

In fact, resource strain has defined us, and in some ways has been an asset. To do cool things, we needed partners. We created innovative relationships that became national standards. Our paucity obligated us to focus. A focused reporting staff distinguished Voice of San Diego for its investigative work.

Thrift, however, also pushed us to use an affordable content management system to run our website. It was Blox, the main product of the well-run, customer service-oriented TownNews.com in Moline, Ill.

I love TownNews.com. Without TownNews.com, we would not have achieved anything we did. The team there truly made the barrier to entry low and we turned the opportunity it provided us into a local institution. But we were only one of a couple of web-native clients for TownNews.com, which mainly services many hundreds of newspapers. Those newspaper publishers are still focused on one primary mission for their websites: Display daily posts and sell advertising next to them.

That’s not Voice of San Diego’s mission. Our mission is to help people get information. It is an educational mission. That’s why we have the nonprofit status we do.

If your job is to help people get educated, you can’t just display stories. Imagine a university that simply invited students into a room with huge posters and pictures and expected them to find everything they needed. Everywhere I look, news sites remain committed to simply displaying their stories and images. At the same time, social sites keep working on how to serve users.

And we’re watching social media eat news sites’ lunch. We’re gawking at an act of bullying taking place right before our eyes. When newspapers write about Mayer’s dream of well-targeted, engaging advertising and her visions for Tumblr, do they realize that’s money newspapers are not going to get?

Falling short

We’ve fallen many years behind social media platforms in serving users. Some news publishers have ceded the ground completely. They let Facebook run their social layer or rely on YouTube for their video sharing.

I’ve been watching this develop for years. Two years ago, I was positively despondent. I went so far as to dream that Facebook itself would create a content management system for news publishers. I’d be the first to sign up.

How far are we from actual Facebook or Tumblr-based news organization? Are you a news publisher? Ask yourself what your CMS does that Tumblr doesn’t. Mayer’s vision of a hyper-personalized news stream isn’t just something she thinks should happen. It is something that will happen. Are news organizations going to be a part of it?

If so, we have to stop working solely to display our content well and start working to serve our users well. Those are not mutually exclusive, but they are different.

Let me rephrase: If we think our community is going to pay for our services (as many, including Voice of San Diego, The New York Times, and Andrew Sullivan do), then we absolutely have to learn how to serve users.

It doesn’t mean that we compete with social media platforms. That ship has sailed. But social is as much about a way of doing things as it is a technology. Social platforms, for instance, have taught us a few things that users now expect. Here are three:

  • You should expect to be notified if something you “follow” is updated.
  • Anyone should be allowed to submit content. It should be easy to do and its success is dependent on the community.
  • You should be able to relentlessly tailor your feed of information, bringing it closer and closer to what Mayer might call a “hyper-personalized” experience.

So you can see why I was despondent. I was nowhere near being able to be part of this. The best I could hope for was to continue displaying content. Then maybe I could master social media, somehow weaving it all together to serve our users and build a loyal, grateful community.

Making the switch

This is where I was last year when I met Kelly Abbott, who runs Realtidbits, a company that provides the commenting and social layer for sites like ESPN, Cleveland.com, the Irish Times and even Lady Gaga. Abbott went from not knowing about us to one of our most loyal readers and donating members. And then he decided he wanted to help more.

He recommended we switch content management systems. The thought made me nauseous. Anyone who knows CMS transitions knows why. But Abbott persisted. He had the same vision I did and he wanted to tackle it. Voice of San Diego was lucky enough to be a part of a great discussion in this country about the future of local news. We had an obligation to bring our technology up to speed.

Abbott created what he called an “engineer-free zone” for me. We would first solve basic website frustrations I had about mobile, search engine optimization, and commenting. But then we would dream. What would I create if I could?

I wanted to switch from an effort to display content well to one focused on serving users. Sure, our stories, photographs, and images needed to look good but my mission was to get people educated and to raise money to make the service stronger. A local foundation, Price Charities, came aboard to help us with the initiative. Then, we brought along another partner: Idea Melt, a company working to help publishers “imagine and thread beautiful, holistic, and engaging social experiences for your community.” And we chose to switch to WordPress.

Finally, last week we launched. Our stories and images look better. Our search engine indexing is much improved and our mobile experience is improved with a new responsive design. We also added three new features.

  • Notifications: Users can now follow storylines, or “narratives,” on the site. If there’s a new update, they don’t need to search for a section heading, they should see a notification.
  • Peer-to-peer and reader-to-author following: They can also follow individual writers, or even their peers.
  • The Plaza: Here, users can submit text, photos, links or video and their peers can vote on it to buoy it above other submissions. Yes, it’s a lot like Reddit.

All of these features need work and we’re moving furiously on a massive to-do list. But I look at everything with different eyes now. Soon, we’ll begin building our membership system into the site. Our 1,600 members will be able to check their status, learn about events they might want to attend, and get special alerts.

What we have is a new future. We can spend it constantly evolving to serve the community more in line with our mission and our business model.

We’re a long way from the vision Mayer described. But at least we started walking.

Scott Lewis is the CEO of Voice of San Diego. You can reach him at scott.lewis@voiceofsandiego.org or on Twitter at @vosdscott.

December 30 2011

18:30

Clara Jeffery: What nonprofit news orgs are betting on for 2012

Editor’s Note: We’re wrapping up 2011 by asking some of the smartest people in journalism what the new year will bring.

Next up is Clara Jeffery, co-editor of Mother Jones.

Predictions are a chump’s game. So this is more like a window into what the editors of a small nonprofit news organization are betting on.

There is no spoon

Forget distinctions between blog posts and stories because readers don’t care. What they care about is a source — be it news org or author — that they trust and enjoy.

Data viz

We at Mother Jones had a breakout hit with our income inequality charts. 5 million readers, 240K Facebook likes, 14K tweets, and counting. Charts were pasted up on the walls of Wisconsin state capitol during the union fight; #OWS protestors blew them up and put them on signs, and distributed them in leaflets. Partly, it was the right message at the right time. But it was also that a very complicated story was boiled down into 11 charts and that the sources for the charts’ information were provided.

More broadly, in 2011, chart fever swept media orgs — hey, USA Today, you were right all along! In 2012, I am sure we’re not the only ones who are investing in ways to make data more frequent, and more interactive.

Blur the lines between writer/producer/coder

If you want to do visual storytelling, you need people who can marry words with images, animation, video. We’re not only hiring people who have advanced data app and video skills, but we’re also training our entire editorial staff to experiment with video, make charts, and use tools like Document Cloud and Storify to enrich the reader experience. To that end, anything that makes it easier to integrate disparate forms of media — whether it’s HTML5 or Storify — is a friend to journalists.

Collaboration 2.0

There are a number of cool content collaborations out there — MoJo is in the Climate Desk collaboration with The Atlantic, Grist, Slate, Wired, CIR, and Need to Know, for example. But in retooling that project for 2012 (coming soon!), we really started thinking about collaborating with tech or content tool companies like Prezi and Storify. And why shouldn’t news orgs on the same CMS potentially collaborate on new features, sharing development time? So, for example, we, TNR, Texas Monthly, the New York Observer, and Fast Company (I think) are all on Drupal. Is there something we all want? Could we pool dev time and build a better mousetrap? We actually built a “create-your-own-cover” tool that, in keeping with the open-source ethos of Drupal (and because I’m friends with editor Jake Silverstein) we handed over to Texas Monthly; they improved on it. The biggest barrier to collaboration is bandwidth within each constituent group. But ultimately it makes sense to try learn collectively.

Where am I?

As people increasingly get news from their social stream, the implications for news brands are profound. If nobody comes through the homepage, then every page is a homepage. Figuring out when (and if) you can convert flybys into repeat customers is a huge priority — especially for companies that have subscription or donation as part of their revenue stream. If everyone is clamoring for this, then somebody is going to invent the things we need — better traffic analysis tools, but also A/B testers like Optimizely.

It also means that being a part of curation communities — be they Reddit or Longform/Longreads — is as important as having a vibrant social media presence yourself. As is the eye candy of charts, data viz, etc. Lure them in with that, and often they’ll stay for the long feature that accompanies it.

User generated content 2.0

Social media and Storify are making users into content producers in ways that earlier attempts at distributed reporting couldn’t. Especially on fast-breaking stories, they are invaluable partners in the creation process, incorporated into and filtered through verified reporting. For MoJo, for example, the social media implications surrounding our Occupy coverage were profound. We were reporting ourselves, as well as getting reports from hundreds of people on the ground. Some became trusted sources, sort of deputized reporters to augment our own. And we found ourselves serving an invaluable role as fact-checkers on the rumors that swirled around any one incident.

It was heady and often exhausting. But it won us a lot of loyal readers. We could do all that in real time on Twitter and use Storify to curate the best of what we and others were reporting on our site, beaming that back to Twitter. (And Al Jazeera’s The Stream, for example, is taking that kind of social media integration to a whole new level. Of course, it helps to be bankrolled by the Al Thanis.)

Mobile, mobile, mobile

To me, especially within the magazine world, there’s been an overemphasis on “apps,” most of which thus far aren’t so great and are often walled off from social media. But anything that improves — and monetizes — the mobile experience is a win. And any major element of what you’re offering that doesn’t work across the major devices is a sunk cost. Sorry, Flash.

Investigative reporting renaissance

Despite all the hand-wringing of a few years ago, it turns out that people do read longform on the web, on tablets and readers, and even on their phone. They love charts and graphs and animation and explainers. They want to know your sources and even look at primary documents. And they want it all tied up with voice and style. There’s no better time to be an investigative journalist.

July 28 2011

16:24

The post-post-CMS CMS: Loosely coupled monoliths & read-only APIs

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — for example, like the app that Talking Points Memo demonstrates for managing their front page.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app’ or the read/write API handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity?

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths, read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam again in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — like the app that Talking Points Memo demonstrates for managing their front page, for example.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app,’ or the read/write API, handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity.

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths & read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Would love to hear otherwise, or experiences of how others are handling this or thinking about the challenge.

May 26 2011

14:00

Topolsky and Bankoff on Engadget, SB Nation, and the new tech site that’s bringing them together

There can be a very real “through the looking glass” feel to working on a site that covers technology, especially when you start contemplating the technology of publishing. At least, that’s the situation Joshua Topolsky and his group of Engadget expats are finding themselves in as they ramp up to the fall unveiling of a new technology site that will live under the SB Nation flag.

“What we’re building and what we write about are the same thing in many ways,” Topolsky told me. “And for us that provides an incredibly unique point of observation.”

It says a lot about Topolsky, as well as his fellow Engadget-ites Nilay Patel, Ross Miller, Joanna Stern, Chris Ziegler, and Paul Miller, that while they could have spent the intervening time developing their new site in a bunker, they’ve instead decided to get out front and do what they do best, which is covering tech. They’ve been doing that on This is my next, their placeholder blog.

In migrating away from Engadget — and, in that, from the AOL/Huffington Post empire — the attraction to SB Nation, as Topolsky has written, came from the company’s publishing philosophy as much as its evolving publishing technology. As purveyors, chroniclers, and users of technology, Topolsky and his team are now in a unique position to develop a phenomenal tech site. It’s a scenario with Willy Wonk-ian overtones: They’ve been set loose in a candy store.

And yet, Topolsky told me, their aspirations are more modest than fantastical. If anything, they’re not looking to re-invent the blog or news site as we know them. They just want something that’s more adaptive both to the way stories are written and published, and to how audiences actually read them.

“We’re not trying to be Twitter or Facebook, as in this new thing people are using,” he said. “We want to be something that is just the evolved version of what we have been doing.”

The point, he said, is this: Reading on the web is an ever-changing thing, and publishers need to develop or embrace the technology that can respond to its evolution.

Topolsky isn’t releasing much information about the new site at this point, but in terms of his team’s coverage of the tech industry, he told me, they won’t be straying far from their Engadget roots. In many ways, what their Project X represents is an experiment in publishing and engagement technology, which fits in well with SB Nation’s M.O. One of the things they’re likely to be using on the site, for example, is SB Nation’s story streams, which provide constantly updated information on stories while also offering different access points to readers.

Though the site will also need to be able to accommodate things like multimedia (Topolsky said they it might use something similar to The Engadget Show for that, that that dynamic approach to narrative will work well for covering the latest updates on Google’s Android OS, say, or the tribulations of a phone producer like BlackBerry. “You write the news as seems appropriate and connect it automatically to a larger story, encompassing the narrative,” he said.

But what’s just as important as the tech, Topolsky pointed out, is an understanding between the editorial people and the developers, so when you need a new module or feature on the site both sides understand why — and how — it could work. In some of the more frustrating moments at Engadget, Topolsky said, he found himself having to plead his case to AOL developers in order to get site changes made.

That likely won’t be the case at SB Nation, which, as we’ve written about before, is more than willing to experiment with the blog format. It also helps that they’ve secured a healthy dose of new funding. When I spoke with SB Nation CEO Jim Bankoff, he noted that publishing companies are only as successful as the technology and people that comprise them.

“The foundation of our company is the marriage of editorial talent and technology, — sometimes I say people and platform,” he said. “We really believe that to be a new media-first company you have to be based on people who understand how to craft stories online.”

But other than trying to build inventive publishing systems out of the box, what makes the difference for SB Nation is its habit of addressing regular feedback from readers, Bankoff said. The developers at SB Nation, he noted, constantly update the sites based on comments from readers and contributors. If something’s in the service of making a better product, they’ll try it, he said.

Though the audiences for sports news and tech news have their own vagaries, there are some elements — cast of players, data points, and healthy competition — that they have in common. And those will go a long way towards helping to adapt and grow SB Nation’s publishing platform, Bankoff said. “Just like sports, there is an arc to every tech story — and we’re going to be able to really convey the various milestones across any big news item.”

May 20 2011

14:30

This Week in Review: What Twitter does to us, Google News gets more local, and making links routine

Every Friday, Mark Coddington sums up the week’s top stories about the future of news.

Twitter on the brain: Last week, New York Times executive editor Bill Keller got a rise out of a lot of folks online with one of the shortest of his 21 career tweets: “#TwitterMakesYouStupid. Discuss.” Keller revealed the purpose of his social experiment this week in a column arguing, in so many words, that Twitter may be dulling your humanity, and probably making you stupid, too. Here’s the money quote: “But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.”

This, as you might imagine, did not go over particularly well online. There were a couple strains of reaction: Business Insider’s Henry Blodget and All Twitter’s Lauren Dugan argued that Twitter may indeed be changing us, but for the good, by helping make previously impossible connections.

Alexia Tsotsis of TechCrunch and Mike Masnick of Techdirt countered Keller by saying that while Twitter isn’t built for deep conversations, it is quite good at providing an entry point for such discussion: “What you see publicly posted on Twitter and Facebook is just the tip of the conversation iceberg,” Tsotsis said. GigaOM’s Mathew Ingram, meanwhile, defended Twitter’s true social nature, and sociologist Zeynep Tufekci gave a fantastic breakdown of what Twitter does and doesn’t do culturally and socially.

Two of the most eloquent responses were provided by Nick Bilton, one of Keller’s own employees, and by Gizmodo’s Mat Honan. Bilton pointed out that our brains have shown a remarkable ability to adapt quickly to new technologies without sacrificing old capacities. (Be sure to check out Keller’s response afterward.)

Honan made a similar argument: Keller, he said, is confusing the medium with the message, and Twitter, like any technology, is what you make it. “If you choose to do superficial things there, you will have superficial experiences. If you use it to communicate with others on a deeper level, you can have more meaningful experiences that make you smarter, build lasting relationships, and generally enhance your life,” Honan wrote.

Google gets more local with news: Google News unveiled a few interesting changes in the past week, starting with the launch of “News near you.” Google has sorted news by location for a while now, but this feature will allow smartphone users to automatically get local news wherever they are. ReadWriteWeb’s Dan Rowinski explained why newspapers should be worried about Google moving further onto their local-news turf, and GigaOM’s Mathew Ingram criticized newspapers for not coming up with like this themselves.

Poynter’s Jeff Sonderman, on the other hand, said Google’s feature is still in need of some human curation to go with its algorithmic aggregation. That’s an area in which local newspapers can still dominate, he said, but it’ll require some technological catchup, as well as a willingness to get over fears about linking to competitors.

Another change, not publicized by Google News but spotted by the folks at Search Engine Land, was the addition of an option to allow users to filter out blogs and press releases from their results. This raised the question, what exactly does Google consider a blog? Google told Search Engine Land it relies on a variety of factors to make that decision, especially self-identification. Mathew Ingram ripped this classification, and urged Google to put everything that contains news together in Google News and let readers sort it out. (Former Lab writer Zach Seward wrote about the problems with Google News’ blog label back in 2009.)

Fitting linking into news’ workflow: A discussion about linking has been simmering on Twitter on and off over the past few weeks, and it began to come together into something useful this week. This round of the conversation started with a post by web thinker and scholar Doc Searls, who wondered why news organizations don’t link out more often. In the comments, the Chicago Tribune’s Brian Boyer suggested that one reason is that many newspapers’ CMS’s and workflows are print-centric, making linking logistically difficult.

CUNY j-prof C.W. Anderson responded that the workflow issue isn’t much of an excuse, saying, as he put it on Twitter: “At this point ‘linking’ has been around for twenty years. The fact that this is STILL a workflow issue is almost worse than not caring.” This kicked off a sprawling debate on Twitter, aptly chronicled via Storify by Mathew Ingram and Alex Byers. Ingram also wrote a post responding to a few of the themes of resistance of links, particularly the notion that information on the web is inferior to information gained by old-fashioned reporting.

British journalist Kevin Anderson took on the workflow issue in particular, noting how outdated many newspaper CMS’s are and challenging them to catch up technologically: “It’s an industrial workflow operating in a digital age. It’s really only down to ‘that’s the way we’ve always done it’ thinking that allows such a patently inefficient process to persist.” Publish2′s Scott Karp gave an idea for a solution to the CMS mess.

AOL’s continued makeover: Another week, another slew of personnel moves at AOL. PaidContent’s David Kaplan reported that AOL is hiring “a bunch” of new (paid) editors and shuffling some current employees around after its layoff of hundreds this spring. Overall, Kaplan wrote, this is part of the continued effort to put the Huffington Post’s stamp on AOL’s editorial products.

One of the AOL entities most affected by the shifts is Seed, which had been a freelance network, but will now fall under AOL’s advertising area as a business-to-business product. Saul Hansell, who was hired in 2009 to run Seed, is moving to HuffPo to edit its new “Big News” features. In a blog post, Hansell talked about what this means for HuffPo and for Seed.

Meanwhile, the company is also rolling out AOL Industry, a set of B2B sites covering energy, defense, and government. But wait, that’s not all: AOL’s Patch is launching 33 new sites in states targeting the 2012 election. The hyperlocal news site Street Fight also reported that Patch is urging its editors to post more often, and a group of independent local news sites is banding together to tell the world that they are not Patch, nor anything like it.

Reading roundup: As always, plenty of other stuff to get to this week.

— We mentioned a Pew report’s reference to the Drudge Report’s influence in last week’s review, and this week The New York Times’ David Carr marveled at Drudge’s continued success without many new-media bells and whistles. Poynter’s Julie Moos looked at Drudge’s traffic over the years, while the Washington Post disputed Pew’s numbers. ZDNet’s David Gewirtz had five lessons Drudge can teach the rest of the media world.

— A few paid-content items: A Nielsen survey on what people are willing to pay for various mobile services, Poynter’s Rick Edmonds on The New York Times’ events marketing for its pay plan, and the Lab’s Justin Ellis on paid-content lessons from small newspapers.

— A couple of tablet-related items: Next Issue Media, a joint effort of five publishers to sell magazines on tablets, released its first set of magazines on Google Android-powered Samsung Galaxy. And here at the Lab, Ken Doctor expounded on the iPad as the “missing link” in news’ digital evolution.

— Columbia University announced it will launch a local news site this summer focusing on accountability journalism, and the Lab’s Megan Garber gave some more details about what Columbia’s doing with it.

— The Columbia Journalism Review’s Lauren Kirchner had an interesting conversation with Slate’s David Plotz about Slate’s aggregation efforts, and in response, Reuters’ Felix Salmon made the case for valuing aggregation skills in journalists.

— This weekend’s think piece is a musing by Maria Bustillos at The Awl on Wikipedia, Marshall McLuhan, communal knowledge-making, and the fate of the expert. Enjoy.

May 19 2011

14:19

Meet the new CMS/Same as the old CMS

“Workflow and how that is coded into the CMS is a huge issue for newspapers.” — Suw Charman-Anderson

For the past few months, I’ve been hearing a consistent message from some folks working in so-called ‘legacy’ news organization: “our corporate content-management systems and our corporate culture are the main barrier to innovation.”

But I’m starting to wonder if that fenced-in technical reality is leading to fenced-in thinking from those that are in the best position to push for change?

I’ve been around long enough to see some news organizations change their content-management system three or four times. You know what? It didn’t change the way they think about news at all.

From Atex to Movable Type, from NewsGate to Wordpress, from Interwoven to Bricolage: I’ve experienced them all over the last fifteen years. If there’s anything consistent to any CMS, it’s that they all suck. It’s just a matter of which one sucks less in your specific situation.

IMHO, the problem isn’t that news is stuck in the wrong CMS, it’s that new thinking is stuck in something much harder to get out of: the belief that change isn’t possible (or already happening).

Change is happening, it’s just not evenly distributed yet. ;)

There are ‘traditional’ media organizations with wonderful, journalist-friendly enterprise content-management systems out there — just have a look inside of MSNBC or Le Monde. Does it change their thinking dramatically? I’m not convinced.

Innovation can come from anywhere; you just have to be willing to look for it.

Tags: cms mojo

March 31 2011

18:10

March 13 2011

17:55

VoIP Drupal Kicks Off at Drupalcon

Last week I wrote about another project that's come to a boil at the Center for Future Civic Media: VoIP Drupal.

Here is a brief video of Leo Burd lecturing at DrupalCon 2011 on the release of Voip Drupal, a plugin that allow full interaction between Drupal CMS and phones.



VoIP Drupal is a project of the MIT Center for Future Civic Media, with key contributions from Civic Actions.

March 08 2011

15:00

Matt Waite: To build a digital future for news, developers must be able to hack at the core of old systems

Editor’s Note: Matt Waite was until recently news technologist at the St. Petersburg Times, where — among many other projects — he was the primary developer behind Politifact, which won a Pulitzer Prize. He’s also been a leader for the movement to combine news and code in new and interesting ways.

Matt is now teaching journalism at the University of Nebraska and working with news orgs under the shingle Hot Type Consulting. Here, he talks about his disappointment with the pace and breadth of the evolution of coding and news apps in contemporary journalism.

Pay attention to the noise, and you start to hear signal. There’s an awakening going on — quiet and slow, but it’s there. There are voices talking about data and apps and journalism becoming more than just writers writing and editors editing. There are labs starting and partnerships forming. There was a whole conference late last month — NICAR in Raleigh — that more than ever was a creative collision of words and nerds.

It’s tempting to say that a real critical mass is afoot, marrying journalists and technologists and finally getting us to this “Future of Journalism” thing we keep hearing about. I’ve recently had a job change that’s given me some time to reflect on this movement of journalism+programming.

In a word, I’m disappointed.

Not in what’s been done. There’s some amazing work going on inside newsrooms and out, work that every news publisher and manager should be looking at with jealous, thieving eyes. Things like the Los Angeles Times crime app. It’s amazing. The Chicago Tribune elections app. ProPublica’s Docs app. The list goes on and on.

I’m disappointed on what hasn’t been done. Where we, from inside news organizations, haven’t gone. Where we haven’t been allowed to go.

To understand my disappointment, you have to understand, at a very low level, how news gets published and the minds of the people who are actually responsible for the newspaper arriving on your doorstep.

Evolution, but only on the edges

To most journalists, once copy gets through the editors, through the copy desk, and onto a page, there comes a point where magic happens and poof — the paper appears on the doorstep. But if you’ve seen it, you know it’s not magic: It’s a byzantine series of steps, through exceedingly expensive software and equipment, run in a sequence every night in a manner that can be timed with a stopwatch. Any glitch, hiccup, delay, or bump in the process is a four-alarm emergency, because at the other end of this dance is an army of trucks waiting for bundles of paper. In short, it’s got to work exactly the same way every night or piles of cash get burned by people standing around waiting.

Experimentation with the process isn’t just uncomfortable — it’s dangerous and expensive and threatens the very production of the product. In other words, it doesn’t happen unless it’s absolutely necessary and can demonstrably cut costs.

Knowing that, it’s entirely understandable why many of the people who manage newspapers — who have gone their whole professional lives with this rhythmic production model consciously and subconsciously in their minds — would view the world through that prism. Most newspapers rely on gigantic, expensive, monolithic content management systems that function very much like the production systems that print the paper every day. Inputs go in, magic happens, a website comes out. It works the same way every day or there’s hell to pay.

And around that rhythmic mode of operation, we’ve created comfortable workflows that feed it. And because it’s comfortable, there’s an amazing amount of inertia around all of it. Change is scary. The consequences down the line could be bad. We should go slow.

Now, I’m not going to tell you that experimentation is forbidden in the web space, because it’s not. But that experimentation takes place almost entirely outside the main content management system. Story here, news app there. A blog? A separate software stack. Photo galleries? Made elsewhere, embedded into a CMS page (maybe). Graphics? Same. Got something more, like a whole high school sports stats and scores system? Separate site completely, but stories stay in the CMS. You don’t get them.

In short, experiment all you want, so long as you never touch the core product.

And that is the source of my disappointment. All this talk about a digital future, about moving journalism onto the web, about innovation and saving journalism is just talk until developers are allowed to hack at the very core of the whole product. To argue otherwise is to argue that the story form, largely unchanged from print, is perfect and to change it is unnecessary. Hogwash.

The evolution of the story form

Now, I’m not saying “Trash the story form! Down with it all!” The story form has been honed over millennia. We’ve been telling stories since we invented language. A story is a very efficient means to get information from one human to another. But to believe that a story has to be a headline, byline, body copy, a publication date, maybe some tags, and maybe a photo — because that’s what some vendor’s one-size-fits-all content management system tells us is all we get — is ludicrous. It’s a dangerous blind spot just waiting to be exploited by competitors.

I believe that all stories are not the same, and that each type of story we do as journalists has opportunities to augment the work with data, structure, and context. There’s opportunities to alter how a story fits into place, and time. To change the atomic structure of what we do as journalists.

Imagine a crime story that had each location in the crime story stored, providing readers with maps that show not just where the crime happened, but crime rates in those areas over time and recent similar crimes, automatically generated for every crime story that gets written. A crime story that automatically grabs the arrest report or jail record for the accused and pulls it up, automatically following that arrestee and updating the mugshot with their jail status, court status, or adjudication without the reporter having to do anything. Then step back to a page that shows all crime stories and all crime data in your neighborhood or your city. The complete integration of oceans of crime data to the work of journalists, both going on every day without any real connection to each other. Rely on the journalists to tell the story, rely on the data to connect it all together in ways that users will find compelling, interesting, and educational.

Now take that same concept and apply it to politics. Or sports. Or restaurant reviews. Any section of the paper. Obits, wedding announcements, you name it.

Can your CMS do that? Of course it can’t. The amount of customization, the amount of experimentation, the amount of journalism that would have to go on to make that work is impossible for a vendor selling a product to do. But it’s precisely the kind of experimentation we need to be doing.

Building from the ground up

The prevailing notions in newsrooms, whether stated explicitly or just subconsciously believed, is this print-production mindset. Stories, for the most part, function as they do in print — a snapshot in time, alone by itself, unalterable after it’s stamped onto a medium and pushed into the world.

What I’ve never seen is the complete counter-argument to that mindset. The alpha to its omega. Here’s what I think that looks like:

Instead of a single monolithic system, where a baseball game story is the same as a triple murder story, general interest news websites should be a confederation of custom content management systems that handle stories of a specific type. Each system has its own features, pulling data, links, tweets and anything else that can shed light on the topic. Humans + computers. Automated aggregates where they make sense, human judgment where it’s needed. The home page is merely a master aggregation of this confederation.

Each area of the site can evolve on its own, given changes in available data, technology, or staff. It’s the complete destruction and rebuilding of every piece of the workflow. Everyone’s job would change when it came to producing the news.

Crazy, you say? Probably. My developer friends and readers with IT backgrounds are spitting their coffee out right now. But is it any more crazy than continuing to use a print-production approach on the web? I don’t think it is. It is the equal and opposite reaction: little innovation at the core vs. a complete custom rebuilding of it. Frankly, I believe neither is sustainable, but only one continues at mass scale. And I believe it’s the wrong one.

While I was at the St. Petersburg Times, we took this approach of rebuilding the core from scratch with PolitiFact. We built it from the ground up, augmenting the story form with database relationships to people, topics, and rulings (among others). We added transparency by making the listing of sources a required part of an item. We took the atomic parts of a fact-check story and we built a new molecule with them. And with that molecule, we built a national audience for a regional newspaper and won a Pulitzer Prize.

Not bad for a bunch of print journalists experimenting with the story form on the web.

I would be lying if I said that I wasn’t disappointed that PolitiFact’s success didn’t unleash a torrent of programmers and journalists and journalist/programmers hacking away on new story forms. It hasn’t and I am.

But I’m not about to blame programmers in the newsroom. Many that I talk to are excited to experiment in any way they can with journalism and the web. The enemy is what we cling to. And it’s time to let go.

March 04 2011

19:22

Drupal Now Accessible Via Mobile Phone

voip_drupal.png

MIT's Center for Future Civic Media has done a variety of breakthrough civic systems with phones. Examples range from Leo Burd's What's Up platform to the Call4Action class and its cool student projects.

We at C4 love these projects, but working with phones has always been a bear. A lot of programming is necessary. In many cases, people start with the phone and end up building custom infrastructure that begin to represent an actual content management system. Projects like Ushahidi or our earlier txtMob are really just simple CMSs with a few custom features for texting inputs.

So Leo Burd has been working on making the Drupal CMS more friendly for the billions of people around the world who only have access to basic telephony rather than smart phones and the web. Leo is launching the first release of the voice over Internet protocol Drupal platform at DrupalCon next week.

Here's what Leo wrote about this exciting project:

VoIP Drupal is an innovative framework that brings the power of voice and Internet-telephony to Drupal sites.

VoIP Drupal can be used to build hybrid applications that combine regular touchtone phones, web, SMS, Twitter, IM and other communication tools in a variety of ways, including:

* Click-to-call functions
* Voice- and SMS-based Go Out to Vote campaigns
* 2-1-1 and 3-1-1 lines
* Phone-based community surveys
* PTA reminders
* Story recording / playback
* Group voicemail
* Geo-based callblasts aimed at specific streets or locations
* And much more!

In technical terms, the goal of VoIP Drupal is to provide a common API and scripting system that interoperate with popular Internet-telephony servers (Asterisk, FreeSwitch, Tropo, Twilio, etc) dramatically reducing the learning and development costs associated with the construction of communication systems that combine voice and text technologies together.

The following VoIP servers are currently supported:

* Tropo, through the voiptropo.module (available soon)
* Twilio, through the voiptwilio.module

This project is under continuous development. If you would like to get involved in the project or ask questions, discussion is taking place on the VoIP Drupal Group. You can find more information in the VoIP Drupal Handbook.

The VoIP Drupal platform has originally been conceived and implemented by C4, with major contributions from Civic Actions.

19:22

VoIP Drupal

voip_drupal.png

C4 has done a variety of breakthrough civic systems with phones, from Leo Burd's What's Up platform to the Call4Action class and its cool student projects.

We love these projects, but working with phones has always been a bear. A lot of custom programming is necessary, and in many cases people start with the phone and end up building custom systems that begin to represent a CMS. Projects like Ushahidi or our earlier txtMob are really just simple CMSs with a few custom features for texting inputs. So Leo Burd has been working on making Drupal more friendly for the billions of people around the world who only have access to basic telephony rather than smart phones and the web.

Leo is launching the first release of the VoIP Drupal platform at DrupalCon next week.

VoIP Drupal is an innovative framework that brings the power of voice and Internet-telephony to Drupal sites. It can be used to build hybrid applications combining regular touchtone phones, web, SMS, Twitter, IM and other communication tools in a variety of ways, including:

* Voice- and SMS-based Get Out The Vote campaigns
* 2-1-1 and 3-1-1 lines (information hotlines)
* Phone-based community surveys
* PTA or any meeting reminder calls
* Story recording / playback
* Group voicemail
* Geo-based call-blasts aimed at specific streets or locations
* And much more!

As Leo writes:

Technically speaking, the goal of VoIP Drupal is to provide a common API and scripting system that interoperate with popular Internet-telephony servers (Asterisk, FreeSwitch, Tropo, Twilio, etc) dramatically reducing the learning and development costs associated with the construction of communication systems that combine voice and text technologies together.

The following VoIP servers are currently supported:

* Tropo, through the voiptropo.module (available soon)
* Twilio, through the voiptwilio.module

This project is under continuous development. If you would like to get involved in the project or ask questions, discussion is taking place on the VoIP Drupal Group. You can find more information in the VoIP Drupal Handbook.

The VoIP Drupal platform has originally been conceived and implemented by the MIT Center for Future Civic Media, with major contributions from Civic Actions.

19:01

Voip Drupal

C4 has done a variety of breakthrough civic systems with phones, from Leo Burd's What's Up platform to the Call4Action class and its cool student projects.

We love these projects, but working with phones has always been a bear. A lot of custom programming is necessary, and in many cases people start with the phone and end up building custom systems that begin to represent a CMS. Projects like Ushahidi or our earlier txtMob are really just simple CMSs with a few custom features for texting inputs. So Leo Burd has been working on making Drupal more friendly for the billions of people around the world who only have access to basic telephony rather than smart phones and the web.

Leo is launching the first release of the VoIP Drupal platform at DrupalCon next week.

VoIP Drupal is an innovative framework that brings the power of voice and Internet-telephony to Drupal sites. It can be used to build hybrid applications combining regular touchtone phones, web, SMS, Twitter, IM and other communication tools in a variety of ways, including:

* Voice- and SMS-based Get Out The Vote campaigns
* 2-1-1 and 3-1-1 lines (information hotlines)
* Phone-based community surveys
* PTA or any meeting reminder calls
* Story recording / playback
* Group voicemail
* Geo-based call-blasts aimed at specific streets or locations
* And much more!

As Leo writes:

Technically speaking, the goal of VoIP Drupal is to provide a common API and scripting system that interoperate with popular Internet-telephony servers (Asterisk, FreeSwitch, Tropo, Twilio, etc) dramatically reducing the learning and development costs associated with the construction of communication systems that combine voice and text technologies together.

The following VoIP servers are currently supported:

* Tropo, through the voiptropo.module (available soon)
* Twilio, through the voiptwilio.module

This project is under continuous development. If you would like to get involved in the project, or ask questions discussion is taking place on the VoIP Drupal Group. You can find more information in the VoIP Drupal Handbook.

The VoIP Drupal platform has originally been conceived and implemented by the MIT Center for Future Civic Media, with major contributions from Civic Actions.

December 22 2010

22:47

New CMS.. any good?

There's some buzz out there about this new CMS: http://www.gethifi.com/

Has anybody heard anything about it? Can the more technically inclined glean anything interesting about?

Tags: cms hifi

December 16 2010

10:14

WordPress vs Drupal: A great discussion on CMS is unfolding

Two days ago, I wrote a post on the ICT-KM blog on choosing an open source cms and the simple and practical way I went about evaluating different CMS solutions.

What has emerged is a discussion (and even a bit of debate) via the post comments from IT managers, information and knowledge managers.

Join the discussion on WordPress vs Drupal: Choosing an Open Source CMS!

November 10 2010

18:30

Talking Points Memo’s first developer talks startup life, jumping to ProPublica and data journalism

What’s it like being the only in-house techie at a news startup? Talking Points Memo’s first developer Al Shaw says “it’s kind of like being a reporter….you have to be a generalist,” doing everything from ad-side work to election-night interactives.

Shaw was the primary technical force behind most of the bells and whistles that cropped up at TPM over the past two years, including a redesign that lets producers switch up the layout of the homepage, and an array of slick interactives like the real-time election results tracker that made TPM look a lot less like a scrappy startup and more like an establishment outlet on Election Night earlier this month. (Shaw is quick to explain he had some help on the election map from Erik Hinton, TPM’s technical fellow.) He’s also been good about blogging about his technical endeavors in ways that could be useful to his peers at other news organizations.

Shaw announced last month he is leaving TPM to start a new gig at ProPublica, where he’ll keep working on data-driven journalism. On one of his few days off between jobs, I talked with him about what it’s like working for a news startup, what he hopes to accomplish at ProPublica, and where he thinks data journalism is headed. Below is a lightly edited transcript. (Disclosure: I used to work at TPM, before Al started there.)

Laura K. McGann: How did you approach your job at TPM? What did you see as your mission there?

Al Shaw: When I started, I came on as an intern right before the ’08 election. At that point, they didn’t have anyone in house who really knew much about programming or design or software. I came on and I saw an opportunity there because TPM is such a breaking-news site, and their whole goal is to do stuff really fast, that they needed someone to do that, but on the technology side, too.

I had a big role in how we covered the 2008 election. We became able to shift the homepage, rearrange stuff. Being able to really elevate what you can do in blogging software. That was kind of the first foray. Then I started redesigning some of the other sections. But the biggest impact I had was redesigning the homepage. That was about a year ago. I had the same goal of being able to empower the editors and nontechnical types to have a bigger palette of what they can do on the site. I created this kind of meta-CMS on top of the CMS that allowed them to rearrange where columns were and make different sections bigger and smaller without having to get into the code. That really changed the way the homepage works.

There is still Movable Type at the core, but there’s a lot of stuff built up around the sides. When we started to build bigger apps, like the Poll Tracker and election apps, we kind of moved off Movable Type all together and started building in Ruby on Rails and Sinatra. They’re hosted on Amazon EC2, which is a cloud provider.

LKM: What have you built that you’re the most proud of?

AS: Probably the Poll Tracker. It was my first project in Rails. It just had enormous success; it now has 14,000 polls in it. Daily Kos and Andrew Sullivan were using it regularly to embed examples of races they wanted to follow and it really has become a central part of TPM and the biggest poll aggregator on the web now. I worked with an amazing Flash developer, Michiko Swiggs, she did the visual parts of the graph in Flash. I think a lot of it was really new in the way you could manipulate the graph — if you wanted to take out certain pollsters, certain candidates, methods, like telephone or Internet, and then you could see the way the trend lines move. You can embed those custom versions.

I think the election tool was also a huge success [too], both technologically and on the design and journalism side. We got linked to from Daring Fireball. We also got linked to from ReadWriteWeb and a lot of more newsy sites. Andrew Sullivan said it was the best place to watch the elections. Because we took that leap and said we’re not going to use Flash, we got a lot of attention from the technology community. And we got a lot of attention from kind of the more political community because of how useable and engaging the site was. It was kind of a double whammy on that.

LKM: What was your experience working with reporters in the newsroom? TPM is turning ten years old, but it’s still got more of a startup feel than a traditional newspaper.

AS: It’s definitely a startup. I would fade in and out of the newsroom. Sometimes I’d be working on infrastructure projects that dealt with the greater site design or something with the ad side, or something beyond the day-to-day news. But then I’d work with the reporters and editors quite a bit when there was a special project that involved breaking news.

So for example, for the Colbert-Stewart rallies we put up a special Twitter wire where our reporters go out to the rallies and send in tweets and the tweets would get piped into a special wire and they’d go right onto the homepage. I worked with editors on how that wire should feel and how it should work and how reporters should interact with it. I remember one concern was, what if someone accidentally tweets somethng and it ends up on the homepage. How do we delete that? I came up with this system with command hashtags, so a reporter could send in a tweet with a special code on it which would delete a certain tweet and no one else would know about that, except for the reporter.

A lot of the job was figuring out what reporters and editors wanted to do and figuring out how to enable that with the technology we had and with the resources we had.

LKM: I remember an instance in my old newsroom where we had a tweet go up on the front page of another site and the frantic emails trying to get it taken down.

AS: Twitter is such an interesting medium because it’s so immediate, but it’s also permanent. We’re having a lot of fun with it, but we’re still learning how best to do it. We did this thing called multi-wire during the midterms, which was a combination of tweets and blog posts in one stream. There was a lot of experimentation with: When do we tweet as compared to a blog post? Should we restrict it to certain hours? That was a really interesting experiment.

LKM: What emerging trends do you see going on in data-driven or interactive journalism?

AS: It’s really good that a lot of sites are starting to experiment more with data-driven journalism, especially as web frameworks and cheap cloud hosting become more prevalent and you can learn Rails and Django, it’s really easy to get a site up that’s based around data you can collect. I do see two kind of disturbing trends that are also happening. One is the rise of infographics. They may not be as useful as they are pretty. You see that a lot just all over the place now. The other problem you see is the complete opposite of that where you’ll get just a table of data filling up your whole screen. The solution is somewhere in between that. You have a better way of getting into it.

It’s really great that there’s kind of a community forming around people that are both journalists and programmers. There’s this great group called Hacks/Hackers that brings those two cohorts together and lets them learn from each other.

LKM: How about at ProPublica? You mentioned you aren’t sure entirely what you’re going to do, but broadly, what do you hope to accomplish there?

AS: I’m most excited about working more closely with journalists on data sets and finding the best ways of presenting those and turning them into applications. That was one thing I was able to do with Poll Tracker, but it didn’t seem like TPM had as big of a commitment to individual stories that could have side applications. Poll Tracker was more of a long-running project. ProPublica is really into delving deeply onto one subject and finding data that can be turned into an application so the story isn’t just a block of text, there’s another way of getting at it.

One of the other things they’re working on is more tools for crowdsourcing and cultivating sources. I know that they want to start building an app or a series of apps around that. And they’re doing some cool stuff with Amazon Mechanical Turk for kind of normalizing and collecting data. I’m sure there’s going to be a lot more fun stuff to do like that.

August 05 2010

14:04

Open Source CMS: A Net2Camb Event Wrap-up from Will Hall

Besides my role with NetSquared globally, I also organize a monthly NetSquared event locally, in Cambridge, UK. The July Net2Camb event was led by Will Hall, a PHP web developer and open source enthusiast. He discussed the options, benefits, and risks associated with using open source content management systems for SMEs, charities and NGOs.

Will has kindly written a wrap-up of the event to share with you, and included his presentation slides for your reference:

read more

July 28 2010

11:36

BBC News redesign architect gets technical about changes

If you are more interested in the cogs and wheels behind the BBC News site’s redesign than the end product, a post by their chief technical architect John O’Donovan this week should be of interest.

The BBC has one of the oldest and largest websites on the internet and one of the goals of the update to the News site was to also update some of the core systems that manage content for all our interactive services.

O’Donovan first outlines the reasoning behind keeping with a Content Production System (CPS), rather than moving over to Content Management System (CMS), before giving a detailed look at the latest model – version 6 – that they have opted for.

The CPS has been constantly evolving and we should say that, when looking at the requirements for the new news site and other services, we did consider whether we should take a trip to the Content Management System (CMS) Showroom and see what shiny new wheels we could get.

However there is an interesting thing about the CPS – most of our users (of which there are over 1,200) think it does a pretty good job [checks inbox for complaints]. Now I’m not saying they have a picture of it next to their kids on the mantelpiece at home, but compared to my experience with many organisations and their CMS, that is something to value highly.

The main improvements afforded by the new version, according to O’Donovan, include a more structured approach, an improved technical quality of content produced and an ability to use semantic data to define content and improve layouts.

See his full post here…Similar Posts:



June 17 2010

09:27

E&P: Media companies in three countries now using controversial Atex system

Editor & Publisher this morning reports that a total of six media organisations, across three countries, have transferred to the controversial Atex content management system in recent weeks.

The CMS is now being used by the Calgary Herald, part of Canwest in Canada; military news source Stars and Stripes in Washington; Erdee Media Groep in The Netherlands; The Sun in Arizona; The Providence in Rhode Island and The Star-Ledger in New Jersey.

Just this week Journalism.co.uk reported that the NUJ had raised strong concerns over Johnston Press’s move to Atex, which it claimed “undermines the editorial independence of editors”.

Full story at this link…

Similar Posts:



June 16 2010

11:50

Johnston Press Atex system is bad news, but the death of the sub-editor is inevitable

It’s not just journalists that threaten to go on strike to maintain the standards of their work – but surely no other occupation’s products can be judged so subjectively. One managing director’s “quality journalism” is a reporter’s incitement to take up arms and storm the parent company’s HQ.

According to the National Union of Journalists, it’s this urge that saw Johnston Press journalists vote for group-wide industrial action last month (they were thwarted by a High Court challenge; a re-ballot is underway). JP journalists are enraged that a new publishing strategy, based on an online/print content management system (CMS) called Atex, will make reporters responsible for subbing and editing their own newspaper stories using pre-made templates. Several companies including Archant are either using or considering using the same system.

The NUJ has a point: with fewer staff and less checks and balances, more errors will get through – this aberration of a front page in the JP-owned Bedfordshire Times & Citizen recently is a classic example.

Yesterday I questioned exactly why the union was opposing Atex; included in the union’s greviances were baffling and unexplained “health and safety” concerns. The union later told Journalism.co.uk that they meant that it adds to staff stress levels.

But, I went on in conversations both online and privately, isn’t this part of a wider problem? The NUJ has a fundamental belief that sub-editors should sub stories and reporters write them. Like the pre-Wapping ihousen-printers that jealously guarded their very specific, outdated roles, the ideal outcome for the union is to maintain the status quo and protect jobs.

The reality isn’t quite that simple. Atex, as more than one person said, is far from the innovative answer that newspapers need. One person with knowledge of how Atex works, who works for a company that is planing to implement it and asked not to be named, put it to me like this:

We’re still in transition in my newsroom at the moment – we haven’t switched to using it for the web yet. However, if the system goes ahead as planned we will not be able to insert in-line links into stories, nor will we be able to embed content from anywhere else online. It’s possible to build link boxes that sit next to web stories, but it’s time consuming compared to in-line links – and if our current CMS is anything to go by, in the press of a busy newsroom, it won’t get done.

That sounds like a retrograde step. Far from holding back innovation, it sounds like JP journalists are right to oppose the move. This is from a company whose former chairman of nine years, Roger Parry, last week criticised the very board that he chaired for not investing enough in digital media (via Press Gazette). Exactly who else is there to blame?

But it gets worse:

For those of us who possess data skills and want to make mashups, visualisations and so on, this is a massive inhibition – even if we find the time to innovate or create something really special for our papers, we’ll have no outlet for it. It also means we can’t source video or images for our stories in innovative ways – no YouTube embeds or Flickr slideshows – cutting us off from huge resources that could save time, energy and money while enhancing our web offering.

It’s astonishing that we’re even considering such a backwards step to a presumably costly proprietory system when so many cheaper, more flexible, open source solutions exist for the web.

Regional reporters, web editors and even overall editors will read that and find this frustration of digital ideas by technical, budgetary limitations very familiar. The last point rings loudest of all: cheap, dynamic blogging solutions like Wordpress and Typepad provide all newsrooms need to create a respectable news site. Publishing executives seem to find it hard to believe that something free to use can be any good, but just look at what’s coming in the in-beta Wordpress 3.0 (via @CasJam on Mashable).

So the union’s misgivings in this case appear to be well placed. The drop in quality from Johnston’s cost-cutting is there for all to see in horrendous subbing errors, thinner editions and entire towns going without proper coverage.

Unfortunately, journalists have to accept that no amount of striking is going to bring back the staff that have gone and that times have changed. Carolyn McCall’s parting shot as CEO of Guardian Media Group was to repeat her prediction (via FT.com) that advertising revenues will never return to pre-recession levels – and don’t forget Claire Enders’ laugh-a-minute performance at the House of Commons media select committee, in which she predicted the death of half the country’s 1,300 local and regional titles in the next five years.

Regional publishers may not all have a solution that combines online editorial innovation with a digital business model right now. But to get to that point, reporters will have to cooperate and accept that their roles have changed forever – “sub-editor” may be a term journalists joining the industry in five years will never hear.

this is from a company whose former chairman of nine years criticised the very the board that he ran for not investing enough in digital media (via Press Gazette).

Similar Posts:



Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl