Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 02 2013

14:58

Tuesday Q&A: Storify’s Burt Herman on entrepreneurial journalism, advertising, and finding the right business model

burthermanWhen you run a startup that leans on journalism, the hunt for a stable business model is top of mind. Burt Herman, cofounder of Storify, said he feels an urgency to find ways to monetize the service, which helps individuals and publishers collect and curates social media into stories. That’s in part because Storify is now three years old, but also because Herman has more than a decade of experience as a journalist working for the Associated Press — meaning he’s seen the disruption of the media business up close.

Last week, his company took its first step towards a business model: Storify announced the creation of Storify VIP, a new paid version of the service that offers a new tier of features and customization for users. The VIP program is designed with big publishers — who have an army of journalists and money to spend — in mind. The BBC has already signed up.

I spoke with Herman about the decision to create a premium version of Storify, how the company might explore advertising, and where he sees entrepreneurial journalism going this year. Here’s a lightly edited transcript of our conversation.

Justin Ellis: When you were looking at ways to monetize, were there other models or options you looked at before deciding on the premium tier?
Burt Herman: We are looking at all potential business models. There basically are two models we see as ones we could use. There’s some kind of subscription or a freemium/pro/VIP plan where we ask some of our users what they would like and offer these premium features. We’re quite fortunate in that we have users who are large publishers and brands and PR agencies, political organizations, NGOs, and all kinds of people like that. They’re interested in these features and have come to us asking for some of these things. That’s a clear way we can now give them something better that they want, and also make sure this is something sustainable.

On the other side, there’s definitely an advertising model we’ve talked about. And it’s still something we kind of have out there for the future. The idea there is to come up with a native form of ad that goes in a Storify story — that is a social ad, like other things in Storify stories. It could be a promoted quote, or a promoted video, or a promoted photo from a brand that is trying to get a certain message out there.

That’s still something we’re talking about. But that requires a larger scale, and being able to sell a specific new form of advertising. But if we do that, we’d also want to do it in a way that works together with our users, and share revenue back with the people creating the stories. That’s really the most valuable thing, and we’re really lucky we’ve gathered this community of amazing people who, everyday, find the best of what’s out there.

Ellis: One of the questions with advertising would be who controls what ads are served — if companies or brands go through Storify, or if they go through publishers directly.
Herman: Yeah, and we could do it both ways. The thing we look at is YouTube — how they have embeds all over the web, and sometimes have advertising in those as well. We would want to work, obviously, with our users on that, who are their advertisers, does it conflict with other ads on the page, and other issues.

We do think there is room for this new form of advertising. We’ve talked about different ways of doing this: It could be more like we promote content to the user creating a story, and whether they want to put that in the story is their own decision. But it’s very clear that’s promoted in some way — that someone is paying to get in front of the eyes of our valuable user base. That is something we have experimented with a little bit, and it is quite an interesting model to look at — not advertising to the masses but advertising to this more elite user base.

Ellis: You’ve said you have more than 600,000 people using Storify now. How did you think about what types of features you would bring up to the premium level? Ideally, you want to create added value in the service but not take away from the things other non-paying users want.
Herman: Well, a lot of these things are things people have asked for, like customization. We’ve offered some things and see what people do with it, and had some people use it for different events, including The New York Times, Yahoo, the BBC. They’re already doing these things, so we’re responding to what they say.

We didn’t intend to be a live-blogging platform, but people have been using us in that way, which is great. So we want to serve that need too. That’s something that can be quite expensive, to service live updates on embeds that are being viewed hundreds of thousands or billions of times around the web. That’s a pretty technically intensive thing, so just to make it sustainable to us, that’s why we’re putting that in the premium tier of features.

Ellis: What’s been most surprising to you about the ways people have found to use Storify? That idea of using Storify for live-blogging seems like a MacGyvering in a way.
Herman: We did think about live stories, in a way, from the start. I worked at the AP for 12 years and that’s what I did all the time — take stories, update them whenever news comes in, move things around, take out quotes, add new quotes. That’s always what we’ve done.

But it’s the story in place that gets changed, which I still would be interested in seeing people thinking about more. Newspapers do that, but they just don’t show you that they’re doing it. Or the next day, they’ll just post a new story, because they’re still in this daily cycle. But what if the story itself was just in one place and kept changing over time as developments occur? I think that’s the idea we had originally.

I thought, initially, journalists will use this and see, “Oh, the Supreme Court is hearing the gay marriage case,” and just see what people are saying in general and mine the best — look for who’s reacting, and kind of pull things in. The thing I did not expect to see, which people have used Storify for, is to say, “Hey, we’re just starting this story, send us what you think about it and use this hashtag on Instagram, on Twitter, respond to us on Facebook, we’ll take the best thing you do and put them in a story and publish it.” It’s much more of an engaging way of creating a story — where it’s not just gathering reaction, but tell us what we should put in the story, we’re going to include what our audience is doing.

The New York Times has done some really interesting things with Instagram — like during storms, the big winter storm in February, or Fashion Week in New York, asking their readers, “Hey, send photos on Instagram, tag them #NYTfashionweek and we’ll put the best ones on The New York Times.” I think it’s really cool to see journalists getting this idea that yes, this is not just a one-way thing anymore — we don’t just decide what we write and call the people we want and put it out there. Now it’s really working collaboratively with the audience to create something bigger.

Ellis: As a journalist, what’s it been like for you to watch news organizations embrace new ways to create stories?

Herman: When I talk about this, I say it’s really like what journalists have always done. We’ve always taken bits of information, whether it’s a press release, or a federal budget, or your notes, or your audio, and pieced it together to tell a story. Now we just have so many more sources potentially to mine for our stories. So many more voices of people that you can include, that you might not have otherwise heard from. I think this is something more news organizations are realizing, and I think it’s a great way to be relevant with your audience again — “Hey, we hear you, we are listening to what you say.”

How can you not want to do this? As a journalist, I was always wanting to know what are people talking about, what are the stories that I’m missing that are out there. Now you can see what people are talking about, at least a segment of people, using social media. That’s a large group of people, and growing all the time. I just think: How could you not embrace that and look at that if you’re a journalist who wants to get the stories that are out there?

Ellis: Storify also gives tweets and other social media a little more permanence. If I’m following a hashtag on the vice presidential debate, I could theoretically go back and read through it, but it’s happening so fast. You guys capture that.
Herman: We picked the name Storify because it was this word used at the AP when editors would tell you to write a story about something, to “storify” it. It really is a word that means “to make a story.” But also, sometimes people see it more here in Silicon Valley and they think “Storify, oh, you’re like a storage company.” Which, in some ways that is true too. That is a lot of what people actually use Storify for in a way we didn’t foresee: simply being able to stop time and save some stuff from this never-ending deluge of tweets and photos and videos from all these social networks. Just being able to pause, take those out, and organize them in one place is kind of valuable.

There’s not a simple way to do that and just make it look nice, or to keep it for yourself or a smaller group. That’s another reason why we’re planning to launch things now like this private story feature. We noticed people simply saving stuff without adding any text in a story, or just saving drafts and never publishing stories because they wanted to keep it somewhere and refer to something, or show it to somebody.

We’re just inundated by all this media now. Everybody has the power to create things and publish easily, instantly, all around the world. It’s great, but it’s getting harder and harder to figure out where the valuable stuff is in all of that.

Ellis: What trends do you see in Storify usage? In terms of people gearing up for big events or big stories?
Herman: We are very aligned with what you would think of as peaks on Twitter or social media, of people talking about things. Definitely the election, the Supreme Court hearing the same sex marriage cases. Certain topics are very resonant on social media and obviously for us too, those are peak things, and that seems to be when people think to use us.

We hope that people also think to use us in other cases when it isn’t just mining what’s out there when it’s a huge event — a smaller, local scale, or asking the audience to help find stories. We’re seeing more of that. That’s also why we wanted to move in this direction we’re launching, to work more closely with people and be more embedded in their organizations too, so it’s not just the social media editor who says, “Hey, there’s this Supreme Court thing — can you get a reaction thing on the blog?”

Ellis: The premium service represents a focus on establishing a business model. For some startups, finding a business model is a “further down the road” idea. How pressing has it been for you to monetize Storify?

Herman: I think there’s been a shift out here in Silicon Valley in terms of thinking about startups and business models. They just had the recent class of Y Combinator, and The Wall Street Journal wrote a post saying none of the companies are doing social media, they’re businesses, which have a built-in business model where you pay somebody for something.

I think it’s definitely kind of shifted here, people are wanting to see the business model in what you’re doing. Unless you have massive, massive scale, you have to have a business model. We are lucky the users we have, more than 600,000 people, are amazing, high-level users. That’s why, as we look at that, we say, “Okay, let’s figure out how we can make this more sustainable and work with them and hopefully help give them some of the things they want. But also make sure we can survive into the future. “

People seem to understand that now. People have grown a little skeptical of companies that don’t seem to have a business model and you wonder when they’re going to do something. So far, the reaction has been hugely positive — I think people understand why we’re doing this.

Ellis: Do you think there needs to be more support for startups that are in this kind of journalism or journalism-adjacent area like Storify? I’m thinking about something like Matter, which is sort of a combination of the Knight News Challenge and Y Combinator.
Herman: I was just at Matter earlier this week talking to the companies there. They’re doing it in a smart way. They are saying yes, it should be related to media, but you can do something that has broader relevance. It can be for-profit, it doesn’t have to be nonprofit just because it’s sort of connected with public radio. I think if you make it too narrow — just for journalism — then you might have a problem in terms of thinking really big. When you’re doing a startup, you should be thinking as big as possible. I guess it would be difficult to limit things — it’s better not to impose that on startups from the start.

We do need things related to media, but I think people will go there. It is still a huge business — billions of dollars are spent on advertising on the web, and even in print still. Startups will go there. I think there are a lot of incubators, Matter and other people, who are focusing on that.

I guess I’m worried that when you support things and force them to be nonprofit or open source, which some of the Knight News Challenge grants did earlier, that it limits the potential of some of these organizations. I love Spot.us, and Dave Cohn is a great guy, and I always think of it as he had the idea for Kickstarter before it existed. But it was limited because it had to be open source and nonprofit and only in a local area. There were all these constrictions on how he was supposed to operate. He had some success, but what could have been if he wasn’t limited in that way? I just think any of these new things should not limit people and Matter is definitely not doing that.

Ellis: Now that you’ve reached this point with Storify, is there something you know now you wish you knew when you were launching?
Herman: I guess I would say it’s different than being a journalist. Things take much longer than you would think, even though people say startups are very fast-paced — often times technology is slow and has debugging issues. Getting a process for people to work together is not an easy thing because you’re not really sure how to do things, because you’re inventing them for the first time. Be patient and realize that this is a longer journey and not a sprint. You get fooled sometimes reading these supposed overnight success stories. But when you look into them, often times it’s somebody who’s been going back for years trying to work their way through different products and pivots, and finally figuring out something that people notice. Really, if it was an overnight success, it was built over years.

July 30 2012

20:41

Rafat Ali on building a media company on top of public data

Ten years ago, Rafat Ali wanted to build a company that could chronicle the transformation of media and technology. Now he hopes to do it again, this time in the world of travel. His new project Skift sounds at first a lot like paidContent: a mix of original reporting and aggregation tailored for a savvy, niche, information-hungry audience.

But this time around, Ali is placing his bet less on a stable of journalists and more on a team of product designers, developers, and, yes, journalists. Skift wants to be a media company in the same way Politico or Bloomberg is a media company: an information provider with a news wrapper. Skift bills itself as a “travel intelligence media company,” not a standalone news site.

Ali told me the way Skift will grow its audience and its fortunes will be through information services, not just news. “What we’re trying to do with Skift is scale quickly on content,” he said. “The more fortunate part for us is everybody covers travel in bits and pieces, so for us it’s a matter of bringing it all together.”

But Skift is more than just a curator of travel news. Ali and cofounder Jason Clampet want to collect a vast data library to build tools that would be useful not just to the travel industry but anyone hoping in a car, train or bus to get away. Ali has funded the site himself up to launch and has raised $500,000 from investors like former Wall Street Journal publisher Gordon Crovitz and former MySpace president Jason Hirschhorn. As designed, Skift — the Swedish word for “shift” — would be a media company built on a stable of products, not just content. “We’re starting with what we’re grandly sort of calling the world’s largest data warehouse of publicly available travel industry data,” he said.

That means things like visit and occupancy information that tourism boards report to the government, departures and delay information from airports, and flight data supplied by airlines to agencies like the FAA. It’s the type of information typically hidden away in Excel spreadsheets on seldom visited agency websites. “We’re gonna try and collect it, clean it, normalize it, put it in a dashboard that humans can understand, and then build services on top,” he said. He said they will also create APIs for the travel data they harness on the site.

Skift has a staff of four, including Ali, and they’ll be announcing the hire of a product development person soon. Ali stresses that as Skift grows they will hire more writers — but the writers will be focused on original reporting, not the things aggregation and curation can pick up more easily. Ali said curation is still an undervalued asset that can prove useful to content creators as well as their audiences. The day-to-day news of airlines’ fuel prices or the ebb and flow of tourism can be aggregated from elsewhere. Ali wants the site to chase the big stories, the airline bankruptcies and innovations in travel tech. “We’ll not get there in the next year, but we’ll get there in due time,” he said.

Ali has been around the media game for a while, having sold paidContent to The Guardian in 2008 and left the company in 2010. He’s gained an insight into how a media business can stay viable today. Focusing on a niche audience is one method of doing that, especially if that audience is highly engaged and willing to spend. Business travelers and travel industry executives are just such an attractive bunch. “We look at business travelers, professional travelers a bit like tech fanboys, where they like to consider themselves like experts in what they’re doing,” he said.

Ali said it’s not enough to simply provide people with news — it has to be valuable or actionable information. It also helps if you can package multiple resources together. In the travel industry, businesses are divided into areas like editorial (travel guides), transactional (booking flights and hotels), and organizational (plan your trip, track your flight). But there’s a fair amount of randomness that goes along with that. You may look up art museums through Frommers, find your flight through Hipmunk, and use GateGuru to navigate airports. “With Skift, on the business traveller side, we’re trying to take the randomness out of the equation and make a more directed way of delivering information,” he said.

Also in the long-term plans for Skift: a membership or subscription service. Ali believes the possibility of better data tools for travel is a step in that direction. But another option would be to create events, something paidContent has had success with. Ali said the they plan to hold one major event, a travel analog to All Things D’s D conference, which would appeal to travel industry executives, travelers, and technologists. “We’re trying to build a crossover media brand, a new kind of media company where the underpinning will be data, and then media as the layer on top of it,” he said.

Image of Rafat Ali by Brian Solis used under a Creative Common license.

July 25 2012

15:12

Who should see what when? Three principles for personalized news

I really don’t know how a news editor should choose what stories to put in front of people, because I don’t think it’s possible to cram the entire world into headlines. The publisher of a major international newspaper once told me that he delivers “the five or six things I absolutely have to know this morning.” But there was always a fundamental problem with that idea, which the Internet has made starkly obvious: There is far more that matters than any one of us can follow. In most cases, the limiting factor in journalism is not what was reported but the attention we can pay to it.

Yet we still need news. Something’s got to give. So what if we abandon the idea that everyone sees the same stories? That was a pre-Internet technological limitation, and maybe we’ve let what was possible become what is right. I want to recognize that each person not only has unique interests, but is uniquely affected by larger events, and has a unique capacity to act.

If not every person sees the same news at the same time, then the question becomes: Who should see what when? It’s a hard question. It’s a question of editorial choice, of filter design, of what kind of civic discussion we will have. It’s the basic question we face as we embrace the network’s ability to deliver individually tailored information. I propose three simple answers. You should see a story if:

  1. You specifically go looking for it.
  2. It affects you or any of your communities.
  3. There is something you might be able to do about it.

Interest, effects, agency. These are three ways that a story might intersect with you, and they are reasons you might need to see it.

But turn them around and they say: if a story doesn’t interest me, doesn’t affect me, and there’s nothing I could do anyway, then I don’t need to see it. What about broadening our horizons? What about a shared view of unfolding history? The idea that we will each have an individualized view on the world can be somewhat unsettling, but insisting on a single news agenda has its own disadvantages. Before getting into detailed design principles for personalized news, I want to look at how bad the information overload problem actually is, and how we came to believe in mass media.

Too much that matters

A solid daily newspaper might run a couple hundred items per day, just barely readable from cover to cover. Meanwhile, The Associated Press produces about 15,000 original text stories every day (and syndicates many times that number) — far more than one person can consume. But the giants of journalism are dwarfed by the collaborative authorship of the Internet. There are currently 72 hours of video uploaded to YouTube every minute, which now houses more video than was produced during the entire 20th century. There are 400 million tweets per day, meaning that if only one tweet in a million was worthwhile you could still spend your entire day on Twitter. There are several times more web pages than people in the world.

All of this available information is a tiny fraction of everything that could be reported. It’s impossible to estimate what fraction of stories go “unreported,” because there is no way to count stories before they’re written; stories do not exist in nature. Yet from the point of view of the consumer, there is still far, far too much available. Ethan Zuckerman has argued that the limiting factor in foreign reporting is not journalistic resources, but the attention of the consumer. I suspect this applies to most other kinds of journalism as well; raise your hand if you’ve been carefully following what your city council is up to.

Compared to the news, there is simply very little attention available.

For the single-issue activist, the goal is attention at any cost. But editors have a different mission: They must choose from all issues. There is a huge number of potentially important stories, but only a tiny fraction can be “headlines.” Most stories must languish in obscurity, because you or I cannot hope to read a thousandth of the journalism produced each day. But even the flood of global journalism is a tragically narrow view on the world, compared to everything on the Internet.

How, then, should an editor choose what tiny part of the world to show us? Sometimes there is an event so massive, so universal, it demands attention. Natural disasters and revolutions come to mind. For all other stories, I don’t think there is an answer. We can’t even agree on what problems are important. No single set of headlines can faithfully represent all that matters in the world.

There is more than one public

The Internet is not like broadcast technology — print, radio, TV. But the routines and assumptions of journalism were formed under the technical constraints of the mass media era. I wonder if we have mistaken what was possible for what is desirable.

The first technical limitation I want to consider was this: Everyone had to see the same thing. This surely reinforced the seductive idea that there is only one “public.” It’s an especially seductive idea for those who have the ability to choose the message. But there’s something here for the rest of us too. There’s the idea that if you pay attention to the broadcast or read the daily paper, you’re informed. You know all there is to know — or at least everything that’s important, and everything everyone else knows. Whatever else it may be, this is a comforting idea.

Media theorists also love the idea of a unified public. Marshall McLuhan was enamored with the idea of the global village where the tribal drums of mass media informed all of us at the same time. Jürgen Habermas articulated the idea of the public sphere as the place where people could collectively discuss what mattered to them, but he doesn’t like the Internet, calling it “millions of fragmented chat rooms.”

But the idea of a unified public never really made sense. Who is “us”? A town? A political party? The “business community”? The whole world? It depends on the publication and the story, and a few 20th-century figures recognized this. In The Public and Its Problems, written in 1927, John Dewey provided an amazing definition of “a public”: a group of people united by an issue that affects them. In fact, for Dewey a public doesn’t really exist until something affects the group interest, such as a proposed law that might seriously affect the residents of a town.

We can update this definition a little bit and say that each person can belong to many different publics simultaneously. You can simultaneously be a student, Moroccan, gay, a mother, conservative, and an astronomer. These many identities won’t necessarily align with political boundaries, but each can be activated if threatened by external events. Such affiliations are fluid and overlapping, and in many cases, we can actually visualize the communities built around them.

The news isn’t just what’s new

There was another serious technical limitation of 20th-century media: There was no way to go back to what was reported before. You could look at yesterday’s paper if you hadn’t thrown it out, or even go to the library and look up last year on microfilm. Similarly, there were radio and television archives. But it was so hard to rewind that most people never did.

Each story was meant to be viewed only once, on the day of its publication or broadcast. The news media were not, and could not be, reference media. The emphasis was therefore on what was new, and journalists still speak of “advancing the story” and the “top” versus “context” or “background” material. This makes sense for a story you can never go back to, about a topic that you can’t look up. But somehow this limitation of the medium became enshrined, and journalism came to believe that only new events deserved attention, and that consuming small, daily, incremental updates is the best way to stay informed about the world.

It’s not. Piecemeal updates don’t work for complex stories. Wikipedia rapidly filled the explanatory gap, and the journalism profession is now rediscovering the explainer and figuring out how to give people the context they need to understand the news.

I want to go one step further and ask what happens if journalism frees itself from (only) giving people stories about “what just happened.” Whole worlds open up: We can talk about long-term issues, or keep something on the front page as long as it is still relevant, or decide not to deliver that hot story until the user is at a point where they might want to know. Journalism could be a reference guide to the present, not just a stream of real-time events.

Design principles for personalized news

If we let go of the idea of single set of headlines for everyone based around current events, we get personalized news feeds which can address timescales longer than the breaking news cycle. Not everyone can afford to hire a personal editor, so we’ll need a combination of human curators, social media, and sophisticated filtering algorithms to make personalized feeds possible for everyone.

Yet the people working on news personalization systems have mostly been technologists who have viewed story selection as a sort of clickthrough-optimization problem. If we believe that news has a civic role — that it is something at least somewhat distinct from entertainment and has purposes other than making money — then we need more principled answers to the question of who should see what when. Here again are my three:

Interest. Anyone who wants to know should be able to know. From a product point of view, this translates into good search and subscription features. Search is particularly important because it makes it easy to satisfy your curiosity, closing the gap between wondering and knowing. But search has proven difficult for news organizations because it inverts the editorial process of story selection and timing, putting control entirely in the hands of users — who may not be looking for the latest breaking tidbit. Journalism is still about the present, but we can’t assume that every reader has been following every story, or that the “present” means “what just happened” as opposed to “what has been happening for the last decade.” But for users who do decide they want to keep up to date on a particular topic, the ability to “follow” a single story would be very helpful.

Effects. I should know about things that will affect me. Local news organizations always did this, by covering what was of interest to their particular geographic community. But each of us is a member of many different communities now, mostly defined by identity or interest and not geography. Each way of seeing communities gives us a different way of understanding who might be affected by something happening in the world. Making sure that the affected people know is also a prerequisite for creating “publics,” in Dewey’s sense of a group of people who act together in their common interest. Journalism could use the targeting techniques pioneered by marketers to find these publics, and determine who might care about each story.

Agency. Ultimately, I believe journalism must facilitate change. Otherwise, what’s the point? This translates to the idea of agency, the idea that someone can be empowered by knowing. But not every person can affect every thing, because people differ in position, capability, and authority. So my third principle is this: Anyone who might be able to act on a story should see it. This applies regardless of whether or not that person is directly affected, which makes it the most social and empathetic of these principles. For example, a politician needs to know about the effects of a factory being built in a city they do not live in, and if disaster recovery efforts can benefit from random donations then everyone has agency and everyone should know. Further, the right time for me to see a story is not necessarily when the story happens, but when I might be able to act.

These are not the only reasons anyone should ever see a story. Beyond these principles, there is a whole world of cultural awareness and expanded horizons, the vast other. There are ways to bring more diversity into our filters, but the criteria are much less clear because this is fundamentally an aesthetic choice; there is no right path through culture. At least we can say that a personalized news feed designed according to the above principles will keep each of us informed about the parts of the world that might affect us, or where we might have a chance to affect others.

Photo of zebras in Tanzania by Angela Sevin used under a Creative Commons license.

April 22 2012

20:03

MustKnow News: The top 10 news stories each hour

The Next Web :: “Unlike other news apps that bring plenty of content from a large number of RSS feeds or a user’s Facebook or Twitter streams, MustKnow focuses on bringing users only the top 10 most important stories,” continues Shany. “This helps users deal with information overload and allows them to stay up-to-date in no time.

Continue to read Paul Sawers, thenextweb.com

19:55

Curate.me mines the web to fill your inbox with personalized news

The Next Web :: Curate.me, formerly known as XYDO Brief, is making its public debut today after a 6-month invitation-only beta period which attracted some 20,000 users. Essentially, the service delivers personalized news to your email inbox based on your interests and data mined from your favorite social networks and news sources.

Continue to read Robin Wauters, thenextweb.com

Tags: Curation
19:45

'Just the News': A curated online news service

The Next Web :: Don’t you sometimes just wish you could drown out all the background noise on the World Wide Web and get back to basics? If your answer is yes, then read on. Just the News is an experiment by Max Temkin, a Chicago-based designer, to capture the big news of the day “without any gossip, sports, or distracting bullshit.” In short, it cuts through the cacophonous crackle of the white-noise Web and brings you a selection of the top news stories of the day.

Clipped from: justthene.ws (share this clip)

 

Continue to read Paul Sawers, thenextweb.com

Tags: Curation

April 20 2012

06:02

New Crowdsourcing, Curation and Liveblogging Training

Hi all! I’ve been traveling a lot for Digital First lately to spread the gospel of social media to my colleagues. So, if you’ve seen my presentations before, you’d know that I make very wordy Powerpoints so that people who weren’t there to see me prattle on about my favorite things can still follow what we went [...]

04:45

Content curators are the new superheros of the web

FastCompany :: Yesterday, 250 million photos were uploaded to Facebook, 864,000 hours of video were uploaded to YouTube, and 294 billion emails were sent. And that's not counting all the check-ins, friend requests, Yelp reviews and Amazon posts, and pins on Pinterest. The volume of information being created is growing faster than your software is able to sort it out. As a result, you're often unable to determine the difference between a fake LinkedIn friend request, and a picture from your best friend in college of his new baby.

[Steven Rosenbaum:] ... curators are emerging as a critical filter that helps niche content consumers find "signal" in noise.

Continue to read Steven Rosenbaum, www.fastcompany.com

February 20 2012

04:54

Resources for learning about social media

I have been collecting posts, articles, tutorials and general how-to materials that relate to how journalists use social media. I started about two weeks ago, as I prepare for a workshop in Singapore.

They are curated here: Social Media and Journalists.

The collection is housed at Scoop.it, a curation site that goes a step beyond social bookmarking sites such as Delicious and Diigo, and which privileges text and tagging — rather than visuals (like Pinterest). For this particular project, I’m finding it very useful.

One example of its utility is that I can offer up a link to a subset of the complete collection by using my own tags: see all posts tagged with “Instagram.” This kind of selection is always useful in teaching and training. Unfortunately, you cannot combine tags (e.g., Instagram + howto) to narrow the search results.

I could have chosen Tumblr for this project, but I’m liking the way Scoop.it works. One of its best features is that when you “scoop” a link using the Scoop.it bookmarklet, the Scoop.it interface opens in a one-third-screen vertical overlay (shown in the first screen capture above). This allows me to scroll up and down in the source material, which makes it easy to write my annotations and choose my tags. I don’t have to flip between browser tabs.

The toolbar shown above appears at the bottom of every posted item. It’s fast and easy to edit your posts and to change or add tags. It’s also easy for others to share your posts on a variety of social networks.

A big drawback is that I can’t download or otherwise preserve my collection. If Scoop.it goes bust, I will lose all my work. There is an RSS feed, but the links go only to the Scoop.it posts; there is no link to the source material in the RSS feed. Bummer.

Scoop.it isn’t brand-new — the site launched in November 2011.

February 08 2012

20:44

Sky and BBC leave the field wide open to Twitter competitors

At first glance, Sky’s decision that its journalists should not retweet information that has “not been through the Sky News editorial process” and the BBC’s policy to prioritise filing “written copy into our newsroom as quickly as possible” seem logical.

For Sky it is about maintaining editorial control over all content produced by its staff. For the BBC, it seems to be about making sure that the newsroom, and by extension the wider organisation, takes priority over the individual.

But there are also blind spots in these strategies that they may come to regret.

Our content?

The Sky policy articulates an assumption about ‘content’ that’s worth picking apart.

We accept as journalists that what we produce is our responsibility. When it comes to retweeting, however, it’s not entirely clear what we are doing. Is that news production, in the same way that quoting a source is? Is it newsgathering, in the same way that you might repeat a lead to someone to find out their reaction? Or is it merely distribution?

The answer, as I’ve written before, is that retweeting can be, and often is, all three.

Writing about a similar policy at the Oregonian late last year, Steve Buttry made the point that retweets are not endorsements. Jeff Jarvis argued that they were “quotes”.

I don’t think it’s as simple as that (as I explain below), but I do think it’s illustrative: if Sky News were to prevent journalists from using any quote on air or online where they could not verify its factual basis, then nothing would get broadcast. Live interviews would be impossible.

The Sky policy, then, seems to treat retweets as pure distribution, and – crucially – to treat the tweet in isolation. Not as a quote, but as a story, consisting entirely of someone else’s content, which has not been through Sky editorial processes but which is branded or endorsed as Sky journalism.

There’s a lot to admire in the pride in their journalism that this shows – indeed, I would like to see the same rigour applied to the countless quotes that are printed and broadcast by all media without being compared with any evidence.
But do users really see retweets in the same way? And if they do, will they always do so?

Curation vs creation

There’s a second issue here which is more about hard commercial success. Research suggests that successful users of Twitter tend to combine curation with creation. Preventing journalists from retweeting  leaves them – and their employers – without a vital tool in their storytelling and distribution.

The tension surrounding retweeting can be illustrated in the difference between two broadcast journalists who use Twitter particularly effectively: Sky’s own Neal Mann, and NPR’s Andy Carvin. Andy retweets habitually as a way of seeking further information. Neal, as he explained in this Q&A with one of my classes, feels that he has a responsibility not to retweet information he cannot verify (from 2 mins in).

Both approaches have their advantages and disadvantages. But both combine curation with creation.

Network effects

A third issue that strikes me is how these policies fit uncomfortably alongside the networked ways that news is experienced now.

The BBC policy, for example, appears at first glance to prevent journalists from diving right into the story as it develops online. Although social media editor Chris Hamilton notes that they have “a technology that allows our journalists to transmit text simultaneously to our newsroom systems and to their own Twitter accounts”, this is coupled with the argument that:

“Our first priority remains ensuring that important information reaches BBC colleagues, and thus all our audiences, as quickly as possible – and certainly not after it reaches Twitter.”

I’m not entirely convinced of this line, because there are a number of competing priorities that I want to understand more clearly.

Firstly, it implies that BBC colleagues are not watching each other on Twitter. If not, why not? Sky pioneered the use of Twitter as an internal newswire, and the man responsible, Julian March, is now doing something similar at ITV.

Then there’s that focus on “all our audiences” in opposition to those early adopter Twitter types. If news is “breaking news, an exclusive or any kind of urgent update”, being first on Twitter can give you strategic advantages that waiting for the six o’clock – or even typing a report that’s over 140 characters – won’t, for example:

  • Building a buzz (driving people to watch, listen to or search for the fuller story)
  • Establishing authority on Google (which ranks first reports over later ones)
  • Establishing the traditional authority in being known as the first to break the story
  • Making it easier for people on the scene to get in touch (if someone’s just experienced a newsworthy event or heard about it from someone who was, how likely is it that they search Twitter to see who else was there? You want to be the journalist they find and contact)

Everything at the same time

There’s another side to this, which is evidence of news organisations taking a strategic decision that, in a world of information overload, they should stop trying to be the first (an increasingly hard task), and instead seek to be more authoritative. To be able to say, confidently, “Every atom we distribute is confirmed”, or “We held back to do this spectacularly as a team”.

There’s value in that, and a lot to be admired. I’m not saying that these policies are inherently wrong. I don’t know the full thinking that went into them, or the subtleties of their implementation (as Rory Cellan-Jones illustrates in his example, which contrasts with what can actually happen). I don’t think there is a right and a wrong way to ‘do Twitter’. Every decision is a trade off, because so many factors are in play. I just wanted to explore some of those factors here.

As soon as you digitise information you remove the physical limitations that necessitated the traditional distinctions between the editorial processes of newsgathering, production, editing and distribution.

A single tweet can be doing all at the same time. Social media policies need to recognise this, and journalists need to be trained to understand the subtleties too.

February 05 2012

04:53

Social Media, Citizen Journalism, Media Curators - Google Docs

This is a document I wrote in advance of the World Journalism Education Conference held in South Africa in 2010. I was what they called an "expert" for a "syndicate" focusing on Social Media, Citizen Journalism, Media Curators. To help the members of the syndicate have a common ground for our discussions over three days, I wrote this document and distributed the link to all.

December 31 2011

21:00

Filter bubbles burst, blind spots shrunk, curation over SEO: Rachel Sklar’s predictions for 2012

Editor’s Note: We’re wrapping up 2011 by asking some of the smartest people in journalism what the new year will bring.

Bringing us home is Rachel Sklar, a media and cultural critic who is the co-founder of Change The Ratio, an adviser to early-stage startups, and a heavy-to-compulsive-tweeter.

More tattoo parlors

Earlier this week, I was blown away by this: Zooey Deschanel and Joseph Gordon-Levitt singing a charming version of “What Are You Doing New Year’s Eve?”. It wasn’t just that it was an adorable clip of two adorable people singing an adorable duet, nor that it was in sly homage to their adorable movie, nor that it was guaranteed to go viral. I sent it to a dude I know who is very smart in the realm of online video, and the future thereof.

“This,” I wrote, “is the future.” He wrote back: “Every time Zooey Deschanel picks up a ukelele, a hipster angel gets his wings tattoo.” True, but that wasn’t even it. “That’s not even it,” I said. “Her genius is that she knows that, and figured out that she should own a piece of the tattoo parlor.”

Hello Giggles is that tattoo parlor.

Do you know Hello Giggles? It’s pretty brilliant, and simple, as many brilliant things are. It’s Deschanel’s website that she founded with Hollywood-and-Internet It girls Sophia Rossi and Molly McAlleer. It’s adorable and slick, like nail polish in hipster colors. Super-fun content, unabashedly girlish, the cool BFF that you love to hang out with. That was where the video broke.

Sure, it was on YouTube — capable of being picked up by HuffPo, BuzzFeed, and all the other usual suspects — but still, it was on Hello Giggles first. Their Twitter feed pointed to it excitedly earlier in the day and then — bingo, link. (And then — site, crash.) Deschanel and her friends looked around and smartly realized that if they could be the content, they could be the platform, too. Tavi Gevinson — more niche but one who could fairly be called the Zooey of the fashion world — did the same thing with Rookie Mag. Louis CK did it last month. If you know your stuff is going to be picked up, why not pick it up yourself? Owning the tattoo parlor. We’re going to see more of that in 2012.

Up with people!

It’s happened: People matter more than brands. Not all brands — people will always love to obsess over The New York Times — but for the most part, it’s individual people who earn and wield the trust of the consumer. (However that Twitter lawsuit pans out, the world will never be Team Phonedog.) So brands will align themselves more closely, and blurrily, with people. (Watch Tina Fey: She’ll probably do something interesting in this vein, that no one else could get away with, but for which she will open the door.)

Speaking of brands vs. people, it will be interesting to watch what happens to TechCrunch over the next year.

And speaking of Tina Fey, a quick coda about Amy Poehler: On “Parks & Recreation,” Leslie Knope is running for office. Outside in the real world, the 2012 election contest will be under way. There’s no way that show will not be a hotbed of trenchant political commentary this season. (BTW, Poehler was a Hillary supporter back in the day. So hopefully that will mean more goddesses in Pawnee.) Point being, people.

News is the killer app

David Carr loves to say this. And it’s true. News moves the needle, every day. Of course, what counts as “news” can be wildly expansive (latest Iowa poll vs. Iran’s latest in the Straits of Hormuz vs. something crazy Glenn Beck said vs. the new Zooey Deschanel vid) (News You Can Hormuz! Sorry). But technology has made everyone a potential real-time newsbreaker, distributor, and TV station, and that is pretty incredible. What we saw from Egypt this year — incredible. The #Occupy livestream during the Zucotti raid — riveting at 2am as the viewer numbers climbed (and the cablers blithely let their canned programming play on).

This is different from a serendipitous civilian twitpic. This is technology letting people change the game, gatekeepers be damned.

Curation is also the killer app

…that said, though, it’s gotten pretty damn noisy out there. And if 2010 and 2011 were years of opening our hearts to a blossoming Internet, 2012 is going to be the year of letting smart people do it for us. Audiences are done with SEO-baiting and bait-and-switch headlines; we’re going to get more choosy with our clicks. And with our eyeball-access. So you’d better be trustworthy, because I don’t let just anyone curate for me. Because while news will always be the killer app, who it’s delivered by will matter just as much.

This is different from “reported by The New York Times.” This is “do I trust Anthony De Rosa to be my filter?” That’s why for those of us who live on the Internet, Ben Smith going to BuzzFeed made perfect, brilliant, forward-looking sense.

Or, to quote media maven Jason Hirschhorn: “Welcome to the age of the “CJ”. The Content Jockey. Payola-free and programmed with care.” He tweeted that, quoting from his email newsletter. #PlatformAgnostic

Unbubbling and unblinding

One of the arguments for old-school newspapers is discovery — next to the article you’re reading might be a completely different article that you never would have seen, on a subject you didn’t know you were interested in. Online, we’re starting to see the opposite: While we’re opting to follow curators who deliver to us the news we wish to receive, our most trusted sites are automatically giving us what they think we want to see — or, taken dystopically, what they want us to see. Eli Pariser dubbed this “the Filter Bubble.” Things are only getting more customized, tailored, targeted, and algorithm-ized, but in 2012 we will see clear pushback on that.

As for curators, an example: The AP’s top Oscar tweets of 2011 — all men. Compiled by Jake Coyle, who follows 191 people on Twitter. Whose tweets did he choose to see? Whose tweets did he ignore? Who was completely in his blind spot? This is just one example, but lemme tell you, I got lots and lots and lots and lots and lots. And lots.

When we began 2011, that blind spot was a frustrating ongoing reality. As we end it, something has shifted — the pushback isn’t only frustrated, it’s mocking. Because the rise of social has surfaced incredible demographic activity and information. Turns out, lots of under-repped constituencies are moving lots of needles. And honestly, those who leave out women, minorities, and other under-noted groups really no longer have a excuse for it — and in so doing, look like tools. (See how the Daily Dot owned that, and moved to make immediate amends.) I watch this stuff closely, and I really do see that trend pushing forward in 2012. (Even if just to keep me from sending you angry emails. WHICH I WILL.)

There isn’t just a single story. In 2012, the audience will expect — nay, demand — to see more of them.

#NoFilter

No relation, but — #NoFilter has become a tag of note this year, thanks to Instagram. What is real? What is fake? What is Kardashian? I think 2012 will demand that we say so up front.

“‘Modern Family’ is the funniest show TV”

I said that the other day. Then I realized I’d never watched Modern Family on TV. I downloaded the first season to my iPad and I have watched it on the elliptical, on planes, in bed, waiting in the security line at the airport, on the subway and walking home from work. This goes double for most other things that I expect to be able to get, see, upload, download, send, save, share or otherwise interact with using the various pieces of technology at my disposal. Our smartphones are now our universal remotes. If you’re not offering your product on-demand in 2012, you’re losing customers in 2012.

Michelle. Sheryl. Mindy. Kristen. Kirsten. Hillary. Zooey.

Mmm-hmmm, I’m not saying anything. I’m just gonna sit back and watch.

December 21 2011

19:00

Vadim Lavrusik: Curation and amplification will become much more sophisticated in 2012

Editor’s Note: We’re wrapping up 2011 by asking some of the smartest people in journalism what the new year will bring.

Next up is Vadim Lavrusik, Journalist Program Manager at Facebook.

Ladies and gentlemen, we can rebuild it. We have the technology. We have the capability to build a sustainable journalism model. Better than it was before. Better, stronger, faster.

Okay, putting “Six Million Dollar Man” theme aside, I do believe every word of that. And here’s a small sliver of the way I think the process can be improved: curating information in a way that both puts it in proper context for consumers and amplifies the reporting of the citizenry.

For the last year, much of the focus has been on curating content from the social web and effectively contextualizing disparate pieces of information to form singular stories. This has been especially notable during breaking news events, with citizens who are participating in or observing those events contributing content about them through social media. In 2012, there will be even more emphasis not only on curating that content, but also on amplifying it through increasingly effective distribution mechanisms.

Because anyone can publish content today and report information from a breaking news event, the role journalists can play in amplifying — and verifying — that content becomes ever more important. Contributed reporting from the citizenry hasn’t replaced the work of journalists. In fact, it has made the work of journalists even more important, as there is much more verification and “making sense” of that content that needs to be done. And journalists’ role as amplifiers of information is becoming more crucial.

What does that mean? It means journalists using their skills to verify the accuracy of claims being made on social media and elsewhere, and then effectively distributing that verified information to a larger audience through their publications’ community of readers and fact-checkers on the social web.

Curation itself will continue to evolve and become more sophisticated. As the year has gone on, breaking news itself has taken on new forms beyond the typical chronological curation of a live event. In the new year, we’ll also see new curated story formats. And we’ll see new tools that allow those formats to take life.

But the mentality of content curation needs to evolve, as well. It’s still very much focused on how to find and curate the content around a news event or story, but much like the old model of content production, there is still little emphasis on making sure that the content is effectively distributed, across platforms and communities. The cycle no longer stops after a piece is written or a story is curated from the social web. The story is ever evolving, and the post-production is just as important.

Though there are plenty of journalists doing a great job at recognizing that — and though news organizations themselves are increasingly putting emphasis on content amplification — the creation of content, rather than the distribution of it, remains the primary focus of news outlets.

The coming year will see a more balanced approach. Whether it’s a written story or one curated from the citizenry using social media tools, we will see a growing emphasis placed on content amplification through distribution, and an increasing effort to ensure that the most accurate and verified information is reaching the audience that needs it. Information will, in this environment, inevitably reach the citizenry; at stake is the quality of the information that does the reaching. If content is king, distribution is queen.

Image by Hans Poldoja used under a Creative Commons license.

July 27 2011

20:21

Anonther curation tool? Bundlr: real-time updates, collaboration features and the ability to embed pages

The Next Web | TNW :: Over the past year or so, a number of curation tools have emerged that make it easy to group diverse collections of online content around themes. With a number of tools already available, is there room for another? - The people behind new kid on the block Bundlr clearly believe so. Like its rivals, Bundlr offers a bookmarklet which allows you to quickly ‘clip’ any web page into a themed bundle which you can then share online.

Continue to read Marty Bryant, thenextweb.com

May 02 2011

21:18

The Bin Laden story and real-time engagement

Please allow me to think aloud on the past 15 hours.

We all acknowledge that the news of Osama bin Laden’s death broke on social media. We’ve all got stories about Twitter’s impact, roundups of Twitter reactions, tweet timelines and Storification galore – but did anyone in the heat of the developing news last night start engaging readers on [...]

April 29 2011

18:00

MSNBC.com’s Breaking News traces info to its source

There are some news events whose coverage is planned far — far, far — in advance of the events themselves. Those are exceptions, though: Most of the time, news is unscripted and unpredictable — breaking, appropriately enough, through the fabric of daily routine.

The tornado-laden storms that ravaged the South this week were some of those breaking news events. So were last month’s earthquake, and resulting tsunami, in northern Japan. For occurrences like those, as we know all too well, up-to-the-minute news can be especially hard to come by, since we don’t (yet) have a mechanism that connects sudden eyewitnesses of news events to the people who suddenly want to learn about those events.

MSNBC.com, though, is trying to provide that kind of connection through its Breaking News trifecta — a branded web domain, Twitter feed, and Facebook page, complemented by three mobile apps — that aims to be a curation engine for breaking news. The idea is to collect the biggest stories of the moment, in near real-time, from hundreds of news sources worldwide — including, often, eyewitness accounts shared on social media from people at the scene of breaking-news events. Google News-style, the feature aggregates stories from around the web and links out, when possible, to their full versions. River-of-news-style, it strips stories of their narrative and context, honing them down to their headlines.

“Our goal is to empower the moment of discovery,” says Cory Bergman, who helps run the project for MSNBC.com. Breaking News’ editors — “a small, caffeinated team of news junkies embedded inside msnbc.com’s newsroom” — scan wire services, Twitter, RSS feeds, live video feeds, YouTube, and even email alerts to hunt for breaking news, (ideally) immediately after it breaks. Their goal in that, Bergman told me, is “to discover the first instance of a breaking update in real time.” That instance could come from an Alabama paper or a person in Sendai, or anything (or anyone) in between. And it can come via users themselves, who, increasingly, are helping Breaking News editors flag noteworthy events that are bubbling up among social and local media. And “as we find stories and eyewitness accounts,” Bergman says, “we link to the originator directly, and credit them, and send them a short burst of traffic.”

To do all that, the team relies on a range of social networks, but focuses most squarely on Twitter. “Twitter is the real-time news distribution platform, quite frankly,” Bergman puts it. And, as such, “the twitter lexicon is infused in how we work and how we think.”

The main point, though, is…the pointing. Breaking News provides the raw material of the news — the quick-hit updates, in real-time — but it’s not trying to provide, in any strict sense, stories. Though the way that breaking news events are presented and contextualized is, clearly, crucial, Bergman notes, “in many ways, we’re a breaking-news discovery service that’s completely agnostic of source.” Which means: “We don’t want to own the news; we just want to point to it.”

That attitude isn’t unusual — do what you do best, link to the rest, etc. — but it’s noteworthy considering that it’s coming from a major media organization. (Breaking News is something of a skunkworks project at MSNBC.com: In 2010, Charlie Tillinghast, MSNBC.com’s general manager and publisher, appointed Bergman, along with Tom Brew and Ben Tesch, to do something interesting with breakingnews.com, which MSNBC.com had just acquired. The curation project is the result of that charge.) Every once in a while, the Breaking News team “might skew toward NBC” in the stories it promotes, Bergman acknowledges. Overall, though, “we are very, very passionate about keeping an even line and linking the first iteration — the first instance of breaking news — regardless of where it comes from.”

And the team is hoping that user contributions will help scale that idea. Last month, Breaking News launched a new set of features that allows users themselves curate information — providing, essentially, the same tools that the editors themselves use to discover and share breaking news. Those tools allow users to scan live updates from dozens of news organizations; to comb video and images — from (currently) YouTube, Twitpic, yFrog, and Plixi — for eyewitness accounts of events; or simply to share links to breaking news stories, tweets, images, and videos that the users have already come across. “If we can grow a community of spotters,” Bergman notes, that alone “would have an amazing impact on the breaking news that we discover.”

The idea — and the hope — is to provide a link between breaking news and the news more broadly. “Hard and fast breaking news is currently an underserved market,” Tillinghast noted last year; the Breaking News project is hoping to change that.

April 13 2011

16:00

Gateway and takeaway: Why Quickish wants to cut the clutter and help readers get to the good stuff

It can be tough for a verbose writer to embrace the short form.

This is important, because in doing an email interview with Dan Shanoff about Quickish — his new site that offers (near)-instant analysis and news on sports — it quickly became clear the man is a lover of words. Shanoff burned through more than 3,000 words about Quickish, which finds its focus through short, deliberate analysis and lots of links. (Full transcript here.)

But most Quickish posts are at tweet length or not much longer — and that restraint makes it as much a conduit for news as it is a case study in why short- and long-form writing aren’t mutually exclusive. What both share, and what Quickish trades in, is “the takeaway,” as in the essential point of a story/event/game/trend, or the answer to the question all readers ask: “Why am I reading this?” It’s that need for understanding, combined with the accelerated pace of media, that Shanoff sees that as the underpinning behind news consumption and its the guiding principal of Quickish.

“The best reporters and pundits know that the real traction isn’t the commodified tidbit of breaking news — this person was traded, this person threw a key interception, this person said something provocative — but the entirely valuable (and hard-to-copy) piece of insight that helps us understand a story better,” Shanoff told me. “This new competition — not for the scoop, but for the fast take — forces everyone to raise their level of instant analysis to cut through the clutter. That the noise level might be raised by everyone rushing to say something is ok — as long as you have reliable filters (like Quickish hopes to be) set up to cancel out the crap.”

If we were to build a periodic table for new media, the elements that make Quickish work would be speed, accessibility, and brevity — all in the service of making sense of a news story. Quickish is what happens when you try to take a coherent focus on those events that everyone is tweeting about — it’s March Madness, the Oscars, the Super Bowl, or election night, but all the time. Quickish embraces the alternate-channel ethos that has developed around how we experience events and is built around that. A reader can get what everyone is talking about, but with the added bonus of context and insight, and they could follow it wholesale or dip in as needed.

“Once you recognize the ascendancy of short-form content — and, by the way, that doesn’t preclude longer-form content (at all!) — the next thing you build on top of that is a system to help people keep up with all that great content, to cut through the increasingly endless clutter that keeps you from seeing the really good stuff,” he wrote.

“And so depending on what the biggest topics are, to the widest possible audience, Quickish editors are looking for the most interesting short-form analysis or conversation about that topic — it doesn’t have to be part of a full-blown column; it could be a killer ‘money quote’ of a short blog post or a Tweet or a message board post or video; we’re source-agnostic. It could be from a ‘national’ outlet or a local/topical reporter or blogger with particular expertise,” he wrote.

This would be a good time to mention that in many ways what Shanoff is talking about is not new in journalism, we’ve come to talk about aggregation a lot in terms of the future of news (apologies to Mr. Keller). There is no doubt that what Quickish provides falls neatly into the category of aggregation. It’s Techmeme or Mediagazer, but for sports. Shanoff, though, is not a big fan of the “A” word.

“Why I wince at ‘aggregation’ is that it doesn’t necessarily distinguish between ‘dumb’ aggregation of automated, algorithm-based systems (that inevitably fail some critical test of judgment) and the ‘smart’ selective, qualified recommendation that comes from an editor (whether that editor is Quickish or a newspaper/magazine editor or someone smart you follow on Twitter or a blogger or anyone else who actively applies judgment whether something is worthwhile or not). Everything on Quickish has been recommended with intention; to me, that’s much more active — and valuable — than a system built on more passively ‘aggregating.’”

If Shanoff has his way the site would be powered primarily off recommendations from readers. News sites large and small typically have some call out for tips, but Quickish seems to have tip-based updates baked in thanks to its Twitter-like nature. Credit for stories or takes gets a nod similar to retweets or hat/tips, and that’s something that Shanoff said is a result of Quickish relying on Twitter as a source, but also wanting a more transparent interaction with readers. It also tracks with another basic idea behind Quickish: The link as the most powerful asset connected to a story or post.

This also tends to build strong connections with readers who can feel a buy-in by contributing to a site. What you end up with — hopefully — is a recommendation-go-round, where stories and links get tipped to your site from readers, readers direct their friends to the site, and the process repeats in perpetuity.

“It is a long-standing tenet of online journalism that you want to encourage readers to make just ‘one more click’ within your site after the page they land on. With Quickish, we are thrilled if that ‘one more click’ is to some great piece of longform journalism that we have recommended. Because if you appreciate that experience as a reader, you are much more likely to give us another try tomorrow or when the next big news happens; isn’t that much more valuable than gaming them into sticking around? Here is a fascinating and powerful stat we have never made public: Quickish readers actively click through to one of our recommended links on nearly half of all total visits. Every other visit results in the reader clicking on a Quickish recommendation,” he wrote.

Considering all of this, Shanoff said the site’s design, minimal and stripped down, is closely attuned to the the content it provides and the expectations of the audience. Shanoff recognizes that readers are coming into news from various destination and on different devices, and that feeds an immediate expectation, it’s the “Why am I reading this” question all over again. Shanoff said for many publishers the focus is less on utility and more on squeezing the most value out of visits to a site. His take: “Don’t be greedy.”

There are two ways to try to engage people: You can try to force them — blitz or confuse or harangue them, in many cases — to try to keep clicking. Is that increase from 1.5 page views per visit to 2.0 really worth it if the reaction from the reader is, “Wow, that really wasted my time.” How is that kind of publisher cynicism a way to create a meaningful relationship with a reader?

The other way is to make the experience so simple, so self-evidently useful, so valuable, so easy that the reader might only give you (in Quickish’s case) that one page per visit for now, but they will come back every day… or a couple times a day… or tell their friends… or trust your recommendations… and ultimately have a deeper relationship with you when you introduce new products and features.

April 01 2011

18:00

“Trimming the Times”: The Atlantic Wire’s new feature wants you to make the most of your 20 clicks

Add another entry into the growing group of New York Times meter-beating strategies: The Atlantic Wire is now providing a daily summary of the best content in the paper. “Now that the New York Times pay wall is live, you only get 20 free clicks a month,” Adam Martin notes in his introduction to the new feature. “For those worried about hitting their limit, we’re taking a look through the paper each morning to find the stories that can make your clicks count.”

“Trimming the Times” isn’t — per its framing, at least — about gaming the Times’ meter, per se; it’s about helping readers navigate stories within an ecosystem that, to some extent, punishes aimless exploration. The inaugural edition of the feature is 400 words and change, with short graphs pointing to stories on the NYT’s front page and its various sections: Global, U.S., Business, Technology, Health, Sports, Opinion, and Arts.

So, essentially, The Atlantic Wire is curating and then summarizing the information the Times has curated and then summarized. Meta! (Double-meta!)

Here’s how Trimming sums up today’s front page:

Leading today’s paper: A report on the fallout of all those defections on the Libyan government, news that the U.S. will likely not arm the rebels there, and a House of Representatives committee is shocked, shocked! at the high salaries of officials at government-backed lenders Fannie Mae and Freddie Mac. Our top pick for today: Mad Men is saved!

First, it should be said, Trimming’s wise selection of the salvation of Mad Men as the day’s top story demonstrates the supremity of its editorial judgment. It also, however, suggests Trimming’s emphasis on curation (service!) over pure aggregation (menace?) — think Today’s Papers, with only one Paper. Introductions of stories with notes like “a worthwhile report,” “you’ll want to read how,” “we’re fascinated by,” and the like add an editorial sensibility to the thing that makes the feature less like a simple repurposing of the NYT and more like a tribute to it.

Trimming’s a neat idea — a reminder of the editorial and business opportunities that a paywall at one outlet can represent for the others. It also suggests the cross-platform layering of editorial content that’s increasingly defining the news space: Single stories spread across outlets, their look and their length — though not necessarily their core information — changing in the stretch. And the value of this kind of meta-curated feature is, it’s worth noting, largely independent of the Times’ paywall. Though the meter thing makes for a nice hook — and though it adds a service-y and slightly cheeky element to Trimming as a feature — a cogent summary of the best content in the nation’s paper of record is a useful thing as a general rule, meter or no.

Still, though, it’ll be interesting to see whether monthly meter patterns affect monthly traffic for The Atlantic Wire’s newest feature. Today being the first of the month, users’ article-view meters, as of early this morning, have been rolled back to zero. Human nature being what it is, the budgeting mindset — the voice of reason that reminds you to use your clicks wisely — probably won’t kick in until at least Article 10 or 15 (or, more realistically, 19). But Trimming wants to be the superego to your web-wandering Id. “It’s the first of the month so your clicks have reset,” Martin acknowledges. But: “You still must budget.”

Image by James Bowe used under a Creative Commons license.

March 29 2011

18:58

Mobilizing online communities in the Face of Disaster: Tips from NetSquared Local Organizers

On the 12th of March, one day after the tragic earthquake and tsunami devastated Japan Ichi - Hiroyasu Ichikawa - the NetSquared Local organizer from Tokyo sent an e-mail to our NetSquared Local Organizer listserve asking for the best practices for mobilizing online communities in the time of a disaster. In the weeks that have followed, Ichi’s e-mail provoked a series of responses from all over the world. In this post, we hope to voice many of the tools, resources, and tactics that have been shared, in hopes of encouraging others around the world to get involved with the digital relief efforts.

 

In response to Ichi, Paula Brantner from the Washington DC Local group suggested taking advantage of the international project called Crisis Commons that sprung into action after the recent Haiti earthquake. Crisis Commons is specifically designed to crowd source the technology needed to leverage communications in the event of a disaster, it helps in finding volunteers and is summing up all of the hand-on actions designed to support the cause.

 

Amy Sample Ward from the New York group has followed Paula’s e-mail with further suggestions on how and where to aggregate information. One of the online spaces she mentioned was the Google Crisis Response page where you can find the latest information about the crisis as well as make simple donations to the organizations involved in supporting the efforts in Japan. She has also provided the link to the Wikipedia page devoted to the 2011 Tokyo earthquake and tsunami. This resource is an important point of reference for everyone interested in the latest events related to the tragedy, as it has been visited and edited by a lot of people and therefore appears high in the search results.


Shufang Tsai from the Taiwan group shared information from one of her community members about an experience with the previous Chilean earthquake that occurred in 2010. The ideas that came from the Chile earthquake experience included setting up a situation map using Ushahidi on the crodmap.com site and asking the volunteers to search through the media news and put them all together in an easily accessible Google Doc. The information could be then added to the Ushahidi map. Other suggestions of the community member in Japan included the usage of the Tweak the Tweet to collect the information from the twitter and facebook. He has also highlighted the importance of keeping the volunteers data saved somewhere (i.e. a Google Doc).


Sarah Schacht from the group that meets in Seattle has put Ichi in touch with the representatives from Crisis Commons and suggested he should list himself at the Honshu Quake Activities @ Crisis Commons wiki. Sarah has also forwarded his information to the Web of Change  to attract tech volunteers.


Jonathan Eyler-Werve from the Chicago group added another wiki link to the conversation – the example of how the source has been used to aggregate the information about the Libyan uprising.
Shufang then summed up the online response information and sent links to (among others):

  • Open source disaster management system Sahana (in Japanese language only)

and to various online sources that work with maps such as:

 

  • ESRI distaster reponse

The next day (13th of March) Ichi sent us the result of this facebook group work (in Japanese language only) as well as a link to the articles he has been writing (in Japanese language only). He also highlighted the importance of learning the lesson from all of the social media crisis responses and planning a long term strategy for the digital curation in case of disaster.

 

In a response to Ichi JD Lasica from the group in San Francisco shared links to the interviews with Andy Carvin who had been instrumental in setting up the Hurricane Information Center and the subsequent Crisis Camp for Haiti:

Rachel Weidinger from TechSoup Global sent the group links to resources and recovery guides available on the techsoup.org site - Disaster Planning and Recovery Toolkit.

 

JD Godchaux from NiJel - a community mapping platform seconded Shufangs’ suggestion to work with Crisis Mappers and encouraged Ichi to join the CrisisMappers list. The project was launched locally on March 11th by a Japanese member of the Open Street Map (OSM) community. The crisis map is being supported by onsite volunteers (mainly in Tokyo) along with a group of students (mainly Japanese) out of Boston lead by The Fletcher School. JD also mentioned another instance of Ushahidi to track radiation levels from the Fukushima Daiichi plant.  

 

The last comment in the threat came from Ichi, who shared the link to the socialmedia dashboard on Netvibes set by him to catch up the current event. Netvibes is a free web site that allows users to set up their own customized start page composed of "modules" which can contain a wide variety of information from dozens and dozens of other sites. It is a great tool to fetch, store and manage various web sources and make the process transparent and easy to access for everyone.

 

The entire conversation happened within the 72 hours from the Japanese earthquake and wasn’t stopped when the radiation threat became an issue, nor was it paused by the power outrage caused by the disaster.  As the Japanese tragedy proves the role of social media in times of a disaster remains a subject of an ongoing conversation. It highlights the importance of connecting with like-minded people to pool the efforts and delegate responsibilities in the times of crisis. We hope that this post will help others who would like to contribute to the relief of the Japanese tragedy and other disasters that will inevitably happen in the future.

 

Do you have any other tips or tools for Ichi or anyone else who is interested in using the web to provide digital disaster relief? If so, please share your suggestions in the comments below!

March 15 2011

14:00

News portal, super aggregator, and mega-curator: PBS builds a new site from scratch with PBSNews.org

PBS finds itself with what could be the definition of a “good” problem. (Well, not that defunding problem, but another one.) Here’s the scenario: Under the PBS umbrella you’ll find news shows like PBS Newshour, Frontline, and Nightly Business Report, among others, all producing content that lives primarily on air and on individual websites. While video clips and stories are pulled into PBS.org, that site’s primary function is not to be a news source like, say, its cousin NPR.org.

With all that news and information swirling around PBS, though, it makes sense to have a sort of super aggregator, something to pull together the threads from various shows around news or topics. Think about it: What if on a broad story like the economic crisis, you could pull together a NewsHour interview with Treasury Secretary Timothy Geithner on changes to borrowing policies for US banks along with a Frontline clip from “Breaking the Bank” on the merger of Bank of America and Merill Lynch? Of course what we’re talking about is not simply aggregation, but also curation — and actually, considering the hours of shows PBS has at its disposal, mega-curation.

Consider all of this and you’ll know where the team behind the PBS News Blog is coming from. It’s PBS’ effort to launch a new site that is both a news portal for readers and a new channel for PBS programming. The new site, which should launch soon, will be called PBSNews.org: The News Navigator.

When I spoke with Tom Davidson, PBS.org’s senior director and publisher for news and public affairs, he told me the new project will essentially start from scratch, partly because a central news division has never been part of PBS, but also because PBS wants to take advantage of the opportunity to build a smarter news site. “Historically PBS has tended to not create content itself — it was founded as a programming service” that would pool member stations’ financial resources “to allow other independent producers to make that content,” Davidson said.

Over the years, PBS has built out a universe of news and current events programming — and in recent years, that’s been matched by further investment in digital tools and websites starting with PBS.org, Davidson said. Again, they’ve created a good problem.

Instead of offering another site for breaking news, the News Navigator team wants to build a site that moves past daily headlines and offers more comprehensive coverage on news or topics — the kind that can come to bear when you have a satellite staff of journalists, producers, and documentarians working on pieces. That staff will rotate around a central hub, the News Navigator staff (which is growing as we speak), which will include producers, data specialists, writers, and editors.

So what could the News Navigator look like? Davidson said the mission will be to present “the knowledge that defines what’s going on on a story behind the headline.”

More specifically they want to meet the balance of context and timeliness in news by having something similar to topic pages that would provide news, raw data sets, timelines, video and other background from across PBS programs. These deep dives, as they call them, will include areas like Afghanistan, same sex marriage, health care, and Congress.

The point in all this context-focused curation isn’t to out-NYT the NYT, but rather to add value by finding new angles on big stories. “I will try lots of crazy things,” Davidson said. “But I’m not going to try and take on CNN.com, CBS.com or NYTimes.com. We lost that battle 15 years ago. Let’s not fight that battle now.”

PBS is also creating issue clashes — an adaptation of a familiar feature of many PBS news shows, the two-analyst, head-to-head debate, adapted for online. Think Shields and Brooks, only on the web — and with the audience empowered not only to vote on the winner, but also to add their own arguments.

Of course, there are hurdles in building out a new news site, particularly one that will need to pull news and videos from across a multitude of other sites, each of those operating off of different frameworks and content management systems. It’s not as easy as connecting tube A to slot B. Instead of trying to put all its programs under one system, PBS instead decided to build the equivalent of a massive card catalog, naming it Merlin. Merlin is essentially a database of PBS content tagged with metadata to allow sites, either from programs or member stations, to pull up material they would like to use. (Merlin was a contributing factor in PBS.org’s recent redesign and iPad offering.)

Jason Seiken, senior vice president of Interactive, Product Development and Innovation for PBS, told me that Merlin came from the need for something that could act as a publisher and distributor of content that would benefit both programs and stations. Once stories or videos are tagged, they can be pulled up on PBS.org, the News Navigator, or WGBH, as an example. “Merlin is in essence a distribution channel,” Seiken said. “It turns PBS.org into a distribution network for local stations.”

Along with Merlin, PBS rolled out a standalone video player and management system called COVE. (While it may seem like online video is ubiquitous, in the past there was no quick, easy, or unified way for stations and programs to share video on their sites, Seiken said.) COVE allows sites to pull together video from across PBS in the same player, meaning a piece from KQED could be coupled with a feature from Need to Know or Sesame Street.

After PBSNews.org makes its debut, Davidson said it will still be in something of a rolling beta. He sees the site as a startup whose features PBS will constantly adjust. The challenge for PBSNews.org, Davidson said, will be growing an audience for it while also finding its place within the PBS family. Its job won’t be to recreate what others have done, but instead to complement and synthesize it. “We don’t see ourselves competing with NewsHour on reporting the news of the day,” Davidson said. Instead, “we see ourselves first and foremost as translators for the consumers.”

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl