Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 17 2012

18:00

NewsRight’s potential: New content packages, niche audiences, and revenue

When NewsRight — the Associated Press spinoff formerly known as News Licensing Group (and originally announced by the AP as an unnamed “rights clearinghouse”) — began to lift the veil a couple of weeks ago, most of the attention and analysis focused on “preserving the value” of news content for content owners and originators. In the first round of reports and commentary on the launch, various bloggers and analysts quickly made comparisons to Righthaven, the infamous and all-but-defunct Las Vegas outfit that pursued bloggers and aggregators for alleged copyright violations.

But most of that criticism misses an important point: Would NewsRight’s investors, all legacy news enterprises, really invest $30 million in a questionable model just to enforce copyrights? Or are they investing in a startup that has the capacity to create revenues from new, innovative ways of generating, packaging and, distributing news content?

While some of the reactions point to the former, I believe the opportunity (and NewsRight’s real intention) lies in the latter: NewsRight has the potential to create revenue for any content creator large or small, and to enable a variety of new business models around content that simply can’t fly today because there hasn’t been a clearinghouse system like it.

(As background, here at Nieman Lab in 2010, I first described the potential benefits of a news clearinghouse months before AP announced the concept. Then after AP made public their plans, I described a variety of new business models it could enable, if done right.)

First, let’s have a look at some of the critics:

  • TechDirt, disputing whether NewsRight would actually “add value,” asked: “AP finally launches NewsRight…and it’s Righthaven Lite?”
  • InfoWars, posting a video talk with Denver radio talk host David Sirota, inquired: “Traditional media to bully bloggers with NewsRight?” In the interview Sirota said, “What I worry about is that it ends up being used as a financial weapon against those voices out there who are citing that information in order to challenge it, scrutinize it, and question it.”
  • GigaOm’s Mathew Ingram pointed out that while NewsRight itself says it will stay out of pursuing copyright infractions via litigation, “one of the driving forces behind the agency is the sense on the part of AP and other members that their content is being stolen by news-filtering services…and news aggregators.” Ingram concludes: “What happens when an organization like The Huffington Post says no thank you? That’s when it will become obvious how much of NewsRight’s business model is based on carrots, and how much of it is about waving a big stick.”
  • Nieman Lab’s own coverage by Andrew Phelps also focused on the tracking and enforcement aspects of NewsRight’s core technology.

NewsRight’s launch PR didn’t do much to dispel these concerns. CEO David Westin said himself in a video: “NewsRight’s designed…to make sure that the traditional reporting organizations that are investing in original journalism are reaping some of the benefits that are being lost right now.” And the company’s press release, quoting Westin, went no further that the following in hinting that there were new business opportunities enabled by NewsRight: “[I]f reliable information is to continue to flourish, the companies investing in creating content need efficient ways to license it as broadly as possible.”

Those traditional news organizations (29 of them, including New York Times Co., Washington Post Co., Associated Press, MediaNews Group, Hearst, and McClatchy) are the investors who scraped together $30 million to launch NewsRight. The Associated Press also contributed technology and personnel to the effort.

Given those roots — along with the initial PR, Westin’s own background as a lawyer, and the fact that NewsRight’s underlying AP-derived technology, News Registry, was explicitly developed to help track content piracy — it’s not hard to see where all the skepticism comes from.

But ultimately, if NewsRight is to be successful, it will have to create a new marketplace. It’s going to have to do more than trying to get paid for the status quo — that is, to collect fees from aggregators and others who are currently repackaging the content of its 29 owners. It can do that, but in addition, like any business, it will have to develop new products that new customers will pay for; it will have to bring thousands of content sources into its network; and it will have to enable and encourage thousands of repackagers to use that content in many new ways. And it will have to focus on those new opportunities rather than on righting wrongs perceived by its investors.

I spoke last week with David Westin about where NewsRight was starting out and where it might ultimately go. While he repeated the company mantra about returning value to the originators of journalistic content — “NewsRight is designed with one mission: to recapture some of the value of original journalism that’s being lost in the internet and mobile world” — it’s clear that his vision for NewsRight goes well beyond that. Here’s some of what we covered:

NewsRight’s initial target is “closed-web” news aggregators. Media monitoring services like EIN News, Meltwater News, and Vocus provide customized news feeds to enterprise clients like corporations and government entities, typically at $100 per month or more. Essentially, they’re the digital equivalent of the old clipping services. Currently, these services must scrape individual news sites, and technically, they should deliver only snippets with links back to the original sources (although whether they limit themselves to that is not easy to monitor). What NewsRight offers the monitoring services is one-stop shopping that includes (a) fulfillment: an accurate content feed (obviating the need to scrape, and eliminating uncertainty by always delivering the latest, most complete version of a story); (b) rights clearance; and (c) usage metrics. The monitoring services will have the option to improve their offerings by supplying full text (or they can stick with first paragraphs); the content owners share the resulting royalties.

While NewsRight currently must individually negotiate content deals, it’s working toward a largely-automated content-exchange system. Clearly, as NewsRight grows, there will have to be an automated system with self-service windows. “I hope that’s right, because that means we will have been successful,” Westin said when I suggested that would have to happen. The deals with private aggregators being worked on now all require one-off negotiations for each deal, both with the aggregators and with the content suppliers. That’s marginally possible when there are 800 or so content contributors to the network, but to be a meaningful player in the information marketplace, the company will need to grow to encompass thousands of content creators, thousands of repackagers, republishers, or aggregators of content, and many millions of pieces of content (including text, images and video) — requiring a sizable infrastructure and high level of automation.

Any legitimate news content creator can join NewsRight for free for the duration of 2012. “Anyone who generates original reporting, original content, can benefit from this. We’re open to anyone who’s doing original work.” Westin says. That includes not only newspapers and other traditional news organizations — it can include hyperlocal sites and news blogs. Basically, that free membership will bring you back information on how and where your content is being used. NewsRight’s system is currently tracking several billion impressions for its investor-members and is capable of tracking billions more for those want to use the service. (All this is rather opaque on the website right now, but if you’re interested, just click on the “Contact us to learn more” link on their homepage, and they’ll get back to you.)

Down the road, NewsRight is looking for ways to create new content packaging opportunities. Westin: “There is a large number of possible businesses [that we can enable]. We don’t have any of them up and running yet; it’ll be a better story when we’ve got the first one up. But I do envision a number of people who might say, ‘I wanted to create this product, dipping into a large number of news resources on a specific subject, but it’s simply been too cumbersome and difficult to do’…We should be able to facilitate that.” What he envisions is something that reduces the friction and the transaction costs in setting up a news feed, app, or site on a niche topic and allows a multiplicity of such sites to flourish — “new products based around the content that don’t exist now.” That includes personalized news streams — products for one, but of which many can be sold: “As we continue to expand News Registry and the codes attached to content, it makes it possible to slice and dice the news content with essentially zero marginal cost.”

While the initial offerings to private aggregators carry a price tag set by NewsRight, in the ultimate networked and largely automated point-to-point distribution arrangement — individual asset syndication — NewsRight will likely stay out of pricing. The “paytags,” or the payment information embedded in the Registry tags, will be able to carry information on a variety of usage and payment terms — not only what the price is, but nuanced provisions like time constraints (e.g. this can’t be used until 24 hours after first published), geographic constraints (to limit usage by regional competitors), variable pricing (hot news costs more than old news), and pricing based on the size of the repackager’s audience. Content owners would likely have control over these options, but there’s also the potential for a dynamic pricing model — something similar to Google’s auction mechanism for AdWords — in order to optimize both revenue and usage.

The NewsRight network could make it possible to monetize topical niche content that’s too difficult to syndicate today. There a lot of bloggers, hyperlocals, and other niche sites today that earn zero or minimal revenue and are operated as labors of love. The potential for NewsRight is to find new markets for the content of these sites. And general publishers like newspapers might find it profitable to jump into specialized niches for which there’s no local audience, but which might generate revenue via redistribution through NewsRight to various content aggregators.

Could that grand vision come to fruition? As I’ve pointed out before, a very similar system has worked very nicely for ASCAP and BMI, the music licensing organizations, which not only collect royalties for musicians but enable a variety of music distribution channels. (This is on the performance and broadcast side of the music biz, not the rather broken recorded music side.) Both AP CEO Tom Curley in launching NewsRight and Westin in discussing it refer to ASCAP and other clearinghouses as models — not just for compensating content creators but for enabling new outlets and new forms of content. NewsRight’s is purely a business-to-business model — it doesn’t involve end users. So the traction it needs will come when it can point not just to compensation streams from private aggregation services, but to new products and new businesses made possible by its system.

October 22 2010

15:00

AP’s “ASCAP for news” — new ecosystem, new revenue streams, new enterprise opportunities

In a speech on Monday, Associated Press CEO Tom Curley announced that the AP would soon set up “an independent rights clearinghouse for news publishers to manage the distribution and use of their content beyond their own Web properties.” (Speech text in PDF link)

The entity, to be designed with input from multiple stakeholders including AP and the Newspaper Association of America, will be established sometime in 2011. It will be a business-to-business clearinghouse, not involving transactions with consumers. Through the clearinghouse, originators of news content (ranging from local bloggers on up; this is not limited to AP members) will be able to distribute their content for digital publication by others, and receive back royalties of revenue shares, according to protocols yet to be determined. The clearinghouse will aim to facilitate a rapid, realtime means of negotiating rights for such content sharing, resulting in a large increase in the potential market for any particular piece of content.

As an illustration, a newspaper (or a broadcaster, or a local blogger) could release a piece of content — a story, a photo, a video — with tags indicating what it is about, who owns it, how and where it may be used, and how the content originator is to be paid. The content, distributed through any available channel, is picked up by another publisher, aggregator, or personalized news service and used in accordance with the attached rights and payments protocols. The clearinghouse monitors usage and payment obligations throughout the network of participating content originators and publishers, and settles transactions among them.

The plan Curley described is very similar to what I proposed in a post here in July, in which I asked, “What if news content owners and creators adopted a variation on the long-established ASCAP-BMI performance rights organization system as a model by which they could collect payment for some of their content when it is distributed outside the boundaries of their own publications and websites?”

Curley framed the opportunity in very similar language: “With the new rights clearinghouse initiative, we are hoping to give news publishers more tools to pursue an audience and capture value beyond the boundaries of their own digital publications.”

Although Curley’s speech did not mention the analogy with ASCAP (the American Society of Composers, Artists and Performers, which has since 1914 protected rights and collected royalties for songwriters and composers when their works are performed or broadcast), the AP’s own story on the announcement said the clearinghouse would be “loosely modeled” after ASCAP.

When I spoke about the project on Wednesday with Srinandan Kasi, the AP’s general counsel, he said that AP had studied clearinghouses in other industries, particularly in order to understand what considerations drove their choice of governance structure. (For those inclined to derail into griping about ASCAP’s perceived shortcomings, the analogy is not to ASCAP specifically; the point is that a clearinghouse for content will speed up and expand content distribution options, and create a new and efficient content marketplace — not that it will be exactly like ASCAP.)

In announcing the project at a meeting of the Southern Newspaper Publishers Association, Curley said it builds on several years of development by AP, beginning with the creation of a digital cooperative in 2007, through which 1,500 newspapers and broadcasters funnel content that AP parses, tags, and returns for use on local websites. In 2009, AP set up News Registry, a system that uses the tags to track where and how that content is being used on the Web and is now used by about 1,000 newspapers.

That tracking functionality, and the possibility of pursuing copyright claims for unauthorized reuse of nothing more than a headline, garnered plenty of criticism for the AP last year. (Even just a few weeks ago, Curley repeated claims of widespread “content scraping,” promising enforcement action against unnamed sites engaged in the practice).

New revenue opportunities for content creators

The real opportunity around News Registry was never in tracking and enforcement, but in helping to find new avenues for content distribution. As an analogy, in the retail world, the equivalent to content piracy is shoplifting and other forms of inventory loss from stores — a real problem, but generally not more than 1.5 percent of revenue and not worth more than that to prevent or reduce. A retailer’s first priority is increasing sales, by building new distribution channels, and the same should be true in the news publishing business, where content misuse is a minor revenue leak compared to the opportunities for broader distribution. AP’s clearinghouse is a big step in that direction.

The digital approach of most news publishers until now has been to seek to control their own distributions channels — their own websites, their own mobile apps, or individually negotiated syndication channels where they retain control. While successful in a few cases, that approach has generally limited access and revenue — for example, the average visitor to U.S. newspaper websites still spends only about one minute per day at newspaper websites, which is less than one percent of total online time.

In a more open system, content with appropriate tags for rights protection and payment provisions could travel the web (and the mobile world) in search of readers via multiple secondary channels, without the need for slow offline negotiation in every instance. The potential for piracy is still there, but the system can establish a network of publishers, aggregators and others who subscribe to the rights protocols for mutual benefit.

The clearinghouse concept grew out of research by Water Street Partners, a Washington D.C. joint venture consulting group engaged by NAA. According to Curley, “Water Street’s work concluded there was a business to be built on the AP’s News Registry work.” (Disclosure: As part of their work, Water Street’s Julian Bene interviewed me about my Lab ASCAP post.)

Kasi told me that Water Street “talked to a lot of people to independently check on various aspects of the things that were under development here or have been developed here or were being considered for development here, and to ratify the path on which we were going.”

For now, Kasi is in charge of the project. He serves as the AP’s chief legal counsel, but is also one of its chief strategists and will likely play a major role in shaping the clearinghouse. He defers answers to many questions about the planned entity’s design. For example, it might be nonprofit, or it might be for-profit: “There are models of clearinghouses that are similar constructs in other industries that have a variety of different structures — profit, nonprofit, non-stock companies, and so on. So there are different models and we’re in the process of analyzing each one of them to understand what drove a particular choice so we are better informed for our effort here.”

Kasi is careful to say that AP is not determining the ultimate governance arrangement, the operational details, or anything in between. Since AP is a content supplier itself, he said, “we thought that journalism would be better served by having an independent entity to provide some of these services rather than the AP.” Serving the greater good of journalism as a necessary ingredient in a democratic society is something Kasi referred to more than once — perhaps an indication that the thinking is leaning in the direction of a nonprofit setup.

While a story by AP reporters April Castro and Michael Liedtke, which was posted on AP’s corporate site, asserted that the clearinghouse would take a 20-percent cut of transactions, Kasi would not confirm that, saying that he was “not privy to the source” of that figure, and that “the number will be something that I think the market will determine.” Kasi also clarified that the clearinghouse will handle content across all platforms from web to mobile, contrary to a few reports that suggested it would focus on mobile only.

The clearinghouse will allow for experimentation on revenue models. It could “clear” payments based on a number of different models, with the method determined by the content originator, who might receive payment based on a share of advertising revenue, user payment revenue, or a royalty payment set by the publisher. Kasi said that ideally, the clearinghouse would provide the flexibility to allow market forces to determine which model would work best. Kasi agreed that a dynamic, real-time, variable pricing or bidding system, as suggested in my earlier post, is possible, but said he’d be concerned that “information may be in some instances essential to democracy, and you don’t want that to be subject to a bidding system that some people may be deprived because they can’t bid into that.” What he expects is a hybrid system that can support multiple pricing methods over time, but not necessarily “on the first day of operations.”

New ecosystem, new opportunities

I can see the clearinghouse spawning a wide range of new business opportunities, and Kasi (who calls it a “new ecosystem”) agrees: “The idea really is for the clearinghouse to bring that efficiency and the toolkit to everybody, regardless of scale, so that we can actually create some vibrant new packaging for example.” Among the possibilities I anticipate:

  • Larger, more robust aggregators of content streams like Daylife. This is also an opportunity for the AP itself, which is one of the reasons it wants the clearinghouse to be a separate organization. Channeling content flows through wholesaler portals of this kind helps ensure proper tracking of rights and payment obligations.
  • New “remixers” — aggregators and niche publishers who take advantage of the ability to publish full content units (stories, pictures, video, graphics) created by others but republished in new contexts, in new markets and to new audiences.
  • New “hyperpersonalized news streams,” created by semantic content-matching engines and presented in multiple formats on the web, as browser add-ons, and as apps. Some of these will be highly specialized enterprise solutions with a subscription revenue models; others will target consumer interests such as sports, weather, cooking, recreation, style, entertainment, travel, pets, sci/tech, etc.
  • Personally or socially curated news channels could multiply and flourish by being able to supply full versions of news content rather than snippets.
  • Many new content-creation opportunities for publishers. The remixers and hyperpersonalized news applications can be seen as akin to the explosion in cable channels since the 1960s, which resulted in a huge increase in video production and consumption (certainly unanticipated, considering the main worry at the time was from movie theater owners who figured “pay TV” would steal their audience). Far more local info can be fed into the content pools available to remixers and hyperpersonalized apps, because as consumers spend more time with these content providers, they will look at more specialized niche content just as they do on cable. For example, newspaper publishers could add more video versions of their stories; they could publish more local statistical information; they could get more traction out of the backstories in their archives; they could create more content on local businesses and artists (perhaps sourcing this from freelancers who share in the ensuing revenue); they could cover events over a wider region; they could provide more specialized coverage of businesses in their area, etc.
  • Clearinghouses — there can be multiple clearinghouses, not just one, that would become major businesses in their own right.
  • Clearinghouse optimization — the equivalent of SEO services: publishers could engage them to help maximize clearinghouse revenue by fine-tuning the rights and pricing parameters, just as there are specialists in Google and Facebook ad marketing for retailers.
  • Payment processing services — (assuming an eventual expansion beyond B2B and into business-to-consumer transactions) — this is a niche that most clearinghouses would outsource rather than do themselves, because of the complexities of interfacing with bank and credit card back-ends and later on with currency exchange issues.
  • Usage metrics — new kinds of distribution will require new kinds of metrics; an opportunity for existing as well as new metrics services.
  • Other service businesses would emerge or grow; for example: businesses that semantically tag content including audio and video as well as text and photographs so it can be fed into the system; advertising networks that focus on supplying local as well as national ads to the remixers and content streams, including real-time priced ads.
  • And the big unknowns: additional opportunities that are created as all of the above are impacted by the very rapid growth of mobile in all its forms, by location-aware services, by social couponing in all its forms, by the addition of item-level RFID tags to virtually all retail inventories (now beginning), the proliferation of QR codes (already saturating Asia), and the emergence of a viable mobile payment systems using point-of-sale proximity sensors or bump technology — all of which could be ingredients in turbocharging a direct commerce layer on digital platforms.

Put all this together and there is no end to the content and commerce opportunities that are enabled when content can travel freely in search of consumers, with revenue flowbacks at multiple levels.

(A final disclosure: I am working with faculty members at the Missouri School of Journalism on opportunities to research, flesh out and develop some of the new opportunities around the clearinghouse concept.)

July 06 2010

14:00

The ASCAP example: How news organizations could liberate content, skip negotiations, and still get paid

Jason Fry suggested in a post here last week that current paywall thinking might be just a temporary stop along the way to adoption of “paytags — bits of code that accompany individual articles or features, and that allow them to be paid for.” But how? As Fry recognizes, “between wallet friction and the penny gap, the mechanics of paytags make paywalls and single-site meters look like comparatively simple problems to solve.”

I suggested a possible framework for a solution during a couple of sessions at the conference “From Blueprint to Building: Making the Market for Digital Information,” which took place at the University of Missouri’s Reynolds Journalism Institute June 23-25. Basically, my “what-if” consisted of two questions:

  1. What if news content owners and creators adopted a variation on the long-established ASCAP-BMI performance rights organization system as a model by which they could collect payment for some of their content when it is distributed outside the boundaries of their own publications and websites?
  2. And, taking it a step further, what if they used a variant of Google’s simple, clever, and incredibly successful text advertising auction system to establish sales-optimizing pricing for such content?

News publishers have been tying themselves in knots for the last few years deciding whether or not to charge readers for content, and if so, how much and in what fashion — micropayments, subscriptions, metered, freemium and other ideas have all been proposed and are being tested or developed for testing.

As well, publishers have complained about the perceived misuse of their content by aggregators of all stripes and sizes, from Google News down to neighborhood bloggers. They’ve expressed frustration (“We’re mad as hell and we are not going to take it anymore,” Associated Press chair Dean Singleton said last year), and vowed to go after the bandits.

But at the same time, many publishers recognize that it’s to their advantage to have their content distributed beyond the bounds of their own sites, especially if they can get paid for it. When radio was developed in the 1920s, musicians and music publishers recognized they would benefit from wider distribution of their music through the new medium, but they needed a way to collect royalties without each artist having to negotiate individually with each broadcaster.

A model from music

That problem was solved by using a non-profit clearinghouse, ASCAP (American Society of Composers, Authors and Publishers), which had been formed in 1914 to protect rights and collect royalties on live performances. Today the performance-rights market in the U.S. is shared between ASCAP, BMI (Broadcast Music Incorporated, founded by broadcasters rather than artists) and the much smaller SESAC (formerly the Society of European Stage Authors & Composers). Using digital fingerprinting techniques, these organizations collect royalties on behalf of artists whose works are performed in public venues such as restaurants and shopping centers as well as on radio and television stations and streaming services such as Pandora.

Publishers have put a lot of effort into trying to confine news content to tightly-controlled channels such as their own destination websites, designated syndication channels, apps, and APIs in order to control monetization via advertising and direct user payments. But when content moves outside those bounds, as it can very easily, publishers have no way to regulate it or collect fees — so they cry foul and look for ways to stop the piracy or extract payments from the miscreants.

Among the content-protection schemes, AP is rolling out News Registry, which it touts as a way of at least tracking the distribution of content across the web, whether authorized or not, and Attributor offers “anti-piracy” services by “enforcement experts” to track down unauthorized use of content. But for now, content misuse identified by these systems will require individual action to remove it or force payment. In the long run, that’s not a viable way to collect royalties.

Suppose, instead, that news publishers allowed their content to it be distributed anywhere online (just as music can be played by any radio station) as long as it were licensed by a clearinghouse, similar to ASCAP and BMI, that would track usage, set prices, and channel payments back to the content creator/owner?

To do this, perhaps the paytags Fry suggested are needed, or perhaps publishers can learn from the music industry and use the equivalent of the digital fingerprints that allow ASCAP’s MediaGuide to track radio play. (The basic technology for this is around: AP’s News Registry uses hNews microtags as well as embedded pixels (“clear GIFs”); Attributor’s methodology is closer to the digital fingerprinting technique.)

How it could work

The system for broadcast and performance music payments is a three-way exchange consisting of (a) artists and composers, (b) broadcasters and performance venues, and (c) performance rights organizations (ASCAP and BMI).

In the news ecosystem the equivalents would be (a) content creators and owners, (b) end users including both individual consumers and “remixers” (aggregators, other publishers, bloggers, etc.); and (c) one or more content clearinghouses providing services analogous to those of ASCAP and BMI.

The difference between a news payments clearinghouse and the music industry model would be in scale, speed and complexity. In the news ecosystem, just as in the music world, there are potentially many thousands of content creators — but there are millions of potential end users, compared to a manageable number of radio stations and public performance venues paying music licensing fees. And there are far more news stories than musical units; they’re distributed faster and are much shorter-lived than songs. In the radio and public performance sphere, music content still travels hierarchically; that was true in the news business 20 years ago, but today news travels in a networked fashion.

To handle the exchange of rights and content in this vastly more complex environment, a real-time variable pricing model could be developed, benefiting both the buyers and sellers of content. Sellers benefit because with variable pricing or price discrimination, sales and revenue are maximized, since content goods are sold across the price spectrum to various buyers at the price each is willing to pay — think of the way airline seats are sold. Buyers benefit because they can establish the maximum price they are willing to pay. They may not be able buy at that price, but they are not subject to the take-it-or-leave-it of fixed pricing.

When it comes to news content, a variable pricing strategy was suggested last year by Albert Sun, then a University of Pennsylvania student; now a graphics designer with The Wall Street Journal. (Sun also wrote a senior thesis on the idea called “A Mixed Bundling Pricing Model for News Websites.”) The graphs on his post do a good job showing how a price-discrimination strategy can maximize revenue; it was also the subject of one of my posts here at the Lab.

A well-known real-time variable pricing arrangement is the Google AdSense auction system, which establishes a price for every search ad sold by Google. Most of these ads are shown to users at no cost to the advertisers; they pay only when the user clicks on the ad. The price is determined individually for each click, via an algorithm that takes into account the maximum price the advertiser is willing to pay; the prices other advertisers on the same search page are willing to pay; and the relative “Quality Score” (a combination of clickthrough rate, relevancy and landing page quality) assigned to each advertiser by another Google. It works extraordinarily well, not only for advertisers but for Google, which reaps more than $20 billion in annual revenue from it.

Smart economist needed

What’s needed in the news ecosystem is something similar, though quite a bit more complex. Like the Google auction, the buyer’s side would be simple: buyers (whether individuals or remixers such as aggregators) establish a maximum price they are willing to pay for a particular content product — this could be an individual story, video, or audio report, or it could be a content package, like a subscription to a topical channel. This maximum price is determined by an array of factors that will be different for every buyer, but may include timeliness, authoritativeness, relevance to the buyer’s interests, etc., and may also be affected by social recommendations or the buyer’s news consumption habits. But for the purposes of the algorithm, all of these factors are distilled in the buyer’s mind into a maximum price point.

The seller is the content creator or owner who has agreed to share content through the system, including having remixers publish and resell it. Sellers retain ownership rights, and share revenue with the remixer when a transaction takes place. The price that may be acceptable to a content owner/seller will vary (a) by the owner’s reputation or authority (this is analogous to Google’s assignment of a reputation score to advertisers), and (b) by time — since generally, the value of news content will drop quickly within hours or days of its original publication.

The pricing algorithm, then, needs to take into account both the buyer’s maximum price point and the seller’s minimum acceptable price based on time and reputation; and at least two more things: (a) the uniqueness of the content — is it one of several content items on the same topic (multiple reports on an event from different sources), or is it a unique report not available elsewhere (a scoop, or an enterprise story) — and (b) the demand for the particular piece of content — is it popular, is it trending up, or has it run its course?

The outcome of this auction algorithm would be that different prices would be paid by different buyers of the same content — in other words, sales would occur at many points along the demand curve as illustrated in Sun’s post, maximizing revenue. But it’s also likely that the system would establish a price of zero in many cases, which is an outcome that participating publishers would have to accept. And of course, many remixers would choose to offer content free and step into the auction themselves as buyers of publication rights rather than as resellers.

In my mind, the actual pricing algorithm is still a black box, to be invented by a clever economist. For the moment, it’s enough to say that it would be an efficient, real-time, variable pricing mechanism, maintained by a clearinghouse analogous to ASCAP and BMI, allowing content to reach end users through a network, rather than only through the content creator’s own website and licensees. Like ASCAP and BMI, it bypasses the endless complexities of having every content creator negotiate rights and pricing with every remixer. The end result would be a system in which content flows freely to end users, the value of content is maximized, and revenue flows efficiently to content owners, with a share to remixers.

Clearly, such a system would need a lot of transparency, with all the parties (readers, publishers, remixers) able to see what’s going on. For example, if a multiple news sources have stories on the same event, they might be offered to a reader at a range of prices, including options priced above the reader’s maximum acceptable price.

Protecting existing streams

Just as ASCAP and BMI play no role when musicians sell content in uncomplicated market settings the musicians can control — for example, concert tickets, CD sales, posters, or other direct sales — this system would not affect pricing within the confines of the content owner’s own site or its direct licensees. But by enabling networked distribution and sales well beyond those confines, it has the potential to vastly increase the content owner’s revenue. And, the system need not start out with complex, full-blown real-time variable pricing machinery — it could begin with simpler pricing options (as Google did) and move gradually toward something more sophisticated.

Now, all of this depends, of course, on whether the various tentative and isolated experiments in content pricing bear fruit. I’m personally still a skeptic on whether they’ll work well outside of the most dominant and authoritative news sources. I think The New York Times will be successful, just as The Wall Street Journal and Financial Times have been. But I doubt whether paywalls at small regional newspapers motivated by a desire to “protect print” will even marginally slow down the inevitable transition of readers from print to digital consumption of news.

A better long-term strategy than “protect print” would be to move to a digital ecosystem in which any publisher’s content, traveling through a network of aggregators and remixers, can reach any reader, viewer or listener anywhere, with prices set efficiently and on the fly, and with the ensuing revenue shared back to the content owner. The system I’ve outlined would do that. By opening up new potential markets for content, it would encourage publishers to develop higher-value content, and more of it. The news audience would increase, along with ad revenue, because content would travel to where the readers, listeners or viewers are. Aggregators and other remixers would have be incentivized to join the clearinghouse network. Today, few aggregators would agree to compensate content owners for the use of snippets. But many of them would welcome an opportunity legitimately to use complete stories, graphics and videos, in exchange for royalties shared with the content creators and owners.

Granted, this system would not plug every leak. If you email the full text of a story to a friend, technically that might violate a copyright — just like sharing a music file does — but the clearinghouse would not have the means to collect a fee (although the paytag, if attached, might at least track that usage). There will be plenty of sketchy sites out there bypassing the system, just as there are sketchy bars that have entertainment but avoid buying an ASCAP license.

But a system based on a broadly-agreed pricing convention is more likely to gain acceptance than one based on piracy detection and rights enforcement. Like ASCAP’s, the system would require a neutral, probably nonprofit, clearinghouse.

How could such an entity be established, and how would it gain traction among publishers, remixers and consumers? Well, here’s how ASCAP got started: It was founded in 1914 by Victor Herbert, the composer, who was well-connected in the world of musicians, composers, music publishers and performance venues, and who had previously pushed for the adoption of the 1909 Copyright Act. Herbert enlisted influential friends like Irving Berlin and John Philip Sousa.

Today, just as a few outspoken voices like Rupert Murdoch are moving the industry toward paywalls, perhaps a few equally influential voices can champion this next step, a pricing method and payments clearinghouse to enable publishers to reap the value of content liberated to travel where the audience is.

Acknowledgments/disclosures: The organizer of the conference where I had the brainstorm leading to this idea, Bill Densmore, has spent many years thinking about the challenges and opportunities related to networked distribution, payment systems, and user management for authoritative news content. A company he founded, Clickshare, holds patents on related technology, and for the last two years he has worked at the University of Missouri on the Information Valet Project, a plan to create a shared-user network that would “allow online users to easily share, sell and buy content through multiple websites with one ID, password, account and bill.” Densmore is also one of my partners in a company called CircLabs, which grew out of the Information Valet Project. The ideas presented in this post incorporate some of Densmore’s ideas, but also differ in important ways including the nature of the pricing mechanism and whether there’s a need for a single ID.

Photo by Ian Hayhurst used under a Creative Commons license.

May 04 2010

12:21
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl