Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 28 2011

16:24

The post-post-CMS CMS: Loosely coupled monoliths & read-only APIs

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — for example, like the app that Talking Points Memo demonstrates for managing their front page.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app’ or the read/write API handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity?

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths, read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Photo of an 'old school' newspaper layout by 'limonada' on Flickr
Creative commons photo by limonada on Flickr

As I sat down with my colleagues on Tuesday for a little hack day on our favourite open-source content management system, we had a familiar conversation — one that is probably familiar to all people who hack on CMSs — which is, What is the future of content management?

It’s a conversation that has been unfolding and evolving for many, many years, but seems to be a gaining a lot of steam again in 2011.

The essence of some of the more recent conversation is nicely summarized over on Stijn’s blog.

One of the questions presented is: will tomorrow’s CMS be a monolithic app, or a ‘confederation’ of purpose-built micro-applications — like the app that Talking Points Memo demonstrates for managing their front page, for example.

In the blog post on the TPM media lab about the ‘Twilight of the CMS,’ they describe their solution to the problem of the monolithic CMS — “a simple and flexible API that digests manifold requests from the different applications.”

As I ponder these ideas in the context of my work for online-only publishers like TheTyee.ca, I struggle with a few competing notions…

Read-only API, or Read/Write API

In the case of TheTyee.ca, the monolithic CMS writes data to a data store (Elastic Search) that directly provides the (soon to actually be public) public API. From there, various applications can request the data from the API and receive a structured JSON representation of that data back as a response. Basically, once these clients have the data, they can do what they want.

This is great. It’s a read-only scenario. The CMS is still the ‘authority’ on the state and structure of the data (along with a lot of other information), but there is an identical copy of that data in the store that provides the API.

Now, let’s think about that TPM front page management app: it clearly needs read/write access to their API, because it can change not just layout, but editorial content like the story title, deck, teaser, and so on.

So, if the API is read/write, the questions I have are:

  • The schema for the documents (the stories, etc.) must be validated somewhere, right? So… does that logic live in each purpose-built app, or as a layer on top of the data store? And does that then violate a bit of the ‘Don’t Repeat Yourself’ design pattern?

  • Do these content-centric writes to the API make their way back to the CMS or editorial workflow system? And, if they don’t, does that not introduce some confusion about mis-matched titles, decks, teasers, and so on? For example, say I change title of a story on the front page, but now I see a typo in the body of story and want to fix that, so I go into the other CMS and search for … whoops! … what was the title of that story again?

  • How does this new ‘front page app,’ or the read/write API, handle typically CMS-y things like competing or conflicting write requests? Or version control? Or audit trails of who made which edits? If one, or the other, or both, actually handle these concerns, is this not a duplication of logic that’s already in the CMS?

Perhaps I’m not thinking about this right, but my gut is saying that the idea of a read/write API — a scenario where you have both a CMS (Movable Type in TPM’s case) and a ‘front page management’ app — starts to get a bit tricky when you think about all the roles that the CMS plays in the day-to-day of a site like TPM.

It gets even more tricky when you think about all the delivery mediums that have their own ‘front page’ — tablet experiences, scaled down mobile experiences, feeds, e-newsletters, and so on.

Presentation management or editorial management

The other thing that is immediately striking about the TPM demo is the bizarrely print-centric feel to the experience — I’m immediately transported back to my (very brief) days working at The Varsity where the editors and designers would literally paste up the newspaper’s pages on big boards.

For a publication like the TPM — an entirely online ‘paper’ — it seems like an odd, slightly ‘retro,’ approach in an age that is defined by content that defies containers. One must ask: where does it end? Should there be a purpose-built app for each section’s front page, e.g., Sports, Arts, Life, etc.? For each regional section? For each-and-every article?

Isn’t this just vanity at some level? Endless bit-twiddling to make things look ‘just right’? Kinda’ like those mornings when I just can’t decide whether to wear a black shirt or a white shirt and stand in front of the mirror trying them on for what seems like eternity.

So, coming back to my point: in a time when many believe (like a religion!) that content and presentation should be separated — not as an exercise, but because that content is delivered to literally hundreds of different end-user experiences (phones, tablets, readers, etc.) — do we really want to be building tools that focus on the presentation for just one of those experiences? If so, where does it end?

For the most part, the modern-day CMS has been designed to alleviate these myriad challenges by providing a way for non-technical people to input structured data, and the tools for developers to output that structured data in a variety of ways, formats, and mediums.

Watching the TPM video gives me some ideas about how to improve the experience for an editor to quickly edit headlines, decks, teasers, photos of the morning’s stories — and even to indicate their relative priority in terms of newsworthiness — but I would want to stop there, at the editorial, and let the presentation layer be handled according to the medium, device, or experience the content is being delivered to.

Loosely coupled monoliths & read-only APIs

Many moons ago, I proposed that Wienberger’s Small Pieces Loosely Joined idea held true for content management also. The proposal was simple: instead of investing in one monolithic CMS — a CMS that did everything from manage content to advertising delivery to comments to search to who-knows-what (a trend in CMS projects at the time) — an organization could choose the current ‘best in class’ solution for each need and connect them together through loose coupling. Then, if a better solution came out for, say, comments, the old system could be replaced with the newer system without having to re-build the whole enchilada.

(Of course, the flip side often is that louse coupling can feel like bubble gum and string when you have to work with it every day.)

So, while my own experience is that loose coupling is great, and that purpose-specific applications are usually better than apps that try to do everything, I would personally want to draw the line somewhere. For me, that line is between distinct ‘areas of responsibility,’ like editorial, advertising, design, community, search, and so on.

In this scenario, each area would have the authority over its own data, and the logic for how that data is structured and validated, and so on. If that data was written to a central data store that provided an API — something simple, flexible, and RESTful — the other apps in a ‘confederation’ could read from it, choose what to do with it, how to present it, and so on, but the final ‘say’ on that data would be from the app that is responsible for creating it.

For me, this is a sensible way to allow these apps to work in concert without having the logic about the data living in multiple places, i.e., the API, and the clients that can write to it (which makes sense if you’re Twitter with hundreds of external clients, but not if you’re one organization building exclusively internal client apps).

Would love to hear otherwise, or experiences of how others are handling this or thinking about the challenge.

June 19 2011

05:05

Not The Guardian - Web-first workflow with Google Docs, WordPress and InDesign integration? ... For free

Mediabistro :: The Bangor Daily News announced this week that it completed its full transition to open source blogging software, WordPress. And get this: The workflow integrates seamlessly with InDesign, meaning the paper now has one content management system for both its web and print operations. And if you’re auspicious enough, you can do it too — he’s open-sourced all the code!

Continue to read Lauren Rabaino, www.mediabistro.com

Docs to WordPress to InDesign, video William P.D. , www.screenr.com

May 20 2011

14:30

This Week in Review: What Twitter does to us, Google News gets more local, and making links routine

Every Friday, Mark Coddington sums up the week’s top stories about the future of news.

Twitter on the brain: Last week, New York Times executive editor Bill Keller got a rise out of a lot of folks online with one of the shortest of his 21 career tweets: “#TwitterMakesYouStupid. Discuss.” Keller revealed the purpose of his social experiment this week in a column arguing, in so many words, that Twitter may be dulling your humanity, and probably making you stupid, too. Here’s the money quote: “But my inner worrywart wonders whether the new technologies overtaking us may be eroding characteristics that are essentially human: our ability to reflect, our pursuit of meaning, genuine empathy, a sense of community connected by something deeper than snark or political affinity.”

This, as you might imagine, did not go over particularly well online. There were a couple strains of reaction: Business Insider’s Henry Blodget and All Twitter’s Lauren Dugan argued that Twitter may indeed be changing us, but for the good, by helping make previously impossible connections.

Alexia Tsotsis of TechCrunch and Mike Masnick of Techdirt countered Keller by saying that while Twitter isn’t built for deep conversations, it is quite good at providing an entry point for such discussion: “What you see publicly posted on Twitter and Facebook is just the tip of the conversation iceberg,” Tsotsis said. GigaOM’s Mathew Ingram, meanwhile, defended Twitter’s true social nature, and sociologist Zeynep Tufekci gave a fantastic breakdown of what Twitter does and doesn’t do culturally and socially.

Two of the most eloquent responses were provided by Nick Bilton, one of Keller’s own employees, and by Gizmodo’s Mat Honan. Bilton pointed out that our brains have shown a remarkable ability to adapt quickly to new technologies without sacrificing old capacities. (Be sure to check out Keller’s response afterward.)

Honan made a similar argument: Keller, he said, is confusing the medium with the message, and Twitter, like any technology, is what you make it. “If you choose to do superficial things there, you will have superficial experiences. If you use it to communicate with others on a deeper level, you can have more meaningful experiences that make you smarter, build lasting relationships, and generally enhance your life,” Honan wrote.

Google gets more local with news: Google News unveiled a few interesting changes in the past week, starting with the launch of “News near you.” Google has sorted news by location for a while now, but this feature will allow smartphone users to automatically get local news wherever they are. ReadWriteWeb’s Dan Rowinski explained why newspapers should be worried about Google moving further onto their local-news turf, and GigaOM’s Mathew Ingram criticized newspapers for not coming up with like this themselves.

Poynter’s Jeff Sonderman, on the other hand, said Google’s feature is still in need of some human curation to go with its algorithmic aggregation. That’s an area in which local newspapers can still dominate, he said, but it’ll require some technological catchup, as well as a willingness to get over fears about linking to competitors.

Another change, not publicized by Google News but spotted by the folks at Search Engine Land, was the addition of an option to allow users to filter out blogs and press releases from their results. This raised the question, what exactly does Google consider a blog? Google told Search Engine Land it relies on a variety of factors to make that decision, especially self-identification. Mathew Ingram ripped this classification, and urged Google to put everything that contains news together in Google News and let readers sort it out. (Former Lab writer Zach Seward wrote about the problems with Google News’ blog label back in 2009.)

Fitting linking into news’ workflow: A discussion about linking has been simmering on Twitter on and off over the past few weeks, and it began to come together into something useful this week. This round of the conversation started with a post by web thinker and scholar Doc Searls, who wondered why news organizations don’t link out more often. In the comments, the Chicago Tribune’s Brian Boyer suggested that one reason is that many newspapers’ CMS’s and workflows are print-centric, making linking logistically difficult.

CUNY j-prof C.W. Anderson responded that the workflow issue isn’t much of an excuse, saying, as he put it on Twitter: “At this point ‘linking’ has been around for twenty years. The fact that this is STILL a workflow issue is almost worse than not caring.” This kicked off a sprawling debate on Twitter, aptly chronicled via Storify by Mathew Ingram and Alex Byers. Ingram also wrote a post responding to a few of the themes of resistance of links, particularly the notion that information on the web is inferior to information gained by old-fashioned reporting.

British journalist Kevin Anderson took on the workflow issue in particular, noting how outdated many newspaper CMS’s are and challenging them to catch up technologically: “It’s an industrial workflow operating in a digital age. It’s really only down to ‘that’s the way we’ve always done it’ thinking that allows such a patently inefficient process to persist.” Publish2′s Scott Karp gave an idea for a solution to the CMS mess.

AOL’s continued makeover: Another week, another slew of personnel moves at AOL. PaidContent’s David Kaplan reported that AOL is hiring “a bunch” of new (paid) editors and shuffling some current employees around after its layoff of hundreds this spring. Overall, Kaplan wrote, this is part of the continued effort to put the Huffington Post’s stamp on AOL’s editorial products.

One of the AOL entities most affected by the shifts is Seed, which had been a freelance network, but will now fall under AOL’s advertising area as a business-to-business product. Saul Hansell, who was hired in 2009 to run Seed, is moving to HuffPo to edit its new “Big News” features. In a blog post, Hansell talked about what this means for HuffPo and for Seed.

Meanwhile, the company is also rolling out AOL Industry, a set of B2B sites covering energy, defense, and government. But wait, that’s not all: AOL’s Patch is launching 33 new sites in states targeting the 2012 election. The hyperlocal news site Street Fight also reported that Patch is urging its editors to post more often, and a group of independent local news sites is banding together to tell the world that they are not Patch, nor anything like it.

Reading roundup: As always, plenty of other stuff to get to this week.

— We mentioned a Pew report’s reference to the Drudge Report’s influence in last week’s review, and this week The New York Times’ David Carr marveled at Drudge’s continued success without many new-media bells and whistles. Poynter’s Julie Moos looked at Drudge’s traffic over the years, while the Washington Post disputed Pew’s numbers. ZDNet’s David Gewirtz had five lessons Drudge can teach the rest of the media world.

— A few paid-content items: A Nielsen survey on what people are willing to pay for various mobile services, Poynter’s Rick Edmonds on The New York Times’ events marketing for its pay plan, and the Lab’s Justin Ellis on paid-content lessons from small newspapers.

— A couple of tablet-related items: Next Issue Media, a joint effort of five publishers to sell magazines on tablets, released its first set of magazines on Google Android-powered Samsung Galaxy. And here at the Lab, Ken Doctor expounded on the iPad as the “missing link” in news’ digital evolution.

— Columbia University announced it will launch a local news site this summer focusing on accountability journalism, and the Lab’s Megan Garber gave some more details about what Columbia’s doing with it.

— The Columbia Journalism Review’s Lauren Kirchner had an interesting conversation with Slate’s David Plotz about Slate’s aggregation efforts, and in response, Reuters’ Felix Salmon made the case for valuing aggregation skills in journalists.

— This weekend’s think piece is a musing by Maria Bustillos at The Awl on Wikipedia, Marshall McLuhan, communal knowledge-making, and the fate of the expert. Enjoy.

May 11 2011

06:17

Carnival of Journalism: Life hacks and how to rock your journalism information workflow

Greetings Carnies!
For this installment of the Carnival of Journalism we’re going to go ultra practical:

What are your life hacks, workflows, tips, tools, apps, websites, skills and techniques that allow you to work smarter and more effectively?

As a recovering RSS-aholic, (my Google Reader account peaked around 2,100 about a year and half ago, I’ve paired it down to 931 currently and am looking to drop that by a half this summer) I’ve always marveled at people like Robert Scoble who seems to be everywhere and tracking everything. Part of this is because he’s an information hound, part social media addict and it’s also part his job to be out there in the conversation with the tech industry. Tim Ferris interviewed him four years ago about his 600+ feeds and how he digs through them for good information.

In my effort to cull my RSS feeds, I’ve relied much more on social networks for network curation but in that transition I realized I was doing it wrong, again. This Winter while meeting with a group of news nerds talking about their workflows, most confessed that they read only a very small portion of their Twitter alerts. At this time, I was close to reading around 70-80+% (obviously that fluctuated but on the average day I’d hit that number or higher); almost everyone else in the room was in the 5-15% range.

So during 2011, I’ve tried to focus on finding more tools and techniques to help boost productivity and save time, while not compromising the quality of information/work completed. Everyone has different ideas on what makes their workflow work, and while sites like Lifehacker.com does a fantastic job, I believe journalists especially manage and filter a lot of information every day, so it would be fascinating to share some of our best practices with the JCarn community.

So for instance, what tools, plugins, apps and websites do you use to get the most out of the day?
For example, here are a few that I’ve tried at various times:

What work techniques and strategies have you learned over the years that help boost your productivity and effectiveness?
More examples of things I’ve tried to get you thinking:

Other ideas?

Our deadline for publishing will be Friday, June 10th. I hope we can all help each other become better, more productive and informed journalists.

March 25 2011

23:02

Photogene and iPad 2: Great tools for photojournalists

Sitting in a lawn chair outside the Spokane Apple Store last week, I pondered the absurdity of my week-long quest to buy an iPad 2. Arriving at 5 a.m. netted me the sixth spot in line and an eventual 16-gig wifi slate of glass and aluminum.

Did I really need another digital device to supplement all the other Apple products that grace my home and workspace? No, of course not. But using the iPad 2 this past week has made me giddy with excitement as I discover one new feature or application after another. It’s interesting, when I demonstrate to people who have never seen or touched one, how utterly amazed they are. Suffice to say this multimedia device is smokin’ hot. There are enough glowing reviews on the Web that I don’t need to pontificate much more.

A great tool for photojournalists

The one thing I really wanted to do with my iPad 2 was edit and send photos from the field back to the newspaper. I couldn’t find much info from other photojournalists about what applications would help me replace Photo Mechanic and Photoshop on my laptop. Nor could I find anyone who was using the iPad to send their photos via FTP (File Tranfer Protocol) back to their newspapers. I can happily announce that during my first photo assignment today I did just that.

My first stop last week was to the Apple iPad App Store where I found this amazing little program called Photogene. It allows me to crop, tone, caption and send my photos all from a three dollar application. The best part is that it has a built in FTP, so I can send my photos directly into our Merlin archive system.

Here was my workflow today:

  • Shot a photo of a woman in a job-training program working in the kitchen of a restaurant.
  • Ordered lunch, sat down at a table and plugged in the Apple camera connection cable between the iPad and the USB port on my Nikon D3s. It immediately displayed all the. jpg’s in the iPad’s photo browser. By touching a photo, it marks it so you don’t have to bring in every image on your card. I hit “Import Selected” and the files were quickly downloaded from the camera.
  • I open Photogene and select the photo I want to edit. The workflow now is super simple. I crop my photo, and then toned the image. Toning is done using sliders for exposure, color temperature, saturation etc. There are a ton of other adjustments from noise reduction to selective color channels. It even has a digital noise filter and curves adjustment tool.
  • On to the metadata tab, I clicked “ITPC” and added caption info and filled out the other metadata fields that are needed to archive the photo for later.
  • Finally, I hit the export button and chose “FTP” from the menu (You can also send directly to Facebook, Twitter, Flickr or email.) I already have all the info such as IP address and password stored, so I just add the file name (make sure there are no spaces) and upload the photo using my ATT MiFi . A minute later it was ready for an editor to move to the desk.

Some observations

Will the iPad 2 replace a laptop? Probably not. I think the iPad is perfect if you need to move a couple of photos from your car during a breaking news event. It’s not be ideal for slogging 300 photos from a high school basketball game.

You need to buy the Camera Connection Kit from Apple ($30.00), which includes an SD card reader and a Apple connector to mini USB cord. I wish there was a CF card reader, but the cable works as advertised.

Typing a caption is easy, but it is all on one line that gets obscured as you type past the field boundary. A bigger caption field for photojournalists is a must have.

Get the PhotoSync application ($1.99). It lets you transfer photos to and from your iPhone, computer and iPad wirelessly. It also lets you bypass the iTunes software, which is not really intended for photos.

I also bought the pro upgrade for eight dollars. It adds a few more things that professionals need such as applying star ratings, adding personal watermarks to exported images, saving your FTP settings, adjusting RGB curves individually, and controlling JPEG export settings.

If any other photojournalists are using an iPad to edit photos please share your experiences in comments below!


December 16 2010

20:47

What's your newsroom's bus factor?

I've been using this phrase (and committing the underlying sin) a lot lately. Some background:

In software development, a software project's bus factor is an irreverent measurement of concentration of information in a single person, or very few people. The bus factor is the total number of key developers who would need to be incapacitated, (as by getting hit by a bus) to send the project into such disarray that it would not be able to proceed.

Or, in my case, maybe you decide to take an off-the-grid honeymoon for a couple weeks.

In your newsroom, how many projects will stop running if one person quits, goes on vacation or is in some other way "hit by a bus"? What have you done to raise your bus factor?

September 27 2010

02:52

Contents of a journalist’s backpack

Neerav Bhatt describes himself as a professional blogger, photographer, geek and qualified librarian. Okay, so he never says “journalist,” but if you read his post that accompanied the photo below, I think you’ll forgive my headline.

He’s got some interesting choices (which he explains in his post) and gives some very practical advice too. I have a similar Asus netbook, and I concur that you can’t beat it for long battery life! The screen resolution is fantastic too.

Thanks to @jayrosen_nyu for linking to Neerav’s post!

August 16 2010

19:19

A "Joel Test" for Coders in the Newsroom?

A while back, someone, somewhere (probably on twitter) pointed me to the Joel Test for software shops.

After reading it a few times, I got to thinking we could really use something like this for newsrooms who have or need programmers. But what to include? Here's a few initial thoughts:

  1. Do you use version control?
  2. Do coders come into regular contact with reporters and editors?
  3. Can reporters and editors easily and accurately describe what coders do?
  4. Do you have a development environment and access to servers?
  5. Do coders have the software, hardware and admin access necessary to build things?
  6. Do you test? (unit tests? benchmarks? usability?)
  7. Do you debrief?

What else would you add?

(A lot of this comes from Brian Boyer's excellent list of best practices, which you should also read)

May 04 2010

16:02

Looking at jQuery for visual journalism

With all this talk about the so-called death of Adobe Flash, the future of HTML5, etc., I thought I should take a closer look at jQuery. This post is intended to give you an overview and help you decide whether you too should take a closer look.

My first thought is that if you have weak skills in CSS (or no CSS skills at all), you can’t even think about using jQuery. You would need to improve your CSS skills before you tackled jQuery.

With that out of the way (sorry if that ruined your day), let’s note that:

  • jQuery is JavaScript.
  • jQuery is free and not a commercial product.
  • The home and source of jQuery is jQuery.com. You can download it there.

As an introduction, I really liked this: jQuery Tutorials for Designers. It shows you what jQuery makes possible on today’s Web pages, and even if you don’t want to look at the code, you can open each of the 10 examples and click and see what it does. So in about 15 minutes, you will have a better idea about jQuery’s usefulness.

This example is my favorite: Image Replacement. It’s simple, and it’s really easy to apply this to all kinds of visual journalism situations that an online designer might encounter.

Many of the other examples are things I wouldn’t bother to do on Web pages, even though they look cool. I was reminded of how a lot of people are saying Flash is unnecessary because you can do all the menu effects and flyovers with JavaScript instead. These examples prove that. Of course, my view of Flash is not to use it for eye candy (like most of these jQuery examples), but instead to use it for complex explanatory journalism, like this.

For a very nice slideshow built with jQuery, see this tutorial: Create a Slick and Accessible Slideshow Using jQuery.

There’s also a nifty jQuery plug-in for making a slideshow: Coda-Slider (thanks to Lauren Rabaino for that link!).

Here’s another good tutorial for a slideshow: Automatic Image Slider w/ CSS & jQuery.

For the geeks among you, read why you should link to Google’s copy of jQuery instead of using a version on your own Web host.

And finally, the ever-helpful Chrys Wu (@MacDivaONA) recommended these free video tutorials for learning jQuery.

May 03 2010

22:26

Where/How do you store code snippets?

Anyone have any favorite ways of storing bits of code that you might want to incorporate at a later time?

I'm wondering what those in this group do (what your process is) when you're working on something that isn't quite ready for the formality of version control.

Scratch pad in text editor? Private repository? Whiteboard in the cloud? Commented-out code until cleanup?

Example scenario:
You're working on a project and decide you might eventually need some code that will provide cool Feature A. You create a chunk of Feature A. It's not quite ready for prime time and unnecessary for the project launch, but it could lead to something good. Where/how would you store the Feature A code?

A little further along, you come up with Feature B, which could be used in several other projects. Maybe. You put time and effort into Feature B, though, so you want to save the code, but again, not in a project. Where/how would you store it?

TIA.

March 23 2010

21:43

March 19 2010

20:12

Resurrecting Unstructured Data to Help Small Newspapers

Unstructured data is typically said to account for up to 80 percent of information stored on business computer systems. While this is a widely accepted notion, I'm inclined to agree with Seth Grimes that this 80 percent rule is inflated, depending on the type of business. Still, If we could structure even a fraction of that data, it would create significant value for small newspapers.

The type of data that has my attention is free-form text. Small newspapers in particular have computers full of text files containing information about their communities. Often, these files lie dormant, left on the hard drive of a dusty computer somewhere in the back of the newsroom, inaccessible to the public. Compounding this problem is the fact that newspapers realize no additional value from content they paid journalists to produce. The information is gathered, and then much of it sits somewhere, unused and untouched. Only parts of it end up being published.

To further understand the potential of resurrecting unstructured data, one must realize the workflow of traditional small newspapers.

Newspaper Workflow

It surprised me several years ago when I learned learn that most community newspapers utilize a very low-tech workflow when managing their data. A typical newspaper might organize their content in hierarchical folders as shown in the example below. Files are grouped by month, then named with the day of publication:

newsroom.png

The workflow is simple, effective and has served its purpose for many years. Once a file's publication date has passed, it is ignored forever. At best, a selection of these files are copy and pasted into a content management system for publication online. But this process seldom happens until after the newspaper's print edition has been completed. At this point the newspaper has little incentive to process these files further, as attention must now be focused on the next day's edition.

This reality helps illustrate the potential for the CMS Upload Utility, my Knight News Challenge project. It's an inexpensive way to move text files into a web-accessible database. Once inside a database, possibilities abound for how value can be created from this data. In my next post, I'll share several sample use cases to help explain how the application works.

For now, though, think about all of that unstructured data, and how we can make better use of it.

Reblog this post [with Zemanta]

January 21 2010

18:14

Mastering Multimedia useful tips roundup


Many of may old posts that deal with tips about how to do video storytelling and audio slideshows get linked on a lot of blogs used by college professors who teach digital media classes. Most of these posts are buried amongst my pontifications about the changes facing the newspaper industry. So for anyone interested,  here is a roundup of my best multimedia suggestions and useful tip posts in one place…

How to make your audio slideshows better

Great audio starts in the field

How best to approach a video story

Sequencing: The foundation of video storytelling

How to make your video editing easier

Get creative with your video camera

Opening your video: How not to lose viewers

Random Final Cut tip: Lower thirds titles

What we can learn from TV new shooters

January 06 2010

12:56
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl