Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 06 2011

19:06

LocalWiki: Laying the Groundwork

A few of you have been wondering what we've been up to since our Kickstarter pledge drive ended, so we want to give you a quick update on our Knight-funded project, LocalWiki. For those of you who are more technically inclined, we hope to also provide an insight into these early stages of our process.

To follow our updates in the future, please sign up with your email address at http://localwiki.org, follow us on Twitter at http://twitter.com/localwiki, or follow our blog directly. Or if you're a huge geek, join us on IRC in Freenode's #localwiki.

Right now, we are ramping up development of the wiki software that will provide the platform for all of our pilot projects. Starting in October, my partner Mike Ivanov and I have been working out of our awesome coworking office in San Francisco (shout out to NextSpace) and laying the groundwork for this new platform.


Making software that lasts

This may not make much sense unless you're a techie, but here are some details about what's going on:

Our initial focus at this stage is to build a set of reusable Django apps that will provide the core functionality of an extensible and easy-to-use wiki software, which include making it straightforward to edit a page, to track and work with revisions of pages and other objects, and to let people compare those revisions to see what's been changed. We will then use these components to build the first functional iteration of our wiki software. The benefits of this approach are that it helps us focus on each aspect separately, will help developers in the Django community to understand and contribute to our code, and makes it possible for other projects and organizations to use only the parts they might find useful. Software only survives if many people actively use it, and we want to ensure our software a long and happy life.

Next Few Months, Roughly Speaking

Until February: Core software. We will create the central components of the wiki software and put them together into something that will enable folks to start creating awesome content. We unfortunately have to work out some legal issues around licensing before we can easily accept outside code contributions.

As soon as our licensing issues are resolved, we'll send out an update with information about how to get involved with the development process. We hope the licensing issues will be resolved in the next couple of weeks. Nevertheless, it may be difficult for outside developers to get involved at this point because core bits and pieces will be moving and changing at a rapid rate.

February-April: Focus on features. We will push heavily to involve more outside developers to help make our software awesome and get some initial user feedback. If you are a developer interested in helping, this will be the best time for you to get involved because we will have somewhat solidified our development processes and underlying, core software. We will also need help with and feedback about the software from a higher level (e.g. feature requests).

April and beyond: Pilot communities, educational materials, community outreach. With the wiki platform largely built, we can start new pilot projects and educating potential users about building successful local projects. At this stage we will need all the help we can get from you to select pilots, write helpful guides, submit bug reports, and develop a model for communities to follow.

19:06

Laying the Groundwork For a Community Wiki

A few of you have been wondering what we've been up to since our Kickstarter pledge drive ended, so we want to give you a quick update on our Knight-funded project, LocalWiki. For those of you who are more technically inclined, we hope to also provide an insight into these early stages of our process.

To follow our updates in the future, please sign up with your email address at http://localwiki.org, follow us on Twitter at http://twitter.com/localwiki, or follow our blog directly. Or if you're a huge geek, join us on IRC in Freenode's #localwiki.

Right now, we are ramping up development of the wiki software that will provide the platform for all of our pilot projects. Starting in October, my partner Mike Ivanov and I have been working out of our awesome coworking office in San Francisco (shout out to NextSpace) and laying the groundwork for this new platform.


Making software that lasts

This may not make much sense unless you're a techie, but here are some details about what's going on:

Our initial focus at this stage is to build a set of reusable Django apps that will provide the core functionality of an extensible and easy-to-use wiki software, which include making it straightforward to edit a page, to track and work with revisions of pages and other objects, and to let people compare those revisions to see what's been changed. We will then use these components to build the first functional iteration of our wiki software. The benefits of this approach are that it helps us focus on each aspect separately, will help developers in the Django community to understand and contribute to our code, and makes it possible for other projects and organizations to use only the parts they might find useful. Software only survives if many people actively use it, and we want to ensure our software a long and happy life.

Next Few Months, Roughly Speaking

Until February: Core software. We will create the central components of the wiki software and put them together into something that will enable folks to start creating awesome content. We unfortunately have to work out some legal issues around licensing before we can easily accept outside code contributions.

As soon as our licensing issues are resolved, we'll send out an update with information about how to get involved with the development process. We hope the licensing issues will be resolved in the next couple of weeks. Nevertheless, it may be difficult for outside developers to get involved at this point because core bits and pieces will be moving and changing at a rapid rate.

February-April: Focus on features. We will push heavily to involve more outside developers to help make our software awesome and get some initial user feedback. If you are a developer interested in helping, this will be the best time for you to get involved because we will have somewhat solidified our development processes and underlying, core software. We will also need help with and feedback about the software from a higher level (e.g. feature requests).

April and beyond: Pilot communities, educational materials, community outreach. With the wiki platform largely built, we can start new pilot projects and educating potential users about building successful local projects. At this stage we will need all the help we can get from you to select pilots, write helpful guides, submit bug reports, and develop a model for communities to follow.

December 30 2010

18:35

Lessons Learned from ReportingOn

In 2008, I was awarded a Knight News Challenge grant to build ReportingOn, a back channel for beat reporters to share ideas, information, and sources. The goal of the project was to provide journalists of all stripes with a place to talk about content -- not craft, or process, or skillset.

I taught myself enough Django -- and sought out advice from friends and co-workers with little regard for their interest or priorities -- to launch the first iteration of the site in October 2008. In July 2009, with fresh design and development from the team at Lion Burger, ReportingOn 2.0 launched.

And almost immediately, I stepped away from it, buried in the responsibilities of my day job, family, and other projects. To grow and evolve, and really, to race ahead of the internal and external communication tools already available to reporters, ReportingOn needed far more time, attention, and dedication than I could give it.

Yesterday, I shut down ReportingOn.

In its last state, it only cost a few bucks a month to maintain, but it has more value at this point as a story, or a lesson, or a piece of software than it has as a working site.

To head off a couple questions at the pass:


  1. No, you can't export your questions or answers or profile data. None of you have touched the site in about a year, so I don't think you're that interested in exporting anything. But if you're some sort of web packrat that insists, I have the database, and I can certainly provide you with your content.

  2. Yes, the source code for the application is still available, and you're more than welcome to take a stab at building something interesting with it. If you do, please feel free to let me know.


And a few recommendations for developers of software "for journalists":

  • Reporters don't want to talk about unpublished stories in public.

  • Unless they're looking for sources.

  • There are some great places on the Internet to find sources.

  • When they do talk about unpublished stories among themselves, they do it in familiar, well-lit places, like email or the telephone. Not in your application.

  • Actually, keep this in mind: Unless what you're building meets a very journalism-specific need, you're probably grinding your gears to build something "for journalists" when they just need a great communication tool, independent of any particular niche or category of users.


As for the problem ReportingOn set out to solve, it's still out there.

Connecting the dots among far-flung newsrooms working on stories about the same issue is something that might happen internally in a large media company, or organically in the wilds of Twitter, but rarely in any structured way that makes it easy to discover new colleagues, peers, and mentors. Sure, there are email lists, especially for professional associations (think: SEJ) that act as back channels for a beat, but not enough, and not focused on content.

(Prove me wrong, kids. Prove me wrong.)

As for me, I'm working on another (even) small(er) Knight-funded side project a few minutes at a time these days. Watch for news about that one in the coming weeks.

July 22 2010

19:08

Some other online innovators for some other list

Journalism.co.uk have a list of this year’s “leading innovators in journalism and media”. I have some additions. You may too.

Nick Booth

I brought Nick in to work with me on Help Me Investigate, a project for which he doesn’t get nearly enough credit. It’s his understanding of and connections with local communities that lie behind most of the successful investigations on the site. In addition, Nick helped spread the idea of the social media surgery, where social media savvy citizens help others find their online voice. The idea has spread as far as Australia and Africa.

Matt Buck and Alex Hughes

Matt and Alex have been busily reinventing news cartoons for a digital age with a number of projects, including Drawnalism (event drawing), animated illustrations, and socially networked characters such as Tobias Grubbe.

Pete Cashmore

Mashable.

Tony Hirst

Tony has been blogging about mashups for longer than most at OUseful.info, providing essential help for journalists getting to grips with Yahoo! Pipes, Google spreadsheets, scraping, and – this week – Google App Inventor.

Adrian Holovaty and Simon Willison

I’m unfairly bunching these two together because they were responsible – with others – for the Django web framework, which has been the basis for some very important data journalism projects including The Guardian’s experiment in crowdsourcing analysis of MPs’ redacted expenses, and Holovaty’s Everyblock.

Philip John

Behind the Lichfield Blog but equally importantly, Journal Local, the platform for hyperlocal publishers which comes with a raft of useful plugins pre-installed, and he runs the West Midlands Future of News Group.

Christian Payne

Documentally has been innovating and experimenting with mobile journalism for years in the UK, with a relaxed-but-excitable on-screen/on-audio presence that suits the medium perfectly. And he really, really knows his kit.

Meg Pickard

Meg is an anthropologist by training, a perfect background for community management, especially when combined with blogging experience that pre-dates most of the UK. The practices she has established on the community management front at The Guardian’s online operations are an exemplar for any news organisation – and she takes lovely photos too.

Chris Taggart

Chris has been working so hard on open data in 2010 I expect steam to pour from the soles of his shoes every time I see him. His ambition to free up local government data is laudable and, until recently, unfashionable. And he deserves all the support and recognition he gets.

Rick Waghorn

One of the first regional newspaper reporters to take the payoff and try to go it alone online – first with his Norwich City website, then the MyFootballWriter network, and more recently with the Addiply self-serve ad platform. Rick is still adapting and innovating in 2010 with some promising plans in the pipeline.

I freely admit that these are based on my personal perspective and knowledge. And yes, lists are pointless, and linkbait.

June 02 2010

20:42

Why Journalists Should Learn Computer Programming

Yes, journalists should learn how to program. No, not every journalist should learn it right now -- just those who want to stay in the industry for another ten years. More seriously, programming skills and knowledge enable us traditional journalists to tell better and more engaging stories.

Programming means going beyond learning some HTML. I mean real computer programming.

As a journalist, I'm full aware of the reasons why we don't learn programming -- and I'm guilty of using many of them. I initially thought there were good reasons not to take it up:

  • Learning to program is time-consuming. One look at the thick books full of arcane code and you remember why you became a journalist and not a mathematician or an engineer. Even if you are mathematically inclined, it's tough to find the time to learn all that stuff.
  • Your colleagues tell you you don't need it -- including the professional developers on staff. After all, it took them years of study and practice to become really good developers and web designers, just like it takes years for a journalist to become experienced and knowledgeable. (And, if you start trying to code, the pros on staff are the ones who'll have to clean up any mess you make.)
  • Learning the basics takes time, as does keeping your skills up to date. The tools change all the time. Should you still bother to learn ActionScript (Flash), or just go for HTML5? Are you sure you want to study PHP and not Python?
  • Why learn programming when there are so many free, ready-made tools online: Quizzes, polls, blogs, mind maps, forums, chat tools, etc. You can even use things like Yahoo Pipes to build data mashups without needing any code.
  • When Megan Taylor wrote for MediaShift about the programmer-journalist, she asked around for the perfect skillset. One response nearly convinced me to never think about programming ever again: "Brian Boyer, a graduate of Medill's journalism for programmers master's track and now News Applications Editor at the Chicago Tribune, responded with this list: XHTML / CSS / JavaScript / jQuery / Python / Django / xml / regex / Postgres / PostGIS / QGIS."

Those are some of the reasons why I thought I could avoid learning programming. But I was so wrong.

Why Journalists Should Program

You've heard the reasons not to start coding. Now here's a list of reasons why you should:

  • Every year, the digital universe around us becomes deeper and more complex. Companies, governments, organizations and individuals are constantly putting more data online: Text, videos, audio files, animations, statistics, news reports, chatter on social networks...Can professional communicators such as journalists really do their job without learning how the digital world works?
  • Data are going mobile and are increasingly geo-located. As a result, they tell the stories of particular neighborhoods and streets and can be used to tell stories that matter in the lives of your community members.
  • People have less time, and that makes it harder to grab their attention. It's essential to look for new narrative structures. Programming enables you to get interactive and tell non-linear stories.

Jquerylogo copy.jpg

  • You don't have to build everything from scratch. Let's take JavaScript, which is used for creating dynamic websites. Tools such as jQuery, a cross-browser JavaScript library, enable people to create interactivity with less effort. Web application frameworks such as Ruby on Rails and Django support the development of dynamic sites and applications. So it can be easier than you thought.

A Way of Looking At the World

Maybe you're not yet convinced. Even though jQuery makes your life easier, you still need a decent knowledge of JavaScript, CSS and HTML. Django won't help you if you never practiced Python. All of this takes time, and maybe you'll never find enough of it to get good at all this stuff.

Still, we must try. The good news is that it doesn't matter if you become proficient at the latest language. What is important, however, is that you're able to comprehend the underpinnings of programming and interactivity -- to be able to look at the world with a coder's point of view.

I'm still just a beginner, but I feel that this perspective provides you with an acute awareness of data. You start looking for data structures, for ways to manipulate data (in a good sense) to make them work for your community.

When covering a story, you'll think in terms of data and interactivity from the very start and see how they can become part of the narrative. You'll see data everywhere -- from the kind that floats in the air thanks to augmented reality, to the more mundane version contained in endless streams of status updates. Rather than being intimidated by the enormous amount of data, you'll see opportunities -- new ways to bring news and information to the community.

You probably won't have time to actually do a lot of the programming and data structuring yourself. But now you're equipped to have a valuable and impactful conversation with your geek colleagues. A conversation that gets better results than ever before.

So, even though it's probably a bit late for me to attend the new joint Master of Science degree program in Computer Science and Journalism at Columbia University, I can still learn How to Think Like a Computer Scientist using the the free MIT OpenCourseWare, take part in the Journalists/Coders Ning network, and find help at Help.HacksHackers.Com).

And so can you.

******

Are you a journalist who has taken up programming? A programmer with advice for journalists? Please share your experiences and insights in the comments.

Roland Legrand is in charge of Internet and new media at Mediafin, the publisher of leading Belgian business newspapers De Tijd and L'Echo. He studied applied economics and philosophy. After a brief teaching experience, he became a financial journalist working for the Belgian wire service Belga and subsequently for Mediafin. He works in Brussels, and lives in Antwerp with his wife Liesbeth.

This is a summary. Visit our site for the full post ».

January 25 2010

01:10

Using Geocoders with GeoDjango

For a “15-minute project“, Simon Willison’s geocoders library is pretty handy if you’re doing geocoding with Python. It offers a common interface to the geocoding services provided by Google, Yahoo and other sources. When we were looking at replacing the home-grown geocoding system that Andrei Scheinkman built for Represent, Simon’s project seemed a natural choice.

It was an easy drop-in, but there was one thing about it that was just slightly off. A successful geocoding result looks like this:

(u'New York, NY, USA', (40.756053999999999, -73.986951000000005))

Notice the coordinate pair is latitude, longitude. For folks using GeoDjango alongside Simon’s library, the way you build a Point object from coordinates is to pass the longitude first, like so:

>>> from django.contrib.gis.geos import Point
>>> p = Point((5, 23)) # 2D point, passed in as a tuple

So on Friday I forked Simon’s project and reversed the ordering of the coordinates in a successful result. That way you can pass that portion of the result directly to a Point constructor:

>>> from django.contrib.gis.geos import *
>>> from geocoders.google import geocoder
>>> geocode = geocoder('GOOGLE-API-KEY')
>>> results = geocode('new york')
(u'New York, NY, USA', (-73.986951000000005, 40.756053999999999))
>>> pnt = Point(results[1])

Not a huge deal, but in keeping with the spirit of library, I think.

November 25 2009

19:46

Keeping It Simple(r)

I haven’t mentioned Fumblerooski in awhile, but rest assured that work continues, especially during college football season. I’ve added more coaching information (still a long ways to go on that, though) and will be unveiling player rankings soon. But the biggest thing I’ve done lately has nothing to do with new features. Instead, as I’ve become a better coder in general, I’ve seen how bloat can really hinder a project. So I spent time last week reorganizing the Fumblerooski code to take advantage of some of Django’s strengths.

This all started back at the NICAR conference in Indianapolis where several of us led a mini-bootcamp on using Django. At one point, as we talked about how projects are organized, I showed off the models for Fumblerooski. They went on forever. Looking back, it wasn’t the message that I wanted to get across – I think several people gasped.

Fumblerooski still is far more tightly coupled together than I’d like – the scrapers can’t really be separated out as an independent app, which would be the right thing to do. But it’s getting closer. Same for the rankings app. Coaches could be the next one, or maybe players. The scrapers, even though they don’t constitute an actual app, are better organized. The point is that now the code is probably easier for someone else to follow, but it’s also easier for me to locate specific functions. I spend less time hunting and more time actually doing things.

How does this actually work? Python’s ability to import libraries into others means that Django apps can share common code (and, if you’re working in the same database, data) inside a single project just by typing an import statement:

from fumblerooski.rankings.models import RushingSummary

And I get access to the rushing rankings wherever I need to use them. Because this is so trivial, it sometimes led me to think that where I put things didn’t matter. But it does, it really does, for your sake and the sake of anyone who attempts to look at your code.

Tags: Python django

November 06 2009

14:54

Welcome to Davis, Calif.: Six lessons from the world’s best local wiki

Ah, Davis: home of 60,000 people, 30,000 students, 188 sunny days a year, a 16 percent bike commute mode share and the busiest local wiki in the world.

If I were Omaha World-Herald Publisher Terry Kroeger, I’d be booking my post-holiday flight immediately.

As Gina reported here last week, Omaha’s employee-owned metro daily just bought WikiCity, an Omaha-based Web startup that wamts to provide mini-Wikipedias for every city in the country. Creating a cheap platform for evergreen, user-generated local Web content has been tried, um, once or twice before. But with some notable exceptions, corporations have turned out to be really, really bad at this.

Philip Neustrom hasn’t.

Today, the quirky 500-page wiki Neustrom launched with fellow UC Davis math student Mike Ivanov in 2004 has 14,000 pages and drew 13,000 edits by 3,300 users last month, averaging 10,000 unique visitors daily. More importantly, it’s the best way in town to find a lost cat, compare apartment rental prices or get a list of every business open past 10 p.m. Operating budget, not counting its founders’ part-time volunteer labor: about $2,000 a year.

What’s the secret? Neustrom, who now wrangles code for the Citizen Engagement Lab in the Bay Area, was nice enough to tell us.

Wikis need content to breed content. Or, as evergreen-content guru Matt Thompson put it last week, a wiki written primarily by robots will appeal primarily to robots.

“Starting anything is hard,” said Neustrom, now 25. “The issue is predominantly an issue of outreach, of coordinating people and making sure people understand that they can’t just put something up there and add 50 pages and walk away, and then come back in a month and hope that it’s taken off.”

Instead, Neustrom, Ivanov convinced some of their friends to spend four summer months writing snippets about things that only exist in Davis, like drunken biking through late-night fog, oversized playground equipment and the smell from the cow farm on the edge of town.

“We were just trying to do something that we liked,” Neustrom said. “We certainly weren’t trying to do anything that was very useful.”

Business information is the holy grail. Pages about your local toad tunnel are dandy, Neustrom said, and quirky content kept the site from feeling generic to early users. But the feature that made DavisWiki take off was what the traditional media calls “consumer reporting.”

“After we’d sort of seeded it with 500 pages or something like that, we opened it up to the public,” Neustrom said. “First, it was pretty slow going. Nothing really happened.”

Then, sometime in late 2005, pages on things like lunch specials and Davis’s nicest bathrooms started filling up. Local business coverage has been “a big driving force” ever since, Neustrom said. Today, he said, retail businesses in town often keep their own information on DavisWiki up to date.

A wiki’s strengths kick in after one year. The web craves news like kids crave sugar. Blogs and tweets are gobbled fast and burn quick. But wikis are the whole grains of the web: One year after news breaks, someone will want to find and link to it again — and a wiki is likely to be the only place it’s still hanging around.

“All of the existing online resources for sort of cataloging anything about the town were sort of time-based,” Neustrom said. “After about a year and a half, these things would sort of disappear, even if they’d been around for 100 years, like the local newspaper…So we became the resource of record.”

Start with a subculture, then build out to a general audience. DavisWiki has always aspired to cover its whole town, but it’s always served students best.

That’s all right, Neustrom thinks. If he’d tried to please everybody who showed up, no one would have come back.

“When building something like this, you can’t just aim for this wide spectrum at first,” Neustrom said. Some companies try to launch wikis by writing programs that “crawl through a database, that spit out statistics and create 13 million pages and put that out there and hope that it’s going to stick. You can’t do that. It’s just not going to work.”

Neustrom, who spent 2004 sharing a house with musicians, found his base among the artsy, but he thinks any subculture would do. “You could have, like, a physics grad student start a community for their town, and it’s a bunch of physics nerds,” he said. “And that could spiral out and out.”

Keep your content open source, no matter what. Don’t do it for marketing reasons or out of the kindness of your heart. Do it because it’s the only way to guarantee to your users that if you fold, all their hard work won’t die with you.

Good wikis inspire rabid devotion — if they don’t, they never become good wikis. Neustrom and Ivanov keep their budget online and think of the project as a user co-op. Their users did, too. “There are people on there who literally spend four hours a day looking at DavisWiki,” Neustrom said. “People had free [computer lab] pages every quarter, so they would use their excess printing to print out 400 fliers and staple them to every room on campus.”

People don’t do that for sites they think are “neat,” Neustrom said. They do it for sites they own.

Don’t get hung up on mimicking Wikipedia. Sure, it may be the most useful object ever created by human beings. But as Marshall Poe showed in his terrific biography of Wikipedia’s youth, its rules — universal editorship, neutral point of view, no original research — were forged out of year-long flamewars among the early Wikipedians. Neustrom and his friends didn’t think NPOV was suited to an inherently Davis-centric site, so they ditched it.

Wikipedia’s widely used software, MediaWiki, isn’t perfect either. DavisWiki uses a modified Sycamore platform but it, too, has flaws.

“People want to be able to search for all elementary schools within a certain radius of a certain point, or all of the restaurants that serve vegan food,” Neustrom said. “MediaWiki suffers the same issue [as Sycamore] — it was written before the advent of modern web framework.”

Neustrom is yearning for a modern wiki platform. That’s why he’s been messing around with Django this year. It’s also why he’s incorporating Wikispot, the nonprofit he set up to reproduce DavisWiki for other towns and topics, as a 501(c)3.

Looking for a tax write-off, Terry?

Photo by Arlen used under a Creative Commons license.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl