Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

May 22 2013

17:54

Who’s reusing the news?

Derek Willis, interactive news developer for The New York Times, wrote a blog post about a different way to use analytics. Willis says he’s interested in tracking and mapping who is citing and quoting the work of major news outlets (like The New York Times).

The idea behind linkypedia is that links on Wikipedia aren’t just references, they help describe how digital collections are used on the Web, and encourage the spread of knowledge: “if organizations can see how their web content is being used in Wikipedia, they will be encouraged and emboldened to do more.” When I first saw it, I immediately thought about how New York Times content was being cited on Wikipedia. Because it’s an open source project, I was able to find out, and it turned out (at least back then) that many Civil War-era stories that had been digitized were linked to from the site. I had no idea, and wondered how many of my colleagues knew. Then I wondered what else we didn’t know about how our content is being used outside the friendly confines of nytimes.com.

That’s the thread that leads from Linkypedia to TweetRewrite, my “analytics” hack that takes a nytimes.com URL and feeds tweets that aren’t simply automatic retweets; it tries to filter out posts that contain the exact headline of the story to find what people say about it. It’s a pretty simple Ruby app that uses Sinatra, the Twitter and Bitly gems and a library I wrote to pull details about a story from the Times Newswire API.

April 09 2013

11:00

OpenNews Revamps Code Sprints; Sheetsee.js Wins First Grant

imageBack at the Hacks/Hackers Media Party in Buenos Aires, I announced the creation of Code Sprints -- funding opportunities to build open-sourced tools for journalism. We used Code Sprints to fund a collaboration between WNYC in New York and KPCC in Southern California to build a parser for election night XML data that ended up used on well over 100 sites -- it was a great collaboration to kick off the Code Sprint concept.

Originally, Code Sprints were designed to work like the XML parser project: driven in concept and execution by newsrooms. While that proved great for working with WNYC, we heard from a lot of independent developers working on great tools that fit the intent of Code Sprints, but not the wording of the contract. And we heard from a lot of newsrooms that wanted to use code, but not drive development, so we rethought how Code Sprints work. Today we're excited to announce refactored Code Sprints for 2013.

Now, instead of a single way to execute a Code Sprint, there are three ways to help make Code Sprints happen:

  • As an independent developer (or team) with a great idea that you think may be able to work well in the newsroom.
  • As a newsroom with a great idea that wants help making it a reality.
  • As a newsroom looking to betatest code that comes out of Code Sprints.

Each of these options means we can work with amazing code, news organizations, and developers and collaborate together to create lots of great open-source tools for journalism.

Code Sprint grant winner: Sheetsee.js

I always think real-world examples are better than theoreticals, so I'm also excited to announce the first grant of our revamped Code Sprints will go to Jessica Lord to develop her great Sheetsee.js library for the newsroom. Sheetsee has been on the OpenNews radar for a while -- we profiled the project in Source a number of months back, and we're thrilled to help fund its continued development.

Sheetsee was originally designed for use in the Macon, Ga., government as part of Lord's Code for America fellowship, but the intent of the project -- simple data visualizations using a spreadsheet for the backend -- has always had implications far beyond the OpenGov space. We're excited today to pair Lord with Chicago Public Media (WBEZ) to collaborate on turning Sheetsee into a kick-ass and dead-simple data journalism tool.

For WBEZ's Matt Green, Sheetsee fit the bill for a lightweight tool that could help get the reporters "around the often steep learning curve with data publishing tools." Helping to guide Lord's development to meet those needs ensures that Sheetsee becomes a tool that works at WBEZ and at other news organizations as well.

We're excited to fund Sheetsee, to work with a developer as talented as Lord, to collaborate with a news organization like WBEZ, and to relaunch Code Sprints for 2013. Onward!

Dan Sinker heads up the Knight-Mozilla News Technology Partnership for Mozilla. From 2008 to 2011 he taught in the journalism department at Columbia College Chicago where he focused on entrepreneurial journalism and the mobile web. He is the author of the popular @MayorEmanuel twitter account and is the creator of the election tracker the Chicago Mayoral Scorecard, the mobile storytelling project CellStories, and was the founding editor of the influential underground culture magazine Punk Planet until its closure in 2007. He is the editor of We Owe You Nothing: Punk Planet, the collected interviews and was a 2007-08 Knight Fellow at Stanford University.

A version of this post originally appeared on Dan Sinker's Tumblr here.

April 02 2013

10:39

How Public Lab Turned Kickstarter Crowdfunders Into a Community

Public Lab is structured like many open-source communities, with a non-profit hosting and coordinating the efforts of a broader, distributed community of contributors and members. However, we are in the unique position that our community creates innovative open-source hardware projects -- tools to measure and quantify pollution -- and unlike software, it takes some materials and money to actually make these tools. As we've grown over the past two years, from just a few dozen members to thousands today, crowdfunding has played a key role in scaling our effort and reaching new people.

DIY Spectrometry Kit Kickstarter

Kickstarter: economies of DIY scale

Consider a project like our DIY Spectrometry Kit, which was conceived of just after the Deepwater Horizon oil spill to attempt to identify petroleum contamination. In the summer of 2012, just a few dozen people had ever built one of our designs, let alone uploaded and shared their work. As the device's design matured to the point that anyone could easily build a basic version for less than $40, we set out to reach a much larger audience while identifying new design ideas, use cases, and contributors, through a Kickstarter project. Our theory was that many more people would get involved if we offered a simple set of parts in a box, with clear instructions for assembly and use.

By October 2012, more than 1,600 people had backed the project, raising over $110,000 -- and by the end of December, more than half of them had received a spectrometer kit. Many were up and running shortly after the holidays, and we began to see regular submissions of open spectral data at http://spectralworkbench.org, as well as new faces and strong opinions on Public Lab's spectrometry mailing list.

Kickstarter doesn't always work this way: Often, projects turn into startups, and the first generation of backers simply becomes the first batch of customers. But as a community whose mission is to involve people in the process of creating new environmental technologies, we had to make sure people didn't think of us as a company but as a community. Though we branded the devices a bit and made them look "nice," we made sure previous contributors were listed in the documentation, which explicitly welcomed newcomers into our community and encouraged them to get plugged into our mailing list and website.

newbox.jpg

As a small non-profit, this approach is not only in the spirit of our work, but essential to our community's ability to scale up. To create a "customer support" contact rather than a community mailing list would be to make ourselves the exclusive contact point and "authority" for a project which was developed through open collaboration. For the kind of change we are trying to make, everyone has to be willing to learn, but also to teach -- to support fellow contributors and to work together to improve our shared designs.

Keeping it DIY

One aspect of the crowdfunding model that we have been careful about is the production methods themselves. While it's certainly vastly different to procure parts for 1,000 spectrometers, compared to one person assembling a single device, we all agreed that the device should be easy to assemble without buying a Public Lab kit -- from off-the-shelf parts, at a reasonable cost. Thus the parts we chose were all easily obtainable -- from the aluminum conduit box enclosure, to the commercially available USB webcams and the DVD diffraction grating which makes spectrometry possible.

spectrometry.jpg

While switching to a purpose-made "holographic grating" would have made for a slightly more consistent and easy-to-assemble kit (not to mention the relative ease of packing it vs. chopping up hundreds of DVDs with a paper cutter...), it would have meant that anyone attempting to build their own would have to specially order such grating material -- something many folks around the world cannot do. Some of these decisions also made for a slightly less optimal device -- but our priority was to ensure that the design was replicable, cheap, and easy. Advanced users can take several steps to dramatically improve the device, so the sky is the limit!

The platform effect

One clear advantage of distributing kits, besides the bulk prices we're able to get, is that almost 2,000 people now have a nearly identical device -- so they can learn from one another with greater ease, not to mention develop applications and methodologies which thousands of others can reproduce with their matching devices. We call this the "platform effect" -- where this "good enough" basic design has been standardized to the point that people can build technologies and techniques on top of it. In many ways, we're looking to the success of the Arduino project, which created not only a common software library, but a standardized circuit layout and headers to support a whole ecology of software and hardware additions which are now used by -- and produced by -- countless people and organizations.

Spectral Challenge screenshot

As we continue to grow, we are exploring innovative ways to use crowdfunding to get people to collaboratively use the spectrometers they now have in hand to tackle real-world problems. Recently, we have launched the Spectral Challenge, a kind of "X Prize for DIY science", but it's crowdfunded -- meaning that those who support the goals of the Challenge can participate in the competition directly, or by contributing to the prize pool. Additionally, Public Lab will continue to leverage more traditional means of crowdfunding as our community develops new projects to measure plant health and produce thermal images -- and we'll have to continue to ensure that any kits we sell clearly welcome new contributors into the community.

The lessons we've learned from our first two kit-focused Kickstarters will help us with everything from the box design to the way we design data-sharing software. The dream, of course, is that in years to come, as we pass the 10,000- and 100,000-member marks, we continue to be a community which -- through peer-to-peer support -- helps one another identify and measure pollution without breaking the bank.

The creator of GrassrootsMapping.org, Jeff Warren designs mapping tools, visual programming environments, and flies balloons and kites as a fellow in the Center for Future Civic Media, and as a student at the MIT Media Lab's Design Ecology group, where he created the vector-mapping framework Cartagen. He co-founded Vestal Design, a graphic/interaction design firm in 2004, and directed the Cut&Paste Labs project, a year-long series of workshops on open source tools and web design in 2006-7 with Lima designer Diego Rotalde. He is a co-founder of Portland-based Paydici.com.

August 20 2012

17:39

Stanford tool unlocks patterns in email dumps

Thanks to a chance encounter, the researcher behind an email archive analysis tool developed at Stanford's computer science department is finding ways to help investigative journalists dive into massive public record dumps. Read More »

July 31 2012

20:54

May 03 2012

13:53

How We Got Here: The Road to Public Lab's Map Project

Last week, Public Laboratory announced that public domain maps are now starting to show up on Google Earth and Google Maps. But how did the projects get there? Here's a timeline of a Public Laboratory map project.

Making a map

Public Laboratory projects take a community-based approach to making maps that differs depending on where you are and the reason you want to create a map. People map areas for a number of reasons, including because there's a need to monitor an area of environmental concern, a dynamic event is happening that there's a desire to capture, or you cannot find adequate aerial image data. Before going out to map, preparing for fieldwork starts with the Public Lab map tools page, where you can discover what type of equipment to use and how to safely use it. Multiple research notes on how to do things such as setting up a dual camera rig and stabilizing the camera with a picavet can help with specific problems, but there are also hundreds of people in the online Public Lab community of mapmakers, sharing tips and experiences on the site.

Upon return

After the mapping flight, the map making begins with backing up the images and sorting through the set, making a subset for map production. Depending on the time in the air, there will be hundreds and sometimes thousands of individual images. Depending on the area of interest, you can hone in on which images will be used in creating the map. Assuming the flight was at a steady altitude, the images that you want to select are the sharpest ones that are vertically oriented. If you have many images for the same area, pick the best one, but also pick overlapping images so that there is plenty of overlap among the different images in the next step.

mapmill.jpg Public Laboratory's MapMill.

Images can be sorted locally or online. Public Laboratory created an online tool where a group can do collaborative selection. MapMill.org is a web-based image sorting and ranking tool where multiple users can sort through a large dataset simultaneously.

Map production

With a smaller set of the best images on hand, the images can be dynamically placed on the map in a process known as georectification. After all the images have been added to the map, the project is exported. The MapKnitter export tool does all of the geographic information systems crunching behind the scenes with the geospatial data abstraction library (gdal.org) and produces a GeoTIFF map file. The GeoTIFF format is a public domain metadata standard that embeds geographic information into the image TIFF file. At this point, the map is now in an interchangeable format that can be easily distributed.

MapKnitter.jpg Public Laboratory MapKnitter web-based aerial image map production tool.

Public Laboratory Map Archive

Public Lab hosts its own map data archive for storing and sharing finished map projects. Each map in the archive has a "map details page" that hosts details such as: title, date, place, location, resolution, field map maker, field notes, cartographer, ground images, oblique images from the flight, and comments from website users. The map participants choose whether to publish the map as Public Domain, Creative Commons Attribution ShareAlike, Creative Commons Attribution, or Creative Commons Attribution Non-Commercial.

occupy-oakland.jpg Public Laboratory Occupy Oakland, November 2, 2011 -- General Strike map in Google Earth.

Maps are viewable on the archive itself, and you can subscribe to it as an RSS feed. However, it's also a place for distribution of the data. As we announced last week, Google Earth has started licensing our public domain maps. Google Earth plans to continue to publish public domain maps from the Public Lab Archive a few times a year.

It's quite exciting to see these Public Labs maps go online with a ubiquitous data provider such as Google. We look forward to more people participating in this activity, and more publishing of public domain data.

rifle.jpg Google published some of the maps to Google Maps as well as Google Earth, which makes those maps widely accessible in the web browser and on mobile applications that use Google Maps.

April 26 2012

14:00

LedeHub to Foster Open, Collaborative Journalism

I'm honored to be selected as one of the inaugural AP-Google Journalism and Technology Scholarship fellows for the 2012-13 academic school year, and am excited to begin work on my project, LedeHub.

I believe in journalism's ability to better the world around us. To fully realize the potential of journalism in the digital age, we need to transform news into a dialogue between readers and reporters. LedeHub does just that, fostering collaborative, continuous and open journalism while incorporating elements of crowdsourcing to allow citizens, reporters and news organizations to come together in unprecedented ways.

LedeHub in Action

Here's a potential case study: "Alice" isn't a journalist, but she loves data and can spot the potential for a story amid the rows and columns of a CSV file. She comes across some interesting census data illustrating the rise of poverty in traditionally wealthy Chicagoland suburbs, but isn't quite sure how to use it, so she points her browser to www.ledehub.com. She creates a new story repository called "census-chicago-12," tags it under "Government Data," and commits the numbers.

Two days later, "Bob" -- a student journalist with a knack for data reporting -- is browsing the site and comes across Alice's repository. He forks it and commits a couple paragraphs of analysis. Alice sees Bob's changes and likes where he's headed, so she merges it back into her repository, and the two continue to collaborate. Alice works on data visualization, and Bob continues to do traditional reporting, voicing the story of middle-class families who can no longer afford to send their children to college.

A few days later, a news outlet like the Chicago Tribune sees "census-chicago-12" and flags it as a promising repository -- pulls it, edits, fact-checks and publishes the story, giving Alice and Bob their first bylines.

As you can see, LedeHub re-imagines the current reporting and writing workflow while underscoring the living nature of articles. By representing stories as "repositories" -- with the ability to edit, update, commit and revert changes over time -- the dynamic nature of news is effectively captured.

Fostering Open-Source Journalism

GitHub and Google Code are social coding platforms that have done wonders for the open-source community. I'd like to see similar openness in the journalism industry.

My proposal for LedeHub is to adapt the tenets of Git -- a distributed version control system -- and appropriate its functionality as it applies to the processes of journalism. I will implement a web application layer on top of this core functionality to build a tool for social reporting, writing and coding in the open. This affords multiple use cases for LedeHub, as illustrated in the case study I described above -- users can start new stories, or search for and contribute to stories already started. I'd like to mirror the basic structure of GitHub, but re-appropriate the front end to cater to the news industry and be more reporter-focused, not code-driven. That said, here's a screenshot of the upcoming LedeHub repository on GitHub (to give you a general idea of what the LedeHub dashboard might look like):

ledehub.jpg

Each story repository may contain text, data, images or code. The GitHub actions of committing (adding changes), forking (diverging story repositories to allow for deeper collaboration and account for potential overlap) and cloning will remain analagous in LedeHub. Repositories will be categorized according to news "topics" or "areas" like education or politics. Users -- from citizens to reporters or coders -- will have the ability to "watch" different story repositories they are interested in and receive updates when changes to that story are made. Users can also comment on different "commits" for a story, offering their input or suggestions for improvement. GitHub offers a "company" option, which allows for multiple users to be added to the organization, a feature I would like to mimic in my project for news outlets, in addition to Google Code's "issues" feature.

Next Steps

I recognize that the scope of my project is ambitious, and my current plan is to segment implementation into iterations -- to build an initial prototype to test within one publication and expand from there.

Journalism needs to become more open, like the web. Information should be shared. The collaboration between the New York Times and the Guardian over WikiLeaks data was very inspiring, two "competing" organizations sharing confidential information for publication. With my project, LedeHub, I hope to foster similar transparency and collaboration.

So, that's the proposal. There's still a lot to figure out. For example, what's the best way to motivate users to collaborate? What types of data can be committed? What copyright issues need to be considered? Should there be compensation involved? Fact-checking? Sound off. I'd love to hear your thoughts.

Keep up with all the new content on Collaboration Central by following our Twitter feed @CollabCentral or subscribing to our RSS feed or email newsletter:







Get Collaboration Central via Email

Katie Zhu is a junior at Northwestern University, studying journalism and computer science, and is particularly interested in human-computer interaction, data visualization and interaction design. She has previously interned at GOOD in Los Angeles, where she helped build GOOD's mobile website. She continues development work part-time throughout the school year, and enjoys designing and building products at the intersection of news and technology. She was selected as a finalist in the Knight-Mozilla News Technology Partnership in 2011.

This is a summary. Visit our site for the full post ».

February 10 2012

18:00

Still shaping the way people think about news innovation? A few reflections on the new KNC 2.0

As someone who probably has spent more time thinking about the Knight News Challenge than anyone outside of Knight Foundation headquarters — doing a dissertation on the subject will do that to you! — I can’t help but follow its evolution, even after my major research ended in 2010. And evolve it has: from an initial focus on citizen journalism and bloggy kinds of initiatives (all the rage circa 2007, right?) to a later emphasis on business models, visualizations, and data-focused projects (like this one) — among a whole host of other projects including news games, SMS tools for the developing world, crowdsourcing applications, and more.

Now, after five years and $27 million in its first incarnation, Knight News Challenge 2.0 has been announced for 2012, emphasizing speed and agility (three contests a year, eight-week turnarounds on entries) and a new topical focus (the first round is focused on leveraging existing networks). While more information will be coming ahead of the February 27 launch, here are three questions to chew on now.

Does the Knight News Challenge still dominate this space?

The short answer is yes (and I’m not just saying that because, full disclosure, the Knight Foundation is a financial supporter of the Lab). As I’ve argued before, in the news innovation scene, at this crossroads of journalism and technology communities, the KNC has served an agenda-setting kind of function — perhaps not telling news hipsters what to think regarding the future of journalism, but rather telling them what to think about. So while folks might disagree on the Next Big Thing for News, there’s little question that the KNC has helped to shape the substance and culture of the debate and the parameters in which it occurs.

Some evidence for this comes from the contest itself: Whatever theme/trend got funded one year would trigger a wave of repetitive proposals the next. (As Knight said yesterday: “Our concern is that once we describe what we think we might see, we receive proposals crafted to meet our preconception.”)

And yet the longer answer to this question is slightly more nuanced. When the KNC began in 2006, with the first winners named in 2007, it truly was the only game in town — a forum for showing “what news innovation looks like” unlike any other. Nowadays, a flourishing ecosystem of websites (ahem, like this one), aggregators (like MediaGazer), and social media platforms is making the storyline of journalism’s reboot all the more apparent. It’s easier than ever to track who’s trying what, which experiments are working, and so on — and seemingly in real time, as opposed to a once-a-year unveiling. Hence the Knight Foundation’s move to three quick-fire contests a year, “as we try to bring our work closer to Internet speed.”

How should we define the “news” in News Challenge?

One of the striking things I found in my research (discussed in a previous Lab post) was that Knight, in its overall emphasis, has pivoted away from focusing mostly on journalism professionalism (questions like “how do we train/educate better journalists?”) and moved toward a broader concern for “information.” This entails far less regard for who’s doing the creating, filtering, or distributing — rather, it’s more about ensuring that people are informed at the local community level. This shift from journalism to information, reflected in the Knight Foundation’s own transformation and its efforts to shape the field, can be seen, perhaps, like worrying less about doctors (the means) and more about public health (the ends) — even if this pursuit of health outcomes sometimes sidesteps doctors and traditional medicine along the way.

This is not to say that Knight doesn’t care about journalism. Not at all. It still pours millions upon millions of dollars into clearly “newsy” projects — including investigative reporting, the grist of shoe-leather journalism. Rather, this is about Knight trying to rejigger the boundaries of journalism: opening them up to let other fields, actors, and ideas inside.

So, how should you define “news” in your application? My suggestion: broadly.

What will be the defining ethos of KNC 2.0?

This is the big, open, and most interesting question to me. My research on the first two years of KNC 1.0, using a regression analysis, found that contest submissions emphasizing participation and distributed knowledge (like crowdsourcing) were more likely to advance, all things being equal. My followup interviews with KNC winners confirmed this widely shared desire for participation — a feeling that the news process not only could be shared with users, but in fact should be.

I called this an “ethic of participation,” a founding doctrine of news innovation that challenges journalism’s traditional norm of professional control. But perhaps, to some extent, that was a function of the times, during the roughly 2007-2010 heyday of citizen media, with the attendant buzz around user-generated content as the hot early-adopter thing in news — even if news organizations then, as now, struggled to reconcile and incorporate a participatory audience. Even while participation has become more mainstream in journalism, there are still frequent flare-ups, like this week’s flap over breaking news on Twitter, revealing enduring tensions at the “collision of two worlds — when a hierarchical media system in the hands of the few collides with a networked media system open to all,” as Alfred Hermida wrote.

So what about this time around? Perhaps KNC 2.0 will have an underlying emphasis on Big Data, algorithms, news apps, and other things bubbling up at the growing intersection of computer science and journalism. It’s true that Knight is already underwriting a significant push in this area through the (also just-revised) Knight-Mozilla OpenNews project (formerly called the Knight-Mozilla News Technology Partnership — which Nikki Usher and I have written about for the Lab). To what extent is there overlap or synergy here? OpenNews, for 2012, is trying to build on the burgeoning “community around code” in journalism — leveraging the momentum of Hacks/Hackers, NICAR, and ONA with hackfests, code-swapping, and online learning. KNC 2.0, meanwhile, talks about embracing The Hacker Way described by Mark Zuckerberg — but at the same time backs away a bit from its previous emphasis on open source as a prerequisite. It’ll be interesting to see how computational journalism — explained well in this forthcoming paper (PDF here) by Terry Flew et al. in Journalism Practice — figures into KNC 2.0.

Regardless, the Knight News Challenge is worth watching for what it reveals about the way people — journalists and technologists, organizations and individuals, everybody working in this space — talk about and make sense of “news innovation”: what it means, where it’s taking us, and why that matters for the future of journalism.

January 20 2012

15:20

How to Create a Minimalist Map Design With OpenStreetMap

Mapping can be as much about choosing what data not to include as to include, so you can best focus your audience on the story you are telling. Oftentimes with data visualization projects, the story isn't about the streets or businesses or parks, but rather about the data you're trying to layer on the map.

To help people visualize data like this, I've started to design a new minimal base map for OpenStreetMap. What's great about OpenStreetMap is that the data is all open. This means I can take the data and design a totally custom experience. Once finished, the map will serve as another option to the traditional OpenStreetMap baselayer.

I'm designing the new map in the open-source map design studio TileMill, which Development Seed has written before about here. The map can be used as a light, very subtle background to add data on top of for use either with our MapBox hosting platform's map builder or on its own. It still provides the necessary geographic context for a map, but moves the focus to the data added on top of the map -- and not details that are irrelevant to its story.

Here's an early look at the features and design aspects I've been working on for the map.

A look at Portland, Ore., on the new OpenStreetMap Mininal basemap Portland, Ore., on the new OpenStreetMap minimal base map.

Behind the design decisions

I used the open-source OSM Bright template that you can load into TileMill as a starting point for the design and removed all color, choosing to limit the palette to light grays. For simplicity, most land use and land cover area types have been dropped. However, wooded areas and parks remain, indicated with subtle textures instead of color. The fact that OpenStreetMap's data is open gives me full control of choosing exactly what I want to show up on the map.

The style now includes more types of roads. Tracks have been added, as have pedestrian routes, bike paths, and bridleways, which are shown as dotted lines. Roads without general public access (for example, private roads) are shown faded out. The rendering of overlaying tunnels, streets and bridges has also greatly improved, with most overlapping lines separated and stacked in the proper order.

Example Boston bridges
Overlapping bridges in Boston.

Coming soon: OSM Bright

Many of the adjustments that I've made for this minimal style are things that can be pulled back into the OSM Bright template project. I'll be working on doing this in the near future as I wrap up work on the minimal design. Keep an eye on GitHub for these improvements as well as our blog for information about when the minimal design will become available for use.

MapBox for design

If you're interested in making your own custom maps, try using TileMill to style your data and pull in extracts from OpenStreetMap. Documentation is available on MapBox.com/Help. We are close to launching TileMill on Windows, so that in the coming weeks anyone using Windows, Mac or Ubuntu operating systems will be able to easily design custom web maps. You can see a preview and sign up for updates on MapBox.com/Windows, and we'll post details here on Idea Lab once it's available.

For more information on these tools and on hosting plans to share them online, check out MapBox.

October 07 2011

18:30

What newsrooms can learn from open-source and maker culture

“Newsosaur” blogger and media consultant Alan Mutter some time ago suggested that journalism has become a lot more like Silicon Valley. Newspapers are too risk-averse, he said, and so they “need some fresh DNA that will make them think and act more like techies and less like, well, newspaper people.”

When Seth was at the Hacks/Hackers hack day at ONA11 last month, as part of his larger project studying Hacks/Hackers, he mentioned this idea to Phillip Smith, a digital publishing consultant who has been instrumental in the Knight-Mozilla News Technology Partnership (the same collective we wrote about in August).

Maker culture is a way of thinking — a DIY aesthetic of tinkering, playing around, and rebuilding, all without fear of failure.

While Smith generally agreed with Mutter’s premise — of course Silicon Valley could bring a little dynamism to newspapers and journalism — he offered a caveat: The technology sector that Smith knew a decade ago was more about hacking-in-the-open and building cool stuff for others to enjoy, with a secondary emphasis on making money. Now the inverse is true: Silicon Valley is much less about the ideals of the open web, and much more about (as another observer has put it) short-sighted technology for the sake of “big exits and big profits.”

So it’s a bit of a mistake, we think, to go down the route of saying that journalism needs to become like Silicon Valley, in part because Silicon Valley is not simply a world of innovation, but also a highly competitive, secretive, and unstable metaphor. (Think: Groupon IPO, or even The Social Network.)

Instead, open source might be what people are hoping for when they think about remaking journalism — both in terms of innovating the business/system of news, and in terms of making it more transparent and participatory.

In a widely circulated recent post, Jonathan Stray suggested that the news industry could draw on “maker culture” to create a new kind of journalism — one that plumbs the institutional and technical complexities of major issues (like global finance) in a way that invites bright, curious “makers” to hack the system for the good of society. “This is a theory of civic participation based on empowering the people who like to get their hands dirty tinkering with the future,” Stray wrote. Josh Stearns built on this line of reasoning, arguing that “maker culture is the willingness to become experts of a system, and then use that expertise to change the system.”

Their approach to “maker culture,” we believe, can have a direct and practical implementation in journalism through a focus on integrating open source into the newsroom. As both Stray and Stearns point out: Maker culture is a way of thinking — a DIY aesthetic of tinkering, playing around, and rebuilding, all without fear of failure. Just the kind of thing journalism needs.

Maker culture is bound up in the technology and ethos of hacker culture, as James Losey of the New America Foundation has helpfully showed us. Losey (and colleague Sascha Meinrath) think this kind of “internet craftsmanship” is instrumental to sustaining the very architecture of the web itself: having the freedom and control and playful curiosity to shape networking technologies to meet one’s needs. Gutenberg, as Losey and Meinrath argue, was the first hacker, fundamentally rethinking technology. And it’s from this hacker mindset that we can take cues for rebooting the tools, practices, and frameworks of journalism.

Silicon Valley is not just a world of innovation, but also a highly competitive, secretive, and unstable metaphor.

So: Add maker/hacker culture, mix in a bit of theorist Richard Sennett, who believes in the capacity of individuals to reshape and innovate new media, and sprinkle some open-source juice into journalism, and you get the following:

1. New tools, stage one
We see this already. At ONA11, Knight-Mozilla’s Dan Sinker led a panel on open source in the newsroom that featured representatives from several major players: ProPublica (Al Shaw), The New York Times (Jacqui Cox), and the Chicago Tribune’s News Apps Team (Brian Boyer). Meanwhile, folks at The Seattle Times are using fairly simple tools like Tableau Public to visualize census data. In short: Already there are people inside newsrooms building cool and creative things.

2. New tools, stage two
This stage means going beyond the existing crop of databases, visualizations, and crowdsourcing applications (amazing as they are!) to look a bit more holistically at the system of news and the incorporation of open source in truly opening up the newsroom. In other words, can news organizations build open-source platforms for refining whole content management systems, or for building entirely new distribution systems?

Gutenberg was the first hacker, fundamentally rethinking technology.

Reflecting on Knight-Mozilla’s “hacking and making” week in Berlin — a gathering (dubbed, despite the month, “Hacktoberfest“) that featured journalists, designers, developers, and several news organization partners — Sinker made an interesting observation about open-source tools for newsrooms. Some of the news partners worried that “open-souce code would reveal too much,” but then it dawned on them that coordination among them would actually be facilitated by “working in the open.” They realized that “it meant far more than just code — it meant a new way of working, of embracing collaboration, and of blazing a real way forward.”

Beyond the benefits to collaboration, it’s important to remember that “open source” doesn’t necessarily mean “non-commercial.” If newsrooms develop open-source tools that make newswork (or knowledge management generally) easier, they can find revenue opportunities in selling services around that open code.

3. New thinking: A maker mindset + open source
What does it mean to incorporate the tinkering and playing and reshaping of maker culture back into journalism? The news industry is one of the last great industrial hold-overs, akin to the car industry. Newsrooms are top-heavy, and built on a factory-based model of production that demands a specific output at the end of the day (or hour). They’re also hierarchical, and, depending on whom you ask, the skills of one journalist, are for the most part, interchangeable with those of most other journalists — in part because journalists share a common set of values, norms, and ideals about what it means to do their work.

Thus, merging elements of maker culture and journalism culture might not be easy. Challenging the status quo is hard. The expectations of producing content, of “feeding the beast,” might get in the way of thinking about and tinkering with the software of news, maker-style. It can’t just be the newsroom technologists hacking the news system; it has to be journalists, all of them, reflecting on what it means to do news differently. We have to make time for journalists to rethink and reshape journalism like a hacker would retool a piece of code.

4. New frameworks: The story as code
While observing the Knight-Mozilla digital learning lab, some of the coolest things we saw were the final projects designed to reimagine journalism. (See all of the projects here and here.) What made these pitches so interesting? Many of them tried to bring a fundamental rethink to problems journalism is struggling to resolve — for instance, how to make information accessible, verifiable, and shareable.

So if we think about the story as code, what happens? It might seem radical, but try to imagine it: Journalists writing code as the building blocks for the story. And while they write this code, it can be commented on, shared, fact-checked, or augmented with additional information such as photos, tweets, and the like. This doesn’t have to mean that a journalist loses control over the story. But it opens up the story, and puts it on a platform where all kinds of communities can actively participating as co-makers.

We have to make time for journalists to rethink and reshape journalism like a hacker would retool a piece of code.

In this way, it’s a bit like the “networked journalism” envisioned by Jeff Jarvis and Charlie Beckett — although this code-tweaking and collaboration can come after the point of initial publication. So, your investigative pieces are safe — they aren’t open-sourced — until they become the source code for even more digging from the public.

In all of this thinking of the ways open source is changing journalism, there are some clear caveats:

1. For open-source projects to succeed, they require lots of people, a truly robust network of regular contributors. Given the amount of time that people might be willing to spend with any one article, or with news in general, who knows whether the real public affairs journalism that might benefit the most from open source would, in fact, get the kind of attention it needs to change the framework.

2. Open source requires some form of leadership. Either you have someone at the top making all the decisions, or you have some distributed hierarchy. As one newspaper editor told a fellow academic, Sue Robinson, in her study of participatory journalism, “Someone’s gotta be in control here.”

Image by tiana_pb used under a Creative Commons license.

15:00

Apply or nominate for the Antonio Pizzigati Prize!

Through October 31, 2011 you can apply or nominate  for the Antonio Pizzigati Prize. The challenge annually awards open source software developers. The 10,000 USD prize is founded by The Florence and Frances Family Fund of Tides Foundation and honors the brief life Tony Pizzigati, an early advocate of open source computing.

“And The Prize Goes To...”


The Pizzigati Prize challenge seeks to recognize developers who are making a two-faceted contribution to social change. First, they have an important practical impact: their software helps nonprofits both become more effective on a daily basis and build their capacity to better inform and mobilize their constituents. In addition, public interest software developers play a broader role. The ideals of public interest computing, as they have evolved inside the open source movement, promote collaboration and sharing.


Applicants will be evaluated by an advisory panel that includes past winners of the Prize on a range of criteria. The winner is expected to have:

  • Developed an elegant open source software product that serves a critical need in the broader U.S.-based nonprofit community
  • Evolved a plan to scale the product through wide distribution of the code
  • Exemplified the values of public interest computing
  • Demonstrated vision and inspired innovation in the field of public interest computing

 

All completed application materials for the 2012 prize competition including

  • an application
  • nomination form, as well as
  • a link pointing to the the relevant software

must be sent in one email to pizzigatiprize@tides.org no later than by 5pm EST, October 31, 2011.  The Tides Foundation, as host of the prize process, will name the next annual Pizzigati Prize winner at the Nonprofit Technology Network's  2012 Nonprofit Technology Conference in April 2012.

 

Do more

 

October 05 2011

12:05

3 Key Reflections From Knight-Mozilla's Hacktoberfest in Berlin

Last week, the Knight-Mozilla News Technology Partnership invited 20 developers, designers, and journalists to take part in a week of hacking and making in Berlin. I forget at what point in the planning one of the participants jokingly called it "Hacktoberfest," but the name stuck. And so now that the jet lag has worn off for the most part, I thought I'd reflect on three of my standout moments of Hacktoberfest and how they're influencing my thinking moving forward on the Knight-Mozilla project.

Working in the open

Sitting in a meeting with our news partners, I got to witness a great moment. At the start of that meeting, a discussion cropped up around the Partnership's core belief that code produced by Knight-Mozilla fellows should be open-sourced. There was hesitation on the part of some partners, worried that open-source code would reveal too much. An hour or so later, there was a discussion about possible collaborations among partners' newsrooms, but it wasn't making much headway, as collaboration with possible competitors is not the normal order of business.

But then it dawned on everyone: Open source made that a non-issue. By working in the open, fellows won't simply be producing things for their host organizations, but for any news organization that wants to use the code. You could see people linking back to the earlier conversation about open source and realizing that it meant far more than just code -- it meant a new way of working, of embracing collaboration, and of blazing a real way forward.

Quit yakking and start hacking

Sitting in the back of our main hackspace at Betahaus, watching team after team get up and present their work, it dawned on me how awesome it was to spend four days seeing people with disparate skill sets truly collaborate around building something.

Too often we orient getting people together around having a drink or listening to a speaker. "Quit yakking and start hacking" was the order of the day, and it worked. Multiple projects went from just an idea to a functioning demo in a matter of days. It's gratifying to me that there is a GitHub repo full of code from the week. Even more so that it was built through open collaboration among so many different types of people.

hacktoberfest.jpg

A new community

After dinner one evening, we took both the Hacktoberfest participants and representatives from our news partners to Cbase, a storied (and slightly ramshackle) hacker space in Berlin. Standing at the bar next to a guy with a huge beard and a leather kilt, I looked out over the main room and was genuinely moved as I watched many from our group moving a table strewn with their laptops over to join in with a table full of German hackers. My eyes adjusted to the blacklight, and I saw hackers, journalists, developers and news partners all sitting around together, socializing and drinking and making. It was awesome -- a real lasting image of a new community built in Berlin.

So what does all this mean for the Knight-Mozilla News Technology Partnership moving forward? Well, in the short term, Hacktoberfest was the last step in a lengthy process to arrive at our 2011 fellows -- expect an announcement in a few weeks. But longer term, I think there are some real lessons to be learned from the event in Berlin, and some real ways those lessons will help to shape the Partnership in 2012:

  • I think the news partners really enjoyed feeling a part of the process, of meeting people and being engaged in the ideas being bandied about. Definitely getting the news partners to be partners throughout the year, instead of simply hosts for fellows at the end is a key step.
  • Additionally great: More opportunities to make code. The paths blazed by the Partnership in 2011 centered around design challenges and learning labs, which I think were both successful and should be replicated, but there wasn't anywhere near enough hacking going on, so more code in 2012 I think is a great goal.
  • Finally, building community is important. It's easy to get focused on process and look inwardly for community, but figuring out ways to intersect with the community around news innovation and making, as well as with the many other developer, design and open gov communities and others that very much intersect with journalism, is crucial.

Three moments, three lessons learned. Let's hear it for a successful Hacktoberfest!

A version of this story first appeared here.

August 03 2011

14:00

Transparency, iteration, standards: Knight-Mozilla’s learning lab offers journalism lessons of open source

This spring, the Knight Foundation and Mozilla took the premise of hacks and hackers collaboration and pushed it a step further, creating a contest to encourage journalists, developers, programmers, and anyone else so inclined to put together ideas to innovate news.

Informally called “MoJo,” the Knight-Mozilla News Technology Partnership has been run as a challenge, the ultimate prize being a one-year paid fellowship in one of five news organizations: Al Jazeera English, the BBC, the Guardian, Boston.com, and Zeit Online.

We’ve been following the challenge from contest entries to its second phase, an online learning lab, where some 60 participants were selected on the basis of their proposal to take part in four weeks of intense lectures. At the end, they were required to pitch a software prototype designed to make news, well, better.

Through the learning lab, we heard from a super cast of web experts, like Chris Heilmann, one of the guys behind the HTML5 effort; Aza Raskin, the person responsible for Firefox’s tabbed browsing; and John Resig, who basically invented the jQuery JavaScript library; among other tech luminaries. (See the full lineup.)

There was a theme running through the talks: openness. Not only were the lectures meant to get participants thinking about how to make their projects well-designed and up to web standards, but they also generally stressed the importance of open-source code. (“News should be universally accessible across phones, tablets, and computers,” MoJo’s site explains. “It should be multilingual. It should be rich with audio, video, and elegant data visualization. It should enlighten, inform, and entertain people, and it should make them part of the story. All of that work will be open source, and available for others to use and build upon.”)

We also heard from journalists: Discussing the opportunities and challenges for technology and journalism were, among other luminaries, Evan Hansen, editor-in-chief of Wired.com; Amanda Cox, graphics editor of The New York Times; Shazna Nessa, director of interactive at the AP; Mohamed Nanabhay, head of new media at Al Jazeera English; and Jeff Jarvis.

In other words, over the four weeks of the learning lab’s lectures, we heard from a great group of some of the smartest journalists and programmers who are thinking about — and designing — the future of news. So, after all that, what can we begin to see about the common threads emerging between the open source movement and journalism? What can open source teach journalism? And journalism open source?

Finding 1:
* Open source is about transparency.
* Journalism has traditionally not been about transparency, instead keeping projects under wraps — the art of making the sausage and then keeping it stored inside newsrooms.

Because open-source software development often occurs among widely distributed and mostly volunteer participants who tinker with the code ad-hoc, transparency is a must. Discussion threads, version histories, bug-tracking tools, and task lists lay bare the process underlying the development — what’s been done, who’s done it, and what yet needs tweaking. There’s a basic assumption of openness and collaboration achieving a greater good.

Ergo: In a participatory news world, can we journalists be challenged by the ethics of open source to make the sausage-making more visible, even collaborative?

No one is advocating making investigative reporting an open book, but sharing how journalists work might be a start. As Hansen pointed out, journalists are already swimming in information overload from the data they gather in reporting; why not make some of that more accessible to others? And giving people greater space for commenting and offering correction when they think journalists have gone wrong — therein lies another opportunity for transparency.

Finding 2:
* Open source is iterative.
* Journalism is iterative, but news organizations generally aren’t (yet).

Software development moves quickly. Particularly in the open source realm, developers aren’t afraid to make mistakes and make those mistakes public as they work through the bugs in a perpetual beta mode rather than wait until ideas are perfected. The group dynamic means that participants feel free to share ideas and try new things, with a “freedom to fail” attitude that emphasizes freedom much more than failure. Failure, in fact, is embraced as a step forward, a bug identified, rather than backward. This cyclical process of iterative software development — continuous improvement based on rapid testing — stands in contrast to the waterfall method of slower, more centralized planning and deployment.

On the one hand, journalism has iterative elements, like breaking news. As work, journalism is designed for agility. But journalism within legacy news organizations is often much harder to change, and tends to be more “waterfall” in orientation: The bureaucracy and business models and organizational structures can take a long time to adapt. Trying new things, being willing to fail (a lot) along the way, and being more iterative in general are something we can learn from open-source software.

Finding 3:
* Open source is about standards.
* So is journalism.

We were surprised to find that, despite its emphasis on openness and collaboration, the wide world of open source is also a codified world with strict standards for implementation and use. Programming languages have documentation for how they are used, and there is generally consensus among developers about what looks good on the web and what makes for good code.

Journalism is also about standards, though of a different kind: shared values about newsgathering, news judgment, and ethics. But even while journalism tends to get done within hierarchical organizations and open-source development doesn’t, journalism and open source share essentially the same ideals about making things that serve the public interest. In one case, it’s programming; in the other case, it’s telling stories. But there’s increasingly overlap between those two goals, and a common purpose that tends to rise above mere profit motive in favor of a broader sense of public good.

However, when it comes to standards, a big difference between the the open-source movement and journalism is that journalists, across the board, aren’t generally cooperating to achieve common goals. While programmers might work together to make a programming language easier to use, news organizations tend to go at their own development in isolation from each other. For example, The Times went about building its pay meter fairly secretly: While in development, even those in the newsroom didn’t know the details about the meter’s structure. Adopting a more open-source attitude could teach journalists, within news organizations and across them, to think more collaboratively when it comes to solving common industry problems.

Finding 4:
* Open-source development is collaborative, free, and flexible.
* Producing news costs money, and open source may not get to the heart of journalism’s business problems.

Open-source software development is premised on the idea of coders working together, for free, without seeking to make a profit at the expense of someone else’s intellectual property. Bit by bit, this labor is rewarded by the creation of sophisticated programming languages, better-and-better software, and the like.

But there’s a problem: Journalism can’t run on an open source model alone. Open source doesn’t give journalism any guidance for how to harness a business model that pays for the news. Maybe open-source projects are the kind of work that will keep people engaged in the news, thus bulking up traditional forms of subsidy, such as ad revenue. (Or, as in the case of the “open R&D” approach of open APIs, news organizations might use openness techniques to find new revenue opportunities. Maybe.)

Then again, the business model question isn’t, specifically, the point. The goal of MoJo’s learning lab, and the innovation challenge it’s part of, is simply to make the news better technologically — by making it more user-friendly, more participatory, etc. It’s not about helping news organizations succeed financially. In all, the MoJo project has been more about what open source can teach journalism, not vice versa. And that’s not surprising, given that the MoJo ethos has been about using open technologies to help reboot the news — rather than the reverse.

But as the 60 learning lab participants hone their final projects this week, in hopes of being one of the 15 who will win a next-stage invite to a hackathon in Berlin, they have been encouraged to collaborate with each other to fill out their skill set — by, say, a hack partnering with a hacker, and so forth. From those collaborations may come ideas not only for reinventing online journalism, but also for contributing to the iteration of open-source work as a whole.

So keep an eye out: Those final projects are due on Friday.

May 13 2011

14:45

Mapping the Japan Earthquake to Help Recovery Efforts

In the days following the earthquake in Japan, members of the Business Civic Leadership Center pledged more than $240 million to aid response and recovery efforts. Their challenge was to figure out how to dispense that money to the projects and people who needed it most. To help them visualize the scope of the disaster and identify the areas that were most affected, we developed an interactive map of the aftershocks felt at more than magnitude 5.0 in the days after the initial 9.0 quake.


Interactive maps like this are great for communicating a lot of information quickly and putting information in context -- in this case, the impact of an earthquake that happened halfway around the world. With the increasing availability of open data sets and new mapping technologies, it's now much easier -- and cheaper -- to build maps like this.

We built this map using the open-source map design studio TileMill, a free tool that we've written about before that allows you to create custom maps using your own data, and open data released by the United States Geological Survey.

Below is a walk-through of how we built this map and details on how you can build your own interactive map using open data and TileMill.


Finding the Data

The U.S. Geological Survey publishes data feeds of recent earthquake readings in a variety of formats. The feeds are geocoded so you can plot the epicenters of each report using longitude and latitude coordinates. For this map, we converted the RSS feed of 5.0-plus magnitude earthquakes over seven days to a shapefile and used TileMill to style the data and add interactivity. You could also download the KML feeds and load them into TileMill directly.

In addition to the point-based epicenter data, we used shapefile data of the Shakemap from the initial earthquake, which shows the ground movement and intensity of shaking caused by a seismic event. This layer provides greater context to the impact felt around the epicenter points.


Building the Map

We used TileMill to design the map and apply an interactive "hover" layer, which allows you to show information when you mouse over a point on the map -- in this case, the epicenter of an aftershock. Below is a look at the editing interface in TileMill. For more on how interactivity works in TileMill, check out this blog post from Bonnie.

We then rendered the map to MBTiles, which is an open-file format for storing lots of map tiles and interactivity information in one file. MBTiles can be hosted on the web or displayed offline on mobile devices like the iPad. For this map, we used TileStream Hosting to host the map online. It has an embed feature that let us embed the map on an otherwise static HTML page. The embed code is also publicly available, so others can embed your map on their own site. Check out this article on O'Reilly Radar for an example of this in action. You can make your own embed of this map by clicking on "embed" here.


Adding Advanced Interactivity

By default, the interactivity in TileMill lets you select to have a "hover" or "click" style for interactivity. When you embed your map on a webpage, you can override this default behavior with some client-side code. For this site, we added some CSS styles and used JavaScript to build a timeline based on the dates in the overlays of each interactive point.

Now when you hover over a point on the embedded map, instead of the usual popup, the corresponding element in the timeline expands. This lets users see the relationship between time, space and magnitude an in intuitive way. All the code to make this work is available in the page -- just "view source" to check it out.

You can download TileMill for free here and find more documentation on how to use it at support.mapbox.com.

May 10 2011

15:23

May 05 2011

18:27

Is Non-Profit Journalism A Safeguard for Press Freedom?

wpfd2011logo200.jpg

WASHINGTON, DC -- Since May 3, 1991, World Press Freedom Day has been celebrated worldwide annually to raise awareness of the importance of freedom of the press and remind governments of their duty to respect it. Marking the 10th anniversary last Tuesday, an international conference was organized in Washington, DC, by the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the U.S. State Department to debate the "new frontiers" of the media. You can see the entire agenda here.

Online freedom and the changing media landscape had pride of place and I was given the opportunity to debate online censorship on May 2 as well as discuss the actual situation between "traditional" and "new media," as a representative of Reporters Without Borders. (Note that Reporters Without Borders also has a special World Day Against Cyber-Censorship focused entirely on online expression.)

In countries where online platforms are tightly controlled -- but also are some of the rare places to get uncensored information -- the lines between traditional and new media is very vague. It's possible that non-profit journalism websites (or sites where the news isn't a profit center) might help safeguard press freedom.

Reports from Malaysia, France

In Malaysia, Premesh Chandran had to adapt to the fact that advertisers were staying away because the info published on Malaysiakini.com was not fitting in with the control imposed on media by the government. Malaysia is ranked 141st out of 178 countries in the 2010 Reporters Without Borders press freedom index. Without ads, Malaysiakini began to install a pay wall for its English version. The website thought it might take a non-profit business model but according to Chandran, "It became obvious that [they] had to become more professional." The subscription allows the core of an audience to support the news activities of the website. But Chandran acknowledges that "readers don't pay."

In France, OWNI.fr depends on the expertise of reporters and licensed content for their free website, but make money by sell journalism services to online publishers. (You can read more about OWNI in this story by Mark Glaser on MediaShift.)

"In terms of client acquisition, this is very helpful," according to OWNI's director of data journalism Nicholas Kayser-Bril. OWNI worked with WikiLeaks on a non-profit basis and organized the crowdsourcing for documents that were released. It is now an expertise that they can sell to other organizations. For this website, the content and features are a non-profit activity, because the income is generated by services instead. "This a way of adapting journalism to the technologies," said Kayser-Bril.

Open Source Software at AllAfrica.com

Convinced that mobile phones were making a huge impact on the way media are operating in Africa, Amadou Mahtar Ba, co-founder of AllAfrica.com, insisted that "traditional media need to adapt to technology. Many media organization are losing relevance and there is a fundamental growth of mobile phones."

"Media owners and operators need guidelines and principles, as journalists have theirs," Ba said.

AllAfrica.com is a news content publisher and relies on the development of systems based on free and open source software, such as XML::Comma, released under the GNU General Public License. It has become the entry point to a global, Africa-interested audience, as well as a pioneering set of technologies. Here again, journalism is a non-profit activity.

newseum feeds.jpg

According to Richard Tofel, general manager of ProPublica, there is a role for non-profit journalism to take over the economic failures of the "traditional" media by taking the risks the latter could not afford anymore.

"We are going to a new territory based on a technological revolution," he said. "We need experimentation and a willingness to take risks almost every day to discover these new ways," said Tofel, when asked about the training journalists should receive to handle these different ways of making the news.

Press freedom is not only about journalists being killed and harassed and newspapers being forced to close by oppressive governments. It is also about guaranteeing independence -- independence from advertisers is no less complicated than independence from donors. At the panel discussion, one of the solutions was making money from readers and services. These publications do bring in money and are trying to get their readers to adapt to new technologies. Non-profit journalism, in the sense of news not being the profitable activity, is a way of helping to guarantee more editorial independence. This is one more possible safeguard for press freedom.

Photo of the Newseum by Clothilde le Coz

Clothilde Le Coz has been working for Reporters Without Borders in Paris since 2007. She is now the Washington director for this organization, helping to promote press freedom and free speech around the world. In Paris, she was in charge of the Internet Freedom desk and worked especially on China, Iran, Egypt and Thailand. During the time she spent in Paris, she was also updating the "Handbook for Bloggers and Cyberdissidents," published in 2005. Her role is now to get the message out for readers and politicians to be aware of the constant threat journalists are submitted to in many countries.

This is a summary. Visit our site for the full post ».

April 22 2011

12:49

How to Design Fast, Interactive Maps Without Flash

Until recently if you wanted to create a fast interactive map to use on your website you had two main options - design it in Flash, or use Google. With the prevalence of mobile devices, for many users Flash isn't an option, leaving Google and a few competitors (like Bing). But we are developing open source technologies in this space that provide viable alternatives for serving fast interactive maps online - ones that often give users more control over map design and the data displayed on it.

TileMill, our open source map design studio, now provides interactivity in the latest head version on github. Once you design a map with TileMill, you can enable certain data in the shapefile to be interactive.

Map interactivity in the latest version of TileMill

When you export a map into MBTiles, a file format that makes it easy to manage and share map tiles and which you can easily export any map made in TileMill to, all the interaction is stored within the MBTiles file. This allows us to host interactive maps that are completely custom designed - including the look and feel and the data points - that are as fast as Google Maps.

An example of an interactive map using TileMill is the map in NPR's I Heart NPR Facebook App, an app that asks users to choose and map their favorite member station.

NPR Using TileMill

Yesterday, Tom MacWright gave a talk about designing fast maps and other emerging open source interactive mapping technologies, and specifically comparing them to Google, at the Where 2.0 Conference, a leading annual geo conference. If you're interested in learning more about this and weren't at the conference, check out his slides, which are posted on our blog.

March 10 2011

15:38

Attend Penguin Day DC for Open Source Learning on March 20

Just after the Nonprofit Technology Conference in Washington another great event for all of the nonprofits and social justice activistsis is coming up on March the 20th. Penguin Day is designed to let nonprofits and social justice activists learn about free and open source software that can support their work and potentially save them money, including tools for web publishing, fundraising, blogging, and campaigning. Some sessions are already planned, but the organizers strongly encourage participants to request a session topic. Penguin Day DC 2011 is organized by Aspiration, PICnet, NOSI and CiviCRM.

February 23 2011

20:10

Help Me Investigate is now open source

I have now released the source code behind Help Me Investigate, meaning others can adapt it, install it, and add to it if they wish to create their own crowdsourcing platform or support the idea behind it.

This follows the announcement 2 weeks ago on the Help Me Investigate blog (more coverage on Journalism.co.uk and Editors Weblog),

The code is available on GitHub, here.

Collaborators wanted

I’m looking for collaborators and coders to update the code to Rails 3, write documentation to help users install it, improve the code/test, or even be the project manager for this project.

Over the past 18 months the site has surpassed my expectations. It’s engaged hundreds of people in investigations, furthered understanding and awareness of crowdsourcing, and been runner-up for Multimedia Publisher of the Year. In the process it attracted attention from around the world – people wanting to investigate everything from drug running in Mexico to corruption in South Africa.

Having the code on one site meant we couldn’t help those people: making it open source opens up the possibility, but it needs other people to help make that a reality.

If you know anyone who might be able to help, please shoot them a link. Or email me at paul(at)helpmeinvestigate.com

Many thanks to Chris Taggart and Josh Hart for their help with moving the code across.

January 20 2011

15:30

Boston Hack Day Challenge: An open door to Boston.com

Count The Boston Globe among the growing number of organizations that want hackers to come in from the cold. On the weekend of Feb. 25 they’re holding a three-day event called the “Boston Hack Day Challenge” where developers, designers, coders and anyone else inclined to make apps will gather to “create new online and mobile products that can make life better for Bostonians.”

We’ve got our share of tech heads around the area thanks to schools like MIT and Harvard, not to mention start-ups (perhaps you’ve heard of SCVNGR?), and the Globe is looking to capitalize on that to help promote their new digital test kitchen, Beta.Boston.com.

In the last few years a number of companies, in and outside of media, have dabbled in hackathons, sometimes to try and associate their name with innovation, other times to try and find the best new talent and products to cherrypick. The New York Times started the Times Open series a few years ago to get New York’s tech community tied into the newspaper and help nudge along the concept of the journo-programmer. We’ve also seen journalists, programmers and developers come together in crises like last year’s earthquakes in Haiti, to try and build tools to aid in communication and emergency response. (And I would be remiss if I didn’t mention the work of Hacks/Hackers, which has held a number of developer events like Hacks/Hackers Unite.)

At the Boston Hack Day Challenge, teams will use the weekend to build a site or app dedicated to alleviating one problem or another in the Boston area. (One example would be something like the OpenMBTA app, which I can vouch for as making it easier to catch the bus or T.)

All of these fit quite nicely with Beta.Boston.com, where the Globe’s digital team has been quietly releasing online products, and highlighting apps and sites created by others, including Citizen’s Connect, an app to report issues to the mayor’s office. You’ll also find their early OpenBlock demo with news and data from Boston neighborhoods.

The team at the Globe says to keep an eye on the beta space as they roll out toys and features for BostonGlobe.com, the new subscriber site that will parallel Boston.com.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl