Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

July 25 2012

13:29

Presentation: Tools for visual storytelling workshop by @Coneee

During news:rewired – full stream ahead on Friday (13 July), Conrad Quilty-Harper, interactive news editor at the Telegraph, ran a workshop on tools for visual storytelling.

Here is a copy of his presentation:

April 30 2012

14:00

How to Contribute to OpenStreetMap and Grow the Open Geodata Set

Hundreds of delegates from government, civil society, and business gathered in Brasilia recently for the first Open Government Partnership meetings since the inception of this initiative. Transparency, accountability, and open data as fundamental building blocks of a new, open form of government were the main issues debated. With the advent of these meetings, we took the opportunity to expand an open data set by adding street names to OpenStreetMap.

Getting ready to survey the Cruzeiro neighborhood in Brasilia.

OpenStreetMap, sometimes dubbed the "Wikipedia of maps," is an open geospatial database. Anyone can go to openstreetmap.org, create an account, and add to the world map. The accessibility of this form of contribution, paired with the openness of its common data repository, holds a powerful promise of commoditized geographic data.

As this data repository evolves, along with corresponding tools, many more people gain access to geospatial analysis and publishing -- which previously was limited to a select few.

When Steve Coast founded OpenStreetMap in 2004, the proposition to go out and crowdsource a map of the world must have sounded ludicrous to most. After pivotal growth in 2008 and the widely publicized rallying around mapping Haiti in 2010, the OpenStreetMap community has proven how incredibly powerful a free-floating network of contributors can be. There are more than 500,000 OpenStreetMap contributors today. About 3 percent (that's still a whopping 15,000 people) contribute a majority of the data, with roughly 1,300 contributors joining each week. Around the time when Foursquare switched to OpenStreetMap and Apple began using OpenStreetMap data in iPhoto, new contributors jumped to about 2,300 per month.

As the OpenGovernment Partnership meetings took place, we wanted to show people how easy it is to contribute to OpenStreetMap. So two days before the meetings kicked off, we invited attendees to join us for a mapping party, where we walked and drove around neighborhoods surveying street names and points of interest. This is just one technique for contributing to OpenStreetMap, one that is quite simple and fun.

Here's a rundown of the most common ways people add data to OpenStreetMap.

Getting started

It takes two minutes to get started with contributing to OpenStreetMap. First, create a user account on openstreetmap.org. You can then immediately zoom to your neighborhood, hit the edit button, and get to work. We recommend that you also download the JOSM editor, which is needed for more in-depth editing.

Once you start JOSM, you can download an area of OpenStreetMap data, edit it, and then upload it. Whatever you do, it's crucial to add a descriptive commit message when uploading -- this is very helpful for other contributors to out figure the intent and context of an edit. Common first edits are adding street names to unnamed roads, fixing typos, and adding points of interest like a hospital or a gas station. Keep in mind that any information you add to OpenStreetMap must be observed fact or taken from data in the public domain -- so, for instance, copying street names from Google is a big no-no.

Satellite tracing and GPS data

JOSM allows for quick tracing of satellite images. You can simply turn on a satellite layer and start drawing the outlines of features that can be found there such as streets, building foot prints, rivers, and forests. Using satellite imagery is a great way to create coverage fast. We've blogged before about how to do this. Here's a look at our progress tracing Brasilia in preparation for the OGP meetings:

Brasilia progress

OpenStreetMap contributions in Brasilia between April 5 and April 12.

In places where good satellite imagery isn't available, a GPS tracker goes a long way. OpenStreetMap offers a good comparison of GPS units. Whichever device you use, the basics are the same -- you track an area by driving or walking around and later load the data into JOSM, where you can clean it up, classify it, and upload it into OpenStreetMap.

Synchronizing your camera with your tracker

Synchronizing your camera with the GPS unit.

Walking papers

For our survey in Brasilia, we used walking papers, which are simple printouts of OpenStreetMap that let you jot down notes on paper. This is a great tool for on-the-ground surveys to gather street names and points of interest. It's as simple as you'd imagine. You walk or drive around a neighborhood and write up information that you see that's missing in OpenStreetMap. Check out our report of our efforts doing this in Brasilia on our blog.

Walking papers for Brasilia.

Further reading

For more details on how to contribute to OpenStreetMap, check out Learn OSM -- it's a great resource with step-by-step guides for the most common OpenStreetMap tasks. Also feel free to send us questions directly via @mapbox.

January 20 2012

15:20

How to Create a Minimalist Map Design With OpenStreetMap

Mapping can be as much about choosing what data not to include as to include, so you can best focus your audience on the story you are telling. Oftentimes with data visualization projects, the story isn't about the streets or businesses or parks, but rather about the data you're trying to layer on the map.

To help people visualize data like this, I've started to design a new minimal base map for OpenStreetMap. What's great about OpenStreetMap is that the data is all open. This means I can take the data and design a totally custom experience. Once finished, the map will serve as another option to the traditional OpenStreetMap baselayer.

I'm designing the new map in the open-source map design studio TileMill, which Development Seed has written before about here. The map can be used as a light, very subtle background to add data on top of for use either with our MapBox hosting platform's map builder or on its own. It still provides the necessary geographic context for a map, but moves the focus to the data added on top of the map -- and not details that are irrelevant to its story.

Here's an early look at the features and design aspects I've been working on for the map.

A look at Portland, Ore., on the new OpenStreetMap Mininal basemap Portland, Ore., on the new OpenStreetMap minimal base map.

Behind the design decisions

I used the open-source OSM Bright template that you can load into TileMill as a starting point for the design and removed all color, choosing to limit the palette to light grays. For simplicity, most land use and land cover area types have been dropped. However, wooded areas and parks remain, indicated with subtle textures instead of color. The fact that OpenStreetMap's data is open gives me full control of choosing exactly what I want to show up on the map.

The style now includes more types of roads. Tracks have been added, as have pedestrian routes, bike paths, and bridleways, which are shown as dotted lines. Roads without general public access (for example, private roads) are shown faded out. The rendering of overlaying tunnels, streets and bridges has also greatly improved, with most overlapping lines separated and stacked in the proper order.

Example Boston bridges
Overlapping bridges in Boston.

Coming soon: OSM Bright

Many of the adjustments that I've made for this minimal style are things that can be pulled back into the OSM Bright template project. I'll be working on doing this in the near future as I wrap up work on the minimal design. Keep an eye on GitHub for these improvements as well as our blog for information about when the minimal design will become available for use.

MapBox for design

If you're interested in making your own custom maps, try using TileMill to style your data and pull in extracts from OpenStreetMap. Documentation is available on MapBox.com/Help. We are close to launching TileMill on Windows, so that in the coming weeks anyone using Windows, Mac or Ubuntu operating systems will be able to easily design custom web maps. You can see a preview and sign up for updates on MapBox.com/Windows, and we'll post details here on Idea Lab once it's available.

For more information on these tools and on hosting plans to share them online, check out MapBox.

August 01 2011

11:26

The Takeaway, Gas prices, Haiti - crowdsourced maps: how to get started and stories to consider

Reynolds Center :: As the ranks of journalists at news organizations shrink, one of our biggest news-gathering assets is our audience. We increasingly rely on users for tips and information via social media, and some companies are working overtime to make crowdsourcing news easier. One of the most interesting emerging uses for all that crowd sourced news is in mapping.

All kinds of individual stories can include mapped components. NPR’s “The Takeaway” set up a national gas prices map. The New York Times and WNYC asked users to share bird-watching spots. Following Haiti’s earthquake, users all over the world cobbled together a map of earthquake damage and relief sites to assist aid workers.

Continue to read Rebekah Monson, businessjournalism.org

July 28 2011

13:43

Visualizing 10 Years of Violence Against Journalists in Afghanistan

Internews and Nai, an Afghan media advocacy organization, have collected hundreds of reports of threats, intimidation, and violence faced by journalists in Afghanistan. We recently announced a new site, which features 10 years of these reports. While Nai's data previously resided in spreadsheets, the new site allows the public to access hundreds of reports through visualizations and to download it directly. With this site we're raising the profile of media freedom in a country often characterized as among the most dangerous in the world for journalists.

Violence Against Journalists - data.nai.org.af

A screenshot of data.nai.org.af.

The site is packed with functionality that allows visitors to interact with the dataset in a variety of ways. Visitors can quickly scan the map to get a national overview of the data. They can drill down on individual provinces and individual years, seeing charts that depict violence over time when they mouse over the dots. If a visitor clicks on a year, they can even browse the data itself in a table just below the map.

We've also allowed visitors to turn on layers that can increase contextual understanding, such as the number of active journalists in each province, the number of media organizations in each province, and so on. Finally, users can download the full dataset and easily generate the code necessary to embed the map on other websites, in electronic press releases, and so on.

We included all this functionality without compromising one of the most important and desirable features of the site: speed. The maps are composited ahead of time, significantly reducing the loading time in Afghanistan and other bandwidth-constrained environments. We've also included a bit of code that dynamically evaluates each visitor's connection and serves map tiles that reflect that visitor's constraints. At the end of the day the maps are fast in spite of poor connections and remain fully interactive.

Violence Against Journalists map featuring ten years of incidents (see the website for more data and details):

July 11 2011

16:02

How TileMill Improved Ushahidi Maps to Protect Children in Africa

In May I worked with Plan Benin to improve its Violence Against Children (VAC) reporting system. The system uses FrontlineSMS and Ushahidi to collect and visualize reports of violence against children. Ushahidi develops open-source software for information collection, visualization and interactive mapping. While in Benin, I was frustrated by the lack of local data available through Google Maps, Yahoo, and even OpenStreetMap -- the three mapping applications Ushahidi allows administrators to use without customization.

While these mapping services are great for places rich in geographic data, many places -- like Benin and other countries in the developing world -- are poorly represented by the major mapping services. Making matters worse is the fact that even when good data is available, slow and unreliable Internet access turns geolocating incidents and browsing the map into a frustrating, time-consuming challenge for staff and site visitors in-country.

In an effort to create a custom map with more local data, I tested out TileMill, Development Seed's open-source map design studio, with successful results.

An area of northwest Benin shown with Google Maps (left) and a custom map built with TileMill (right). Note the number of towns and villages that appear in the map at right.

With little hands-on experience with map design or GIS (geographic information systems), I was happy to find TileMill's Carto-based code intuitive and easy to use.

Because of the lack of data on Benin available through the major mapping services, I thought it would be interesting to visualize the VAC Benin data on a custom map using geographic data obtained by Plan Benin through CENATEL, the National Centre of Remote Sensing and Forest Cover Observation in Benin. I exported reports of violence from Ushahidi into a CSV file using Ushahidi's built-in export functionality. From there, I used Quantum GIS -- an open-source GIS tool -- to convert the data into GeoJSON, an open standard for data interchange that works very well with TileMill.

I then used TileMill to create a map that includes only the data relevant to Plan Benin's activities on this particular project, which helps users focus on the information they need. The map includes geographic data for Atacora and Couffo, the two "Program Units" where Plan Benin operates. (These are highlighted in light blue on the map.)

I also included labels for the important cities in both Program Units and, if you zoom in several levels, village names in Atacora. The red dots indicate reports of violence, and if you mouse over or click on a dot, you can see a summary of the incident. The reports were geolocated by hand using information sent via text message. The map also incorporates MapBox's open-source World Bright base-layer map, adding country borders, custom labels, population centers (in light yellow/brown tones), and other information to the map.

The Tip of the Iceberg

This is really the tip of the iceberg in terms of what TileMill can do. It would also be possible to add as many cities and villages as there are in the dataset, include multimedia-rich interactivity, use a choropleth scheme to indicate hotspots of violence, cluster reports, and so on.

With just a few design choices, this custom map dramatically improves the experience of interacting with data collected through Ushahidi. Highlighting the Program Units draws the eye to the important areas; using deep datasets and custom map labels solves the problem of missing local data; and the built-in interactivity means that visitors don't need to browse to multiple pages (a killer in low-bandwidth environments) to view information on individual reports.

Compositing, which was just rolled out on TileStream Hosting, helps the map load quickly, even in low-bandwidth environments (the maps are now faster than Google Maps), and this map can also be used offline via either the MapBox Appliance or the MapBox iPad app. Finally, TileStream Hosting makes it easy to host the map and generates embed code so the map can be widely shared.

Take a look at the map below and feel free to click over to the VAC Benin Ushahidi site to see the difference for yourself.

VAC Benin data collected with Ushahidi and visualized with TileMill:

Paul Goodman is a master's student at the UC-Berkeley School of Information and is spending the summer working with Development Seed.

May 13 2011

14:45

Mapping the Japan Earthquake to Help Recovery Efforts

In the days following the earthquake in Japan, members of the Business Civic Leadership Center pledged more than $240 million to aid response and recovery efforts. Their challenge was to figure out how to dispense that money to the projects and people who needed it most. To help them visualize the scope of the disaster and identify the areas that were most affected, we developed an interactive map of the aftershocks felt at more than magnitude 5.0 in the days after the initial 9.0 quake.


Interactive maps like this are great for communicating a lot of information quickly and putting information in context -- in this case, the impact of an earthquake that happened halfway around the world. With the increasing availability of open data sets and new mapping technologies, it's now much easier -- and cheaper -- to build maps like this.

We built this map using the open-source map design studio TileMill, a free tool that we've written about before that allows you to create custom maps using your own data, and open data released by the United States Geological Survey.

Below is a walk-through of how we built this map and details on how you can build your own interactive map using open data and TileMill.


Finding the Data

The U.S. Geological Survey publishes data feeds of recent earthquake readings in a variety of formats. The feeds are geocoded so you can plot the epicenters of each report using longitude and latitude coordinates. For this map, we converted the RSS feed of 5.0-plus magnitude earthquakes over seven days to a shapefile and used TileMill to style the data and add interactivity. You could also download the KML feeds and load them into TileMill directly.

In addition to the point-based epicenter data, we used shapefile data of the Shakemap from the initial earthquake, which shows the ground movement and intensity of shaking caused by a seismic event. This layer provides greater context to the impact felt around the epicenter points.


Building the Map

We used TileMill to design the map and apply an interactive "hover" layer, which allows you to show information when you mouse over a point on the map -- in this case, the epicenter of an aftershock. Below is a look at the editing interface in TileMill. For more on how interactivity works in TileMill, check out this blog post from Bonnie.

We then rendered the map to MBTiles, which is an open-file format for storing lots of map tiles and interactivity information in one file. MBTiles can be hosted on the web or displayed offline on mobile devices like the iPad. For this map, we used TileStream Hosting to host the map online. It has an embed feature that let us embed the map on an otherwise static HTML page. The embed code is also publicly available, so others can embed your map on their own site. Check out this article on O'Reilly Radar for an example of this in action. You can make your own embed of this map by clicking on "embed" here.


Adding Advanced Interactivity

By default, the interactivity in TileMill lets you select to have a "hover" or "click" style for interactivity. When you embed your map on a webpage, you can override this default behavior with some client-side code. For this site, we added some CSS styles and used JavaScript to build a timeline based on the dates in the overlays of each interactive point.

Now when you hover over a point on the embedded map, instead of the usual popup, the corresponding element in the timeline expands. This lets users see the relationship between time, space and magnitude an in intuitive way. All the code to make this work is available in the page -- just "view source" to check it out.

You can download TileMill for free here and find more documentation on how to use it at support.mapbox.com.

April 22 2011

12:49

How to Design Fast, Interactive Maps Without Flash

Until recently if you wanted to create a fast interactive map to use on your website you had two main options - design it in Flash, or use Google. With the prevalence of mobile devices, for many users Flash isn't an option, leaving Google and a few competitors (like Bing). But we are developing open source technologies in this space that provide viable alternatives for serving fast interactive maps online - ones that often give users more control over map design and the data displayed on it.

TileMill, our open source map design studio, now provides interactivity in the latest head version on github. Once you design a map with TileMill, you can enable certain data in the shapefile to be interactive.

Map interactivity in the latest version of TileMill

When you export a map into MBTiles, a file format that makes it easy to manage and share map tiles and which you can easily export any map made in TileMill to, all the interaction is stored within the MBTiles file. This allows us to host interactive maps that are completely custom designed - including the look and feel and the data points - that are as fast as Google Maps.

An example of an interactive map using TileMill is the map in NPR's I Heart NPR Facebook App, an app that asks users to choose and map their favorite member station.

NPR Using TileMill

Yesterday, Tom MacWright gave a talk about designing fast maps and other emerging open source interactive mapping technologies, and specifically comparing them to Google, at the Where 2.0 Conference, a leading annual geo conference. If you're interested in learning more about this and weren't at the conference, check out his slides, which are posted on our blog.

February 17 2011

18:30

How public is public data? With Public Engines v. ReportSee, new access standards could emerge

A recently settled federal court case out in Utah may affect the way news organizations and citizens get access to crime data.

Public Engines, a company that publishes crime statistics for law enforcement agencies, sued ReportSee, which provides similar services, for misappropriating crime data ReportSee makes available on CrimeReports.com. In the settlement, ReportSee is barred from using data from Public Engines, as well as from asking for data from agencies that work with Public Engines.

At first glance, the companies seem virtually identical, right down to their similar mapping sites CrimeReport.com (Public Engines) and SpotCrime.com (ReportSee). The notable exception is that Public Engines contracts with police and sheriff departments for its data and provides tools to manage information. ReportSee, on the other hand, relies on publicly available feeds.

In the settlement between the two websites, a new question arises: Just what constitutes publicly available data? Is it raw statistics or refined numbers presented by a third party? Governments regularly farm out their data to companies that prepare and package records, but what stands out in this case is that Public Engines effectively laid claimed to the information provided to it by law enforcement. This could be problematic to news organizations, developers, and citizens looking to get their hands on data. While still open and available to the public, the information (and the timing of its release) could potentially be dictated by a private company.

“The value in this kind of crime data is distributing it as quickly as possible so the public can interact with it,” Colin Drane, the founder of SpotCrime, told me.

In its news release on the settlement, Public Engine notes that it works with more than 1,600 law enforcement agencies in the US. Greg Whisenant, CEO of Public Engines, said in the statement that the company is pleased with the outcome of the case, concluding, “The settlement ushers in a new era of transparency and accessibility for the general public. It clearly validates our perspective that law enforcement agencies should retain the right to manage and control the data they decide to share.”

Naturally, Drane sees things differently. “I just don’t think people recognize that the data is being, essentially, privatized,” he said.

That may be a slight exaggeration, evidenced by the fact that SpotCrime is still operating. Instead of signing contracts with law enforcement agencies, SpotCrime requests data that is available for free and runs ads on its map pages. The company also partners with local media to run crime maps on news sites.

Through Drane sought to create a business through data mapping, his methods are largely similar to those of news organizations, relying on open data and free mapping tools. And just like news organizations, Drane finds that the hardest part of the job can be negotiating to get records.

“The technology has been here for years, but the willingness to use it is just starting for many cities,” Drane said.

The open data movement has certainly exploded in recent years, from property and tax records at the municipal level all the way up to Data.gov. As a result, news organizations are not only doing data-backed reporting, but also building online features and news apps. And news organizations are not alone, as developers and entrepreneurs like Drane are mining open datasets to try to create tools and fill information needs within communities.

I asked David Ardia of the Citizen Media Law Project whether this case could hinder development of more data products or have broader ramifications for journalists and citizens. The short answer is no, he said, since no ruling was issued. But Public Engines could be emboldened to take action against competitors, Ardia noted — and, as a result, developers looking to do something similar to what Drane has done may think twice about using public data.

“This is just the tip of the iceberg,” Ardia said. “There are tremendous amounts of money to be made in government information and data.”

In this case, Public Engines saw crime data as a proprietary product — and Dane’s company as infringing on their contract. It also claimed misappropriation of the hot news doctrine, arguing that it gathers and publishes information in a timely manner as part of its business. (An interesting link Ardia points out: On its FAQ page, CrimeReports.com says it does not make crime data downloadable “to the general public for financial and legal reasons.”)

Ardia said the larger question is twofold: first, whether government agencies will let third parties exert control over public data, and, second, who can access that data. As more local and state departments use outside companies to process records, tax dollars that go towards managing data are essentially paid to limit access to the public. Drane and his company were barred from using or asking to use public crime data in certain cities: If crime data is the property of a third party, the police department could either direct people to CrimeReports.com or, Ardia worries, say that it’s not free to make the information available to others.

“This is a problematic trend as governments adapt to and adopt these technologies that improve their use and analysis of information,” Ardia said.

Obviously all of this runs counter to established practice for public records and data in journalism, and Ardia said that it’s likely the issue won’t be settled until a case similar to Public Engines v. ReportSee makes its way to the courts. (We should have a better view of how the hot news doctrine holds up overall, though, after an appeals court rules on the FlyOnTheWall case.) But a better option could be to adapt current open records laws to reflect changes in how data is stored, processed, and accessed, Ardia said. Businesses and developers should be able to build products on a layer of public data, he said, but not exclusively — or at the expense of greater access for the broader public.

“We don’t have to wait for the courts to resolve this. Part of this can be addressed through changes in open records laws,” Ardia said. “Put the onus on agencies to make this data available when they sign agreements with third parties.”

January 21 2011

15:23

Turning the iPad into an Open, Offline Mapping Platform

We've talked here before about TileMill, an open source tool for creating your own custom map tiles (the individual pieces that make up a full map of a city, country, and so on). But what sorts of things can you do with these map tiles? One area we wanted to explore was using them on Apple's latest touch-based device, the iPad. Providing a touch interface for maps is a serious usability win and the long battery life, huge available storage, and opportunistic network connectivity combine to make a really attractive mobile mapping platform.

The result? The MapBox iPad app. This app allows you to use custom maps on the iPad (and in an open format), as well as use OpenStreetMap (OSM) map tiles, overlay custom data in Google Earth's popular KML format as well as GeoRSS, save and share map snapshots, and much more.

To create the app the first thing we had to figure out was an alternative to Apple's standard MapKit toolset, which only uses online Google Maps. This was accomplished with the open source route-me library. Once this was decided, we created a file format called MBTiles to easily exchange potentially millions of tile images so they could be used offline.

We then layered on data visualizations, creating an open source library called Simple KML in order to parse and display the KML and KMZ file formats, something that hasn't really been done much on the iPhone or iPad outside of Google's own app.

MapBox for iPad

To round out the initial release, we added the ability to save the current view -- coordinates, zoom level, and data overlays -- as a document for later, as well as the ability to email a picture of the current map straight from the app.

As a whole, we've been really happy with the iPad as an open mapping platform. We've used some tools, made some new ones available, and combined them all in new ways.

Do you have any ideas for open mapping on the iPad? We'd love to hear your thoughts in the comments or on Twitter, where you can follow our progress at @MapBox.

January 20 2011

17:37

Dotspotting Expands to Track Homicides, Food Vendors, Road Trips

Since my last post, we've been busily working on extending the functionality of Dotspotting, the first project in our larger Citytracking project aimed at helping people tell stories about cities. It's still, as my colleague Aaron puts it, very much in a "super alpha-beta-disco-ball" state -- which mainly just means we don't want anyone to put data in there that they expect to keep -- but it's getting there.

A few things have happened that I want to update you about:

  • Import has been expanded from only accepting .csv files to include .gpx, .json, .kml, .rss and .xls files. Various people around the studio use a variety of GPS tracking software. I use Trails for the iPhone, Julie uses Mytracks for Android, and so on. We've been starting with those formats as a baseline, using the files that different applications export and pulling them into Dotspotting.
  • Export has been expanded to include all of the above file types, and also includes .png files. We're hoping this is going to be particularly useful for journalists who want to include images of geographic content in their articles but don't want to use the screen-capping-a-google-map-and-hoping-the-legal-department-doesn't-catch-on technique. So these kinds of images become easy to export out of the system.
  • Search is coming along. This report from DataSF about 311 activity in District 6, where the studio is, has 392 dots, showing the wide variety of calls for service that the system handles in a week. You can now search these kinds of reports fairly comprehensively, so it's now possible to make maps of only those requests having to do with Catch Basin Maintenance, graffiti, or tree maintenance. These are the kinds of queries that we want to enable journalists and others to make when telling stories about city data, and they're the kind of thing that lots of current city data services don't report, so it's gratifying to see those come together.
  • Search is also working in a limited fashion relative to position, and PDF export is next on the list. More on these next time.
  • We've squashed a lot of bugs related to importing and exporting, and found a bunch more of course. We're keeping track of these on the project GitHub account; if you find one, please let us know.
  • Uploads are starting to trickle in from outside the studio walls: Homicides in Richmond, New Food Vendors in Vancouver, and trips along the coast of California are a few of what we've seen. We're in conversations with a couple of cities and other groups, more on that next time.

Onwards!

January 11 2011

17:45

How Mapping, SMS Platforms Saved Lives in Haiti Earthquake

This article was co-authored by Mayur Patel

Tomorrow marks the anniversary of the devastating earthquake that shook Haiti last January, killing more than 230,000 people and leaving several million inhabitants of the small island nation homeless. Though natural disasters are common, the humanitarian response this time was different: New media and communications technologies were used in unprecedented ways to aid the recovery effort.

A report released today by Communicating with Disaster Affected Communities, with support from Internews and funding from the Knight Foundation, takes a critical look at the role of communications in the crisis and recommends ways to improve the effectiveness of utilizing media in future disaster relief efforts. (The Knight Foundation is a major funder for MediaShift and its sister site MediaShift Idea Lab.)

In the weeks after the crisis, Haiti quickly became a real world laboratory for several new applications, such as interactive maps and SMS texting platforms. In the aftermath of the quake, these tools were used for the first time on a large scale to create dialogue between citizens and relief workers, to help guide search-and-rescue teams and find people in need of critical supplies. The report, Lessons from Haiti [PDF download] (co-authored by Anne Nelson and Ivan Sigal, with assistance from Dean Zambrano), recounts the stories of media participants, technologists, humanitarian organizations, Haitian journalists and response teams involved in the relief. Many of these players were first brought together to share their experiences at a roundtable convened by the Knight Foundation and Internews last May.

Notable Innovations

"The most notable innovations to emerge from Haiti were: the translation of crowdsourced data to actionable information; the use of SMS message broadcasting in a crisis; and crowdsourcing of open maps for humanitarian application," according to the report. A dizzying array of new media and information technology groups, Haitian diaspora networks and media development partners were involved in these initiatives (see the infographic below). Although these innovations had varying levels of impact in Haiti, they showcased the potential for use in future crises.

haitiResponse_final_03.jpg

One of the most notable developments was the application of Ushahidi, an online crisis mapping platform that was born only a few years earlier in Kenya. Ushahidi had already been used to map political violence, but it had not yet been used in the context of large-scale natural disasters. When the earthquake struck, an ad hoc coalition quickly took shape, anchored by a group of graduate students at Tufts University in Boston.

The Ushahidi teams, supported by translators from the Haitian diaspora community in the U.S., gathered information from news reports and individuals about the most acute needs on the ground: rescue, food and water, and security, among others. The coordinates were placed on a map and made available to rescue and relief teams.

Soon they were able to include SMS texts in their bank of information. A few days after the quake, Digicel, one of Haiti's leading telecom companies, agreed to offer a free short code (4636) for SMS texts in service of the relief efforts, with the help of InSTEDD, a technology focused humanitarian organization. The four-digit code enabled cell phone users to send free messages to central information centers about missing persons and emergency needs. SMS messages and direct reports from Haitian citizens began to flow within four days of the quake.

OpenStreetMaps, an open community of volunteer mappers, joined the effort to create online maps of Haiti's improvised and unnamed neighborhoods. These maps became the standard reference points: Users included not just information technology platforms such as Ushahidi, but also large providers of humanitarian services, such as the UN Office for the Coordination of Humanitarian Affairs (UNOCHA) and the International Federation of the Red Cross (IFRC).

Not Necessarily a Success Story

However, the CDAC Report cautions against calling the Haitian experience a "new media success story," as some of the approaches -- attempted for the first time -- faltered. The crisis threw together volunteer technology communities and professional humanitarian organizations, without a common language and operating procedures. A lack of coordination and understanding of how to use and integrate the new tools into existing disaster relief structures further complicated efforts on the ground.

In addition, new media efforts did not preclude the importance of traditional media. As in past crises in the developing world, radio continued to be the most effective tool for serving the information needs of the local population. With Haiti's newspapers and television broadcasters knocked out of production for the first few weeks after the quake, radio provided a heroic lifeline. One Haitian station, SignalFM, was able to broadcast continuously throughout the crisis, and worked closely with both international relief organizations and the digital innovators in support of the population. Popular radio host Cedre Paul reached his audience via Twitter as well as on the air.

"We have always known that one of the best ways to communicate with affected population in crises is through radio broadcasts," said Mark Frohardt, vice president of humanitarian programs for Internews, a media development organization. "We found in Haiti that innovative technologies not only had an impact on information delivery on their own, but also greatly enhanced the reach and effectiveness of radio."

Still Work to be Done

For all the welcome innovation, the report makes it clear that digital humanitarian action has a long ways to go. One of the big obstacles to the Haiti effort was the lack of pre-existing connections between the large government and international institutions and the new tech activists. Large institutions tend to mean weighty protocol, some of it based on long and bitter experience. The International Committee of the Red Cross (ICRC), for example, has strict rules of confidentiality, which has allowed it to play a uniquely useful role in conflicted and tense situations, while the open source community's hallmarks are spontaneity and transparency.

Nonetheless, the connections among the various sectors advanced in Haiti, stimulated by a common desire to help, and there are many signs that new synapses are emerging. For example, CDAC has made some progress bridging the gaps between the humanitarian and media communities. The report calls for more of this kind of cross-sector collaboration to improve future recovery efforts. Specifically, it recommends that media, new technology developers and humanitarian agencies (both UN and international NGOs) engage in joint preparation and simulation exercises for future emergency responses.

We should not forget that Haiti's crisis is far from over. Many donors have yet to fulfill their commitments for reconstruction funds, and much of the rubble remains. New digital initiatives are still appearing; one promising new effort from MIT is an online labor exchange for Haitians called Konbit.

Disasters will continue to occur. But their damage can be mitigated by relief efforts that are well-planned and executed in concert with the local population. Digital media technologies offer a unique opportunity to advance these goals with the right on-the-ground coordination. As the report notes: Haiti demonstrated "the culmination of a vision and the beginning of the hard work of implementation."

Anne Nelson is an educator, consultant and author in the field of international media strategy. She created and teaches New Media and Development Communications at Columbia's School of International and Public Affairs (SIPA) and teaches an international teleconference course at Bard College. She is a senior consultant on media, education and philanthropy for Anthony Knerr & Associates. She is on Twitter as @anelsona, was a 2005 Guggenheim Fellow, and is a member of the Council on Foreign Relations.

This is a summary. Visit our site for the full post ».

December 13 2010

18:11

How OpenStreetMap Helps to Curb Haiti's Cholera Epidemic

In order to respond to the current cholera epidemic in Haiti, it's essential that citizens, aid groups and others are aware of the locations of functioning health and sanitation facilities. The challenge is that maps showing this information don't currently exist -- at least not in a comprehensive and up-to-date way.

Guensmork Alcin is attempting to change this. He is working with the International Organization for Migration (IOM) to expand OpenStreetMap, a free and open source map of the world that has one of the most detailed GIS data sets in existence on Haiti. Guensmork, known as Guens, is training local IOM staff and folks from the International Committee for the Red Cross and the World Food Programme responding to the epidemic on how to use hand-held GPS devices to collect data to add to maps on OpenStreetMap. He is one of 30 Haitians recently hired by IOM to work full-time contributing to OpenStreetMap to improve map details and grow the community around it.

Humanitarian OpenStreetMap Team

I first met Guens last March when I traveled to Haiti with the Humanitarian OpenStreetMap Team (HOT). In the days immediately following the January earthquake, hundreds of volunteers from all over the world used recently liberated satellite imagery to trace roads, building footprints, and other map features of Haiti into the OpenStreetMap database. The data they produced quickly became critical to the response and was used on the GPS devices of first responders and as a resource in planning the response by the UN cluster system. As part of the Humanitarian OpenStreetMap Team, I was in Port au Prince to help UN and NGO staff understand how to use and participate in OpenStreetMap. We also wanted to find ways to engage with civil society members and NGOs with a long-term stake in Haiti -- and not just with the humanitarian workers that would cycle out after the initial stages of the response.

Guensmork Photo: Guensmork Alcin leading an OpenStreetMap training, courtesy of Todd Huffman

Guens became involved as a representative of the Cite Soleil Community Forum. He had seen aid workers survey Cite Soleil and believed that, with a little help in the form of a loaned GPS or two, the people of Haiti's most famous slum could collect this information -- vital to planning the distribution of aid -- themselves. Inspired by Mikel Maron's work mapping Kibera, Nairobi, we partnered with Community Forum to throw a a mapping party in Cite Soleil to bring folks together to learn about OpenStreetMap and find out how they could get involved and contribute to it by mapping their neighborhoods.

Importance of Accurate Information

Eight months later I returned to Haiti on the fifth trip undertaken by the Humanitarian OpenStreetMap Team. Since that first mapping party, we've worked with Guens and other Cite Soleil residents to map first their community and then other parts of the country. The team they've built is collecting data critical to the cholera response, building the local OpenStreetMap community, and ensuring that the best maps of the country are created by Haitians and are free to use by anyone who needs them.

As the cholera epidemic worsens, the work that Guens and his team are doing is only more important. Accurate information about the location and quality of water and sanitation infrastructure and health facilities is critical to efforts to combat the disease. With the continued support of IOM, this data will be public, regularly updated, and available for use by all aspects of the response.

November 25 2010

14:29

November 12 2010

15:00

Hacking data all night long: A NYC iteration of the hackathon model

In the main room of the Eyebeam Art and Technology Center’s massive 15,000-square foot office and lab space in the Chelsea neighborhood of Manhattan, more than sixty developers, designers, and journalists pore over their computer screens. A jumble of empty coffee cups and marked up scraps of butcher paper litter the tabletops while networks of power cords fan out underneath.

The event is The Great Urban Hack, a two-day overnight hackathon, organized by the meetup group Hacks/Hackers, that took place over an intense 30-hour stretch this past weekend. Starting early Saturday morning journalists and developers came together to “design, report on, code and create projects to help New Yorkers get the information they need while strengthening a sense of community.”

The eleven teams that participated in the event worked on a varied set of projects that ranged in scope from collaborative neighborhood mapping to live action urban gaming.

Rearranging and visualizing data

The team that worked on the project “Who’s My Landlord?,” based off of Elizabeth Dwoskin’s article of the same name in the Village Voice last Wednesday, concerned itself with the task of helping residents determine who owns a given piece of property. Dwoskin’s article points out that for many of the most derelict buildings in the city this link is obfuscated, a huge barrier for city agencies in their task of regulation to protect tenants. The team built a tool that draws from three databases: two from the city to pull the names of building owners, and one state database to look up the address of the owner when there is an intermediate company.

Several groups worked on visualizations of some form of city data. The “Drawing Conclusions” team created a “Roach Map” using the raw data set of restaurant inspection results from the NYC Data Mine. The group wrote a script that scans the data line-by-line and counts each violation by zip code. They then analyze the data, taking into account variation in the number of inspections across zip codes, and plot it on a map of the city which auto-generates every week.

How hackathons work is simple: They define goals and create artificial constraints (like time) to catalyze the process of innovation. The closest journalistic equivalent might be the collaborative rush of a good newsroom working a big breaking story. But is this really the best environment to incubate projects of a journalistic nature? What are the different circumstances that foster the healthiest practices of innovation? And what is the best way to set expectations for an event like this?

The hackathon model

Hackathons like this are a growing trend. A lot can be said for bringing these groups together and into a space outside of their normal work environment. What’s maybe most fascinating to me is the opportunity for cultural interplay as these two groups find themselves more and more immersed in each other’s creative work. As John Keefe, one of the hosts of the event and a senior producer at WNYC, says: “It’s not really journalistic culture to come together and build stuff like this.”

Chrys Wu, a co-organizer of Hacks/Hackers and both a journalist and developer, talked about the group’s different philosophy’s of sharing information: “Your traditional reporter has lots of lots of notes, especially if they’re a beat reporter. There’s also their rolodex or contacts database, which is extremely valuable and you wouldn’t want to necessarily share that. But there are pieces of things that you do that you can then reuse or mine on your own…at the same time technologists are putting up libraries of stuff, they say: ‘I’m not going to give you the secret sauce but I’m definitely going to give you the pieces of the sandwich.’”

Lots of questions remain: what is the best way to define the focus or scope for an event like this? Should they be organized around particular issues and crises? And what’s the best starting point for a journalistic project? Is it with a problem, a data set, a question, or as in the case of the landlord project: the research of a journalist? For all of the excitement around hackathons, this seems like just the beginning.

Photo by Jennifer 8. Lee used under a Creative Commons license.

October 29 2010

14:55

Mapnik: The Coolest Mapping Software You've Never Heard Of

On the MapBox website we describe TileMill — the project we’re working on with our 2010 Knight News Challenge grant — as “a toolkit for rendering map tiles”. To be more specific, it’s essentially a “glue layer.” TileMill is built on top of a cocktail of other open source mapping software projects, and its biggest value is streamlining other more complex tools into a clean and easier workflow. For users to take advantage of TileMill, it can be useful to understand some of the underlying parts. Perhaps the most important part of that cocktail is a lesser known open source project called Mapnik. In this post I’ll talk a little about what Mapnik is and the important role it plays in helping users style their maps, as well as how it relates to TileMill.

The goal of the TileMill project is to make it easy for anyone with some basic web design familiarity to design their own custom maps. In past posts on this site we’ve introduced readers to the general reasons why we think custom online maps are valuable and have shared a couple examples for when custom maps have been particularly helpful on websites. Mapnik makes all this possible by providing the core technology to apply styles to GIS data and then render maps based on those styles.

Here’s the basic idea with styling maps: raw GIS data in the form of shapefiles contains information about various “features” — for instance, place names, points (e.g. center of a city), lines (e.g. roads), or polygons (e.g. state or country borders). If you have the data in its raw form, you’re only part of the way toward turning it into a map. Next you need to decide how to style each element.

Mapnik in action, styling maps of Kabul, Afghanistan Mapnik in action, styling maps of Kabul, Afghanistan

The style of each feature (or lack thereof) is why maps of the same location might look different from others. At a simple level, you might want your primary roads to be red versus orange. Compare MapQuest to OpenStreetMap for instance, at the exact same zoom level — note the difference in the styles for the same features.

Boulder, CO on MapQuest Screenshot of Boulder, CO on MapQuest

Boulder, CO on OpenStreetMap Screenshot of Boulder, CO on OpenStreetMap

Setting aside conversations about which features you decide to show on a map and assuming your data is accurate (both are huge factors), how you choose to style certain features might be the next most important part of map design. Getting styling right is essential for your users and is central to map design. If you over style or under style features, it has a direct impact on the readability and effectiveness of your maps.

This is where Mapnik comes in — it provides the framework for styling map data and then rendering new maps based on those styles. Mapnik is an open source project that is heavily used by the team at Cloudmade, who are involved in styling OpenStreetMap, and it’s been used by MapQuest, who have even released their Mapnik map style files for the public. Our team uses it heavily too, and AJ Ashton and Tom MacWright from the MapBox team were recently in London at Cloudmade’s offices with a group of core contributors, including Mapnik’s creator Artem Pavlenko, for the first ever Mapnik code sprint.

But where professional mappers are able to leverage Mapnik in complex ways, it has its downsides for the average would-be map designer. For starters, it’s not easy for noobs to even install it, before anyone worries about using it. This is part of why we’re working on TileMill — we want to make it easier for people to take advantage of these powerful tools. TileMill puts a wrapper around Mapnik that makes it simple to set up and leverage the powerful map styling capacity that it provides.

If you’re interested in more details about Mapnik, check out the Mapnik website or a recent Q&A with Mapnik developer Dane Springmeyer about Mapnik performance on Development Seed’s blog.

October 20 2010

16:16

OpenStreetMap's Audacious Goal: Free, Open Map of the World

In our previous posts on TileMill, we’ve focused on how open data can be used to create custom mapsand tell unique stories. One question we run into a lot is, “Where does open data come from?”

One exciting source is a global mapping project called OpenStreetMap (OSM). Founded in 2004 with the goal of creating a free and open map of the world, OSM now boasts over 300,000 contributors and has comparable or better data for many countries than the popular proprietary or closed datasets. The premise is simple and powerful: Anyone can use the data, and anyone can help improve it.

OSM-based map of Port au Prince made with TileMill

With this huge amount of data, activity, and adoption, we’re excited about how TileMill is going to give more people ways to leverage OSM data to make their own maps. Users will be able to mash up OSM data on their own using TileMill and turn it into their very own custom map.

Humanitarian OpenStreetMap Team

To get a sense of the practicality of OSM, just look at the role it played in the response to the January 12 earthquake in Haiti. Reliable maps are critical to disaster response efforts and there simply wasn’t much data available for the affected areas. Within hours of the quake, the OSM community mobilized and hundreds of volunteers from all over the world began tracing available satellite imagery, importing available datasets, and coordinating with relief workers on the ground to ensure that new data was being created and distributed in ways that would best support their work.

Using OpenStreetMap as a platform and leveraging the existing, engaged community paid off — within days, volunteers had created the best available maps of Port au Prince and nearby cities. OSM data quickly appeared on the GPS devices of search and rescue teams, and in the planning tools of the international response community.

Members of the Humanitarian OpenStreetMap Team (HOT), of which I’m a member, have continued to support the use of OSM in Haiti through trainings with local NGOs, the Haitian government, and international responders. In November, I’ll be part of the fifth deployment of HOT team members to Haiti to support the International Organization for Migration (IOM) in their work to map the camps for people displaced by the earthquake, using OSM as a platform.

Through this effort by the OSM community, anyone looking to make a map of Haiti has a great database of roads, hospitals, and even collapsed buildings that they can use in their work. We see this kind of data sharing as important capacity-building to help people make useful custom maps. With TileMill, we’re working to create a practical toolset for working with this data.

Beyond Haiti

Moving beyond Haiti and thinking about maps of other places, what’s exciting about OpenStreetMap is the hundreds of community groups around the world getting together and using OSM to map their own cities and neighborhoods. If a map data doesn’t exist yet, there’s a chance that it could through the efforts of the OSM community. For instance, the image below is a picture of work the local OSM community did in Washington, DC, to make a very detailed map of the National Zoo.

Mapping the National Zoo in Washington, DC by ajturner

If you’re looking for open map data for your next project, a great place to start would be to reach out to the local OSM community in your area — there’s a good chance they can help you figure out how to get it.

October 18 2010

13:10

Mapping the budget cuts

budget cuts map

Richard Pope and Jordan Hatch have been building a very useful site tracking recent budget cuts, building up to this week’s spending review.

Where Are The Cuts? uses the code behind the open source Ushahidi platform (covered previously on OJB by Claire Wardle) to present a map of the UK representing where cuts are being felt. Users can submit their own reports of cuts, or add details to others via a comments box.

It’s early days in the project – currently many of the cuts are to national organisations with local-level impacts yet to be dug out.

Closely involved is the public expenditure-tracking site Where Does My Money Go? which has compiled a lot of relevant data.

Meanwhile, in Birmingham a couple of my MA Online Journalism students have set up a hyperlocal blog for the 50,000 public sector workers in the region, primarily to report those budget cuts and how they are affecting people. Andy Watt, who – along with Hedy Korbee – is behind the site, has blogged about the preparation for the site’s launch here. It’s a good example of how journalists can react to a major issue with a niche blog. Andy and Hedy will be working with the local newspapers to combine expertise.

October 16 2010

13:39

ScraperWiki: Hacks and Hackers day, Manchester.

If you’re not familiar with scraperwiki it’s ”all the tools you need for Screen Scraping, Data Mining & visualisation”.

These guys are working really hard at convincing Journos that data is their friend by staging a steady stream of events bringing together journos and programmers together to see what happens.

So I landed at NWVM’s offices to what seems like a mountain of laptops, fried food, coke and biscuits to be one of the judges of their latest hacks and hackers day in Manchester (#hhhmcr). I was expecting some interesting stuff. I wasn’t dissapointed.

The winners

We had to pick three prizes from the six of so projects started that day and here’s what we (Tom Dobson, Julian Tait and me)  ended up with.

The three winners, in reverse order:

Quarternote: A website that would ‘scrape’ myspace for band information. The idea was that you could put a location and style of music in to the system and it would compile a line-up of bands.

A great idea (although more hacker than hack) and if I was a dragon I would consider investing. These guys also won the Scraperwiki ‘cup’ award for actually being brave enough to have a go at scraping data from Myspace. Apparently myspace content has less structure than custard! The collective gasps from the geeks in the room when they said that was what they wanted to do underlined that.

Second was Preston’s summer of spend.  Local councils are supposed to make details of any invoice over 500 pounds available, and many have. But many don’t make the data very useable.  Preston City council is no exception. PDF’s!

With a little help from Scraperwiki the data was scraped, tidied and put in a spreadsheet and then organised. It through up some fun stuff – 1000 pounds to The Bikini Beach Band! And some really interesting areas for exploration – like a single payment of over 80,000 to one person (why?) – and I’m sure we’ll see more from this as the data gets a good running through.  A really good example of how a journo and a hacker can work together.

The winner was one of number of projects that took the tweets from the GMP 24hr tweet experiment; what one group titled ‘Genetically modified police’ tweeting :). Enrico Zini and Yuwei Lin built a searchable GMP24 tweet database (and a great write up of the process) of the tweets which allowed searching by location, keyword, all kinds of things. It was a great use of the data and the working prototype was impressive given the time they had.

Credit should go to Michael Brunton-Spall of the Guardian into a useable dataset which saved a lot of work for those groups using the tweets as the raw data for their projects.

Other projects included mapping deprivation in manchester and a legal website that if it comes off will really be one to watch. All brilliant stuff.

Hacks and hackers we need you

Give the increasing amount of raw data that organisations are pumping out journalists will find themselves vital in making sure that they stay accountable. But I said in an earlier post that good journalists don’t need to know how to do everything, they just need to know who to ask.

The day proved to me and, I think to lots of people there,  that asking a hacker to help sort data out is really worth it.

I’m sure there will be more blogs etc about the day appearing over the next few days.

Thanks to everyone concerned for asking me along.

October 07 2010

14:00

Los Angeles Times collaborates across the newsroom and with readers to map neighborhood crime

There’s something about the immediacy of the web that makes interactive features seem effortless: One click and the information is there. But of course the feel of the end product is not the same as the process required to get it there. Just ask the Los Angeles Times.

Last week the Times unveiled a new stage in its ongoing mapping project, Mapping L.A. The latest piece lets users check out crime data by neighborhood, including individual crimes and crime trends. Ultimately, the goal is to give locals access to encyclopedia-style information about their neighborhoods, including demographic, crime, and school information. And for reporters, it’s a helpful tool to add context to a story or spot trends. Getting the project where it is now has been a two-year process, drawing on talent across the newsroom and tapping the expertise of the crowd. I spoke with Ben Welsh, the LAT developer working on the project, about what it’s taken to piece it together. Hint: collaboration.

“I was lucky to find some natural allies who had a vision for what we could find out,” Welsh told me. “In some sense it’s the older generation of geek reporters. There’s this whole kind of tradition of that. We talk the same language. They collect all this data — and I want data so we can do stuff online. Even though we don’t have the same bosses, we have this kind of ad hoc alliance.”

Before Welsh could start plotting information, like crime or demographics data, the Times had to back up to a much simpler question: What are the neighborhood boundaries in Los Angeles city and county?

“Because there are no official answers and there are just sort of consensus and history and these things together, we knew from the get-go it was going to be controversial,” Welsh said. “We designed it from the get-go to let people to tell us we suck.”

And people did. About 1,500 people weighed in on the first round of the Times’ mapping project. A tool allowed users to create their own boundary maps for neighborhoods. Between the first round and second round, the Times made 100 boundary changes. (Compare the original map to the current one.) “I continue to receive emails that we’re wrong,” more than a year later, Welsh said.

An offshoot project of the neighborhood project was a more targeted question that every Angeleno can answer: “What is the ‘West Side’?” Welsh said the hundreds of responses were impassioned and creative. The West Side project was recently named a finalist for the Online News Association’s annual awards in the community collaboration category.

Welsh has now layered census, school, and crime data into the project. Working with those varied government data set brings unique problems. “We put all kinds of hours in to clean the data,” Welsh said. “I think a lot of times journalists don’t talk about that part.” At one point, the Times discovered widespread errors in the Los Angeles Police Department data, for example. The department got an early look at the project and supports the Times’ efforts, and has actually abandoned its own mapping efforts, deciding to use the Times’ instead.

Welsh doesn’t talk about the project in terms of it ever being “finished.” “With everything you add, you hope to make it this living, breathing thing,” he said. In the long-run, he hopes the Times will figure out a way to offer a more sophisticated analysis of the data. “That’s a challenging thing,” he said. In the more immediate future, he hopes to expand the geographic footprint of the project.

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl