Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

April 19 2012

13:31

Public Lab's Community-Created Maps Land on Google Earth

We've just announced that community-generated open-source maps from the Public Laboratory for Open Technology and Science (PLOTS) -- captured from kites and balloons -- have been added to Google Earth. The 45-plus maps are the first aerial maps produced by citizens to be featured on the site, and are highlighted on the Google Lat Long Blog.

The Public Laboratory is an expansion of the Grassroots Mapping community. During an initial project mapping the BP oil spill, local residents used helium-filled balloons and digital cameras to generate high-resolution DIY "satellite" maps documenting the extent of the Deepwater Horizon oil spill in the Gulf of Mexico -- at a time when there was little public information available. Expanding the toolkit beyond aerial mapping, Public Laboratory has been growing into a diverse community, both online and offline, experimenting with new ways to produce information about our surroundings. The lab's DIY kits cost less than $100 to assemble.

"We're very excited to be able to include some of the balloon and kite imagery from the Public Laboratory in Google Earth. It provides a unique, high-resolution view of interesting places, and highlights the citizen science work of the Public Laboratory community," said Christiaan Adams of Google Earth Outreach.

"The Public Laboratory is demonstrating that low-cost tools, in the hands of everyday people, can help generate information citizens need about their communities," added John Bracken, Knight Foundation program director for journalism and media innovation.

a mission of civic science

Especially exciting is a map of the Gowanus Canal Superfund site in Brooklyn, N.Y., that was created during the winter of 2011 and has been added to the primary layer of Google Earth/Google Maps. The New York chapter of Public Laboratory has begun an ongoing periodic monitoring campaign in partnership with local environmental advocacy group the Gowanus Canal Conservancy. Designated a Superfund cleanup site by the Environmental Protection Agency in 2010 due to pollution from decades of coal tar accumulation in canal sediments, and suffering from 300 million gallons of untreated sewage which are released into the canal yearly, local activists have adapted and improved many of the techniques developed for monitoring the effects of oil contamination in the Gulf of Mexico. That a group of local activists could create a high-resolution map of an area they care about -- and that such imagery could replace commercial and government data as a recognized representation of that place -- is a powerful example of the civic science mission of Public Laboratory.

nyc.JPG

Democratizing diy

Public Lab is a community which develops and applies open-source tools to environmental exploration and investigation. By democratizing inexpensive and accessible "Do-It-Yourself" techniques, Public Laboratory creates a collaborative network of practitioners who actively re-imagine the human relationship with the environment.

The core PLOTS program is focused on "civic science" in which we research open-source hardware and software tools and methods to generate knowledge and share data about community environmental health. Our goal is to increase the ability of underserved communities to identify, redress, remediate, and create awareness and accountability around environmental concerns. PLOTS achieves this by providing online and offline training, education and support, and by focusing on locally relevant outcomes that emphasize human capacity and understanding.

Please watch for the follow-up post by Public Lab's Stewart Long in the next week.

April 12 2012

14:00

SocMap: Why a Small Map App Can Be Better Than a Big Geo-Social Platform

We experimented with various concepts for SocMap.com for a whole year in an effort to create a map-based social network for connecting and informing people in local neighborhoods.

The conclusion: Even though we can reach commendable levels of new user registration, our users don't create content and so the platform doesn't grow. Experimenting with usability didn't solve this, so we dug deeper.

We came up with the idea of decentralizing SocMap -- creating small and useful map applications instead of a big geo-social platform. Creating applications are cheaper and easier than managing a large website, so we find them to be much more suited for experimenting with, and finding the right concept for, SocMap.com

In February, we launched our very first application, HotBills, which we created in partnership with the Baltic Centre for investigative journalism (Re:Baltica). The idea behind the app is to determine how much people pay for heating in various parts of Latvia, so that the data can later be used in journalists' research into heating prices, transparency and validity, as well as to give people an incentive to talk to their landlords about the prices, ask for explanations, and get adequate answers. We asked users to scan their bills and submit them.

Developing this application took just a couple of weeks -- so we saw it as a minor experiment that wouldn't deter development of SocMap.com even if it failed.

The outcome

The idea was well-received from the start -- we secured partnerships with the largest media outlets in Latvia, including LR1, the national radio broadcaster; TVNET, the second-largest news site; TV3, the largest TV channel; DIENA, the largest newspaper; and DRAUGIEM.LV, the top local social network.

Within a month, the application was used by almost 2 percent of the population, or 37,800 people, almost 2,400 of which uploaded real bills. Analyzing these bills revealed: the cost of heating per square meter differs by up to several times; even neighboring houses can have vastly different costs; people do not know how their bills are calculated; and there's confusion about how the calculations are carried out and what some entries in the bills mean since there are no national guidelines or methodologies for this.

Thanks to data being visible on a map, it was easy for people to understand. Following the launch of the app, the minister of economy promised to look into these and other issues that were raised by journalists at a conference.

Screen+Shot+2012-04-11+at+19.14.37.png

During the first two weeks, we managed to get four out of 100 users to upload a bill. This was unexpectedly high, especially considering the effort required -- even a couple of seconds of attention are worth a fortune, but with this, the users had to find the bill, scan it, and send it over, which can take up to several minutes. Good results notwithstanding, we decided to push them even higher -- we improved the landing page and usability and reached a conversion rate of 6.4 percent!

You're welcome to check out the user experience before:

siltums_socmap-5_1.1ENG.png

... and after:

siltums_socmap-5_1.2ENG.png

Key facts about HotBills (Jan. 9 - Feb. 15)

  • 6.4% users uploaded their bill
  • 37,800 unique visitors - 1,9% of the population of Latvia
  • 2,400 submitted bills (20 of which were sent by snail mail)

Receiving bills from all across Latvia convinced us that an application like this is an indispensable tool for crowdsourcing and displaying location-based data. This prompted us to develop a tool that would allow journalists without technical skills to set up similar studies within minutes. This tool was developed together with Re:Baltica. TVNET, one of the biggest Latvian news portals, has agreed to become our pilot-client!

It seems that SocMap can succeed in a scenario where we focus on creating task-tailored applications -- and we expect to introduce new concepts in the coming months. It seems, after years of searching, SocMap.com has finally found its right path. This summer will show us for sure.

March 28 2012

14:00

Colorful City Tracking Maps Launch Under Creative Commons

Maps.stamen.com, the second installment of the City Tracking project funded by the Knight News Challenge, is live.

These unique cartographic styles and tiles, based on data from Open Street Map, are available for the entire world, downloadable for use under a under a Creative Commons Attribution 3.0 license, and free.

takes deep breath

There are three styles available: toner, terrain, and watercolor:

  • Toner is about stripping online cartography down to its absolute essentials. It uses just black and white, describing a baseline that other kinds of data can be layered on. Stripping out any kind of color or image makes it easier to focus on the interactive nature of online cartography: When do different labels show up for different cities? What should the thickness of freeways be at different zoom levels? And so forth. This project is the one that Nathaniel Vaughn Kelso is hacking on at all hours, and it's great to be seeing Natural Earth data get more tightly integrated into the project over time.
  • Terrain occupies a middle ground: "shaded hills, nice big text, and green where it belongs." In keeping with City Tracking's mandate to make it easier for people to tell stories about cities, this is an open-source alternative to Google's terrain maps, and it uses all open-source software like Skeletron to improve on the base line cartographic experience. Mike Migurski has been heading up this design, with help from Gem Spear and Nelson Minar.
  • Watercolor pushes through to the other side of normal, bending the rules of traditional legibility in order to explore some new terrain. It incorporates hand-painted textures and algorithmic rule sets into a design that looks like it's been done by 10,000 slaves in the basement, but is rendered on the fly. Geraldine Sarmiento and Zach Watson did the lion's share of the design and development on this one. This design is a mixed bag for me: I'm delighted to see it out in the world, but it's the thing that's pretty much kept me from looking at anything else for the last month and a half.

The code that runs Toner and Terrain is available for download and use at the City Tracking GitHub repository; we're going to wait on watercolor a little while until we can get some of the kinks ironed out. We talked about waiting to launch until watercolor was all buttoned up, but what with all the attention that Open Street Map has been getting, we decided to just bite the bullet and go for it.

We'll follow up this week with some posts on how everything works and how the sausage is made, and I've got a lot more to say about what I think this implies for what can be done with online maps and data visualization.

In the meantime, have you seen how awesome Los Angeles, Washington, D.C., the Forbidden City, Massachusetts Bay, Key West, London, New Orleans, New York, Versailles, and every other city in the cotton-pickin' world look when you point this thing at it? Holy heck.

Los Angeles

Washington, D.C.

The Forbidden City

Massachusetts Bay

Key West

London

New Orleans

New York

San Francisco

Tokyo

Versailles

January 04 2012

15:20

Public Lab Produces Wetlands Maps From Balloon and Kite Flights

The Public Laboratory for Open Technology and Science (PLOTS) is an organization and membership community which develops and applies open-source tools to environmental exploration and investigation. Public Laboratory's mapping tools, openly available and easy to use, are putting the ability to do processes such as georectifying in the hands of people who may have never created a map. 

Using aerial mapping techniques, residents and volunteers of the Gulf Coast region began field mapping trips in 2010 to document the impact of the BP oil spill. Between May 2010 and April 2011, tens of thousands of images were collected and 50 regional maps created. Between May and October 2011, Public Laboratory partnered with Dr. Alex Kolker, from the Louisiana Universities Marine Consortium to begin a bi-monthly monitoring of select oil impacted and non-impacted sites in the Barataria Bay region. The intent of this phase of wetlands mapping was to monitor change over time with high-resolution aerial and ground imagery.

slong-image00.jpg
Wilkinson Bay in Louisiana.

MAKING MAPS WITH GEORECTIFICATION
The fieldwork that goes into collecting images is the first step in creating maps. On the back end, the next step involves georectification -- the process where balloon and kite aerial images are "published" into geographic data. Simply, it is the alignment of an aerial image with a map or other spatial data of the same area. This part of the process is where images become maps and are associated with geographic standardized formats so that other users and programs can exchange and experience them in the same context.

Public Laboratory map production has used some specific techniques in creating wetlands maps. The maps are made through a georectification process by which adjacent images from the flights are merged in overlay as they are aligned to existing mapping information. Different "base" data can be used in these types of projects. In this case, we're georectifying the imagery through examination of the new images with existing imagery.

slong-image02.jpg

A new unaligned image (right) about to be georectified with the other imagery. The base layer is visible in the background.

Distortion correction is applied to the new images, and they're moved around so that the same features of the base map are in perfect overlay with the new image on top. In the bayou setting where lots of change is occurring on the outer coastlines, features such as vegetation composition and interior waterways are used to match each overlay. Although waterways in marshland may shift quite rapidly, the center of a 3-way intersection is quite stable, as a rule.

Matching the imagery based on the interior features has proven effective in this simple map-making technique. While the outer coastline in our new images has changed since the time of the base data, the process of fitting the imagery with all of the interior ground control features allows us to discover where the coastline is with some measure of confidence. Or to put it differently -- when the alignment happens with the historic data in most of the image, then the new areas can be extrapolated with regularity.

slong-image01.jpg

The image is aligned in overlay with the base data during the georectify process.

Visit the Public Lab site for guides and discussion about the process and to view image sets and maps that have already been published in the map archive. The archive is a home and distribution channel for published open-source maps.

Visit MapKnitter.org to learn about our map-stitching tools and to view maps that are being created by people who are using aerial mapping techniques in new ways to document and monitor sites and events that are of importance to their community.

December 08 2011

18:30

Mapping the Story of Climate Change

For this week's climate meetings in Durban, the World Bank released a series of maps showing the predicted impact of climate change on the world between now and 2100.

The data is dismal. If climate change continues unmitigated as it has for the past century, temperatures around the world will increase 5 degrees Celsius (9 degrees Fahrenheit) by 2100 -- the equivalent increase between today's climate and the last ice age. This change won't impact the world equally, with local changes varying from almost none to more than 10 degrees Celsius, depending on scenario, location and season.

All of these maps were designed using Development Seed's TileMill, an easy-to-use open-source map design tool that we've written about here before, and hosted on MapBox Hosting. TileMill is free to download and has loads of documentation to help people get started making maps. For design tips on map making, check out a blog post from Development Seed's AJ Ashton on the thinking behind the design of these maps.

Preparing for climate change

These maps tell the story of the anticipated impact of climate change, from the basics of where we'll see the biggest increase in temperature and fluctuation in precipitation levels to larger societal impacts on food security, countries' economies, and people's vulnerability to natural disasters. With these maps, the World Bank aims to not only show the urgency in preparing for climate changes, but also to target efforts to the countries and regions that will be most affected.

This map shows the expected worldwide temperature increases, assuming that global population continues to increase and regionally oriented economic growth is slower than in other scenarios.

Agriculture is expected to be one of the most affected industries, impacting countries' economies -- and only more so for ones whose GDP (gross domestic product) is made up largely of agriculture-related business. For example, agriculture is 61.3 percent of Liberia's GDP and 47.68 percent of Ethiopia's, while it's just 1.24 percent of the U.S. GDP.

Low-lying coastal areas will likely be more vulnerable to increased flooding, with countries such as Bangladesh, Myanmar and India at highest risk due to the huge populations that live there.

More details on the maps are available in this blog post by Development Seed's Alex Barth.

The data powering the maps is all publicly available from the World Bank, as part of its larger open data push with data.worldbank.org. This and other related climate data is all housed in its Open Data Resources for Climate Change. The World Bank is encouraging people to use this data and is hosting an Apps for Climate challenge to promote and reward this use. Check out the details, and be sure to submit your app by March 16.

July 31 2011

16:44

6 Proposals for Journalism Education Today

I’ve spent a huge amount of time this year thinking about and working on journalism curriculum. From developing and teaching a four-week program to train journalism educators in Africa in the practice of online journalism, to helping with a major overhaul of the undergraduate curriculum in my own department, to my current preparations to teach journalism at a university in Indonesia, I have been thinking a lot about what students need to learn today.

Here are six proposals in three distinct areas of journalism that are increasingly important today.

Data Journalism

My colleague Ron Rodgers sent me this post from the Guardian, and it has great value in its brevity and directness: Data journalism at the Guardian: What is it and how do we do it? It addresses 10 big themes that a journalism educator could build a whole course around, but you can read the whole post in about 10 minutes.

In contrast, a paper produced last August as the outcome of a conference in Europe about data-driven journalism is quite long — 78 pages. The paper, Data-driven journalism: What is there to learn?, provides many details in a very well organized format, and it includes lots of links to examples and tools (free tools!).

Moreover, there’s a new book to help us teach students about data! The video below explains it.

Proposals: (1) A journalism degree program should ensure that all students are introduced to basic data journalism, using current examples and demonstrating how to apply concepts. (2) A journalism degree program should offer at least one 3-credit elective course that focuses exclusively on data journalism.

Social Media and Participation

Just about everyone who teaches journalism is trying to figure out how to integrate social media into the mix. We all know that young people are already active users of social media — but that doesn’t mean they understand how to use those media ethically and effectively to do journalism.

Did you know that journalists in Al Jazeera’s Arabic and English newsrooms have had intensive social-media training? Read about it here. The same article discusses how social media links drive traffic to news websites.

As well as getting involved (if they choose) in newsgathering, verification and curation of news, readers and viewers have also become part of the news-distribution system as they share and recommend items of interest via e-mail and social networks. [source]

The phrase participatory journalism is not precisely defined, but I take it to mean that the audience participates in setting the agenda for news. This requires that journalists make themselves open to listening more, and listening to more sources (not only official ones), as well as making a commitment to go beyond superficial (and sometimes denigrating) man-on-the-street interviews.

Another important term is crowdsourcing. This is one kind of audience participation in gathering news — but not the only kind. This BBC story provides a good overview of crowdsourcing, and this article from the scholarly journal Journalism Practice discusses some excellent examples.

Proposals: (3) All journalism students need to learn how to use social media for specific journalistic goals. Assignments should focus on distinct uses such as identifying experts, crowdsourcing, and crisis mapping. (4) In any journalism program, the instructors must work together to eliminate unnecessary repetition in the program — for example, two or more required courses might have almost identical Twitter assignments or blogging assignments. This is a particular danger because it’s easy to integrate social media into almost any course — but redundancy risks trivializing the experience for students.

Presentation

This is not just a matter of design (as in “page layout and design”), and it should never be a mere afterthought in the production of news materials. A wonderful post by designer Andy Rutledge illustrates better than anything else I have seen why news websites — and many news applications for mobile devices — are more likely to repel readers than to attract them.

Sometimes I think the students who choose to major in journalism came to us through a time machine from a place where people still read text that is printed on paper. What’s especially strange is that most of these students do not themselves read any text on paper — but they imagine that someone will give them a job where they will spend all their time writing text, text, text that will not interact with any other media.

In the early days of print newspapers, pictures were added to help attract people who would buy the product and read the text. Formats and font sizes (among other things) make journalism more appealing. When the product is appealing, it does not drive people away.

Unfortunately, many online and digital news products since the mid 1990s have been doing just that — driving people away. Why was this permitted? Why didn’t the entire newsroom stand up and protest that the website was hideous, slow, impossible to read, horrible, offputting, unusable? They didn’t do it because it wasn’t their job — the way their stories looked was of no concern to them. As the readers abandoned them, the journalists continued to be silent and even ignorant about the destructive effects of bad digital design.

Educators could use this book, for example, and assign students to evaluate news web pages according to its principles: Neuro Web Design: What Makes Them Click?

Proposals: (5) Every journalism program needs a required course in visual design. (6) A journalism course in visual design must educate students in the principles that make an image, a frame, a page, and a screen appealing — or offputting. The course does not need to produce skilled designers; rather, it should produce journalists who recognize when a presentation of news or journalism is effective, and when it is confusing, difficult, and fails.

I’ve spent a huge amount of time this year thinking about and working on journalism curriculum. From developing and teaching a four-week program to train journalism educators in Africa in the practice of online journalism, to helping with a major overhaul of the undergraduate curriculum in my own department, to my current preparations to teach journalism at a university in Indonesia, I have been thinking a lot about what students need to learn today.

Here are six proposals in three distinct areas of journalism that are increasingly important today.

Data Journalism

My colleague Ron Rodgers sent me this post from the Guardian, and it has great value in its brevity and directness: Data journalism at the Guardian: What is it and how do we do it? It addresses 10 big themes that a journalism educator could build a whole course around, but you can read the whole post in about 10 minutes.

In contrast, a paper produced last August as the outcome of a conference in Europe about data-driven journalism is quite long — 78 pages. The paper, Data-driven journalism: What is there to learn?, provides many details in a very well organized format, and it includes lots of links to examples and tools (free tools!).

Moreover, there’s a new book to help us teach students about data! The video below explains it.

Proposals: (1) A journalism degree program should ensure that all students are introduced to basic data journalism, using current examples and demonstrating how to apply concepts. (2) A journalism degree program should offer at least one 3-credit elective course that focuses exclusively on data journalism.

Social Media and Participation

Just about everyone who teaches journalism is trying to figure out how to integrate social media into the mix. We all know that young people are already active users of social media — but that doesn’t mean they understand how to use those media ethically and effectively to do journalism.

Did you know that journalists in Al Jazeera’s Arabic and English newsrooms have had intensive social-media training? Read about it here. The same article discusses how social media links drive traffic to news websites.

As well as getting involved (if they choose) in newsgathering, verification and curation of news, readers and viewers have also become part of the news-distribution system as they share and recommend items of interest via e-mail and social networks. [source]

The phrase participatory journalism is not precisely defined, but I take it to mean that the audience participates in setting the agenda for news. This requires that journalists make themselves open to listening more, and listening to more sources (not only official ones), as well as making a commitment to go beyond superficial (and sometimes denigrating) man-on-the-street interviews.

Another important term is crowdsourcing. This is one kind of audience participation in gathering news — but not the only kind. This BBC story provides a good overview of crowdsourcing, and this article from the scholarly journal Journalism Practice discusses some excellent examples.

Proposals: (3) All journalism students need to learn how to use social media for specific journalistic goals. Assignments should focus on distinct uses such as identifying experts, crowdsourcing, and crisis mapping. (4) In any journalism program, the instructors must work together to eliminate unnecessary repetition in the program — for example, two or more required courses might have almost identical Twitter assignments or blogging assignments. This is a particular danger because it’s easy to integrate social media into almost any course — but redundancy risks trivializing the experience for students.

Presentation

This is not just a matter of design (as in “page layout and design”), and it should never be a mere afterthought in the production of news materials. A wonderful post by designer Andy Rutledge illustrates better than anything else I have seen why news websites — and many news applications for mobile devices — are more likely to repel readers than to attract them.

Sometimes I think the students who choose to major in journalism came to us through a time machine from a place where people still read text that is printed on paper. What’s especially strange is that most of these students do not themselves read any text on paper — but they imagine that someone will give them a job where they will spend all their time writing text, text, text that will not interact with any other media.

In the early days of print newspapers, pictures were added to help attract people who would buy the product and read the text. Formats and font sizes (among other things) make journalism more appealing. When the product is appealing, it does not drive people away.

Unfortunately, many online and digital news products since the mid 1990s have been doing just that — driving people away. Why was this permitted? Why didn’t the entire newsroom stand up and protest that the website was hideous, slow, impossible to read, horrible, offputting, unusable? They didn’t do it because it wasn’t their job — the way their stories looked was of no concern to them. As the readers abandoned them, the journalists continued to be silent and even ignorant about the destructive effects of bad digital design.

Educators could use this book, for example, and assign students to evaluate news web pages according to its principles: Neuro Web Design: What Makes Them Click?

Proposals: (5) Every journalism program needs a required course in visual design. (6) A journalism course in visual design must educate students in the principles that make an image, a frame, a page, and a screen appealing — or offputting. The course does not need to produce skilled designers; rather, it should produce journalists who recognize when a presentation of news or journalism is effective, and when it is confusing, difficult, and fails.

16:44

6 Proposals for Journalism Education Today

I’ve spent a huge amount of time this year thinking about and working on journalism curriculum. From developing and teaching a four-week program to train journalism educators in Africa in the practice of online journalism, to helping with a major overhaul of the undergraduate curriculum in my own department, to my current preparations to teach journalism at a university in Indonesia, I have been thinking a lot about what students need to learn today.

Here are six proposals in three distinct areas of journalism that are increasingly important today.

Data Journalism

My colleague Ron Rodgers sent me this post from the Guardian, and it has great value in its brevity and directness: Data journalism at the Guardian: What is it and how do we do it? It addresses 10 big themes that a journalism educator could build a whole course around, but you can read the whole post in about 10 minutes.

In contrast, a paper produced last August as the outcome of a conference in Europe about data-driven journalism is quite long — 78 pages. The paper, Data-driven journalism: What is there to learn?, provides many details in a very well organized format, and it includes lots of links to examples and tools (free tools!).

Moreover, there’s a new book to help us teach students about data! The video below explains it.

Proposals: (1) A journalism degree program should ensure that all students are introduced to basic data journalism, using current examples and demonstrating how to apply concepts. (2) A journalism degree program should offer at least one 3-credit elective course that focuses exclusively on data journalism.

Social Media and Participation

Just about everyone who teaches journalism is trying to figure out how to integrate social media into the mix. We all know that young people are already active users of social media — but that doesn’t mean they understand how to use those media ethically and effectively to do journalism.

Did you know that journalists in Al Jazeera’s Arabic and English newsrooms have had intensive social-media training? Read about it here. The same article discusses how social media links drive traffic to news websites.

As well as getting involved (if they choose) in newsgathering, verification and curation of news, readers and viewers have also become part of the news-distribution system as they share and recommend items of interest via e-mail and social networks. [source]

The phrase participatory journalism is not precisely defined, but I take it to mean that the audience participates in setting the agenda for news. This requires that journalists make themselves open to listening more, and listening to more sources (not only official ones), as well as making a commitment to go beyond superficial (and sometimes denigrating) man-on-the-street interviews.

Another important term is crowdsourcing. This is one kind of audience participation in gathering news — but not the only kind. This BBC story provides a good overview of crowdsourcing, and this article from the scholarly journal Journalism Practice discusses some excellent examples.

Proposals: (3) All journalism students need to learn how to use social media for specific journalistic goals. Assignments should focus on distinct uses such as identifying experts, crowdsourcing, and crisis mapping. (4) In any journalism program, the instructors must work together to eliminate unnecessary repetition in the program — for example, two or more required courses might have almost identical Twitter assignments or blogging assignments. This is a particular danger because it’s easy to integrate social media into almost any course — but redundancy risks trivializing the experience for students.

Presentation

This is not just a matter of design (as in “page layout and design”), and it should never be a mere afterthought in the production of news materials. A wonderful post by designer Andy Rutledge illustrates better than anything else I have seen why news websites — and many news applications for mobile devices — are more likely to repel readers than to attract them.

Sometimes I think the students who choose to major in journalism came to us through a time machine from a place where people still read text that is printed on paper. What’s especially strange is that most of these students do not themselves read any text on paper — but they imagine that someone will give them a job where they will spend all their time writing text, text, text that will not interact with any other media.

In the early days of print newspapers, pictures were added to help attract people who would buy the product and read the text. Formats and font sizes (among other things) make journalism more appealing. When the product is appealing, it does not drive people away.

Unfortunately, many online and digital news products since the mid 1990s have been doing just that — driving people away. Why was this permitted? Why didn’t the entire newsroom stand up and protest that the website was hideous, slow, impossible to read, horrible, offputting, unusable? They didn’t do it because it wasn’t their job — the way their stories looked was of no concern to them. As the readers abandoned them, the journalists continued to be silent and even ignorant about the destructive effects of bad digital design.

Educators could use this book, for example, and assign students to evaluate news web pages according to its principles: Neuro Web Design: What Makes Them Click?

Proposals: (5) Every journalism program needs a required course in visual design. (6) A journalism course in visual design must educate students in the principles that make an image, a frame, a page, and a screen appealing — or offputting. The course does not need to produce skilled designers; rather, it should produce journalists who recognize when a presentation of news or journalism is effective, and when it is confusing, difficult, and fails.

I’ve spent a huge amount of time this year thinking about and working on journalism curriculum. From developing and teaching a four-week program to train journalism educators in Africa in the practice of online journalism, to helping with a major overhaul of the undergraduate curriculum in my own department, to my current preparations to teach journalism at a university in Indonesia, I have been thinking a lot about what students need to learn today.

Here are six proposals in three distinct areas of journalism that are increasingly important today.

Data Journalism

My colleague Ron Rodgers sent me this post from the Guardian, and it has great value in its brevity and directness: Data journalism at the Guardian: What is it and how do we do it? It addresses 10 big themes that a journalism educator could build a whole course around, but you can read the whole post in about 10 minutes.

In contrast, a paper produced last August as the outcome of a conference in Europe about data-driven journalism is quite long — 78 pages. The paper, Data-driven journalism: What is there to learn?, provides many details in a very well organized format, and it includes lots of links to examples and tools (free tools!).

Moreover, there’s a new book to help us teach students about data! The video below explains it.

Proposals: (1) A journalism degree program should ensure that all students are introduced to basic data journalism, using current examples and demonstrating how to apply concepts. (2) A journalism degree program should offer at least one 3-credit elective course that focuses exclusively on data journalism.

Social Media and Participation

Just about everyone who teaches journalism is trying to figure out how to integrate social media into the mix. We all know that young people are already active users of social media — but that doesn’t mean they understand how to use those media ethically and effectively to do journalism.

Did you know that journalists in Al Jazeera’s Arabic and English newsrooms have had intensive social-media training? Read about it here. The same article discusses how social media links drive traffic to news websites.

As well as getting involved (if they choose) in newsgathering, verification and curation of news, readers and viewers have also become part of the news-distribution system as they share and recommend items of interest via e-mail and social networks. [source]

The phrase participatory journalism is not precisely defined, but I take it to mean that the audience participates in setting the agenda for news. This requires that journalists make themselves open to listening more, and listening to more sources (not only official ones), as well as making a commitment to go beyond superficial (and sometimes denigrating) man-on-the-street interviews.

Another important term is crowdsourcing. This is one kind of audience participation in gathering news — but not the only kind. This BBC story provides a good overview of crowdsourcing, and this article from the scholarly journal Journalism Practice discusses some excellent examples.

Proposals: (3) All journalism students need to learn how to use social media for specific journalistic goals. Assignments should focus on distinct uses such as identifying experts, crowdsourcing, and crisis mapping. (4) In any journalism program, the instructors must work together to eliminate unnecessary repetition in the program — for example, two or more required courses might have almost identical Twitter assignments or blogging assignments. This is a particular danger because it’s easy to integrate social media into almost any course — but redundancy risks trivializing the experience for students.

Presentation

This is not just a matter of design (as in “page layout and design”), and it should never be a mere afterthought in the production of news materials. A wonderful post by designer Andy Rutledge illustrates better than anything else I have seen why news websites — and many news applications for mobile devices — are more likely to repel readers than to attract them.

Sometimes I think the students who choose to major in journalism came to us through a time machine from a place where people still read text that is printed on paper. What’s especially strange is that most of these students do not themselves read any text on paper — but they imagine that someone will give them a job where they will spend all their time writing text, text, text that will not interact with any other media.

In the early days of print newspapers, pictures were added to help attract people who would buy the product and read the text. Formats and font sizes (among other things) make journalism more appealing. When the product is appealing, it does not drive people away.

Unfortunately, many online and digital news products since the mid 1990s have been doing just that — driving people away. Why was this permitted? Why didn’t the entire newsroom stand up and protest that the website was hideous, slow, impossible to read, horrible, offputting, unusable? They didn’t do it because it wasn’t their job — the way their stories looked was of no concern to them. As the readers abandoned them, the journalists continued to be silent and even ignorant about the destructive effects of bad digital design.

Educators could use this book, for example, and assign students to evaluate news web pages according to its principles: Neuro Web Design: What Makes Them Click?

Proposals: (5) Every journalism program needs a required course in visual design. (6) A journalism course in visual design must educate students in the principles that make an image, a frame, a page, and a screen appealing — or offputting. The course does not need to produce skilled designers; rather, it should produce journalists who recognize when a presentation of news or journalism is effective, and when it is confusing, difficult, and fails.

July 25 2011

04:57

July 20 2011

14:31

Dotspotting + Embeds = Great Maps of Prisons, Crime, Pavement Dots

There are three basic parts to working with online representations of urban civic data in Dotspotting: collating the data, manipulating it, and then sharing and publishing it. Up until now, we've been focused on the first two, which makes sense. Obviously you need to be able to gather and work with the data before you can share it.

Today we're announcing the inclusion of the project's most requested feature: embedding the maps that people make into sites of their own.

Dotspotting makes tools to help people gather data about cities and make that information more legible. It's the first project Stamen released as part of Citytracking, a project funded by the Knight News Challenge.

Dotspotting's "embed/export" feature has been reworked to include the ability to generate HTML code that you can configure to your own specs, depending on how your site is formatted. Basic embed code is available in default mode, which will generate a map that looks pretty much the way it does on Dotspotting:

California state prisons on Dotspotting

There are a couple of different options in embed; so, for example, you can swap out the normal toner cartography for Bing's new (awesome) map tiles:

California state prisons on Dotspotting

We've been working with Mission Local, a news organization that reports on our home base of the Mission District, to find ways to take the lessons learned from the Crimespotting project and give this ability to local publications and advocates. The crime theme we've developed with them lets you generate maps that look like the one below, if you provide a "crime type" value in your data:

Crime June 21-28 updated on Dotspotting

And my favorite so far is the photo theme, which takes a "flickr:id" or "photo_url" field from your data (say, a set on flickr) and generates a visual mapping of where the photos are:

Dots on the pavement from flickr on Dotspotting

We're planning on releasing more of these as time goes by; if you've got ideas for a theme you'd like to see, please upload some data and get in touch!

May 10 2011

15:23

May 02 2011

18:32

Tying a slideshow and a map together?

I have some geographically-tied data, and I'd like to be able to do a slideshow (text and/or images) with a map that highlights the current slide's location along side it. Anyone seen this sort of thing out in the wild? Any tips on how to make it happen? I've seen a lot of examples of maps and texts embedded in the map, but I'd like to have the map below the text/images.

April 01 2011

16:59

Map Mashup Shows Broadband Speeds for Schools in U.S.

The Department of Education (DOE) recently launched Maps.ed.gov/Broadband an interactive map that shows schools and their proximity to broadband Internet access speeds across the country. This is an important story for DOE, an agency that has a stated goal that all students and teachers have access to a sufficient infrastructure for learning -- which nowadays includes a fast Internet connection. The map is based on open data released last month by the Federal Communications Commission (FCC). As you can see below, the result is a custom map that shows a unique story -- how schools' Internet access compares across the country.

In addition to being an example of an open data mashup, this map also serves as an example of what can be built with emerging open-source mapping tools. We worked with DOE to process and merge the two data sets, and then generated the new map tiles using Mapnik, an open-source toolkit for rendering map tiles. Then we created the custom overlay of schools and universities using TileMill, our open-source map design studio. Finally, a TileMill layer was added on top of the broadband data.

The Feds' Open-Source Leadership

It is great to see both the DOE and FCC able to leverage open data to make smarter policy decisions. Karen Cator, the director of the office of educational technology at DOE has an awesome blog post about why this mashup matters:

"The Department of Education's National Education Technology Plan sets a goal that all students and teachers will have access to a comprehensive infrastructure for learning, when and where they need it," Cator writes. "Broadband access is a critical part of that infrastructure. This map shows the best data to date and efforts will continue to gather better data and continually refresh the maps."

February 06 2011

19:11

The Visual Appeal of Super Bowl Sunday

In case you haven’t heard or seen, Super Bowl XLV TV coverage begins on Fox Sports at 2 p.m ET today, with the kickoff at 6:29 p.m. ET.

Fans, sponsors, and more are pulling out the stops for what’s being described as a classic matchup between two old-school, cheerleader-less football franchises in an unexpectedly icy stadium.

For a sport that has never failed to capture national attention, it’s interesting to see the size of each team’s respective fan nations are in landmass — and to notice how the Green Bay Packers and Pittsburgh Steelers areas are almost evenly matched.

Here’s graphic designer Jared Fanning‘s take:

The United States of Football by Jared Fanning

A slightly different, visually exciting version was posted on I Love Charts:

The United States of Football, from I Love Charts

National spectacle knows no bounds, however, and Visa, smartly, is taking advantage with dynamic visualizations of Twitter chatter, including a look at football-related trending topics in the days leading up to today’s big game:

Visa Super Bowl Twitter trending topics map

Not everyone will be focused on Super Bowl pre-game coverage, or at least that’s what Animal Planet is counting on.

The Puppy Bowl is back, offering entertainment to those who prefer tumbling fuzzy animals to the charging bulls of the gridiron. Broadcast starts at 3 p.m. ET (tape delayed to 3 p.m. Pacific).

Meanwhile, advertisers have put up big bank to be a part of today’s big game. “Fox was seeking between $2.8 million and $3 million for 30 seconds of time,” writes AdAge, which rounds up facts on all the spots.

Related Posts:

Share: Print Digg del.icio.us Facebook Google Bookmarks StumbleUpon Tumblr Twitter

January 21 2011

15:23

Turning the iPad into an Open, Offline Mapping Platform

We've talked here before about TileMill, an open source tool for creating your own custom map tiles (the individual pieces that make up a full map of a city, country, and so on). But what sorts of things can you do with these map tiles? One area we wanted to explore was using them on Apple's latest touch-based device, the iPad. Providing a touch interface for maps is a serious usability win and the long battery life, huge available storage, and opportunistic network connectivity combine to make a really attractive mobile mapping platform.

The result? The MapBox iPad app. This app allows you to use custom maps on the iPad (and in an open format), as well as use OpenStreetMap (OSM) map tiles, overlay custom data in Google Earth's popular KML format as well as GeoRSS, save and share map snapshots, and much more.

To create the app the first thing we had to figure out was an alternative to Apple's standard MapKit toolset, which only uses online Google Maps. This was accomplished with the open source route-me library. Once this was decided, we created a file format called MBTiles to easily exchange potentially millions of tile images so they could be used offline.

We then layered on data visualizations, creating an open source library called Simple KML in order to parse and display the KML and KMZ file formats, something that hasn't really been done much on the iPhone or iPad outside of Google's own app.

MapBox for iPad

To round out the initial release, we added the ability to save the current view -- coordinates, zoom level, and data overlays -- as a document for later, as well as the ability to email a picture of the current map straight from the app.

As a whole, we've been really happy with the iPad as an open mapping platform. We've used some tools, made some new ones available, and combined them all in new ways.

Do you have any ideas for open mapping on the iPad? We'd love to hear your thoughts in the comments or on Twitter, where you can follow our progress at @MapBox.

January 11 2011

16:00

A year later, lessons for media from Haiti earthquake response

Tomorrow marks the one-year anniversary of Haiti’s devastating earthquake, as well as the anniversary of one of the largest humanitarian responses to a natural disaster, with almost $3.8 billion in aid given or pledged. The international response required coordination between the Haitian government, foreign governments, NGOs, and the people directly affected by the disaster, becoming a real-world laboratory for testing new tools — from SMS short messaging to crowdsourced crisis mapping with Ushahidi — according to a new report from the Knight Foundation. And the response’s successes and failures could prove valuable lessons both for responding to the next crisis and for better understanding mass media.

Three innovations in particular, the report says, were put to the test: Broadcasting crisis information with SMS, crowdsourcing data into actionable information, and using open mapping tools to meet humanitarian needs.

The report found that none of them, however, would have been as effective without one very low-tech tool: radio.

Radio still Haiti’s dominant medium

In a country with a literacy rate of just 52 percent, traditional newspapers and Internet access would have been of low value even if the presses and power lines hadn’t been knocked out of commission. Inexpensive, resilient, and nearly universal radio access, however, cut past literacy and economic boundaries, particularly since one station, Signal FM, managed to continue operation throughout the crisis. Signal FM, and other stations as they returned to broadcasting, served as vital information sources, detailing aid and emergency procedures and helping connecting survivors with other resources.

“Although much of the attention has been paid to new media technologies, radio was the most effective tool for serving the needs of the public,” the report notes. But while radio provided the first line of communications, new technologies were critically in actually connecting communities with the aid they needed.

For help, text 4636

Despite erratic cellular service (cell towers would often go live for a few hours, followed by hours of silence), SMS text messaging proved an invaluable tool: Even when coverage was down, messages could be queued and then sent when access returned. The Knight report states that the first usage was informal: Local journalists received pleas for help or reports from the ground. But local service provider Digicel, collaborating with non-profit InSTEDD, set up a dedicated, free short code service at the number 4636, which was up and running four days after the quake and allowed Haitians to text in reports and even requests for emergency help — even as the system was used by Thomson Reuters Foundation to broadcast basic shelter, hygiene and security alerts to roughly 26,000 subscribers.

Across the Atlantic, SMS messages were also playing a very important relief role: raising money. SMS was widely used by the Red Cross and other organizations in the United States to spur convenient giving, helping raise $30 million in the 10 days after the earthquake. In fact, more people donated via text messaging (14 percent) than telephone (12 percent) or e-mail (5 percent).

Despite the invaluable data in the flood of wireless bits, though, the mix of languages, formats, and levels of urgency sent in to relief workers also posed serious logistical hurdles.

Crowdsourcing through the diaspora

Critical to parsing through all the data were centers far outside of Haiti, like one group in Boston that helped geolocate emergency texts, information that was then passed along to relief workers on location. Groups of Haitian expatriates helped translate the flood of data from Creole, French, and Spanish into English, passing it along to the most appropriate aid organizations as well as the U.S. Marines, who often served as the basis for search-and-rescue missions.

In Haiti, the report found the use crowdsourced emergency information had hit a turning point, helping inform real-time decision-making.

“In most previous efforts, information was collected mostly to understand when, where and why events were occurring. It had been relatively rare for such information to be useful for actual response to a specific problem,” the report states. “In Haiti, by contrast, limited numbers of humanitarian responders attempted to include crowdsourced information to help form their decisions about where to respond, to send search-andrescue teams, to identify collapsed structures and to deliver resources. While these efforts were not systemic in nature, they were nonetheless groundbreaking.”

Ushahidi mapping provides real-time relief

The report also highlights the use of crowdmapping tool Ushahidi, which volunteers used to map many of the incoming pleas for help to determine trouble “hot spots” and inform rescue operations where they were needed. Haiti.ushahidi.com served as one focal point for cataloging submitted public health, security and other dangers, making it easy to quickly see what problems a particular area was facing and quickly deploy help there.

While the report is careful to note that these efforts were limited, it highlighted the potential noted that even these early efforts were making a difference. According to a Ushahidi team leader at Tufts, “On the third day, FEMA (Federal Emergency Management Agency) called us to say keep mapping no matter what people say – it’s saving lives.”

The changed role of media in crisis

While the report focuses on primarily local and “humanitarian media,” it notes that journalists often played an important role in not just documenting the damage and recovery, but in connecting the local communities with information when traditional lines of communication were severely disrupted. Trusted on-air radio personalities switched from delivering hit music to health bulletins, while reporters passed along reports of danger and distress. Radio host Cedre Paul, the host of “Radio One Haiti,” sent out “prolific Twitter messages that provided real-time updates to his many followers,” the report noted.

While the blurred role of NGOs in media serve is by no means new, Haiti served as powerful reminder of the dual role — both documenting and aiding — that news organizations can serve. Even traditional outlets, like CNN and the New York Times, served in relief capacities by partnering with NGOs and Google to create a unified Haitian people finder.

Unmet potential

While the report focuses on the successes, it recognizes that many barriers still needed to be deal with in order for the highlighted technologies to have a maximum impact, particularly in regards to education and policies within more traditiona aid and government organizations, which often have strict privacy rules, for example, that often conflict with the transparency required by crowdsourced projects like Ushahidi. Still, both sides, the technologists and the aid organizations, are realized that gains made clear from cooperation, the report states, and strides are being made towards better integration of these technologies into traditional response plans.

“This process of creating new forms of collaboration between different organizational cultures will not be quick or easy,” the report concludes. “However, the promise of such collaboration is recognized by many of those involved, on both sides of the equation. The question is not whether this process will advance, but how.”

The full Knight report and related materials are available for download.

December 13 2010

03:04

Custom/simplified maps with OpenHeatMap and/or OpenStreetMap

I'm working on a project with OpenHeatMap, which overall seems like a great, simple well-executed project. It's meeting all my needs, except it relies on OpenStreetMap for its mapping, and I can't find a way to reduce the level of detail it presents. Ideally, I'd like it to simply show a very high-level view of one state, and keep out the clutter, but I can't find any way to do this on either OpenStreetMap or OpenHeatMap.

Any advice? OpenHeatMap allows you to set your map tile source, but I also couldn't find another source besides OSM.

December 03 2010

14:19

Views part 1 – Canadian weather stations

(This is the first of two posts announcing ScraperWiki “views”. A new feature that Julian, Richard and Tom worked away and secretly launched a couple of months ago. Once you’ve scraped your data, how can you get it out again in just the form you want?)

Canadian weather stations

Clear Climate Code is a timely project to reimplement the software of climate science academics in nicely structured and commented Python. David Jones has been using ScraperWiki views to find out which areas of the world they don’t have much surface temperature data for, so they can look for more sources.

Take a look at his scraper Canada Climate Sources. If you scroll down, there’s a section “Views using this data from this scraper”. That’s where you can make new views – small pieces of code that output the data the way you want. Think of them as little CGI scripts you can edit in your browser. This is a screenshot of the Canada Weather Station Map view.

It’s a basic Google Map, made for you from a template when you choose “create new view”. But David then edited it, to add conditional code to change the colours and letters on the pins according to the status of the stations.

This is the key powerful thing about ScraperWiki views – even if you start with a standard chart or map, you have the full power of the visualisation APIs you are using, and of HTML, Javascript and CSS, to do more interesting things later.

There’s more about ScraperWiki and the Canada weather stations in the posts Canada and Analysis of Canada Data on the Clear Climate Code blog.

Next week – part 2 will be about how to use views to output your data in the machine readable format that you want.


October 23 2010

00:28

Which scripting language for a novice who wants to get into Google/Bing Mapping APIs?

I have a journo background, and am a programming novice (except for a single undergad PASCAL course in 1994). My goal is to learn how to use the Google and/or Bing Maps APIs, but first, to get a handle on a scripting language like Java or Python. My question: a) Which of these languages is most useful in the context of a Maps API.

From what I've read, Python is a great language to get started with for a novice programmer, but when I read about the Google Maps API, a knowledge of Java always seems assumed. Is Java needed for working with Google Maps API? With Bing Maps' API? Can either of these APIs adapt to different scripts? As you can see, I'm a little confused.

Thanks in advance.

October 20 2010

16:16

OpenStreetMap's Audacious Goal: Free, Open Map of the World

In our previous posts on TileMill, we’ve focused on how open data can be used to create custom mapsand tell unique stories. One question we run into a lot is, “Where does open data come from?”

One exciting source is a global mapping project called OpenStreetMap (OSM). Founded in 2004 with the goal of creating a free and open map of the world, OSM now boasts over 300,000 contributors and has comparable or better data for many countries than the popular proprietary or closed datasets. The premise is simple and powerful: Anyone can use the data, and anyone can help improve it.

OSM-based map of Port au Prince made with TileMill

With this huge amount of data, activity, and adoption, we’re excited about how TileMill is going to give more people ways to leverage OSM data to make their own maps. Users will be able to mash up OSM data on their own using TileMill and turn it into their very own custom map.

Humanitarian OpenStreetMap Team

To get a sense of the practicality of OSM, just look at the role it played in the response to the January 12 earthquake in Haiti. Reliable maps are critical to disaster response efforts and there simply wasn’t much data available for the affected areas. Within hours of the quake, the OSM community mobilized and hundreds of volunteers from all over the world began tracing available satellite imagery, importing available datasets, and coordinating with relief workers on the ground to ensure that new data was being created and distributed in ways that would best support their work.

Using OpenStreetMap as a platform and leveraging the existing, engaged community paid off — within days, volunteers had created the best available maps of Port au Prince and nearby cities. OSM data quickly appeared on the GPS devices of search and rescue teams, and in the planning tools of the international response community.

Members of the Humanitarian OpenStreetMap Team (HOT), of which I’m a member, have continued to support the use of OSM in Haiti through trainings with local NGOs, the Haitian government, and international responders. In November, I’ll be part of the fifth deployment of HOT team members to Haiti to support the International Organization for Migration (IOM) in their work to map the camps for people displaced by the earthquake, using OSM as a platform.

Through this effort by the OSM community, anyone looking to make a map of Haiti has a great database of roads, hospitals, and even collapsed buildings that they can use in their work. We see this kind of data sharing as important capacity-building to help people make useful custom maps. With TileMill, we’re working to create a practical toolset for working with this data.

Beyond Haiti

Moving beyond Haiti and thinking about maps of other places, what’s exciting about OpenStreetMap is the hundreds of community groups around the world getting together and using OSM to map their own cities and neighborhoods. If a map data doesn’t exist yet, there’s a chance that it could through the efforts of the OSM community. For instance, the image below is a picture of work the local OSM community did in Washington, DC, to make a very detailed map of the National Zoo.

Mapping the National Zoo in Washington, DC by ajturner

If you’re looking for open map data for your next project, a great place to start would be to reach out to the local OSM community in your area — there’s a good chance they can help you figure out how to get it.

October 05 2010

21:46

Six Stunning Projects That Show the Power of Data Visualization

Data visualization is taking the web by storm and, with a little luck, it might be the next big thing in online journalism. Buoyed by the open data movement and accelerating change in newsrooms around the country, it has become something more than just flashy graphics and charts -- it is a new form of visual communication for the 21st century.

In the coming months, I'll be writing about this emerging field for MediaShift. We'll cover best practices, free tools and resources. We'll also analyze the best of the best and talk to some data visualization or viz bloggers about what's hot and what's not. From time to time, I'll share some of my own data viz experiences with you and seek your feedback.

What is Data Visualization?

At its core, data visualization is the visual representation of information served up with a healthy dose of innovation and creativity. A truly stunning data viz becomes more than the sum of its parts. This new digital alchemy can turn a simple spreadsheet into something that can shake up the debate, measure progress or even change the world.

This periodic table of visualization methods by the folks over at VisualLiteracy.org illustrates a number of different elements or viz building blocks. A data viz can take the form of an infographic, a timeline or a map. It can be a motion chart, a short video clip, an interactive dashboard, or even a web app.

Below, you'll find six examples of data visualization from around the web and across the globe that provide an overview of the techniques and approaches to data visualization.

1. Work With Passion Like Hans Rosling

Data_viz_1.jpg

Any discussion about data visualization has to start with Hans Rosling. He is a professor of international health and co-founder/director of the Gapminder Foundation. He created the Trendalyzer, an advanced motion chart that makes statistics come alive.

If you are not excited about the power of data visualization, you will be after this video of his talk at TED where he talks about health in the developing world. The magic begins at around three minutes in:

You can make your own international health comparisons using an interactive motion chart or download the free Gapminder desktop application for a hands-on data experience.

2. Visual Can Also Be Visceral

Latoya Egwuekwe, a former classmate of mine at American University's Interactive Journalism program, made national headlines with her visualization of county-level unemployment data. See it for yourself: The Geography of a Recession. This viz has received over 1 million hits since it was launched in October 2009.

Data_viz_2.jpg

Every day, I work with labor statistics and I am still floored every time I see this viz. It goes to show that you don't have to be a pro to have an impact. Around the web, students, citizen journalists and bloggers are breaking new ground.

3. Making a Story Hit Home

Data visualizations can be used to tease out and illustrate trends from data in new and unexpected ways.

Timothy Noah over at Slate introduced the concept of "the Great Divergence" and then he used a data viz to take readers on a visual tour of income inequality in America.

data_viz_3.jpg

Dubbed the United States of Inequality, this 10-part series and viz shows a widening income gap.

4. Use Motion to Move Your Audience

A visual look at aging around the world by General Electric incorporates motion beautifully. It allows you to compare age cohorts from different countries over time -- think Baby Boomers, Generation X, etc. Watch as your generation grows old and dies based on United Nations population projections.

data_viz_4.jpg

This viz is called Our Aging World and is presented as an interactive motion chart.

5. Seeing Something in a New Light

This viz by NMAP.org shows the web like you've never seen it before. If you've ever clicked a mouse before, you're probably familiar with favicons -- the tiny images that appear next to the website URL in the address bar of your browser. This viz includes close to 300,000 of them.

data_viz_5.jpg

The size of a company's favicon corresponds to the reach of its website on the web. As you might have guessed, Google is the largest. Check out Icons of the Web, a gigapixel image with an interactive viewer.

6. What's A Billion Dollars Between Friends?

Visualizing numbers can add context to any story. Last but not least, we have a viz by Information is Beautiful's David McCandless. It's called the Billion Dollar-O-Gram and is an interactive tree map. He created this viz out of frustration with media reports citing billion-dollar amounts without providing the proper context.

data_viz_6.jpg

Not only is this viz useful and informative, it's also an example of open data in action. McCandless does something that should be an industry standard -- he links to the entire data set used to create the viz. You can also see how he has updated the viz over time; view the original version, which uses different facts and figures.

How Else Can Journalists Use This?

Besides using them to tell data stories, journalists can use visualizations in the newsroom or on the go for several essential activities. Here are a few more examples of how data visualization can play a role in finding, processing and communicating information:

The Most Beautiful Viz You Have Ever Seen

What is the most beautiful viz you have ever seen? What is your favorite viz of all time?

My pick for most beautiful is more form than function. It's Chris Harrison's Visualizing the Bible. Check it out for yourself.

data_viz_beautiful.jpg

My current favorite viz is a triple threat. It's beautiful, useful and also a great way to link to old movie reviews. It's the New York Times' Ebb and Flow of Movies.

I'd be doing you a disservice if I also didn't share with you a data visualization that I produced. This viz examines the state of the nuclear stockpile, and is called Between Five Countries, Thousands of Nukes.

Please share your favorite examples of data visualization in the comments, and stay tuned for my future posts about this emerging storytelling form.

Anthony Calabrese is journalist based in Washington, D.C., who specializes in data visualization and digital storytelling. He works as a web producer for the State of the USA where he blogs about measuring the nation's progress. Previously, he worked as a data researcher on the Best Colleges and Best Graduate Schools publications at U.S. News & World Report. He holds a master's degree in interactive journalism from American University and is an active member of the Online News Association.
You can follow him on Twitter @2mrw

This is a summary. Visit our site for the full post ».

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!

Schweinderl