Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

August 20 2012


How Wikipedia Manages Sources for Breaking News

Almost a year ago, I was hired by Ushahidi to work as an ethnographic researcher on a project to understand how Wikipedians managed sources during breaking news events.

Ushahidi cares a great deal about this kind of work because of a new project called SwiftRiver that seeks to collect and enable the collaborative curation of streams of data from the real-time web about a particular issue or event. If another Haiti earthquake happened, for example, would there be a way for us to filter out the irrelevant, the misinformation, and build a stream of relevant, meaningful and accurate content about what was happening for those who needed it? And on Wikipedia's side, could the same tools be used to help editors curate a stream of relevant sources as a team rather than individuals?


Ranking sources

When we first started thinking about the problem of filtering the web, we naturally thought of a ranking system that would rank sources according to their reliability or veracity. The algorithm would consider a variety of variables involved in determining accuracy, as well as whether sources have been chosen, voted up or down by users in the past, and eventually be able to suggest sources according to the subject at hand. My job would be to determine what those variables are -- i.e., what were editors looking at when deciding whether or not to use a source?

I started the research by talking to as many people as possible. Originally I was expecting that I would be able to conduct 10 to 20 interviews as the focus of the research, finding out how those editors went about managing sources individually and collaboratively. The initial interviews enabled me to hone my interview guide. One of my key informants urged me to ask questions about sources not cited as well as those cited, leading me to one of the key findings of the report (that the citation is often not the actual source of information and is often provided in order to appease editors who may complain about sources located outside the accepted Western media sphere). But I soon realized that the editors with whom I spoke came from such a wide variety of experience, work areas and subjects that I needed to restrict my focus to a particular article in order to get a comprehensive picture of how editors were working. I chose a 2011 Egyptian revolution article on Wikipedia because I wanted a globally relevant breaking news event that would have editors from different parts of the world working together on an issue with local expertise located in a language other than English.

Using Kathy Charmaz's grounded theory method, I chose to focus editing activity (in the form of talk pages, edits, statistics and interviews with editors) from January 25, 2011 when the article was first created (within hours of the first protests in Tahrir Square), to February 12 when Mubarak resigned and the article changed its name from "2011 Egyptian protests" to "2011 Egyptian revolution." After reviewing the big-picture analyses of the article using Wikipedia statistics of top editors, and locations of anonymous editors, etc., I started work with an initial coding of the actions taking place in the text, asking the question, "What is happening here?"

I then developed a more limited codebook using the most frequent/significant codes and proceeded to compare different events with the same code (looking up relevant edits of the article in order to get the full story), and to look for tacit assumptions that the actions left out. I did all of this coding in Evernote because it seemed the easiest (and cheapest) way of importing large amounts of textual and multimedia data from the web, but it wasn't ideal because talk pages, when imported, need to be re-formatted, and I ended up using a single column to code data in the first column since putting each conversation on the talk page in a cell would be too time-consuming.


I then moved to writing a series of thematic notes on what I was seeing, trying to understand, through writing, what the common actions might mean. I finally moved to the report writing, bringing together what I believed were the most salient themes into a description and analysis of what was happening according to the two key questions that the study was trying to ask: How do Wikipedia editors, working together, often geographically distributed and far from where an event is taking place, piece together what is happening on the ground and then present it in a reliable way? And how could this process be improved?

Key variables

Ethnography Matters has a great post by Tricia Wang that talks about how ethnographers contribute (often invisible) value to organizations by showing what shouldn't be built, rather than necessarily improving a product that already has a host of assumptions built into it.

And so it was with this research project that I realized early on that a ranking system conceptualized this way would be inappropriate -- for the single reason that along with characteristics for determining whether a source is accurate or not (such as whether the author has a history of presenting accurate news article), a number of important variables are independent of the source itself. On Wikipedia, these include variables such as the number of secondary sources in the article (Wikipedia policy calls for editors to use a majority of secondary sources), whether the article is based on a breaking news story (in which case the majority of sources might have to be primary, eyewitness sources), or whether the source is notable in the context of the article. (Misinformation can also be relevant if it is widely reported and significant to the course of events as Judith Miller's New York Times stories were for the Iraq War.)


This means that you could have an algorithm for determining how accurate the source has been in the past, but whether you make use of the source or not depends on factors relevant to the context of the article that have little to do with the reliability of the source itself.

Another key finding recommending against source ranking is that Wikipedia's authority originates from its requirement that each potentially disputed phrase is backed up by reliable sources that can be checked by readers, whereas source ranking necessarily requires that the calculation be invisible in order to prevent gaming. It is already a source of potential weakness that Wikipedia citations are not the original source of information (since editors often choose citations that will be deemed more acceptable to other editors) so further hiding how sources are chosen would disrupt this important value.

On the other hand, having editors provide a rationale behind the choice of particular sources, as well as showing the variety of sources rather than those chosen because of loading time constraints may be useful -- especially since these discussions do often take place on talk pages but are practically invisible because they are difficult to find.

Wikipedians' editorial methods

Analyzing the talk pages of the 2011 Egyptian revolution article case study enabled me to understand how Wikipedia editors set about the task of discovering, choosing, verifying, summarizing, adding information and editing the article. It became clear through the rather painstaking study of hundreds of talk pages that editors were:

  1. storing discovered articles either using their own editor domains by putting relevant articles into categories or by alerting other editors to breaking news on the talk page,
  2. choosing sources by finding at least two independent sources that corroborated what was being reported but then removing some of the citations as the page became too heavy to load,
  3. verifying sources by finding sources to corroborate what was being reported, by checking what the summarized sources contained, and/or by waiting to see whether other sources corroborated what was being reported,
  4. summarizing by taking screenshots of videos and inserting captions (for multimedia) or by choosing the most important events of each day for a growing timeline (for text),
  5. adding text to the article by choosing how to reflect the source within the article's categories and providing citation information, and
  6. editing disputing the way that editors reflected information from various sources and replacing primary sources with secondary sources over time.

It was important to discover the work process that editors were following because any tool that assisted with source management would have to accord as closely as possible with the way that editors like to do things on Wikipedia. Since the process is managed by volunteers and because volunteers decide which tools to use, this becomes really critical to the acceptance of new tools.



After developing a typology of sources and isolating different types of Wikipedia source work, I made two sets of recommendations as follows:

  1. The first would be for designers to experiment with exposing variables that are important for determining the relevance and reliability of individual sources as well as the reliability of the article as a whole.
  2. The second would be to provide a trail of documentation by replicating the work process that editors follow (somewhat haphazardly at the moment) so that each source is provided with an independent space for exposition and verification, and so that editors can collect breaking news sources collectively.


Regarding a ranking system for sources, I'd argue that a descriptive repository of major media sources from different countries would be incredibly beneficial, but that a system for determining which sources are ranked highest according to usage would yield really limited results. (We know, for example, that the BBC is the most used source on Wikipedia by a high margin, but that doesn't necessarily help editors in choosing a source for a breaking news story.) Exposing the variables used to determine relevancy (rather than adding them up in invisible amounts to come up with a magical number) and showing the progression of sources over time offers some opportunities for innovation. But this requires developers to think out of the box in terms of what sources (beyond static texts) look like, where such sources and expertise are located, and how trust is garnered in the age of Twitter. The full report provides details of the recommendations and the findings and will be available soon.

Just the beginning

This is my first comprehensive ethnographic project, and one of the things I've noticed when doing other design and research projects using different methodologies is that, although the process can seem painstaking and it can prove difficult to articulate the hundreds of small observations into findings that are actionable and meaningful to designers, getting close to the experience of editors is extremely valuable work that is rare in Wikipedia research. I realize now that in the past when I actually studied an article in detail, I knew very little about how Wikipedia works in practice. And this is only the beginning!

Heather Ford is a budding ethnographer who studies how online communities get together to learn, play and deliberate. She currently works for Ushahidi and is studying how online communities like Wikipedia work together to verify information collected from the web and how new technology might be designed to help them do this better. Heather recently graduated from the UC Berkeley iSchool where she studied the social life of information in schools, educational privacy and Africans on Wikipedia. She is a former Wikimedia Foundation Advisory Board member and the former Executive Director of iCommons - an international organization started by Creative Commons to connect the open education, access to knowledge, free software, open access publishing and free culture communities around the world. She was a co-founder of Creative Commons South Africa and of the South African nonprofit, The African Commons Project as well as a community-building initiative called the GeekRetreat - bringing together South Africa's top web entrepreneurs to talk about how to make the local Internet better. At night she dreams about writing books and finding time to draw.

This article also appeared at Ushahidi.com and Ethnography Matters. Get the full report at Scribd.com.

August 14 2012


What's Next for Ushahidi and Its Platform?

This is part 2 in a series. In part 1, I talked about how we think of ourselves at Ushahidi and how we think of success in our world. It set up the context for this post, which is about where we're going next as an organization and with our platform.

We realize that it's hard to understand just how much is going on within the Ushahidi team unless you're in it. I'll try to give a summarized overview, and will answer any questions through the comments if you need more info on any of them.

The External Projects Team

Ushahidi's primary source of income is private foundation grant funding (Omidyar Network, Hivos, MacArthur, Google, Cisco, Knight, Rockefeller, Ford), and we don't take any public funding from any country so that we are more easily able to maintain our neutrality. Last year, we embarked on a strategy to diversify our revenue stream, endeavoring to decrease our percentage of revenues based on grant funding and offset that with earned revenue from client projects. This turned out to be very hard to do within our current team structure, as the development team ended up being pulled off of platform-side work and client-side work suffered for it. Many internal deadlines were missed, and we found ourselves unable to respond to the community as quickly as we wanted.

This year we split out an "external projects team" made up of some of the top Ushahidi deployers in the world, and their first priority is to deal with client and consulting work, followed by dev community needs. We're six months into this strategy, and it seems like this team format will continue to work and grow. Last year, 20% of our revenue was earned; this year we'd like to get that to the 30-40% range.

Re-envisioning Crowdmap

When anyone joins the Ushahidi team, we tend to send them off to some conference to speak about Ushahidi in the first few weeks. There's nothing like knowing that you're going to be onstage talking about your new company to galvanize you into really learning about and understanding everything about the organization. Basically, we want you to understand Ushahidi and be on the same mission with us. If you are, you might explain what we do in a different way than I do onstage or in front of a camera, but you'll get the right message out regardless.


You have a lot of autonomy within your area of work, or so we always claimed internally. This was tested earlier this year, where David Kobia, Juliana Rotich and myself as founders were forced to ask whether we were serious about that claim, or were just paying it lip-service. Brian Herbert leads the Crowdmap team, which in our world means he's in charge of the overall architecture, strategy and implementation of the product.

The Crowdmap team met up in person earlier this year and hatched a new product plan. They re-envisioned what Crowdmap could be, started mocking up the site, and began building what would be a new Crowdmap, a complete branch off the core platform. I heard this was underway, but didn't get a brief on it until about six weeks in. When I heard what they had planned, and got a complete walk-through by Brian, I was floored. What I was looking at was so different from the original Ushahidi, and thus what we have currently as Crowdmap, that I couldn't align the two in my mind.

My initial reaction was to shut it down. Fortunately, I was in the middle of a random 7-hour drive between L.A. and San Francisco, so that gave me ample time to think by myself before I made any snap judgments. More importantly, it also gave me time to call up David and talk through it with him. Later that week, Juliana, David and I had a chat. It was at that point that we realized that, as founders, we might have blinders on of our own. Could we be stuck in our own 2008 paradigm? Should we trust our team to set the vision for a product? Did the product answer the questions that guide us?

The answer was yes.

The team has done an incredible job of thinking deeply about Crowdmap users, then translating that usage into a complete redesign, which is both beautiful and functional at the same time. It's user-centric, as opposed to map-centric, which is the greatest change. But, after getting around our initial feelings of alienness, we are confident that this is what we need to do. We need to experiment and disrupt ourselves -- after all, if we aren't willing to take risks and try new things, then we fall into the same trap that those who we disrupted did.

A New Ushahidi

For about a year we've been asking ourselves, "If we rebuilt Ushahidi, with all we know now, what would it look like?"

To redesign, re-architect and rebuild any platform is a huge undertaking. Usually this means part of the team is left to maintain and support the older code, while the others are building the shiny new thing. It means that while you're spending months and months building the new thing, that you appear stagnant and less responsive to the market. It means that you might get it wrong and what you build is irrelevant by the time it's launched.

Finally, after many months of internal debate, we decided to go down this path. We've started with a battery of interviews with users, volunteer developers, deployers and internal team members. The recent blog post by Heather Leson on the design direction we're heading in this last week shows where we're going. Ushahidi v3 is the complete redesign of Ushahidi's core platform, from the first line of code to the last HTML tag. On the front-end it's mobile web-focused out of the gate, and the backend admin area is about streamlining the publishing and verification process.

At Ushahidi we are still building, theming and using Ushahidi v2.x, and will continue to do so for a long time. This idea of a v3 is just vaporware until we actually decide to build it, but the exercise has already born fruit because it forces us to ask what it might look like if we weren't constrained by the legacy structure we had built. We'd love to get more input from everyone on this as we go forward.

SwiftRiver in Beta

After a couple of fits and starts, SwiftRiver is now being tried out by 500-plus beta testers. It's 75% of the way to completion, but usable, and so it's out and we're getting the feedback from everyone on what needs to be changed, added and removed in order to make it the tool we all need to manage large amounts of data. It's an expensive, server-intensive platform to run, so those who use it in the future will have to pay for its use when using it on our servers. As always, the core code will be made available, free and open source, for those who would like to set it up and run it on their own.

In Summary

The amount of change and internal change that Ushahidi is undertaking is truly breathtaking to us. We're cognizant of just how much we're putting on the edge. However, we know this; in our world of technology, those who don't disrupt themselves will themselves be disrupted. In short, we'd rather go all-in to make this change happen ourselves than be mired in a state of stagnancy and defensive activity.

As always, this doesn't happen in a vacuum for Ushahidi. We've relied on those of you who are the coders and deployers to help us guide the platforms for over four years. Many of you have been a part of one of these product rethinks. If you aren't already, and would like to be, get in touch with myself or Heather to get into it and help us re-envision and build the future.

Raised in Kenya and Sudan, Erik Hersman is a technologist and blogger who lives in Nairobi. He is a co-founder of Ushahidi, a free and open-source platform for crowdsourcing information and visualizing data. He is the founder of AfriGadget, a multi-author site that showcases stories of African inventions and ingenuity, and an African technology blogger at WhiteAfrican.com. He currently manages Ushahidi's operations and strategy, and is in charge of the iHub, Nairobi's Innovation Hub for the technology community, bringing together entrepreneurs, hackers, designers and the investment community. Erik is a TED Senior Fellow, a PopTech Fellow and speaker and an organizer for Maker Faire Africa. You can find him on Twitter at @WhiteAfrican

This post originally appeared on Ushahidi's blog.

Sponsored post

April 20 2012


How Ushahidi Deals With Data Hugging Disorder

At Ushahidi, we have interacted with various organizations around the world, and the key thing we remember from reaching out to some NGOs (non-governmental organizations) in Kenya is that we faced a lot of resistance when we began in 2008, with organizations not willing to share data which was often in PDFs and not in machine-readable format.

This was especially problematic as we were crowdsourcing information about the events that happened that year in Kenya. Our partners in other countries have had similar challenges in gathering relevant and useful data that is locked away in cabinets, yet was paid for by taxpayers. The progress in the Gov 2.0 and open data space around the world has greatly encouraged our team and community.

When you've had to deal with data hugging disorder of NGOs, open data is a welcome antidote and opportunity. Our role at Ushahidi is to provide software to help collect data, and visualize the near real-time information that's relevant for citizens. The following are some thoughts from our team and what I had hoped to share at OGP in Brazil.


Government Data is important

  • It is often comprehensive - It covers the entire country. For example, a national census covers an entire country, so it has a large sample, whereas other questionnaires have a smaller sample.
  • Verified - Government data is "clean" data; it has been verified -- for example, the number of schools in a particular region. Crowdsourcing projects done by government can be quite dependable. (Read this example of how Crowdmap was used by the Ministry of Agriculture in Afghanistan to collect commodity prices.)
  • Official - government data forms the basis of government decision making and policy. If you want to influence government policy and interventions, it needs to be based on official data.
  • Expensive - Government data because it is comprehensive and verified is expensive to collect -- this expense is covered by the taxpayer.

Platforms are important

Libraries were built before people could read. Libraries drove the demand for literacy. Therefore, it makes sense that data and data platforms exist before before citizens have become literate in data. As David Eaves wrote in the Open Knowledge Foundation blog:

It is worth remembering: We didn't build libraries for an already literate citizenry. We built libraries to help citizens become literate. Today we build open data portals not because we have a data or public policy literate citizenry, we build them so that citizens may become literate in data, visualization, coding and public policy.

Some countries like Kenya now have the data, and now open-source platforms available not just for Kenya but worldwide. What are we missing?

Platforms like Ushahidi are like fertile land, and having open data is like having good seeds. (Good data equals very good seeds.) But fertile land and seeds are not much without people and actions on that very land. We often speak about technology being 10 percent of what needs to go into a deployment project -- the rest is often partnership, hard work and, most of all, community. Ordinary citizens can be farmers of the land; we need to get ordinary citizens involved at the heart of open government for it to powerful.

Ushahidi's role

Accessible data: The ownership debate has been settled as we agree government data belongs to the citizens. However, ownership is useless without access. If you own a car that you do not have access to, that car is useless to you. In the same way, if our citizens own data they have no access to, it's useless to them. Ownership is exercised through access. Ushahidi makes data accessible -- our technology "meets you where you are." No new devices are needed to interact with the data.

Digestible data: Is Africa overpopulated? If Africa is overpopulated or risks overpopulation, what intervention should we employ? Some have suggested sterilization. However, the data shows us that the more education a woman has, the less babies she has. Isn't a better intervention increasing education opportunities for women? This intervention also has numerous additional advantages for a country -- more educated people are usually more economically productive.

Drive demand for relevant data: Governments are frustrated that the data they have released is not being used. Is this because data release is driven mainly by the supply side, not the demand side -- governments release what they want to release, not what is wanted? How do we identify data that will be useful to the grassroots? We can crowdsource demand for data. For example: The National Taxpayer Alliance in Kenya has shown that when communities demand and receive relevant data, they become more engaged and empowered. There are rural communities suing MPs for misusing constituency development funds. They knew the funds were misused because of the availability of relevant data.

Closing the feedback loop: The key to behavioral change lies in feedback loops. These are very powerful, as exemplified by the incredible success of platforms like Facebook, which are dashboards of our social lives and that of our networks. What if we had a dashboard of accountability and transparency for the government? How about a way to find out if the services funded and promised for the public were indeed delivered and the service level of said services? For example: The concept of Huduma in Kenya, showed an early prototype of what such a dashboard would look like. We are working on more ways of using the Ushahidi platform to provide for this specific use case. Partnership announcements will be made in due course about this.

All this, To what end? Efficiency and change

If we as citizens can point out what is broken, and if the governments can be responsive to the various problems there are, we can perhaps see a delta in corruption and service provision.

Our role at Ushahidi is making sure there's no lack of technology to address citizen's concerns. Citizens can also be empowered to assist each other if the data is provided in an open way.

Open Data leading to Open Government

It takes the following to bridge open data and open government:

  • Community building - Co-working spaces allow policy makers, developers and civic hackers to congregate, have conversations, and build together. Examples are places like the iHub in Kenya, Bongo Hive in Zambia, and Code For America meetups in San Fransisco, just to name a few.
  • Information gathering and sharing - Crowdsourcing plus traditional methods give not only static data but a near real-time view of what's going on on the ground.
  • Infrastructure sharing - Build capacity once, reuse many times -- e.g., Crowdmap.
  • Capacity building - If it works in Africa, it can work anywhere. Developing countries have a particularly timely opportunity of building an ecosystem that is responsive to citizens and can help to leapfrog by taking open data, adding real-time views, and most of all, acting upon that data to change the status quo.
  • Commitment from government - We can learn from Chicago (a city with a history of graft and fraud), where current CTO John Tolva and Mayor Rahm Emmanuel have been releasing high-value data sets, running hackathons, and putting up performance dashboards. The narrative of Chicago is changing to one of a startup haven! What if we could do that for cities with the goal of making smart cities truly smart from the ground up? At the very least, surfacing the real-time view of conditions on the ground, from traffic, energy, environment and other information that can be useful for urban planners and policy makers. Our city master plans need a dose of real-time information so we can build for our future and not for our past.
  • Always including local context and collaboration in the building, implementation and engagement with citizens.

Would love to hear from you about how Ushahidi can continue to partner with you, your organization or community to provide tools for processing data easily and, most importantly, collaboratively.

Daudi Were, programs director for Ushahidi, contributed to this post.

A longer version of this story can be found on Ushahidi's blog.

July 11 2011


How TileMill Improved Ushahidi Maps to Protect Children in Africa

In May I worked with Plan Benin to improve its Violence Against Children (VAC) reporting system. The system uses FrontlineSMS and Ushahidi to collect and visualize reports of violence against children. Ushahidi develops open-source software for information collection, visualization and interactive mapping. While in Benin, I was frustrated by the lack of local data available through Google Maps, Yahoo, and even OpenStreetMap -- the three mapping applications Ushahidi allows administrators to use without customization.

While these mapping services are great for places rich in geographic data, many places -- like Benin and other countries in the developing world -- are poorly represented by the major mapping services. Making matters worse is the fact that even when good data is available, slow and unreliable Internet access turns geolocating incidents and browsing the map into a frustrating, time-consuming challenge for staff and site visitors in-country.

In an effort to create a custom map with more local data, I tested out TileMill, Development Seed's open-source map design studio, with successful results.

An area of northwest Benin shown with Google Maps (left) and a custom map built with TileMill (right). Note the number of towns and villages that appear in the map at right.

With little hands-on experience with map design or GIS (geographic information systems), I was happy to find TileMill's Carto-based code intuitive and easy to use.

Because of the lack of data on Benin available through the major mapping services, I thought it would be interesting to visualize the VAC Benin data on a custom map using geographic data obtained by Plan Benin through CENATEL, the National Centre of Remote Sensing and Forest Cover Observation in Benin. I exported reports of violence from Ushahidi into a CSV file using Ushahidi's built-in export functionality. From there, I used Quantum GIS -- an open-source GIS tool -- to convert the data into GeoJSON, an open standard for data interchange that works very well with TileMill.

I then used TileMill to create a map that includes only the data relevant to Plan Benin's activities on this particular project, which helps users focus on the information they need. The map includes geographic data for Atacora and Couffo, the two "Program Units" where Plan Benin operates. (These are highlighted in light blue on the map.)

I also included labels for the important cities in both Program Units and, if you zoom in several levels, village names in Atacora. The red dots indicate reports of violence, and if you mouse over or click on a dot, you can see a summary of the incident. The reports were geolocated by hand using information sent via text message. The map also incorporates MapBox's open-source World Bright base-layer map, adding country borders, custom labels, population centers (in light yellow/brown tones), and other information to the map.

The Tip of the Iceberg

This is really the tip of the iceberg in terms of what TileMill can do. It would also be possible to add as many cities and villages as there are in the dataset, include multimedia-rich interactivity, use a choropleth scheme to indicate hotspots of violence, cluster reports, and so on.

With just a few design choices, this custom map dramatically improves the experience of interacting with data collected through Ushahidi. Highlighting the Program Units draws the eye to the important areas; using deep datasets and custom map labels solves the problem of missing local data; and the built-in interactivity means that visitors don't need to browse to multiple pages (a killer in low-bandwidth environments) to view information on individual reports.

Compositing, which was just rolled out on TileStream Hosting, helps the map load quickly, even in low-bandwidth environments (the maps are now faster than Google Maps), and this map can also be used offline via either the MapBox Appliance or the MapBox iPad app. Finally, TileStream Hosting makes it easy to host the map and generates embed code so the map can be widely shared.

Take a look at the map below and feel free to click over to the VAC Benin Ushahidi site to see the difference for yourself.

VAC Benin data collected with Ushahidi and visualized with TileMill:

Paul Goodman is a master's student at the UC-Berkeley School of Information and is spending the summer working with Development Seed.

June 25 2011


The problem of real-time verification - SwiftRiver: add context to content

Niemanlab :: One of the biggest challenges news organizations face is the real-time aspect of newsgathering: the massive problem that is making sense of the torrent of information that floods in when breaking-news events take place. How do you process, and then verify, hundreds and often thousands and sometimes millions of information pieces (SMS, photos, tweets and more discrete data points), even before those points transform into something that resembles useful information?

[Jon Gosier] .. it is all about adding context to content

SwiftRiver Dataflow 1 from Ushahidi on Vimeo.


A team at Ushahidi has been working on that problem since early 2010, developing a way to help users manage large quantities of data more efficiently — and to help verify and prioritize information as it flows in from the crowd.

Continue to read Megan Garber, www.niemanlab.org

March 24 2011


Using Technology to Aid Disaster Relief for Japan and Beyond

The March 11 earthquake in Japan triggered a flurry of concern in the Media Lab community at MIT. The natural desire to help was amplified by the fact that the disaster had hit many of our friends close to home in a very literal sense. Most messages suggested donations to support relief organizations -- a worthy cause indeed -- but there was also a more unique reaction: A call for relief technology.

It turns out that the use of digital tools in crisis situations is a concept with rich communities and plenty of solid examples. Within the Media Lab there are a handful of projects designed to assist in crisis situations and outside of the Lab there are hundreds more. Because I just started exploring the area, I want to share my initial observations along with a quick description of the projects I'm most familiar with.

Framing the Challenge

I always thought that social systems were complicated, but I will never complain again. The issues surrounding a crisis tool are almost unreal. Take a minute to think of the types of challenges that a relief technology project needs to consider up front, before even targeting a particular problem. Here are a few examples to get your brain running:

  • Limited technology access -- The primary stakeholders of a relief technology are those living in a disaster area. Maybe the area is a developing country with no digital infrastructure or a more developed country whose digital infrastructure just got annihilated. Maybe people have the Internet, maybe they just have cell phones, or maybe they really have nothing on the ground. Nothing is a given here.
  • Shifting impact windows -- What are the phases of disaster relief, how long do they last, and what changes between them? I like the general structure presented in this document [PDF]. It starts with an immediate response, which focuses on things like saving lives, cleaning up, and tending to basic human needs. This is followed by mid-term planning, which involves getting the basics back such as temporary housing and lifeline utilities. Finally the long term reconstruction phase begins. Each of these phases has drastically different needs, requirements, and timelines.
  • Global-scale requirements -- National disaster relief is a problem that even government-scale organizational structures struggle to deal with. What are the core needs and how can they be met? Who is being impacted? Who is going to participate and how? What are the implications of your solution? What is already being done and how can you fit in effectively? How many different cultures are going to be using your tool? What languages are involved? These are all vital questions which have to be thought through.
  • Lives are at stake -- Behind this entire process is an ultimate fact: These technologies are dealing with matters of life and death. If an organization relies on a tool for some portion of its operations and the tool fails there could be very real and serious consequences.

What exactly do these issues mean from a system design standpoint? How do these concerns end up shaping a project? Hopefully some examples can help illustrate that.

Example 1: Konbit

konbit jpg

Greg Elliott and Aaron Zinman, two students at the MIT Media Lab, noticed a major problem with the way reconstruction efforts were being approached after the 2010 Haiti earthquake. The issue was simple: Haiti lacked the information infrastructure needed to effectively identify local skill sets at scale and hire Haitians to assist in the rebuilding tasks.

For instance, instead of hiring a local plumber or electrician, someone might be flown in from the United States to get the job done. Plumbers, bricklayers, drivers, nurses, and translators all live within the crisis areas, but without a way for foreign NGOs to easily discover them and their skill sets they simply aren't hired. To make the problem more challenging, access to the Internet in Haiti is uncommon, meaning a purely web-based solution would offer no help at all. Additionally, 50% of the country is illiterate, so even SMS-based solutions are not appropriate.

They created Konbit to address this problem. It provides a phone-based interface that allows Haitians to register their skills and life experiences via an automated interview process. The interviews are then translated and categorized, resulting in an online searchable directory that employers and NGOs can use to discover local workers. They have more than 3,000 workers ready to be hired right now and are looking for NGOs and employers who can use their database.

Example 2: Ushahidi

ushahidi map.jpg

While Konbit focuses on the rebuilding phase of a disaster, Ushahidi, a 2008 Knight News Challenge winner, has proven how powerful an information-mapping platform can be in the immediate response to a crisis. Within days of the tsunami in Japan, an instance of the platform was set up to track reports and needs from the ground. This particular map (see above) has an aggregation of reports with labels such as "Wanted!" "Disaster Area," and "Available Service."

As those of you who are familiar with the Ushahidi platform already know, it is brilliant because the open tools are general enough to be easily and quickly adapted for use in new situations. Information transfer based on geography is going to be needed in any crisis situation and many non-crisis situations as well. This helps separate the technology from the context, which means that at the very least a general information flow can be quickly set up almost immediately after disaster strikes.

Example 3: Grassroots Mapping

One of the projects that has come out of the Center for Future Civic Media is a set of tools and techniques for community-driven maps called Grassroots Mapping. The tools allow individuals to create high resolution maps using what boils down to a kite and a camera. Unlike Konbit and Ushahidi, this project is much more focused on the documentation of geographic change.

The project had immediate application after the oil spill in the Gulf of Mexico in 2010, where thousands of miles of coastline were contaminated with oil over a period of several weeks. Satellite imagery gave a sense of the damage, but grassroots maps made it possible to create high resolution maps of the damage, which are now part of the public record for anyone to view and use.

Designing for a Need

I want to wrap up the post with a story about my own miniature attempt to contribute to the world of relief technology. As reactors were beginning to overheat in Japan, I heard someone comment on a desire to better understand what was actually needed and how they could help. That night I threw together a quick Google Maps mash-up with the hope to make it easy for people to help log needs and support organizations on the ground:

schultz map.jpg

The next day I learned about the Ushahidi instance and put my project on temporary hold; a few days of rushed hacking wasn't going to save Japan and I needed some giant shoulders to stand on. One week later, it seems that the original need I was approaching -- making it possible for American donors to understand how and where they could help -- is still not being met. Ushahidi has the information buried inside of its maps but the interface is simply not designed for that purpose and the reports are in Japanese.

I want to tap into Ushahidi by creating a layer which can frame the information for charitable supporters rather than for NGOs and survivors. The goal is a system that turns information into action by helping people with resources understand where needs are being met, who is meeting them, and how they can help. In the meantime, though, I would love to hear about any type of relief technology that you have seen which stood out as successful or unique.

March 01 2011


February 25 2011


Ushahidi Takes First Steps in Evaluating Kenya Projects

This post was written by Melissa Tully and Jennifer Chan. It is the first in a series of blog posts documenting a 9-month Ushahidi evaluation project in partnership with the Harvard Humanitarian Initiative and supported by the Knight Foundation. A version of the post below was originally published on the Ushahidi blog

During the first two weeks of January, we traveled to Nairobi, Kenya, to begin phase one of a 9-month evaluation of Ushahidi's Kenya projects. Ushahidi is a web application created to map the reported incidents of violence during the post-election crisis in Kenya.

As part of a team, Jennifer and I met with individuals and groups who have incorporated the Ushahidi software into their programming as well as other partners to better understand how organizations have implemented and used the platform to improve their programming and organizational goals.

This evaluation has multiple purposes. In addition to writing case studies of some interesting and dynamic projects that use the Ushahidi platform: Unsung Peace Heroes and Building Bridges, and Uchaguzi in both Kenya and Tanzania; we plan to document our progress through a series of blog posts and to create practical and interactive tools.

Tracking Progress

These resources can help organizations decide if Ushahidi is right for them through a self-assessment and evaluation process. Implementers can use these resources throughout the entire project period to track their progress and strengthen monitoring and evaluation.

We're in the very early stages of development, but based on discussions with people in Kenya who have used Ushahidi and members of the Ushahidi team and community, we think we're developing some very useful stuff. Currently, we're focusing on the "pre-implementation assessment" and "implementation" resources so that we can get feedback from current and future deployments on these key areas.

We're working closely with the Ushahidi team and others involved in developing the Ushahidi Community page to integrate the case studies and tools into this part of the site and to add to the already existing resources for Ushahidi users.

Another goal is to link to guides, case studies, tips, and tricks -- or anything else out there created by the vast Ushahidi community worldwide -- to better serve the entire user community. Let us know in the comments what you think about our service and how we might better improve it.

January 11 2011


How Mapping, SMS Platforms Saved Lives in Haiti Earthquake

This article was co-authored by Mayur Patel

Tomorrow marks the anniversary of the devastating earthquake that shook Haiti last January, killing more than 230,000 people and leaving several million inhabitants of the small island nation homeless. Though natural disasters are common, the humanitarian response this time was different: New media and communications technologies were used in unprecedented ways to aid the recovery effort.

A report released today by Communicating with Disaster Affected Communities, with support from Internews and funding from the Knight Foundation, takes a critical look at the role of communications in the crisis and recommends ways to improve the effectiveness of utilizing media in future disaster relief efforts. (The Knight Foundation is a major funder for MediaShift and its sister site MediaShift Idea Lab.)

In the weeks after the crisis, Haiti quickly became a real world laboratory for several new applications, such as interactive maps and SMS texting platforms. In the aftermath of the quake, these tools were used for the first time on a large scale to create dialogue between citizens and relief workers, to help guide search-and-rescue teams and find people in need of critical supplies. The report, Lessons from Haiti [PDF download] (co-authored by Anne Nelson and Ivan Sigal, with assistance from Dean Zambrano), recounts the stories of media participants, technologists, humanitarian organizations, Haitian journalists and response teams involved in the relief. Many of these players were first brought together to share their experiences at a roundtable convened by the Knight Foundation and Internews last May.

Notable Innovations

"The most notable innovations to emerge from Haiti were: the translation of crowdsourced data to actionable information; the use of SMS message broadcasting in a crisis; and crowdsourcing of open maps for humanitarian application," according to the report. A dizzying array of new media and information technology groups, Haitian diaspora networks and media development partners were involved in these initiatives (see the infographic below). Although these innovations had varying levels of impact in Haiti, they showcased the potential for use in future crises.


One of the most notable developments was the application of Ushahidi, an online crisis mapping platform that was born only a few years earlier in Kenya. Ushahidi had already been used to map political violence, but it had not yet been used in the context of large-scale natural disasters. When the earthquake struck, an ad hoc coalition quickly took shape, anchored by a group of graduate students at Tufts University in Boston.

The Ushahidi teams, supported by translators from the Haitian diaspora community in the U.S., gathered information from news reports and individuals about the most acute needs on the ground: rescue, food and water, and security, among others. The coordinates were placed on a map and made available to rescue and relief teams.

Soon they were able to include SMS texts in their bank of information. A few days after the quake, Digicel, one of Haiti's leading telecom companies, agreed to offer a free short code (4636) for SMS texts in service of the relief efforts, with the help of InSTEDD, a technology focused humanitarian organization. The four-digit code enabled cell phone users to send free messages to central information centers about missing persons and emergency needs. SMS messages and direct reports from Haitian citizens began to flow within four days of the quake.

OpenStreetMaps, an open community of volunteer mappers, joined the effort to create online maps of Haiti's improvised and unnamed neighborhoods. These maps became the standard reference points: Users included not just information technology platforms such as Ushahidi, but also large providers of humanitarian services, such as the UN Office for the Coordination of Humanitarian Affairs (UNOCHA) and the International Federation of the Red Cross (IFRC).

Not Necessarily a Success Story

However, the CDAC Report cautions against calling the Haitian experience a "new media success story," as some of the approaches -- attempted for the first time -- faltered. The crisis threw together volunteer technology communities and professional humanitarian organizations, without a common language and operating procedures. A lack of coordination and understanding of how to use and integrate the new tools into existing disaster relief structures further complicated efforts on the ground.

In addition, new media efforts did not preclude the importance of traditional media. As in past crises in the developing world, radio continued to be the most effective tool for serving the information needs of the local population. With Haiti's newspapers and television broadcasters knocked out of production for the first few weeks after the quake, radio provided a heroic lifeline. One Haitian station, SignalFM, was able to broadcast continuously throughout the crisis, and worked closely with both international relief organizations and the digital innovators in support of the population. Popular radio host Cedre Paul reached his audience via Twitter as well as on the air.

"We have always known that one of the best ways to communicate with affected population in crises is through radio broadcasts," said Mark Frohardt, vice president of humanitarian programs for Internews, a media development organization. "We found in Haiti that innovative technologies not only had an impact on information delivery on their own, but also greatly enhanced the reach and effectiveness of radio."

Still Work to be Done

For all the welcome innovation, the report makes it clear that digital humanitarian action has a long ways to go. One of the big obstacles to the Haiti effort was the lack of pre-existing connections between the large government and international institutions and the new tech activists. Large institutions tend to mean weighty protocol, some of it based on long and bitter experience. The International Committee of the Red Cross (ICRC), for example, has strict rules of confidentiality, which has allowed it to play a uniquely useful role in conflicted and tense situations, while the open source community's hallmarks are spontaneity and transparency.

Nonetheless, the connections among the various sectors advanced in Haiti, stimulated by a common desire to help, and there are many signs that new synapses are emerging. For example, CDAC has made some progress bridging the gaps between the humanitarian and media communities. The report calls for more of this kind of cross-sector collaboration to improve future recovery efforts. Specifically, it recommends that media, new technology developers and humanitarian agencies (both UN and international NGOs) engage in joint preparation and simulation exercises for future emergency responses.

We should not forget that Haiti's crisis is far from over. Many donors have yet to fulfill their commitments for reconstruction funds, and much of the rubble remains. New digital initiatives are still appearing; one promising new effort from MIT is an online labor exchange for Haitians called Konbit.

Disasters will continue to occur. But their damage can be mitigated by relief efforts that are well-planned and executed in concert with the local population. Digital media technologies offer a unique opportunity to advance these goals with the right on-the-ground coordination. As the report notes: Haiti demonstrated "the culmination of a vision and the beginning of the hard work of implementation."

Anne Nelson is an educator, consultant and author in the field of international media strategy. She created and teaches New Media and Development Communications at Columbia's School of International and Public Affairs (SIPA) and teaches an international teleconference course at Bard College. She is a senior consultant on media, education and philanthropy for Anthony Knerr & Associates. She is on Twitter as @anelsona, was a 2005 Guggenheim Fellow, and is a member of the Council on Foreign Relations.

This is a summary. Visit our site for the full post ».


A year later, lessons for media from Haiti earthquake response

Tomorrow marks the one-year anniversary of Haiti’s devastating earthquake, as well as the anniversary of one of the largest humanitarian responses to a natural disaster, with almost $3.8 billion in aid given or pledged. The international response required coordination between the Haitian government, foreign governments, NGOs, and the people directly affected by the disaster, becoming a real-world laboratory for testing new tools — from SMS short messaging to crowdsourced crisis mapping with Ushahidi — according to a new report from the Knight Foundation. And the response’s successes and failures could prove valuable lessons both for responding to the next crisis and for better understanding mass media.

Three innovations in particular, the report says, were put to the test: Broadcasting crisis information with SMS, crowdsourcing data into actionable information, and using open mapping tools to meet humanitarian needs.

The report found that none of them, however, would have been as effective without one very low-tech tool: radio.

Radio still Haiti’s dominant medium

In a country with a literacy rate of just 52 percent, traditional newspapers and Internet access would have been of low value even if the presses and power lines hadn’t been knocked out of commission. Inexpensive, resilient, and nearly universal radio access, however, cut past literacy and economic boundaries, particularly since one station, Signal FM, managed to continue operation throughout the crisis. Signal FM, and other stations as they returned to broadcasting, served as vital information sources, detailing aid and emergency procedures and helping connecting survivors with other resources.

“Although much of the attention has been paid to new media technologies, radio was the most effective tool for serving the needs of the public,” the report notes. But while radio provided the first line of communications, new technologies were critically in actually connecting communities with the aid they needed.

For help, text 4636

Despite erratic cellular service (cell towers would often go live for a few hours, followed by hours of silence), SMS text messaging proved an invaluable tool: Even when coverage was down, messages could be queued and then sent when access returned. The Knight report states that the first usage was informal: Local journalists received pleas for help or reports from the ground. But local service provider Digicel, collaborating with non-profit InSTEDD, set up a dedicated, free short code service at the number 4636, which was up and running four days after the quake and allowed Haitians to text in reports and even requests for emergency help — even as the system was used by Thomson Reuters Foundation to broadcast basic shelter, hygiene and security alerts to roughly 26,000 subscribers.

Across the Atlantic, SMS messages were also playing a very important relief role: raising money. SMS was widely used by the Red Cross and other organizations in the United States to spur convenient giving, helping raise $30 million in the 10 days after the earthquake. In fact, more people donated via text messaging (14 percent) than telephone (12 percent) or e-mail (5 percent).

Despite the invaluable data in the flood of wireless bits, though, the mix of languages, formats, and levels of urgency sent in to relief workers also posed serious logistical hurdles.

Crowdsourcing through the diaspora

Critical to parsing through all the data were centers far outside of Haiti, like one group in Boston that helped geolocate emergency texts, information that was then passed along to relief workers on location. Groups of Haitian expatriates helped translate the flood of data from Creole, French, and Spanish into English, passing it along to the most appropriate aid organizations as well as the U.S. Marines, who often served as the basis for search-and-rescue missions.

In Haiti, the report found the use crowdsourced emergency information had hit a turning point, helping inform real-time decision-making.

“In most previous efforts, information was collected mostly to understand when, where and why events were occurring. It had been relatively rare for such information to be useful for actual response to a specific problem,” the report states. “In Haiti, by contrast, limited numbers of humanitarian responders attempted to include crowdsourced information to help form their decisions about where to respond, to send search-andrescue teams, to identify collapsed structures and to deliver resources. While these efforts were not systemic in nature, they were nonetheless groundbreaking.”

Ushahidi mapping provides real-time relief

The report also highlights the use of crowdmapping tool Ushahidi, which volunteers used to map many of the incoming pleas for help to determine trouble “hot spots” and inform rescue operations where they were needed. Haiti.ushahidi.com served as one focal point for cataloging submitted public health, security and other dangers, making it easy to quickly see what problems a particular area was facing and quickly deploy help there.

While the report is careful to note that these efforts were limited, it highlighted the potential noted that even these early efforts were making a difference. According to a Ushahidi team leader at Tufts, “On the third day, FEMA (Federal Emergency Management Agency) called us to say keep mapping no matter what people say – it’s saving lives.”

The changed role of media in crisis

While the report focuses on primarily local and “humanitarian media,” it notes that journalists often played an important role in not just documenting the damage and recovery, but in connecting the local communities with information when traditional lines of communication were severely disrupted. Trusted on-air radio personalities switched from delivering hit music to health bulletins, while reporters passed along reports of danger and distress. Radio host Cedre Paul, the host of “Radio One Haiti,” sent out “prolific Twitter messages that provided real-time updates to his many followers,” the report noted.

While the blurred role of NGOs in media serve is by no means new, Haiti served as powerful reminder of the dual role — both documenting and aiding — that news organizations can serve. Even traditional outlets, like CNN and the New York Times, served in relief capacities by partnering with NGOs and Google to create a unified Haitian people finder.

Unmet potential

While the report focuses on the successes, it recognizes that many barriers still needed to be deal with in order for the highlighted technologies to have a maximum impact, particularly in regards to education and policies within more traditiona aid and government organizations, which often have strict privacy rules, for example, that often conflict with the transparency required by crowdsourced projects like Ushahidi. Still, both sides, the technologists and the aid organizations, are realized that gains made clear from cooperation, the report states, and strides are being made towards better integration of these technologies into traditional response plans.

“This process of creating new forms of collaboration between different organizational cultures will not be quick or easy,” the report concludes. “However, the promise of such collaboration is recognized by many of those involved, on both sides of the equation. The question is not whether this process will advance, but how.”

The full Knight report and related materials are available for download.

November 29 2010


The best online projects that monitored Brazil’s 2010 Elections

“Last year, electoral reform opened the door for politics 2.0 by authorizing parties to use social networks to raise campaign donations and participate in streamlined debates”, claims Manuella Ribeiro about the recent Brazilian election that made Dilma Rousseff the new president.

Ribeiro made a compilation of the best online projects that worked on transparency, civic engagement and public policies monitors. Here are my personal favorites:

eu lembro

Eu lembro: “Be a voter with an elephant’s memory. Vote and remember everything that happens to politicians”.

VotenaWeb: “A site where you can approach the decisions of National Congress that directly affect your life. Vote and be heard”. Citizens can compare, with an easy interface, their votes on bills and the votes of politicians. The congressional bills are translated into simple language and you can monitor the voting records of different candidates.

Quanto vale seu candidato?: in English “How much is your candidate worth?” is a nice piece of data journalism with information about the patrimony of candidates.

eleitor 2010

Eleitor 2010: developed with Ushahidi to monitor the elections, receive and map complaints about electoral crimes through Twitter, SMS, email and social networks.

Adote um Vereador: encourages citizens to “adopt” a city councilman and open blog about their work to keep an eye on them and their parliamentary activities.

November 09 2010


Overcoming the Challenges of Using Ushahidi in Low Bandwidth Areas

With the increased adoption of Ushahidi around the world, we are finding that one problem (which we anticipated in the very beginning of the initiative) is that of low bandwidth regions. In the early days of testing the platform in Kenya, we found that the map would take ages to load, and so the development team worked very hard to change this. This was of course before the installation of fiber optic links in Kenya, which make connection speeds much better after September 2009.

Our current solution for integrating SMS in areas with low bandwidth (but good wireless service coverage) is to have a FrontlineSMS hub with a compatible mobile phone attached to a computer via USB or even Bluetooth for those who prefer it.

Ushahidi plus FrontlineSMS

That has worked reasonably well, but we are always looking for ways to improve access to maps containing crowdsourced information, particularly in areas with low Internet penetration rates. Recent statistics indicate that mobile networks are now available to 90 percent of the world's population overall, and to 80 percent of the people living in rural areas. This means it's even more important for Ushahidi to be able to collect and then visualize information from mobile phones. It's worth remembering that for many people with mobile phones, their first social network is their address book.

What follows below are several updates on developments to improve the ability for people to use Ushahidi in low bandwidth areas. We welcome everyone in our greater community to try these applications out and provide us with feedback. Let's see if we can continue this process of "real-time sense making," even in rural areas. At the very least, we would like to have the tools well tested and used in various locales.


We have an upcoming version of Ushahidi dubbed "Luanda" that will be released soon, it will have many improvements that will be of interest to deployers around the world.

There are two options for using Ushahidi in low bandwidth regions:

1. Configuring the mobile version of the site you build and put Ushahidi on. You will need the 2.0 build of the platform (caveat that it's a test build). Then add and activate the mobile plug-in from our plug-ins database.

2. The offline mapping tab available as an OS X test build - Dale Zak and Emmanuel Kala are still working on this, but we'd like to invite users to test things out. Caveat is it's a test build and for Mac OS X for now.

Please submit issues/suggestions on the Github tracking issue tracking log, as this will help us greatly.

Frontline Mapping

The upcoming Frontline Mapping plug-in allows new ways for Ushahidi incident reports to be gathered in the field:

  • SMS-to-Report -- Any incoming text message can be converted into an incident report and synced once Internet access becomes available. For example, a text message that reads "Riots in the streets, several people injured" would be received by Frontline. A person managing the application double-clicks that message and the new incident report dialog is pre-populated with that information, along with the sender's contact info if available.
  • FrontlineForms-to-Report -- The Mapping plug-in can generate a FrontlineForm with all the required Ushahidi fields, and send that Form to any contact with a Java-enabled phone. The incoming FrontlineForm response is automatically covered to an incident report, and can be synced once the Internet becomes available.
  • FrontlineSurveys-to-Report -- The Mapping plug-in can also populate the new FrontlineSurveys plug-in with Ushahidi-specific questions (such as, "What is the incident description?") You can send a survey to any contact via SMS, which initializes a series of questions, the next question sent once the previous answer is received.

Here are four demo videos showing the Mapping Plug-in in action:





Note that the FrontlineForms and FrontlineSurveys options require less work for administrators because the data received is structured; however it may require multiple SMS messages to gather all the information. In times of crisis, the user may only be able to send one text message. However, community health care workers may choose to use the FrontlineForms or FrontlineSurveys options to submit structured patient information.

Do subscribe to our blog feed or follow us on Twitter to get the latest about upcoming announcements about the continuing evolution of the platform.

October 26 2010


Innovative SMS-Driven News Project Takes Root in Afghanistan

In Afghanistan, a documentary media company and an independent news agency have teamed up to integrate mobile phones and SMS into news reports. From election-day text messages to stories of homemade airplanes, they're demonstrating how a willingness to adapt mobile platforms to the landscape can contribute to a successful intersection of technology and media.

Small World News is a documentary and new media company that provides tools to journalists and citizens around the world to tell stories about their lives. Pajhwok Afghan News is an independent news agency headquartered in Kabul with eight regional bureaus and a nationwide network of reporters delivering stories in Dari, Pashto, and English. Together, the two launched Alive in Afghanistan, a website originally meant to showcase reports from the 2009 election in Afghanistan.

aliveinafghan grab.jpg

Alive in Afghanistan

Danish Karokhel, director of Pajhwok Afghan News, said social and multimedia platforms are new for many in Afghanistan. So he hired Small World News to help train Pajhwok staff on how to use these tools and equipment in the context of the 2009 elections. 

Unlike other initiatives that bridge mobile technology and journalism, the project did not promote or encourage citizen journalism per se, said Brian Conley, founder of Small World News. Instead, it grew from a rudimentary, informal election observation tool to a broader platform for media dialogue and journalism support for trained reporters.

Alive in Afghanistan launched in 2009 when Small News Network set up an SMS reporting system for Pajhwok reporters during the Afghan presidential election that year.

At launch, Alive in Afghanistan received attention for how it posted citizen reports from "ordinary Afghanis" alongside verified reports from Pajhwok reporters. More than 100 reports came in on election day from Twitter, SMS, and directly from Pajhwok reporters. These were mapped using Ushahidi, a platform for map- and time-based visualizations of text reports.

Conley said the site turned out to be one of the only sources of real-time news on election day. But though individuals could submit messages via mobile phone, many could not access the website because reliable Internet access is not widespread in Afghanistan.

"Although, as the founders of the site readily admit, only a minority of Afghanis know how to use the site and have access to it, it's still a great resource for real-time election news from Afghanistan," reported a 2009 story in the Los Angeles Times.

Alive in Afghanistan was intended to be used to report on a looming presidential run-off election. When the run-off was called off in November, the site and functionality could easily have been abandoned by either Small World News or Pajhwok.

Instead, it was retooled and it "turned into more of a journalism-strengthening project for supporting a free and fair media in Afghanistan," Conley said. 

Karokhel, who is also a 2008 recipient of an International Press Freedom Award from the Committee to Protect Journalists, said Pajhwok received user feedback following the launch that suggested news about daily life and daily activities would be an interesting addition.

So they did just that. SMS reports on the site are now sorted into multiple categories, including security, election, governance, construction, sport, health, and innovation. The reports also appear on special "SMS Updates" section on the Pajhwok website. 

On October 19, for example, 15 SMS reports were posted on the Pajhwok site. They ranged from commentary about last month's parliamentary election to sexual performance drugs to a teenager who constructed the country's first homemade plane.

pajhwok grab.jpg

SMS Reports Alert Both Readers and the Media

While many mobile-based journalism projects capitalize on geo-coding technology and hyper-local conversations, Karokhel sees larger, international potential for this initiative. Pajhwok posts SMS reports on stories that are interesting to international readers. This comes, in part, from understanding that the audience of the site goes far beyond Afghanis, many of whom don't have Internet access.

Another important function of the adapted platform is that the SMS reports foster a dialogue for other media outlets and help Pajhwok set the agenda and alert other journalists of breaking news from provinces across the country.

Pajhwok SMS reports are read by 93 radio stations, 25 TV news channels, and 14 daily newspapers. Reports are often picked up by many media outlets, Karokhel said.

In the near future, Karokhel plans to rev up the citizen journalism component of the project to provide SMS reports, in multiple languages, to mobile phones.

Overall, the project is a step forward for both Pajhwok and the media landscape. By taking advantage of training and equipment from Small World News -- and running with it -- "they will be very cutting edge for a news agency anywhere, and not just in Afghanistan."  Conley said.


Ushahidi's David Kobia Named Humanitarian of the Year

Ushahidi is a web-based platform that allows anyone to gather distributed data via SMS, email or web and visualize it on a map or timeline. The goal is to create a way of aggregating information from the public for use in crisis response. In 2008, Ushahidi was awarded $25,000 after winning the NetSquared N2Y3 Mashup Challenge.

read more

October 18 2010


Mapping the budget cuts

budget cuts map

Richard Pope and Jordan Hatch have been building a very useful site tracking recent budget cuts, building up to this week’s spending review.

Where Are The Cuts? uses the code behind the open source Ushahidi platform (covered previously on OJB by Claire Wardle) to present a map of the UK representing where cuts are being felt. Users can submit their own reports of cuts, or add details to others via a comments box.

It’s early days in the project – currently many of the cuts are to national organisations with local-level impacts yet to be dug out.

Closely involved is the public expenditure-tracking site Where Does My Money Go? which has compiled a lot of relevant data.

Meanwhile, in Birmingham a couple of my MA Online Journalism students have set up a hyperlocal blog for the 50,000 public sector workers in the region, primarily to report those budget cuts and how they are affecting people. Andy Watt, who – along with Hedy Korbee – is behind the site, has blogged about the preparation for the site’s launch here. It’s a good example of how journalists can react to a major issue with a niche blog. Andy and Hedy will be working with the local newspapers to combine expertise.

October 14 2010


New Media as a Force for Mobilizing Political Change

Does the dramatic uptake of new media tools such as mobile applications, digital media, blogging, social networking and video activism mean that citizens, citizen groups and service organizations have the power to challenge the state and mobilize political change?

This is a question that I'll be pondering along with my fellow participants at the New Media: Alternative Politics Conference at the University of Cambridge. Below are some of my thoughts on the topic, as well as a specific look at the situation in Zimbabwe. After the conference is over, I'll share some of the opinions expressed by key researchers and practitioners in this area.

Digital Media Affecting Change

In a recent article, researchers Adi Kuntsman and Rebecca L. Stein argue that in the Middle East "digital media is becoming a new war zone." Digital media has changed the nature of the Israeli-Palestinian conflict and the Israeli military occupation by offering local populations new tools to "interface with, support, contest, and/or agitate against state policies."

Social media sites are being used, some more successfully than others, to rally interest for online political demonstration, election campaigning and to logistically organize on-the-ground activity. These platforms, such as Facebook, are enabling citizens, groups and politicians alike. Using these types of tools certainly seems to be a quick and simple method for supporters to demonstrate their political allegiance or to air their views. But are members and fans genuine and active supporters? Does the fact that they join a Facebook group imply that they can be relied upon to take action in a material way? Will they turn up to vote, boycott a product or participate in a rally?

One positive example came after a tragedy -- when Khaled Said was beaten to death by the Egyptian police. A Khaled Said Facebook group was launched in his memory, and that group, along with Twitter and YouTube, were central in bringing together Egyptian activists and organizing protests to demand justice for Said.

Websites and blogs have similar power. In just one example, Kubatana.net, with its archive of over 17,000 reports from the NGO sector, is documenting the history of Zimbabwe's political and economic decline over the past nine years. It serves not only as a digital record, but also as an aid to the community's collective memory and a factual reference point for international media. Likewise, the Kubatana blogs site, with over 34 different authors, allows ordinary voices to be heard on a wide array of subject matter. BBC, CNN, Sky and the New York Times have looked to this site for a range of opinions from Zimbabweans.

Mobile Applications

Mobile telephony applications have also been widely used for political mobilization. In early September 2010 in Maputo, Mozambique, food riots were mobilized through the viral spread of text messages. According to a report from Russell Southwood on Pambazuka News, this may have resulted in the government putting pressure on the three local network providers to temporarily ban the SMS function.

In Kenya, SMS was used to incite ethnic violence during the 2008 post-election violence. This period gave rise to Ushahidi, a Kenyan crowdsourcing and mapping initiative and News Challenge grantee, which was developed to monitor and map the election violence in Kenya. (Read more about the project here.)

FrontlineSMS, which is free software for sending and receiving SMS and MMS messages, has been used in many countries, including Pakistan and Zimbabwe, to deploy SMS's for election related logistics and results.

Mobiles phones -- through voice, SMS and interactive voice menus -- are increasingly
important tools for citizens to receive, validate, gather and offer information. Mobiles offer a large percentage of the population a new means through which to stay informed and share opinions. For instance, mobile pones can be used to rally and organize participation, monitor elections, poll opinion, track human rights abuses, assist with more transparent modes of governance and to report back on government's service delivery. They can also be used for numerous other positive social benefits in the health, agriculture, education and emergency response sectors; and they can be used in a meaningful way to improve the lives of people at the bottom of the pyramid.

But to what extent do these examples of new media activities translate into meaningful change? Certainly, they facilitate remote participation; but how often does this convert into direct participation and/or action on the ground? We can assess opinion, reflect outrage, inform and inspire recipients, crowdsource information and record events without ever
meeting any of the contributors or consumers. Is there a danger that we will mistakenly believe our armchair activists will meet us in the street or at the polls? And how do we effectively measure the impact of new media? This inability to quantify new media's impact could lead to false optimism/pessimism, incorrect presumptions and misaligned reactions.

Zimbabwe's Traditional Media Landscape

In Zimbabwe, in spite of the two-year-old inclusive government, the media continues to suffocate under draconian laws like the Access to Information and Protection of Privacy Act (AIPPA), the Broadcasting Services Act (BSA) and the Public Order and Security Act (POSA).

Television and radio remain firmly in the hands of the old guard, offering biased and highly politicized coverage. Community radio remains an elusive dream, and the government has redoubled its efforts to jam the shortwave radio signals of external broadcasters.

The licensing of five newspapers in May 2010 has not yet materially changed the media landscape as only one of them, NewsDay,
is actually operating. Added to this is the fact that the majority of the country's population lives in rural areas, where they struggle to access newspapers due to cost barriers and limited distribution infrastructure.

So while there has been some token liberalization of the print media, overall Zimbabwe's traditional media landscape continues to be tightly restricted, repressed and controlled. In its stead, new media initiatives are rising like green shoots in the cracks of the concrete, providing citizens with an alternate voice and means to bypass the state's road blocks.

In Zimbabwe access to the Internet is largely limited to the elite. Due to poor infrastructure and high costs, only about 10 percent of the population have access to the Internet. This limits the number of people who can take advantage of the net's abundance of news, social networking, blogs and services such as email. Not surprisingly, the 10 percent of Zimbabweans who have access to the Internet are benefiting from improved communications, information consumption, organizing ability and productivity. But the lack of overall access means that web-based media have limited power to mobilize political change within Zimbabwe.

Mobile Growth in Zimbabwe

While Internet access is far behind, there is much improved GSM coverage and rapidly growing penetration rates of mobile phone users, which according to government data is up to 49 percent from 9 percent in 2008. This sudden growth coincided with the signing of the Global Political Agreement (GPA) in late 2008 and is in stark contrast to the years preceding when people had to source SIM cards on the black market. Increased competition has brought down the cost of phone lines, but has had little impact on the cost of local calls and SMS. The lack of cooperation between mobile network operators has also resulted in the duplication of mobile phone towers to services areas, resulting in slower progress in the roll out of infrastructure and greater costs for callers.

At US$0.25 a minute, Zimbabwe has one of the highest mobile call tariffs in the world. This partly reflects the government's lack of vision on how mobile communications can be embraced to benefit the nation. The sad result is that these costs and attitudes constrain the potential of mobile communications to increase productivity and improve lives.

The government has been clear about its unease with civic and political initiatives using interactive voice response phone services to share information with mobile phone users. Actions to date have largely been directed at the mobile network operators, whose licenses they threaten to revoke or not renew. However, out-dated legislation and the inevitable convergence or radio, telephony and web technologies make this a losing battle in the long run.

September 30 2010


How to Get Past the Fear of New Technology with Ushahidi

This post was authored by Rebecca Wanjiku

There is something nice about people who attend our Ushahidi 101 sessions in Kenya. They come to us having heard lots of good things about our information-mapping platform, and many have intentions to implement it within their organization or project.

These sessions, which we started earlier this year, attract new users, current users and those who have deployed and have lessons to share. The majority are new users and they come to the iHub in Nairobi in search of answers and a way forward.

For many, the intention is there, but they wonder about when or how to deploy Ushahidi. Many are already collecting very interesting information that they can share with users on the ground and get feedback, but they are not sure of how to go about it. In the end, they often hesitate and never get going. The fear of using new technology seems to take hold.

This has been the case for a variety of people and organizations. Whether it is the United Nations or a community-based organization, the decision on how or when to use our platform seems to be a challenge. I have been assisting users for the last year, and out of more than 30 presentations that have been given about our project and our tool, only five groups have gone ahead and implemented Ushahidi. There is optimism, but not as much action.

For those who don't move ahead with an implementation, I always drop an email to ask why. The most common reason given is a lack of technical expertise or a lack of funds to outsource implementation to an outside party. Others tell me they are still considering the platform.

Just Do It

When people express hesitation about using Ushahidi, or ask when they should get started, my default answer is to tell them to start now. The platform is free, after all. So if you have information to share that would be better presented graphically, then just start putting up the information and get to work; the platform doesn't require high-tech knowledge. You can start with the basics and work your way up.

People often worry that Ushahidi will be too difficult for users who have little or no familiarity with technology. In our experience, people have a way of surprising you. If the information is vital, people will go to any length to share it.

With the growth of fiber optic connections and the availability of affordable connectivity, the next frontier is content. Ushahidi provides an easier way for organizations to upload their content online and to present it graphically.

stop stockouts.jpg

A good example of an implementation is the Stop Stock-Outs campaign (above), which was created to monitor the supply of the vital drugs that treat TB and malaria, as well as the supply of morphine. The campaign was successful because health organizations in rural areas worked with local communities to report the information. One can argue that people in rural areas do not have a lot of familiarity with technology, but their use of SMS ensured their voices were heard.

All it takes is the initial effort to try and then anything is possible.

September 28 2010


Net2 Recommends - September's Interesting Posts From Around The Web

The NetSquared team reads and shares lots of different blog posts, articles, reports, and surveys within our team. We have a lot of fun sharing within the team and it occurred to us that we should start sharing them with you, too! Net2 Recommends is a monthly series of news and blog posts from around the web that we found interesting or inspiring, mind-bending or opinion-changing, fun or just plain weird.

read more

September 18 2010


What I read today…

September 09 2010


What the BBC learned from using Crowdmap tool to cover tube strikes

On Tuesday, Journalism.co.uk reported that the BBC were using Ushahidi’s new Crowdmap technology to record and illustrate problems on the London Underground caused by the day’s tube strikes.

The BBC’s Claire Wardle has helpfully followed up on her experiences with a post on the College of Journalism website explaining how it went, what they changed and what they would like to do with the technology next time.

She explains the reasoning behind decisions taken throughout the day to amend their use of the platform, such as moving across to Open Street Map as a default mapping tool and the introduction of a time stamp at the start of each headline. She also provides some suggestions on how the platform could be improved in the future, including provisions for greater information outside of the map.

It would have be useful if there’d been a scrolling news bar at the top so we could have put out topline information which we knew everyone could see by just going to the map. Something like ‘the Circle Line is suspended’ or ‘the roads are really starting to build with traffic’ was very hard to map.

See the full post here…Similar Posts:

Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.
No Soup for you

Don't be the product, buy the product!

YES, I want to SOUP ●UP for ...