The information in this wiki hasn't been maintained for a good while. Some of the projects described have since been deprecated.

In particular, the "Ushahidi Platform v3.x" section contains information that is often misleading. Many details about this version of Platform have changed since.

This website is an extraction of the original Ushahidi wiki into a static form. Because of that, functions like logging in, commenting or searching will not work.

For more documentation, please refer to

Skip to end of metadata
Go to start of metadata

A Crowdsourcing Toolkit for Journalists

Good crowdsourcing projects are 10% technology, 90% logistics. In 2011, Ushahidi worked with members of the Harvard Humanitarian Initiative (HHI) to develop three toolkits to help our users start to consider what those outreach and deployment plans should look like, available to all at Ushahidi Toolkits Using their model as a starting point, we’ll brainstorm ideas on creating a open toolkit focused on supporting journalists working with not just Ushahidi but crowdsourced data in  general.  The end result we’ll make available to our community for sharing and  further collaboration on this page

For: Journalists, Writers, Content creators, Project managers

Difficulty: Medium, subject matter experts requested (developers and journalists)

Status: This has not been done yet and would be great to have for the next Data Journalism handbook.

Requirements:None to contribute; an account with to officially record
Directions: See discussion questions about brainstorming new ideas, available at

Draft Table contents

  • Tool overview
  • Which tool for which purpose?
  • Resources to get you started (links to relevant wiki pages, existing or to be created)
  • User Models
  • Planning your project
  • Verification Techniques
  • Security notes
  • Privacy notes
  • Analytics and Reports
  • Outreach

Questions for the Crowdsourcing Toolkit for Journalists

Our goal:

Last year, Ushahidi worked with Harvard University to create three toolkits for deployments, the first tangible products for our users to have when considering the before, during, and after of their own work and how to be as successful as possible in their work.  These toolkits, while helpful, are intentionally general to fit the broadest possible context, country to country and situation to situation.

Ushahidi would now like to work with its community and media professionals around the world to develop a second set of toolkits specific to different types of uses.  In this case, a toolkits by journalists and for journalists to help approach crowdsourcing.  This toolkit will serve to inform citizen journalists and technologists to standard journalism standards, verification requirements, and what it will take to make sure what is captured is clear and understandable for larger media sources to review and craft better stories.

Current issues:

Ushahidi, as a company and a platform, has made strides in not only collecting crowdsourced data from mobile and social channels but curating this data for everyone from emergency personnel, government actors, and media sources large and small.  However, the data that is usually published is unverified and requires a lot of additional research to be "usable" by journalists.  In most cases this effort is not possible under tight deadlines, or simply better sources are also available, making these public contributions of interest but largely unusable.


  1. From what you've seen about the Ushahidi platform or from your experience is trying to accurately sourcing social media channels (Twitter, Foursquare, YouTube, etc.), what are the immediate challenges from using data?  Accuracy?  Verification?  Not enough information provided?  What should users and technologists understand from a journalist's perspective?
  2. Data from SMS, Twitter, Facebook, or other social channels usually requires a lot of work before it is published -- categorization, location, translation.  What should the priorities be for presenting this data and can we do a better job collecting it? Would you, as journalists, writers, and content creators, prefer a structured approach?
  3. Verification is a big issue.  On average, 90% of data in crowdsourced deployments remains unverified.  Can you share your verification process and requirements and how citizen journalists/technologists can adopt some of these practices?
  4. Open for comments! Around the world, citizen journalists have increased access to mobile technology, new and innovative web technology, and social media -- but not always the knowledge and approach of trained, professional journalists.  What advice can you lend these users in using these approaches to focus on not just the collection and curation of data but the craft of telling a thorough and engaging story?