The Center for Forced Migration Studies (CFMS) at the Northwestern University (NU) launched an Ushahidi map to monitor the recent Presidential elections in the Democratic Republic of Congo. Early last November, Dr. Galya Ruffer, Director of the CFMS who spearheaded efforts on the DRC elections map, requested support from the SBTF for a small team to help with Media Monitoring tasks. About 6 volunteers were initially handpicked for this side-deployment of the SBTF to monitor media sources and to identify relevant information to be reported on the map.
On November 28th, election day in Congo, the skype chat was set up and work started on the deployment. Liz Casano, student coordinator at the Northwestern University, shared the overall plan for SBTF support and the guidelines for media monitoring. Veteran SBTF Media Monitoring coordintators Leesa Astrado and Christina Kraich Rogers helped coordinate tasks and guide all volunteers on the Media Monitoring expectations. For security reasons, it was decided that the SBTF would create reports using the Ushahidi web-form and not by logging on to the platform. Given this and the small scale of the deployment, it didn’t make sense to have additional teams for Geo-location and Approval. For the DRC map, identified volunteers in the Media Monitoring team also doubled up as “geo-locators”. Approval of reports was handled by the NU.
After about two weeks on the mainstream media reports, the deployment’s focus shifted to the incoming twitter feed on the system. The SBTF volunteers sorted through about 20,000 tweets to identify relevant messages to be reported. Special thanks to Media Monitoring coordinator Estella Reed for her invaluable help in structuring efforts on the twitter feed!
A few days into the deployment, we realized that we had miscalculated efforts required for the map and that we needed more volunteers. The following reasons can be attributed to this:
1. The initial size of the Media Monitoring team, agreed upon by the NU and the SBTF, was more a random assessment and not calculated based on any factors like the extent of media coverage of the elections, the likelihood/ frequency of relevant reports incoming or the minimum number of reports required by the NU for analysis. Consequently, we had to spike up the team size as we gained more clarity on the above factors.
2. A lot of the incoming reports were in French and this was something we had not factored in while building the initial team. At the start of the deployment, there was only one French-speaking SBTF volunteer. This considerably slowed down efforts on monitoring media reports.
3. Considering that the deployment was scheduled in December, right before the start of the holidays season, volunteers were understandably pressed for time. Keeping this in mind, at least 4-5 more volunteers should have been added to the initial team.
4. The twitter tasks, which were introduced about two weeks after the deployment, necessitated inclusion of more volunteers in the team.
By mid-December, we had several additional volunteers help with the map, including quite a few French speakers as well. Many thanks to SBTF members Joseph Richard Pollack for his help in reaching out to interested alumni at the Georgetown University and Dr. Colette Mazzucelli, who was instrumental in having a small team from the Montreal Institute for Genocide and Human Studies (MIGS), Canada participate. Overall, about 40 volunteers, including about 25 SBTF members, helped with the map at different phases through the deployment. While the SBTF mostly weaned efforts off the map around the first week of January, a handful of volunteers are helping clear the final 100-odd French messages on the Twitter feed.
We’d like to thank Dr. Galya Ruffer and Liz Casano of the Northwestern University who were easy to work with and always around to respond to our queries. Liz Casono’s frequent presence in chat made it easy for volunteers to directly chat with the deploying organization. (Please see Dr. Ruffer’s post on the Ushahidi blog for a detailed description of this deployment and events on the ground in Congo during the elections)
Monitoring mainstream media and sorting through 20,000 tweets would not have been possible without our amazingly dedicated volunteers who stepped in to help even during the peak holiday season! A big thank you to the following SBTF volunteers: Virginia Brussa, Eliana Zemmer, Fiona Gideon Achi, Fairuz Alfadia, Ajibola Oseni, Luciana Torrioni, Ellen Kay Endriss, Fiifi Baidoo, Brian Quinn, Michelle Steffens, Lauriane Bisbort, Bruno Pilogret, Raphael Githinji, I. K. Oyekanmi, Neil Catford, Oludotun Babayemi, Kerstin Reisdorf, Patrick Keenan, Ali Bn Kalifah and Thaís Pinheiro.
Following are key observations and lessons learned from this deployment:
Setting expectations with deploying organizations: A clear understanding of the organization’s exact requirements and expectations from the map will help us to better estimate efforts and mobilize the required number of volunteers. A key point to keep in mind during the initial communication is that we need to ask the right questions to the deploying organizations, in a language they understand and devoid of any mapping parlance, in order to accurately understand what they are looking for. Otherwise, we run the risk of either grossly over-estimating or under-estimating the effort expected from the SBTF. In some ways, Media Monitoring lays the foundation over which other teams base their efforts on. So, if the size of the Media Monitoring team exactly fits the needs, then all other tasks/ teams will automatically fall in place. As mentioned above, we didn’t do this very well in the initial stages of the DRC map, while estimating the size of the team. I think that instead of simply asking the deploying organization if x number of volunteers would suffice for the effort, which is what we did initially, we should have asked questions of the following nature:
i) How/ when will the information sourced from these media sources be used?
If the organization needs the information right away to be sent to first responders, then we need more media monitors, across multiple time zones, to ensure relevant information gets pulled in as and when it trickles in. If the organization needs it for a later analysis purpose, like in the case of the DRC map, media monitors have more flexibility and time to browse through media sources.
ii) How would you describe the extent of media coverage for this event? (In other words, ease of finding relevant information for this map)
a) Very frequent and regular
c) Very sporadic
iii) Is the event likely to be featured on International media as well or only on regional media?
iv) What is the likely primary language for most media reports?
v) Is the event likely to garner a lot of attention/discussion/reportage/on social media sources?
A yes to this question would mean that more volunteers are required as social media reports are more difficult to source than pre-determined mainstream media sources.
vi) How exhaustive would you like the SBTF’s report coverage for the map to be?
a) 100% (report all relevant information)
b) Would suffice if the important/ significant messages are tracked
c) Whatever the team can get done within the duration of the deployment
Hopefully, questions like the above will give us a better idea of the effort required and the amount of volunteer participation required.
Distribution of tasks: One observation from the DRC map was that people are more likely to participate if tasks are distributed and assigned based on volunteer’s availability. On the Congo map, we noticed that tasks like browsing media sources to identify relevant news items and creating new reports garnered lesser interest as opposed to tasks like deleting irrelevant tweets and RTs. This is not indicative that any one task is more interesting than the other but simply that volunteers whose bandwidth is very restricted, yet still want to help in some way, are more likely to want to involve in tasks that require very little time (and thought). This was a very important lesson we learnt. Take, for instance, volunteer X, who can spend only about 10 minutes every day on a particular deployment. If he were assigned the task of verifying reports, which could sometimes stretch to 30-40 minutes per report, he’s bound to drop out. Instead, if there are a few tasks requiring less effort that he can complete within the 10 minutes he has to spare, he’s bound to show up everyday! A suggestion for future deployments would be to provide all volunteers the effort estimation for each task as well, so that they have the option to choose tasks based on effort required.
Orientation for volunteers before start of deployment: Researching media sources for relevant information during times of natural disasters is pretty straight-forward in that there’s not much by way of contextual background that volunteers need to have. However, for conflicts and events like elections monitoring, some background to the situation helps volunteers work faster and more efficiently. For instance, in an elections monitoring map like Congo, knowing who the candidates are and a basic idea of the story leading up to the elections would be extremely helpful to a media monitor as he/she tries to identify news reports that are relevant for the map. Not every volunteer who steps in to map will have this knowledge and it would be very helpful if volunteers are provided at least a basic overview of the context, the various actors involved, etc, right at the start of the deployment. Otherwise, simply too much time is spent by volunteers in trying to research this information themselves and to understand the context. We believe that for future deployments, it would be helpful to request deploying organizations to either conduct a short orientation or to provide an overview video/ presentation deck that can be circulated to all incoming volunteers to help them understand the story leading up to the map .