This guest post is a Q&A with Meredith Blair Pearlman, Evaluation and Learning Manager at the David and Lucile Packard Foundation. It is Part III of a series of posts on a collaboration between UN Global Pulse and the Packard Foundation. The collaboration emphasized the “learning by doing” process of designing a big data project, and tested the value of leveraging new sources of digital data to understand public perceptions.
Can you start by telling readers a little more about this project?
In 2014, the Packard Foundation’s Evaluation and Learning team made a grant to the UN Secretary-General’s Global Pulse initiative to explore two things: (1) how big data might be used to understand the environments we work in better, and (2) how big data might be used as a faster, cheaper, way to gather evaluative data. Our primary research question was: Is it possible to develop alternatives to traditional monitoring and evaluation (M&E) methods (e.g. surveys) for gaining evaluative insights into our philanthropic programmatic work?
We selected two projects to dig into: one that focused on understanding sentiment about biofuels as expressed on social media, and a second project that explored various ambient data sources to try and understand sentiment around the topic of child marriage. These projects were intended as an experiment to develop a “proof-of-concept” to demonstrate how data-driven, evidence-based solutions can be applied to real world challenges of tracking progress and completing evaluations. Our hope was that the findings of this exploration would showcase the basic concepts of big data application for monitoring, evaluation, and learning—demonstrating how new data sources and analytical techniques can provide insights on the activities of the Foundation.
Upon reflecting on thie experience, how would you define big data?
That’s a good question. What we learned through this project is that the term “big data” can be confusing as it doesn’t refer to working with large data sets. The simplest way to think of big data is data that we can’t crunch using the traditional M&E tools we typically have used. Big data analysis really does require the use of new data sources, like mobile or social media data, new tools, and a different analytical background.
What has been the main learning in terms of working on a big data project?
The two projects—and the exploration as a whole—were a constructive challenge for us. That is, prior to commissioning Global Pulse, our Evaluation and Learning team had little experience with, and understanding of, big data analysis techniques and possibilities. Now, we appreciate the complexity of such analysis, the amount of effort and time that it takes, and the importance of carefully scoping suitable projects as well as diligently managing them.
While our primary research question (Is it possible to develop alternatives to traditional M&E methods for gaining evaluative insights into our philanthropic programmatic work?) still hasn’t been answered, this exploration was the first step in a longer journey.
Q: Reflecting on these projects and on not ultimately answering your primary research question, what do you think made this so challenging?
There are a couple of things that made this particularly challenging:
(1) Communicating the value of a “not-yet proven methodology in social sector” to folks that are typically overstretched, especially our program staff, is just hard.
(2) We were exploring new data analysis methods (which takes time) and trying to build demand by quickly showing value…in the end it was hard to do both at the same time.
Q: So…what’s next? Are you planning to support other data innovation projects in the future?
Absolutely. We’re in the early stages of this work and as connectivity proliferates and big data analysis techniques are refined, these techniques will be included in the Evaluation & Learning team’s toolbox. In the meantime, we have a lot of work to do internally to demonstrate the added value of big data science for M&E and build a shared understanding of when to employ big data analysis.