Blog

Open UN Keynote at Social Media Week

Robert Kirkpatrick
Feb 10, 2011

by Robert Kirkpatrick


Global Pulse Director, Robert Kirkpatrick.
Photo credit: Sarah Shatz

I have begun to suspect that the 21st Century’s take on the oft-cited Chinese curse is something along the lines of, “May you have to figure out how to keep your organization running halfway through a massive global paradigm shift.”

Good morning.  My name is Robert Kirkpatrick.  I direct the Global Pulse initiative in the Secretary-General’s Office. I’d like to express my thanks to Social Media Week for giving us the opportunity to hold these discussions here today, to our sponsors the UN Foundation and the Vodaphone Foundation for their generous support, to Helsinki Group for their hard work in bringing all the pieces together, to PSFK for their compelling report, to David Brancaccio for agreeing to moderate, and to the talented panelists who will help us explore what it means for institutions such as the UN to engage in the age of real-time.  And thank you, all of you, for joining us here today.

I’d like to begin with a bit of background on the journey that brought me here today.  I’ve spent the past 15 years designing and developing technologies to help teams work more effectively.  For the past decade, I’ve had the privilege of supporting relief and development organizations in their responses to everything from natural disasters and conflicts to food shortages and outbreaks of infectious diseases.  I was Chief Technology Officer of an NGO dedicated to building capacity for grassroots innovation, and I’ve led software development for two private-sector innovation teams focused on improving collaboration during emergencies.  I’ve done fieldwork in Iraq, Afghanistan, New Orleans, Pakistan, and Cambodia.   I’ve seen first hand what works, what doesn’t, and why, in harsh and demanding environments where both technologies and people perform poorly, where information overload never relents, and where the cost of delay may be measured in human lives.

In humanitarian relief, certainly, where the standard tool for coordination is a two-way radio and every second counts, anything less than real-time is unacceptable. 
The problem with two-way radios, though, is that the communities you are trying to help usually don’t have them.  Affected communities have local knowledge, useful expertise, and time-sensitive information responders need to target assistance, and citizens, too, need information to participate effectively in the response.  Without a common platform, however, they are often left out of the loop in real-time coordination on the ground.  This problem has dogged relief work for years.

The response to the tragic earthquake in Haiti, however, showed us the first glimpse of what might be possible in the years ahead.  To be clear, much about the Haiti response did not go well at all.  Yet by wiring up SMS text messaging to Web-based services, volunteer technologists created a proof-of-concept for disaster coordination in real-time over the first common platform – a hybrid of mobile phones and the Web – that linked government, first responders, affected communities, and volunteers around the world.  

It is now possible to imagine a world in which institutions with a mandate to respond to emergencies engage openly, directly, and in real time, with both affected communities and communities of practice around the world who are eager to help. Governments and institutions such as the UN are now immersed in discussions over what kinds of tools, policies, processes and organizational structures will be needed to enable them to join forces effectively with a self-organizing swarm of volunteers. 
The UN Foundation and the Vodaphone Foundation are partnering with the UN Office for Coordination of Humanitarian Affairs and the Harvard Humanitarian Initiative on an eagerly awaited report on these very issues.  The first of our panel discussions today will explore some of the changes we are seeing in how institutions engage with communities in an operational context, and hopefully give us a sense of where things might be heading. 

Yet while the disaster relief moves into the world of real-time community engagement, global development remains mired in slow time.  It took the largest financial crisis in human history for us to realize just how badly we need real-time data.  When the crisis hit in 2008, world leaders needed to make policy decisions quickly to protect their most vulnerable populations from shocks, and they needed to base their decisions on hard evidence about the impact the crisis was already having at the household level.  But what they found was a massive information gap: the only hard evidence on how these populations were doing were statistics that were 2-5 years out of date.

And there’s the problem. Statistics are essentially a form of historical record keeping.  They simply take too long to collect and validate to be useful in a crisis.  In the next few years, as the statistical evidence from 2008 and 2009 begins to trickle in, we will piece together our first accurate picture of what actually happened to vulnerable populations in many developing countries.  We may learn for the first time, for example, that parents in a hard-hit community in 2008 began pulling their kids out of school to work in the market.  Finding this out in 2011, though, is years too late to prevent harm that will last a lifetime.   Few if any of these children will ever return to school.

But what if world leaders had been able, in 2008, to discover within a matter of weeks that this was happening?  There would still have been time for them to take steps to provide assistance to families that would keep those kids in school.  It was this tantalizing alternative history that drew me to Global Pulse last year.  Global Pulse was created to develop a global system capable of detecting – in real-time -- the early impacts of slow-moving crises. For me, it represents an extraordinary opportunity to work within a global institution to help bring development into the world of real-time that the disaster relief community takes for granted.

So let’s talk about what it would take to close the gap.  So much of what we do today, we do online, whether through computers or mobile phones.  We search for information, buy and sell goods, do our banking, coordinate with-coworkers and share our experiences with networks of friends, wherever we are, whenever we wish, in real-time.  As we do so, we are generating vast quantities of what has come to be called “data exhaust” – the ambient information that is produced – largely for free – as a by-product of our going about our daily lives. 

People in developing countries, too, are generating data exhaust. Mobile phone coverage in the developing world is expanding, and governments, agencies and the private-sector are providing communities with phone-based access to services such as banking, online commerce, healthcare, agricultural information, and job hunting.    Increasingly, members of the same vulnerable communities Global Pulse seeks to protect are generating data exhaust through their use of services. 

This kind of information is being produced in real-time across the face of the globe, and the cost of generating it is rapidly approaching zero, as is the cost of storing it.  Yet while corporations use this information to better understand their customers, identify emerging markets, and make investment decisions, governments and institutions such as the UN aren’t.  We are practically swimming in real-time data, and it isn’t being used for development.

It is well known that that as populations begin to feel the effects of slow-moving crises, their collective behavior changes.  At Global Pulse, our working hypothesis is that these changes extend to how affected populations use private sector services through mobile phones, and to how they participate in programs run by governments, agencies and NGOs.  You’ve seen Google’s Flu Trends, the interesting things people are doing with Twitter data, and the other projects described in the Future of Real-Time.  We believe it should be possible to develop the capacity to mine data exhaust being generated by vulnerable populations to detect the characteristic signatures of those changes in collective behavior that first show up in the early days of a slow-onset crisis. In essence, we propose to use community services as human sensor networks for crisis monitoring.  Who knows? One day, perhaps, entrepreneurs will even design commercial services with this dual purpose in mind.

Let me give you an example of what we’re thinking about here. There’s a service in Bangladesh called CellBazaar where 20 million people use mobile phones linked to bank accounts to buy and sell everything from cows to sacks of rice to household items. When the crisis hit three years ago, we’re curious:  what changed?  Did people begin to sell items they don’t normally sell, or for lower or higher prices?  Were there corresponding anomalies in data exhaust generated by mobile banking, money transfers, farmers hotlines, health hotlines, or other services? If we’d known what to look for in 2008, if we’d been able to detect these early anomalies, teams could have been sent to the communities to investigate, government could have responded more quickly, and perhaps a lot more kids would still be in school.  By analyzing this kind of data from past crises, we hope to learn how to protect populations more effectively in the future.

So that’s our mission.  In developing our approach, we realized early on that a purely top-down approach wasn’t going to work.  We must harness the forces – and the tools -- of self-organization and build this global system from the bottom up. I’m sure that UN probably isn’t the first organization most of you would traditionally turn to for high-tech innovation.  Yet as we see it, the UN in the 21st Century is actually in remarkable position, with its global reach, development expertise, and convening power, to fulfill its charter a new way.  We have massive potential to serve as a catalyst for the grassroots technical innovation needed to make Global Pulse a reality.   

The challenges before us, however, are humbling, and we need your help.  This year, as we grapple with how to bring global development into the world of real-time big data, we are actively exploring ways to engage with new partners, in new ways, using new tools.

Private sector data exhaust clearly has a critical role to play in helping us fulfill our mission to protect vulnerable communities, and we need to figure out how to make it possible for corporations to make it available for development.  If we can develop a robust framework for using anonymized and aggregated data exhaust in ways that protect both individual privacy and intellectual property, imagine the possibilities!  We envision a world in which openly sharing rich sets of real-time data is widely recognized as a high-impact form of corporate social responsibility.  Last week in Davos, some of you may have noticed that the phrase “data philanthropy” spontaneously emerged in a conversation with the World Economic Forum.  I find it rather catchy, but then, I’m biased.  Global Pulse needs data philanthropy to become a reality.  To this end, the coming months, we’ll be working to identify a leadership circle of public and private sector partners to work with us to take this idea forward. We’d love your ideas on who we should be working with.

Emerging technologies, too, have a role to play in Global Pulse.  As we begin developing our technology platform for analysis of real-time data, we’ll be doing so in close collaboration with the open source software community.  Now, open-source developers and UN development experts might not at first glance appear to have much in common.  Yet open source grew out of the hacker movement of the 1970s, and the UN of today still practices a grassroots approach known as participatory development that also appeared in the 1970s. Both movements spring from common ideological foundations:  the belief that when communities are involved in the process of solving their own problems, they gain a sense of ownership, and the solutions that emerge are more likely to be appropriate, effective, and sustainable. 

As we develop the reference architecture for our technology platform, we’ll be hosting, and participating in, bar camps, hack-a-thons, and code sprints around the world to move us to implementation.  Longer term, we’ll be exploring how we might work with our agency partners to establish a sustainable mechanism for harnessing the creative energies of talented technologists around the world with a passion for social impact and a willingness to volunteer their nights and weekends to build better tools for relief, development, peace and human rights.  Again, we’d be grateful for your ideas and suggestions, and if you’d like to get involved in building our platform, please let us know.

So we are experimenting with new approaches, new partnerships, and new tools.  Yet as we started to work through the operational implications of the Global Pulse mission last year, we realized that the only way to to obtain – and maintain – access to reliable streams of real-time information on the wellbeing of vulnerable populations, we must engage directly with networks of individuals on a massive scale. IN this fast-changing world, the UN must continue to change – and evolve – as an organization in order to be able to continue to fulfill our mandate.

Information is power, and in the information age, information may increasingly be purchased only with information.  As we move into the world of real-time engagement, the emerging culture is one of reciprocity:  if institutions want useful information from communities, they must provide them with useful information in return – and quickly.  At some point, this exchange becomes an interactive feedback loop, and then blurs into something truly participatory.  It becomes collaborative problem solving in a virtual team, and coordination between an institution and various communities on the outside.  Here, organizational boundaries and institutional mandates begin to blur as well. 

We are just beginning to come to terms with what will likely come to be recognized as one of the defining challenges of the decade. The rules of the game are changing in ways that are exciting but also deeply unsettling for institutions. The high-latency way of doing business is a thing of the past. Now is king. And if you want it real-time, you have a responsibility to reciprocate – indeed to interact – in kind.  Sustainable access to the kind of real-time information institutions increasingly need to fulfill their mandates requires open engagement, and the technologies they have to use are those that communities are decentralized, social, peer-to-peer tools designed for grassroots empowerment.

Citizens are playing an ever-increasing role in helping institutions fulfill their mandates, but both roles and balance of power are shifting. The new technologies powering much of this transformation are profoundly disruptive, not only for the open and direct forms of engagement that they allow, but also because institutions now recognize that citizens can use these tool to hold them accountable real-time as well.  These evolutionary pressures – and the fitness landscape that seems to be emerging  -- will be the topic of our second panel discussion today. 

This particular paradigm shift isn’t something to be afraid of, and at any rate, it’s coming whether like it or not. In fact, it’s already arrived, and it’s changing facts on the ground.  For institutions such as the UN to remain relevant in the 21st century, we need to begin adapting to this new reality, and indeed we already are.  Today is part of that process. You are part of that process.  Thank you for joining us here today.  Now let’s get started!

Add comment