Scaling Big Data Applications for Good - Global Pulse at the 2018 UN World Data Forum
UN Global Pulse hosted three sessions at the World Data Forum, including a plenary on how to scale big data applications for the SDGs, gathering more than 1,000 participants.
Representatives from government, UN agencies, the private sector, civil society, academia, and experts in statistics and data innovation gathered last week in Dubai, the United Arab Emirates. The UN World Data Forum, only the second ever held, was established following recommendations in the “A World That Counts” report and is now a biennial event. The level of participation at the forum this year showed a growing interest in big data by public officials and various other stakeholders, reinforcing the recognition that new data sources can accelerate the achievement of the 2030 Agenda.
To advance the conversations started during the first World Data Forum, UN Global Pulse hosted a plenary session on: “Big Data for Sustainable Development: What Does it Take to Get to the Next Level?”
Robert Kirkpatrick, Director of UN Global Pulse, in his opening remarks pointed out that the world has changed with regard to its need for and use of data. “The Member States of the United Nations set ambitious sustainable development goals that do not allow trade-offs between prosperity, the health of the planet, and social progress. We need big data to reach these goals and to transform society. However, this data is largely produced by people, often without their knowledge, collected by machines, and owned by the private sector,” he said.
Kirkpatrick also pointed out that there is a general sense that the ‘data revolution’ has been delayed when compared to expert opinion just a few years ago. In his view, one of the key points in addressing this is the need to build and rebuild trust among the public when it comes to data, in particular its collection, security and privacy.
The Pulse Labs introduced examples of tools that were scaled up through national systems. Derval Usher, manager of Pulse Lab Jakarta, explained how Haze Gazer, the crisis analysis and visualization tool the Lab developed to inform official policy on issues around forest and peatland fires, was adopted and installed in the situation room of the President of Indonesia. Building on the tool, the Lab is now refining CycloMon, a cyclone monitoring platform that ingests social media data and combines it with weather data, which can be adapted at the global level.
Martin Mubangizi, Data Scientist at Pulse Lab Kampala, emphasised the importance of scaling up big data projects to achieve the SDGs in Africa. He shared examples from Uganda where the team created speech-recognition technology for languages, starting with Ugandan-accented English, Acholi and Luganda, that weren't served by any existing programs. “We now have a prototype that works— it can listen to the radio streams and flag when a relevant topic comes up. And it is doing so in communities where we traditionally got very sporadic and unreliable data,” he highlighted.
These presentations were followed by an all-female panel discussion and Q&A featuring Paula Hidalgo Sanchis, manager of Pulse Lab Kampala; Heather Savory, Director-General for Data Capability, Office for National Statistics, UK; Jeanine Vos, Head of SDG Accelerator, GSMA; and Elena Alfaro, Global Head of Data & Open Innovation, BBVA.
The panellists discussed opportunities and pitfalls of scaling big data applications for sustainable development and shared a number of key lessons, which were echoed by earlier remarks:
- It is important to identify champions within the public sector to facilitate public adoption of innovative tools and methodologies. Political will is key to adoption.
- Capacity building of local government officials is crucial to scaling up innovations and requires more specific training modules on how to contextualise and use new tools and methodologies at national and sub national levels of government.
- New tools and solutions should always be designed with the user in mind. There is no point in creating a snazzy dashboard using data science wizardry if it will not be used.
- Flexible core funding is an advantage when it comes to experimentation as are dedicated funding streams for scaling up proven big data methodologies and technologies.
- Moving from proof-of-concept to scaled projects can only be achieved through co-operation and multistakeholder partnerships and by avoiding duplication of efforts.
- Trust building and awareness raising need to take place, as both are crucial to overcome challenges early on and to move towards operationalising big data applications.
- Responsible use of big data requires data privacy, protection and ethical frameworks for the anonymisation, sharing and usage of the most relevant data sets.
- Regulations must ensure that while they protect against the risk of misuse, they also address the real harms that can occur from missed use, that is with leaving crucial data resources untapped in the global fight against famine, plague and war.
- Data innovation should be available to all and its use for achieving sustainable development goals be made a priority.
Kirkpatrick concluded the session by comparing the guiding principle that should apply to the future work of data scientists and policy makers to the Hippocratic oath. In other words, the main principle should be: do no harm; this includes avoiding harmful practices and actively preventing potential future harm.
The World Data Forum came to a close with the adoption of the Dubai Declaration to increase financing for better data and statistics for sustainable development. The Declaration specifically acknowledged the need to leverage the power of new data sources and technologies to achieve the 2030 Agenda for Sustainable Development.