Scaling and Development Intern

Scaling and Development Intern 2
  • Intern/Fellow
  • New York, USA
  • Applications have closed

Website unglobalpulse UN Global Pulse

Deadline: 11 August 2020

Purpose and Scope of Assignment

Under the guidance and direct supervision of the Data Innovation Strategist, the intern will have a double focus on project and product coordination, and software development:

  • Assist in the product management of Global Pulse tools, including the coordination of user and feature requests, creating product roadmaps, documenting requirements, and creating scaling plans.
  • Assist in the project management of software development, including setting up, maintaining, and updating project management tools, project planning, task management, and resource allocation.
  • Assist in the development and testing of software tools to extract, process and analyze data sets from a variety of sources.
  • Assist in the development of machine learning pipelines to support the creation of insights in Global Pulse’s software tools.
  • Map functionalities and code bases of UN Global Pulse’s in-house tools and similar open-source efforts, and support the creation of any integration and merging plans.
  • Train users, create help videos and other documentation aimed at increased adoption, and document processes and workflows related to the use and maintenance of big data software platforms.

Qualifications and Experience


Applicants must be currently enrolled in a university degree program (or the equivalent) or have graduated from such studies within three months prior to the application date of the internship. Areas of study include Computer Science, Software Engineering, Computer Engineering, Electrical Engineering, Telecommunications Engineering, Data Science and other related disciplines.


  • Knowledge of big data, software engineering, architecture design, and application development is highly desired.
  • Knowledge of project management processes and tools is an asset.
  • Familiarity with deep learning, NLP/NLU, and computer vision (PyTorch, Keras/Tensorflow, sci-kit learn, NLTK, spaCy) is an asset.
  • Familiarity with database performance monitoring and optimization (DynamoDB, MongoDB, PostgreSQL, MySQL) is an asset.
  • Knowledge of scripting and coding (Python, Node.JS/JavaScript, Java, C++) for data access and analysis is an asset.
  • Familiarity with Unix-like command-line based tools and workflows is an asset.
  • Knowledge of interactive data visualization development and information design knowledge such as working knowledge of D3.js, leaflet.js, Mapbox or similar libraries and tools is an asset.
  • Familiarity with repositories, wikis and creating documentation.
  • Ability to maintain clear and concise documentation, files and notes.
Scroll to Top