top of page
Job Opportunities: Servicios

WORK WITH US

“It always seems impossible until it’s done.”
- Nelson Mandela

WhatsApp Image 2022-07-14 at 11.22.34 AM.jpeg

DATA ENGINEER

Service Subtitle

Scope of Work:

  • Research and assess the industry leading technical solutions for master data management and provide a technical recommendation on selected solution’ implementation in the UN data management environment. In close collaboration with the data architecture implementation team, the data catalogue development team, the API management platform deployment team, lead the deployment of the selected enterprise master data management solution for the UN Secretariat.

  • Participate in the establishment of master data management technical standards, guidelines and procedures, including but not limited in defining data domains, data modelling templates, API development requirements and/or templates.

  • Participate in developing training modules on master data management, metadata management, data quality and access management.

  • Provide assistance to the Team Leader in service management, focusing on automation of the IM service process and service requests analytics.


Competencies:

Bachelor's or higher degree, in Computer Science, Data Science, Information Science or a directly related field or its equivalent in education and/or work experience.


Experience:

  • At least 2 years of experience in information technology with a focus on data architecture, data modelling, data management, data analytics, deployment, configuration and use of master data management tools.

  • At least 1 year of experience working with well-established master data management tools such as Informattica MDM, Profisee, Microsoft MDS, SAP Master Data Governance, TIBCO EBX, Stibo STEP MDM, etc.

  • Ability to write computer code in well stablished languages such as R, Python, Java, SQL, C, C++.

  • Possesses knowledge of statistics, predictive modelling, data science, machine learning, text analytics.

  • Knowledge and experience with well-established data visulization tools such as Qlik Sense, Tableau or Microsoft Power BI.

  • Knowledge and experience with front end JavaScript frameworks such as React, Angular or Vue.

  • Knowledge and experience with developing User Interface (UI) and User Experience (UX) designs using web technologies (HTML/CSS/JavaScript) and open-source data visualization libraries for interactive chart building.

  • Knowledge and experience with Geo analytics and visual representation of Geographical data.

WhatsApp Image 2022-07-14 at 11.22.34 AM (1).jpeg

LIFECYCLE SPECIALIST

Data Governance / Data Management

Scope of Work:

  • Under the guidance of the IMT team leader, in collaboration with the Enterprise Solutions Service team, enhance the technical tool(s) for monitoring the compliance of data and information lifecycle management technical procedure, and establish an on-going compliance management mechanism.

  • Coordinating with major stakeholders, explore, research, propose, and implement an operational model for managing data and information lifecycle in the O365 environment, focusing on:

    • enterprise level technical governance and policy configuration in the O365 tenant;

    • template(s) and guidance to entity-level technical configuration and associated procedures;

  • Coordinating with major stakeholders, explore, research, propose, and implement data and information lifecycle management in other enterprise platforms.

  • Participate in developing training modules on data lifecycle management, data quality and access management.



Competencies:

Bachelor's or higher degree, in Computer Science, Data Science, Information Science or a directly related field or its equivalent in education and/or work experience.


Experience:

  • At least 2 years of experience in information technology with a focus on data architecture, data modelling, data management, data analytics, deployment, configuration and use of master data management tools.

  • At least 1 year of experience working with well-established master data management tools such as Informattica MDM, Profisee, Microsoft MDS, SAP Master Data Governance, TIBCO EBX, Stibo STEP MDM, etc.

  • Ability to write computer code in well stablished languages such as R, Python, Java, SQL, C, C++.

  • Possesses knowledge of statistics, predictive modelling, data science, machine learning, text analytics.

  • Knowledge and experience with well-established data visulization tools such as Qlik Sense, Tableau or Microsoft Power BI.

  • Knowledge and experience with front end JavaScript frameworks such as React, Angular or Vue.

  • Knowledge and experience with developing User Interface (UI) and User Experience (UX) designs using web technologies (HTML/CSS/JavaScript) and open-source data visualization libraries for interactive chart building.

  • Knowledge and experience with Geo analytics and visual representation of Geographical data.

AWS DEVOPS June 2022.jpeg

AWS DEVOPS

Remote

Location and Modality:

Offsite (available at least partly at New York business hours).

9 months engagement with possibility for extension.


The applicant should be an experienced DevOps specialist with a history of working in an agile environment.  Ideally, the applicant should have AWS experience in the Build, Release processes, Continuous Integration (CI), Continuous Delivery/Deployment (CD) for the  automated deployment of Microservices components using Docker Containers, AWS services and technologies (Cognito, IAM, ECS, KMS, Secret Manager, Lambda, ElasticSearch, SES, SNS, SQS, EC2, S3 and X-Ray ) and GitLab for version control to implement infrastructure as a code for e-deleGATE suite of applications. 


Tasks and deliverables:

  • Work with an agile team to implement process automation for MEAN platform apps

  • Implement the CI/CD pipelines using GitLab and AWS services

  • Automation of code builds and deployments incorporating automated QA gates

  • Implement process tracking and monitoring application performance in AWS cloud environment

  • DEV, QA and PROD automated deployment pipelines to match industry standards

  • Continuously monitor and record configuration changes of AWS resources.


 Required skills and experiences:

At least 3+ years of hands-on experience supporting application build and deployment in a AWS Cloud environment

Experience and deep understanding of core DevOps practices (Configuration Management, Continuous Integration, Infrastructure as Code, Continuous Delivery, Continuous Deployment, Continuous Monitoring)

Experience and deep understanding of DevOps integration with automated testing and/or automation deployment verification

Experience and expertise in any of scripting languages and databases: Python, Terraform and MongoDB, Redis

Knowledge/experience with AWS services – EC2, ECS, ELB/ALB, Security Groups, Route53, IAM, KMS, Cognito, Secrets Manager, Parameter Store, Lambda, SES, SNS, SQS

Familiar with various Branching strategies, pull request concept and GIT commands and Bash

Knowledge/experience of Microservices architecture, Docker container implementation.


English language proficiency (oral and written)

 

Academic qualifications:

Bachelor’s degree, Master’s degree is a plus.

DevOps certifications from Industry leading institutions is a plus.

Azure Data Engineer June 2022.jpeg

AZURE DATA ENGINEER

Remote

Key Tasks:

  • Design, build and maintain data integration solutions and data management architecture.

  • Build infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.

  • Collaborate with Giga application teams in building data pipelines and solution architecture meeting the required data processing needs. Have objective and vision to create a truly open-source model for distributing the solution.

  • Be responsible for implementing and ultimately expanding Giga’s MVP system components so that subsystems function according to their solution architecture definitions

  • Under the guidance of database administrators build and implement HA and DR for enterprise database systems using Failover Clustered Instances, Availability Groups, Multi-Site clusters and Replication.

  • Assist in housekeeping activities of database systems including building data pipelines and tools to reduce barriers, decrease manual testing and speed up delivery of data science as well as customer-facing products such as https://projectconnect.unicef.org/about.

  • Follow established environmental change control processes for promoting changes from development to production, providing the Giga team with guidance to ensure code can be scaled., standardized and whenever possible, automated

  • Own issue solutions, root cause (Tier 3) analysis & problem solving

  • Document processes, procedures for completed solution to ensure continuity. Provide recommendations on near and long-term plans to improve efficiency and performance across Giga data science and product.


The minimum qualifications required are:

  • University degree in Computer Science, Information systems management or related field.

  • 5+ years of hands-on experience in software development, data engineering or data science field. developing and deploying integration solutions, with concrete understanding of core Microsoft Azure Integration Services to connect a mixture of on-premises, SaaS, and cloud-hosted applications. Experience supporting database and application architecture for deployment in hybrid cloud environments is a plus

  • 2+ years of experience in Python programming and developing solution using Apache airflow, as well as serverless, low-code Azure Integration offerings including Azure Logic Apps, Azure Data Factory, and Azure Functions.

  • 3+ year's experience with SDLC/Agile methodologies and release management

  • Experience configuring MySQL, Mongo, PostgreSQL and/or Microsoft SQL Database systems.

  • Proficient in SQL and NoSQL Database systems.

  • CI/CD process leveraging ARM Templates and Azure DevOps, including git, DevOps pipelines and migrations from development to production environments.

  • Deep understanding of large-scale, distributed systems.

  • Experience with Big Data Technologies (Hadoop, Spark…).

  • Good knowledge of Data formats as well as structured, unstructured storage, and orchestration tools in Docker Swarm or Kubernetes

  • Strong understanding of open API standards particularly, REST, SOAP and GraphQL, as well as ability to research develop, deploy and integrate with custom internal and third-party APIs. Knowledge of JSON, REST, and XML a plus.

  • Experience and comfort working and collaborating with highly cross-functional teams. Must have excellent communication skills, both written and verbal

  • Language Requirements: Fluency in English is required. Additional UN languages will be considered an asset.

Shrepont junio 2022.jpeg

O365 POWER APPS SPECIALIST

Remote

Location and Modality: Offsite (available at least half day during New York business hours). 12 months contractor engagement with possibility for extension. 
 
The contractor position is located in the Business Analysis Section (BAS) of the Department for General Assembly and Conference Management. The contractor will report to a Team Lead in BAS. 
 
The applicant should be an experienced O365 Specialist who can work with business users to gather their requirements and then architect, design and develop low code applications that utilize and extend the Microsoft SharePoint/O365 platform for document management and office workflow applications.   Coordinated by a Team Lead, the specialist is expected to work mostly independently with stakeholders on the required solutions. 
 
Tasks and deliverables:  Work with business users in the department to gather and document their needs for small to medium sized office applications supporting basic document management and office workflows in the department.  Create low code applications using O365, MS Power Apps and MS Power Automate  Guide end users in testing the applications and refine them in an agile manner with a focus on business value  For larger applications, work with the QA team on the creation of automated test  Work with other developers in the section on integrations with enterprise solutions as needed  Maintain knowledgebase documentation in SharePoint  
 
Quality Assurance:  All tasks will be assigned in a Kanban type process by the Team Lead, with progress being tracked in weekly meetings (morning hours EST time zone)  Tasks will be considered completed after sign-off by Business Owner or Technical Lead 
 
Required skills and experiences:  3+ years of experience in design and development of Power Apps/Power Automate/O365 solutions  Experience in business analysis and visualizing / documenting user requirements  Experience developing complex custom workflows on O365 with Power Apps and Power Automate  Experience in SharePoint site page customizations with JSON and CSS   Experience with PowerBI   Experience with Azure integrations  Very good interpersonal skills to work with business users and distributed developer teams in an agile environment  Experience with test automation using Selenium would be a plus  
 
English language proficiency (oral and written) 
 
Qualifications: Bachelor degree, Master degree is a plus Relevant Microsoft certifications desirable 

bottom of page