Join Our Team as a Cloud Data Engineer
Are you passionate about working in a dynamic data team using state-of-the-art cloud technology? Do you thrive on creating efficient data pipelines and elevating data platforms to the next level? If so, the role of Cloud Data Engineer at NPO is perfect for you!
What You'll Do
At NPO, we have a public service mission where data usage is crucial for both steering and accountability. As a Cloud Data Engineer, you will play a pivotal role in the Datahub team, providing data insights that inform program impact and government reporting. You will be the technical expert in our Google Cloud Platform (GCP) environment, responsible for data integrations, database setups, and designing robust data models.
Your role will involve:
- Designing and implementing new ETL (or ELT) processes as microservices in Golang.
- Developing and enhancing existing data systems to centralize data in Google BigQuery.
- Monitoring and validating data integration processes.
- Designing and developing data models for analysis purposes.
- Optimizing queries, views, and jobs for use in Looker and Power BI.
- Maintaining the structure and dependencies of the data platform in relation to its users.
Your Profile
- HBO work and thinking level.
- Experience with statically typed languages, preferably Golang.
- Experience with scripting languages like Python or JavaScript.
- Hands-on experience with cloud platforms, preferably GCP.
- Experience with deployment and provisioning automation tools, Containers/Docker, Terraform, CloudFormation, etc.
- Experience in building microservices in a cloud environment.
- Experience with scalable data processing in a big data environment.
- Some DevOps experience in the cloud is a plus.
Why Join Us?
- Work in a team that values collaboration and knowledge sharing.
- Enjoy a flexible work environment with a 36-hour workweek.
- Benefit from a 6% year-end bonus and 200 vacation hours.
- Access to the NPO Academy for continuous learning and development.
About NPO
NPO is committed to making a societal impact through our media offerings. We believe in an inclusive organization that leverages diversity to achieve better results. We invite candidates who can contribute to this diversity to apply.
Application Process
Interested in joining us? We aim for a swift recruitment process. Based on your CV, you may be invited for an interview, followed by a small test. Successful candidates will receive an offer promptly.
NPO is open to everyone. We look forward to your application!
Benefits Extracted with AI
- 36-hour workweek with flexible hours
- 6% year-end bonus
- 200 vacation hours
- Access to NPO Academy for further development
Similar jobs
Last update: 23 minutes ago
Cloud Engineer with AWS and Kubernetes
Join NPO as a Cloud Engineer to manage AWS and Kubernetes infrastructure, ensuring high availability and scalability.
Data Platform Engineer
Join RTL Nederland as a Data Platform Engineer to manage cloud platforms and develop data solutions.
Medior Data Engineer
Join DPG Media as a Medior Data Engineer to build and maintain cloud-based data pipelines in Amsterdam.
Data Engineer with GCP and Big Data Experience
Join Nextory as a Data Engineer in Stockholm. Work with GCP, Big Data, and more to enrich lives through reading.
DevOps Engineer with Google Cloud Platform and CI/CD Experience
Join Channable as a DevOps Engineer to enhance cloud infrastructure and CI/CD processes in a dynamic eCommerce environment.
Solutions Engineer with Data Analytics and GCP Experience
Join Databricks as a Solutions Engineer in Amsterdam, leveraging data analytics and GCP expertise to solve complex data challenges.
Cloud Engineer, Data and Analytics
Join Google as a Cloud Engineer in Data and Analytics, focusing on data processing and cloud solutions.
Lead Data Engineer with GCP Expertise
Lead Data Engineer role in Berlin, focusing on GCP, BigQuery, and data pipelines.
Senior GCP Data Engineer (Databricks)
Join Xebia Poland as a Senior GCP Data Engineer, focusing on Databricks, Python, and SQL for cloud-based solutions.
Software Engineer - Cloud Applications and Python
Join Topicus as a Software Engineer in Arnhem to develop cloud applications using Python, REST APIs, and ETL processes for healthcare data services.
Cloud Engineer
Join Tibo Energy as a Cloud Engineer to drive energy transition with cloud architecture skills in a dynamic team.
Remote Data Engineer - Google BigQuery
Join Hostaway as a Remote Data Engineer, leveraging Google BigQuery and Python to optimize data infrastructure and support revenue operations.
Cloud Data Engineer
Seeking a Cloud Data Engineer with expertise in AWS, Python, and CI/CD for a hybrid role in Hannover. Join our dynamic team!
Senior Software Engineer - Java & GCP
Join bol.com as a Senior Software Engineer to develop high-performance Java solutions on Google Cloud Platform.
Senior Data Engineer
Senior Data Engineer role focusing on GCP, data pipelines, and automation in Madrid, hybrid work setup.
Experienced Data Engineer
Join Tropos.io as an Experienced Data Engineer in Antwerp, Belgium. Engage in building high-quality data solutions with a leading IT consulting firm.
Senior Data Engineer at Nederlandse Loterij
Senior Data Engineer needed at Nederlandse Loterij in Rijswijk, focusing on Big Data AI platform development using Azure, Python, Spark.
Senior DevOps Cloud Engineer
Senior DevOps Cloud Engineer in Utrecht, NL. Expertise in AWS, GCP, Azure, CI/CD, Python, Terraform. Hybrid work, competitive salary & benefits.
Senior Data Engineer / Tech Lead with GCP Expertise
Join Bel Group as a Senior Data Engineer / Tech Lead with GCP expertise, leading data solutions and development teams.
Cloud Engineer with DevSecOps Focus
Join Efficy as a Cloud Engineer with DevSecOps focus, managing cloud environments and CI/CD pipelines.
Software Engineer - Python, React.js, GCP
Join Bloom & Wild as a Software Engineer working with Python, React.js, and GCP in a flexible, innovative environment.
DevOps Engineer with AWS and Kubernetes Experience
Join NU.nl as a DevOps Engineer to enhance AWS EKS infrastructure and CI/CD pipelines. Work with Kubernetes, Terraform, and more.
Data Engineer with Apache Spark Experience
Join Mapiq as a Data Engineer to build scalable data pipelines using Apache Spark in a hybrid work environment.
Cloud Engineer
Join our team as a Cloud Engineer in Cologne, Germany. Work with GCP, Azure, and serverless computing in a dynamic, hybrid environment.