Senior Cloud Engineer -Unit: BTO | Data Platform 40u/w

Volgnummer: 94183

Publicatiedatum: 30-06-2025

Locatie: ARNHEM

Contact

DC Professionals

info@dcprofessionals.nl

085 020 1022

Standplaats: ARNHEM
Duur: zsm - 31-12-2025
Optie tot verlenging: Ja
Reageren voor: 30-06-2025

Europe's energy transition is among the greatest challenges of our time. Its goal: To make Europe the first climate-neutral continent by 2050 at the latest. The Netherlands and Germany have therefore set themselves ambitious goals to expand offshore wind energy: as early as 2030, Germany and the Netherlands want to reach a capacity of 30 and 22.2 Gigawatts, respectively. Energy from offshore wind farms in the European powerhouse North Sea will play in this a crucial role. TenneT guarantees that this energy will reach the mainland
- and in the most environmentally friendly way possible. That is why we have developed the 2GW program with a unique transnational approach. And thus increase the pace and efficiency of the energy transition.

For TenneT we are looking for 3 Sr. Cloud Engineers.

Unit: The Digital & Data organization at TenneT is focused on driving innovation and leveraging digital technology to enhance data-driven decision-making across the company. As part of this mission, the organization has developed the TenneT Data Cloud (TDC), a modern cloud-based data platform built on Azure. This platform supports a wide range of data integration, processing, and analytics tasks, serving as the foundation for data initiatives across TenneT. Within this structure, DevOps teams play a central role, working closely with stakeholders to deliver high-quality, scalable, and reliable data solutions that meet the evolving needs of the business.

Function: As a Cloud Data Platform Engineer in TenneT’s Digital & Data organization, you will be a crucial member of a DevOps team responsible for designing, implementing, and maintaining the TenneT Data Cloud (TDC) on Azure. Your role involves setting up and managing Azure services like Azure Data Factory, Azure Databricks, and Microsoft Fabric, ensuring seamless integration with various data sources and automating workflows to enhance efficiency. Additionally, you’ll monitor and optimize the performance of the TDC to uphold high standards of availability and reliability, staying current with the latest Azure technologies and best practices to continuously improve the platform.

Tasks and responsibilities:

• Design, develop, and implement scalable data solutions using Microsoft Azure services.
• Manage containerized applications with Azure Kubernetes Service (AKS).
• Build and maintain CI/CD pipelines to support efficient, automated deployment and testing of data engineering workflows.
• Develop and maintain data processing solutions using Python, Java, or other relevant programming languages.
• Ensure effective data storage, ingestion, transformation, and analytics leveraging Azure data services.
• Design, develop, and integrate APIs to facilitate seamless data exchange with external systems.
• Implement automated workflows and system integrations to streamline operations.
• Use Infrastructure as Code (IaC) tools to provision and manage cloud infrastructure on Azure.
• Design, build, test, deploy, and maintain applications with a focus on performance, fault tolerance, observability (logging and monitoring), and reliability.
• Write and maintain unit and integration tests to ensure code quality and reliability.
• Troubleshoot and resolve issues identified through testing or reported by users.
• Continuously identify opportunities to improve existing technical solutions and team practices.
• Actively participate in knowledge sharing, design discussions, and technical reviews within the team.


Profile:

• Bachelor’s in Computer Science, Engineering, or a related field (or equivalent practical experience).
• Extensive experience (min 7 years) with Microsoft Azure services, including but not limited to Azure Kubernetes Service (AKS), Azure Data Lake Storage, Azure Data Factory, and Azure Databricks (must-have).
• Proven track record of designing and deploying scalable, production-grade data pipelines and distributed data processing solutions.
• Strong proficiency in Databricks development, including notebook orchestration, Delta Lake, structured streaming, and performance optimization.
• Deep understanding of CI/CD practices and tools (e.g., Azure DevOps, GitHub Actions, Jenkins) for automating deployment and testing workflows.
• Advanced scripting and development skills in Python, Java, and SQL, with the ability to write clean, testable, and maintainable code.
• Experience provisioning and managing cloud infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Bicep.
• Familiarity with building and integrating RESTful APIs for data access and interaction with external systems.
• Experience with automated workflows, event-driven architectures, and data integration pipelines.
• Solid understanding of data engineering principles including data modeling, ETL/ELT patterns, and data governance.
• Knowledge of big data technologies and frameworks such as Apache Spark, Kafka, and Hadoop.
• Strong analytical and problem-solving skills with the ability to debug and optimize complex systems in production.
• Excellent communication and interpersonal skills; ability to collaborate effectively in agile, cross-functional teams.
• High proficiency in English, Dutch is not mandatory.

Soft skills:

• Team player and communicative
• Proactive
• Open minded and flexible
• Ambitious and driven
• Involved and motivated

Conditions:

• At entry, TenneT performs a Pre-Employment Screening;
• Duty station for this position is officially Arnhem MCE | 2 x week in the office (team day on Thursday) and the rest hybrid.
• One interview with panel of 2 or 3 partners | Online via Teams.

Additional information:

• Suppliers must be aware of the laws and regulations regarding employment conditions and TenneT’s Collective Labour Agreement. This assignment is placed in scale 8.
• We would like to receive the personal motivation of the candidate and CV in English or Dutch.

Screening:

• Pre-employment screening: If the candidate is selected to start, a pre-employment screening will be executed. We will send you the required documents to be filled in and returned as soon as possible. Your candidate is only allowed to start after the pre-employment screening has been completed successfully. The VOG Application is part of the screening. The VOG must be received before the candidate could start;
• All submitted candidates must be in possession of a valid Passport or ID card, which must be taken along to the interview and at the start of the assignment.

Availability:

• Important, the candidate has to be available throughout the entire duration of the assignment;
• By submitting a candidate, you are agreeing to the terms of this specific client. If you are not familiar with these terms, you can ask our recruiters for a copy. IMPORTANT: Fill all the information about the candidate in correctly. When asked for the address, fill in the address of the candidate. This is information the client needs to receive with the submission. Incomplete/incorrect submissions risk being rejected. Solliciteer nu!