Data Engineer
Role Summary
The Data Engineer is responsible for designing, building, implementing, and maintaining data infrastructure and pipelines that enable efficient and secure processing of structured and unstructured data. The role supports Data Analysts, Data Scientists, and other stakeholders by ensuring reliable data availability for analytics, AI, and operational use.
The focus is on end-to-end data lifecycle management, including ingestion, transformation, storage, monitoring, and optimization of data systems.
Key Duties & Responsibilities
The role includes:
- Designing and implementing data pipelines (ETL/ELT) for batch and near real-time processing
- Building and maintaining data infrastructure such as data lakes and data warehouses
- Integrating data from multiple structured and unstructured sources
- Ensuring data integrity, availability, security, and governance across systems
- Supporting development of data science, analytics, and AI applications
- Monitoring and optimizing data pipelines for performance and reliability
- Implementing logging and monitoring solutions for data workflows
- Applying secure data handling practices (encryption, access control, authentication/authorization)
- Working with stakeholders to translate data requirements into technical solutions
- Using modern software engineering practices (version control, CI/CD, testing)
Requirements
Education / Experience
- Bachelor’s degree in Computer Science, Data Science, Engineering or related field relevant experience
OR - Exceptionally: 6 years of relevant professional experience in data engineering roles
Technical Skills & Knowledge
- Strong knowledge of data engineering, including:
- Data warehousing
- ETL / ELT pipelines
- Data governance
- Experience with data lakes and batch real-time processing systems
- Experience supporting analytics, AI, and data science use cases
- Strong proficiency in Python
- Experience with cloud platforms: AWS, Azure, or Google Cloud
- Knowledge of containerization and orchestration tools:
- Docker
- Kubernetes
- Experience with monitoring and logging tools for data pipelines
- Strong understanding of:
- CI/CD pipelines
- Version control systems
- Unit and functional testing
- Solid understanding of data security:
- Encryption
- Access control
- Authentication and authorization mechanisms
Additional Information
- Location: The Hague, Netherlands (100% on-site)
- Duration: 08 June 2026 – 31 December 2026
- Total effort: 836 hours
- Security Clearance: NATO SECRET
- NATO Grade: G15