Build a greenfield data platform from scratch! This logistics company is halfway through a massive digital transformation. Legacy monoliths are being dismantled, replaced with a clean, cloud-native, modular stack - and the new data team is central to making it all talk. You'll be working directly with their Head of Data, building Azure-based pipelines from the ground up, and shaping how the company uses data across 50+ systems. There's real investment, real urgency, and real interest in doing this properly - not endless meetings and PowerPoints. * Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and Delta Lake * Implementing a medallion architecture - from raw to curated * Collaborating with analysts to make data business-ready * Applying CI/CD and DevOps best practices (Git, Azure DevOps)
more