About the Client:
My client is an exciting, agile, and forward-thinking healthcare company with ambitious growth plans. Committed to excellence, their aim is to be 'Beyond Better' in everything they do, as they build an outstanding healthcare organization. Their innovative model is based on partnering with doctors to build and operate highly specialized healthcare centers, focusing on single medical specialties such as Endoscopy, Orthopaedics, or Cardiology. With their HQ in London and a growing number of centers in Central London, they are also expanding to new locations in Oxford and Cambridge, with many more exciting projects planned across the UK and worldwide.
About the Role:
My client is seeking a talented Data Engineer to join their Business Intelligence Team. In this crucial role, you will be responsible for designing, building, managing, and optimizing the data infrastructure and pipelines that drive analytics and insights across their business and digital platforms. Your work will be pivotal in ensuring that their data is reliable, accurate, and readily available for analysis.
Your Impact:
- Design, build, and optimize robust and scalable data pipelines to ingest, process, and store data across the business.
- Develop, manage, and maintain ETL/ELT processes, ensuring data quality, integrity, and timely delivery to their data warehouse.
- Implement data modeling solutions within the data warehouse to support efficient querying and analytics for performance monitoring and executive reporting needs.
- Collaborate with BI Analysts and commercial stakeholders to understand data requirements and translate them into effective technical solutions.
- Monitor, troubleshoot, and optimize data pipeline performance, reliability, and cost-effectiveness.
- Implement data quality checks and validation processes throughout the data lifecycle.
- Contribute to the architecture and strategy of their data platform and tooling.
Required Skills:
- Strong SQL proficiency with experience writing complex, optimized queries for data transformation and modeling.
- Proven experience (3+ years) in data engineering, specifically designing, building, and maintaining data pipelines and ETL/ELT processes.
- Proficiency in Python for data engineering tasks (e.g., scripting, automation, data manipulation libraries like Pandas, interacting with APIs).
- Solid understanding of data warehousing concepts (e.g., dimensional modeling, star/snowflake schemas) and database design principles.
- Hands-on experience with data build tool (dbt) or similar data transformation frameworks.
Nice to Have:
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related technical field.
- Experience with workflow orchestration tools (e.g., Airflow, Prefect, Dagster).
- Experience ingesting data from diverse sources, including marketing analytics platforms (Google Analytics, HubSpot), APIs, databases, and call tracking systems.
- Familiarity with data visualization tools (like PowerBI) from a data provisioning perspective.
- Healthcare or medical industry experience.
- Knowledge of containerization (Docker) and CI/CD practices for data pipelines.
- Experience with data governance and data quality management frameworks.
