6 month Fixed Term Contract
Our client is seeking an experienced Data Engineer with strong Databricks and Python skills on AWS. The role involves developing, supporting, and maintaining cloud-based data solutions with a focus on reliability, technical excellence, and timely delivery.
Design, develop, and maintain data pipelines, ETL processes, and Lakehouse environments.
Build and optimize SQL databases, schemas, tables, and stored procedures.
Collaborate with business and technical teams to gather and implement data requirements.
Support data integration, transformation, and migration between systems.
Develop Databricks notebooks, ETL workflows, and job orchestration.
Contribute to reporting and visualization initiatives using BI tools.
Participate in Agile/SCRUM processes, version control, and release management.
Communicate technical issues and solutions clearly to stakeholders.
5-7 years' experience in Data Warehousing / BI development and data modeling.
3-5 years' experience with Databricks on AWS and related cloud data solutions.
Strong proficiency in Python, SQL, and relational databases (Postgres, Oracle, SQL Server).
Experience with REST/SOAP/JSON integrations and version control (e.g., Git).
Excellent communication, analytical, and problem-solving skills.
Bachelor's degree in Computer Science or related field.
Background in financial or similar data-driven industries.
Familiarity with ML/AI concepts.
Experience with Agile methods and data visualization tools.
