Job description
Data Engineer (Teradata / Informatica)
Start: ASAP
Duration: 3-9 months
Pay: inside IR35, up to £510 per day
Location: West London, Hounslow area
Join a major data transformation programme that is modernising its data estate—blending robust legacy systems with cutting-edge, cloud-first architecture. This is not a standard lift-and-shift project; it’s a long-term, forward-looking initiative focused on AI-driven decision-making and operational excellence.
Key Responsibilities
- Design and maintain scalable, cloud-based data pipelines.
- Modernise legacy ETL processes using tools like Airflow, DBT, AWS Glue.
- Collaborate with architects and engineers to deliver high-quality, production-ready solutions.
- Optimise workflows and ensure data quality, reliability, and documentation.
- Champion best practices and a culture of continuous improvement.
Essential Skills
- Solid data engineering experience in both on-prem and cloud environments.
- Strong hands-on knowledge of Teradata and Informatica.
- Proven experience with SQL, Python, and large-scale data pipelines.
- Familiarity with data warehousing, modelling, and performance tuning.
- Experience using AWS and Agile working methods.
- CI/CD and version control proficiency.
Desirable
- Experience with Snowflake or other cloud-native warehouses.
- Exposure to GraphQL, DataOps, Terraform/CloudFormation.
- Interest in AI/ML integration in data engineering.
- Experience in enterprise-grade or regulated environments.
Start: ASAP
Duration: 3-9 months
Pay: inside IR35, up to £510 per day
Location: West London, Hounslow area
Join a major data transformation programme that is modernising its data estate—blending robust legacy systems with cutting-edge, cloud-first architecture. This is not a standard lift-and-shift project; it’s a long-term, forward-looking initiative focused on AI-driven decision-making and operational excellence.
Key Responsibilities
- Design and maintain scalable, cloud-based data pipelines.
- Modernise legacy ETL processes using tools like Airflow, DBT, AWS Glue.
- Collaborate with architects and engineers to deliver high-quality, production-ready solutions.
- Optimise workflows and ensure data quality, reliability, and documentation.
- Champion best practices and a culture of continuous improvement.
Essential Skills
- Solid data engineering experience in both on-prem and cloud environments.
- Strong hands-on knowledge of Teradata and Informatica.
- Proven experience with SQL, Python, and large-scale data pipelines.
- Familiarity with data warehousing, modelling, and performance tuning.
- Experience using AWS and Agile working methods.
- CI/CD and version control proficiency.
Desirable
- Experience with Snowflake or other cloud-native warehouses.
- Exposure to GraphQL, DataOps, Terraform/CloudFormation.
- Interest in AI/ML integration in data engineering.
- Experience in enterprise-grade or regulated environments.