Job description
Data Engineer (SC cleared)
Start: ASAP
Duration: 12 months
Location: Mostly Remote - must have access to London or Bristol
Pay: negotiable, INSIDE IR35
Responsibilities:
- Design, implement robust ETL/ELT data pipelines using Apache Airflow
- Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS
- Develop and maintain data lakes and warehouses (AWS S3, Redshift)
- Ensuring governance using automated testing tools
- Collaborate with DevOps to manage CI/CD pipelines for data deployments and ensure version control of DAGs
- Apply best practice in security and compliance
Required Tech Skills:
- Python and SQL for processing
- Apache Airflow, writing Airflow DAGs and configuring airflow jobs
- AWS cloud platform and services like S3, Redshift
- Familiarity with big data processing using Apache Spark
- Knowledge of modelling, schema design and partitioning strategies
- Understanding batch Vs streaming data paradigms
- Docker or Kubernetes (containerization)
Start: ASAP
Duration: 12 months
Location: Mostly Remote - must have access to London or Bristol
Pay: negotiable, INSIDE IR35
Responsibilities:
- Design, implement robust ETL/ELT data pipelines using Apache Airflow
- Build ingestion processes from internal systems and APIs, using Kafka, Spark, AWS
- Develop and maintain data lakes and warehouses (AWS S3, Redshift)
- Ensuring governance using automated testing tools
- Collaborate with DevOps to manage CI/CD pipelines for data deployments and ensure version control of DAGs
- Apply best practice in security and compliance
Required Tech Skills:
- Python and SQL for processing
- Apache Airflow, writing Airflow DAGs and configuring airflow jobs
- AWS cloud platform and services like S3, Redshift
- Familiarity with big data processing using Apache Spark
- Knowledge of modelling, schema design and partitioning strategies
- Understanding batch Vs streaming data paradigms
- Docker or Kubernetes (containerization)