Job description
Analytics Engineer (GCP / dbt)
Sector: Telecoms
Location: London or Reading (2 days in office)
Pay Rate: £467.50 - £552.50 (Inside IR35)
Tech Stack: Google Cloud Platform (BigQuery), dbt (Cloud/Core), SQL
The Role: The Bridge Between Data & Decisions
In the world of Telecoms, the data is vast, but the insights need to be surgical. We are looking for an Analytics Engineer who lives exactly where Data Engineering meets Data Analysis.
You won't just be moving data; you’ll be the architect of the "Single Source of Truth." Your mission is to take raw, complex telecoms data and transform it into clean, scalable, high-quality models that power our Finance, Planning, and Commercial teams. If you’re tired of "black box" engineering and want to see the direct business impact of your code, this is the seat for you.
What You’ll Be Doing
-
Architecting the Truth: Design and implement scalable business-layer models, marts, and OBTs. You’ll decide whether a dimensional model or a flat OBT is the right tool for the specific business problem.
-
Owning the Lifecycle: You take business requirements and turn them into technical reality. You’ll develop, test, document, and deploy via CI/CD, treating analytics code with the same rigour as software engineering.
-
Enforcing Excellence: You’ll be the gatekeeper for SQL and dbt standards. You’ll lead code reviews that focus on logic, consistency, and future maintainability.
-
Security & Integrity: Data quality isn’t an afterthought—it's a critical defect. You’ll implement generic and custom tests while ensuring PII is handled safely (masking, hashing, and access control).
-
Deep-Dive Problem Solving: When a pipeline breaks or a number looks "off," you’ll trace the lineage from source to BI layer, debugging failures independently until they’re resolved.
-
Performance & Cost Ops: You’ll hunt for refactoring opportunities in older models and use BigQuery tools to optimize query performance and cloud costs.
-
Stakeholder Storytelling: You’ll explain technical "why" in simple "business." You make sure downstream teams know exactly how a model change will affect their dashboards before you hit deploy.
Your Profile
The Must-Haves:
-
Expert SQL: You write complex, performant queries and understand the "under the hood" mechanics of cloud data warehouses (specifically GCP BigQuery).
-
dbt Specialist: Strong hands-on experience with dbt Cloud or Core is essential.
-
Modelling Pro: You understand dimensional modelling and how to structure data for varied business use cases.
-
CI/CD Native: You’re comfortable with branching, merging, rebasing, and resolving conflicts in a professional development environment.
-
Analytical Detective: You have the "lineage mindset"—the ability to trace data flows backward to find and fix the root cause of an issue.
The Nice-to-Haves:
-
Experience in Agile environments and a deep understanding of the data development lifecycle.
-
Familiarity with BI tools (like Tableau) so you can see the data through the eyes of the end-user.
-
A background in Performance Tuning and cost optimization within GCP.
-
A "Community" mindset—whether that’s mentoring others or contributing to internal forums and meetups.