Senior Analytics Engineer
Company Description
Since launching in Kuwait in 2004, talabat, the leading on‑demand food and Q‑commerce app for everyday deliveries, has been offering convenience and reliability to its customers. talabat’s local roots run deep, offering a real understanding of the needs of the communities we serve in eight countries across the region.
We harness innovative technology and knowledge to simplify everyday life for our customers, optimise operations for our restaurants and local shops, and provide our riders with reliable earning opportunities daily.
Here at talabat, we are building a high‑performance culture through an engaged workforce and growing talent density. We're all about keeping it real and making a difference. Our 6,000+ strong talabaty are on an awesome mission to spread positive vibes. We are proud to be a multi‑great place to work award winner.
Job DescriptionAs the leading delivery company in the region, we have a great responsibility and opportunity to impact the lives of millions of customers, restaurant partners, and riders. To realise our potential, we need to advance our platform to become much more intelligent in how it understands and serves our users.
ResponsibilitiesDesign, build, and maintain clean, efficient, and scalable data models in SQL and dbt. Optimize query performance and data processing efficiency.
Ensure data is structured to support self‑service analytics and business intelligence.
Work closely with data engineers to define data requirements and enhance ETL pipelines.
Partner with product analysts, data scientists, and business teams to ensure data meets analytical and reporting needs.
Implement and enforce data quality best practices to ensure accuracy and consistency.
Develop and maintain data transformation workflows using dbt, SQL, and cloud‑based data platforms. Automate data validation and reporting processes.
Monitor data integrity and troubleshoot data issues proactively. Advocate for best practices in documentation, testing, and version control.
Bachelor's degree in engineering, computer science, technology, or similar fields. A postgraduate degree is a plus but not required.
Strong proficiency in Python, SQL and experience with dbt for data modelling.
Experience working with cloud‑based data warehouses (e.g., Snowflake, Big Query, GCP, Redshift).
Familiarity with version control (Git) and CI/CD practices for data workflows.
Understanding of data engineering principles, including ETL/ELT processes.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).