Position Summary:
We are seeking a Professional Data Engineer to join our dynamic team, where you will play a crucial role in developing and maintaining robust data solutions.
As a Professional Data Engineer, you will collaborate with data science, business analytics, and product development teams to deploy cutting‑edge techniques and utilise best‑in‑class third‑party products. The Data team operates with engineering precision, prioritising security, privacy, and regulatory compliance in every initiative. As a Professional Data Engineer, you will contribute to the team's commitment to utilising the latest tools and methodologies, ensuring that our data solutions align with industry best practices.
TechnicalStack
- Languages:
SQL and Python - Pipeline orchestration tool:
Dagster (Legacy: Airflow) - Data stores:
Redshift, Clickhouse - PaaS: AWS (ECS/EKS, DMS, Kinesis, Glue, Athena, S3 and others)
- ETL:
Five Tran & DBT for transformation - IaC:
Terraform (with Terragrunt) - GenAI:
Bedrock, Lang Chain, LLMs
- Develop and maintain ETL pipelines using SQL and/or Python.
- Use tools like Dagster/Airflow for pipeline orchestration.
- Collaborate with cross‑functional teams to understand and deliver data requirements.
- Ensure a consistent flow of high‑quality data using stream, batch, and CDC processes.
- Use data transformation tools like DBT to prepare datasets to enable business users to self‑service.
- Ensure data quality and consistency in all data stores.
- Monitor and troubleshoot data pipelines for performance and reliability.
- 3+ years of experience as a data engineer.
- Proficiency in SQL is a must.
- Experience with modern cloud data warehousing and data lake solutions (Snowflake, Big Query, Redshift, Azure Synapse).
- Experience with ETL/ELT, batch, streaming data processing pipelines.
- Excellent ability to investigate and troubleshoot data issues, providing fixes and proposing short and long‑term solutions.
- Knowledge of AWS services (S3, DMS, Glue, Athena, etc.).
- Familiar with DBT or other data transformation tools.
- Familiarity with GenAI and how to leverage LLMs to resolve engineering challenges.
- Experience with AWS services and concepts (EC2, ECS, EKS, VPC, IAM, etc.).
- Familiar with Terraform and Terragrunt.
- Experience with Python.
- Experience with orchestration tools (Dagster, Airflow, AWS Step Functions, etc.).
- Experience with pub‑sub, queuing, and streaming frameworks such as AWS Kinesis, Kafka, SQS, SNS.
- Familiar with CI/CD pipelines and automation.
Mid‑Senior level
Employment TypeFull‑time
IndustriesIT Services and IT Consulting
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).