WFS Senior Data Ops Engineer
Cape Town, 7100, South Africa
Listed on 2026-01-10
-
IT/Tech
Data Engineer, Cloud Computing
Woolworths Financial Services | Full time
Cape Town, South Africa | Posted on 09/03/2025
Woolworths Financial Services, or WFS as it is better known, is a Joint Venture with Absa Bank, that supports the Woolworths retail business by providing in-store credit in the form of the Woolworths Store Card and offering value‑added services including credit cards, personal loans and short‑term insurance as well as life insurance linked to other products.
Job Description Main PurposeAs a Senior Data Ops Engineer, you will be responsible for designing, automating, and maintaining robust data pipelines and infrastructure that enable continuous data integration, delivery, and observability in a dynamic cloud environment. You will work alongside data engineers, architects, analysts, and platform teams to ensure data is flowing securely, efficiently, and reliably from source to destination.
Key Responsibilities- Design, build, and maintain CI/CD pipelines for data services and workflows.
- Automate data pipeline orchestration using tools like Apache Airflow, AWS Step Functions, or Prefect.
- Ensure data quality, testing, and monitoring are integrated into pipelines using tools like dbt, Great Expectations, or similar.
- Collaborate with engineering and analytics teams to promote data infrastructure as code using Terraform or AWS Cloud Formation.
- Implement logging, monitoring, and alerting for data operations using services like AWS Cloud Watch, Datadog, or Prometheus.
- Support deployment and version control of data models, transformations, and schemas.
- Drive adoption of Dev Ops and Agile best practices in data projects.
- Lead initiatives to improve data pipeline performance, cost optimization, and scalability.
- Troubleshoot and resolve pipeline failures or data integrity issues in a production environment.
- Bachelor’s degree in computer science, Engineering, or related field.
- 6+ years in data engineering, Dev Ops, or Data Ops roles.
- Hands‑on experience with AWS data services like: S3, Glue, Lambda, Redshift, RDS, Athena, Step Functions.
- Strong scripting and programming skills in Python or Shell.
- Deep understanding of CI/CD tools (e.g., Bit Bucket, Git Lab CI, Jenkins, Git Hub Actions).
- Experience with infrastructure as code (e.g., Terraform, Cloud Formation).
- Familiarity with modern orchestration frameworks (e.g., Airflow, Dagster, Prefect).
- Expertise in data pipeline design, data testing, and metadata management.
- Understanding of data governance frameworks and cataloguing tools.
- Exposure to CI/CD, Terraform, and Data Ops pipelines.
- AWS certification (e.g., Data Analytics or Solutions Architect) is a plus.
- Experience with real‑time streaming technologies such as Kafka, Kinesis, or Confluent Cloud.
- Familiarity with containerization and Kubernetes.
- Exposure to dbt, Great Expectations, or similar testing frameworks.
- AWS certification (e.g., Dev Ops Engineer, Data Analytics Specialty) is highly desirable.
- Strong problem‑solving and incident response skills.
- High attention to detail and focus on reliability.
- Effective communicator, capable of working cross‑functionally.
- Passion for automation, efficiency, and clean documentation.
- A cloud‑native environment focused on innovation and learning.
- Opportunities for professional development and AWS certification support.
- Competitive compensation and benefits.
- Flexible work options, including hybrid or remote work.
- A culture of collaboration and data‑driven impact.
- Curiosity and passion for understanding technology.
- Analytical, with an affinity to solve problems.
- Ability to work under pressure in a fast‑paced, dynamic environment.
- Eager to embrace unfamiliar situations with a positive mindset.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: