AWS Data Engineer Big Data & Python; GautengHybrid ISB
Midrand, Gauteng, South Africa
Listed on 2026-01-11
-
IT/Tech
Data Engineer, Cloud Computing
Build and maintain enterprise-scale Big Data pipelines using AWS cloud services and Group Data Platforms delivering innovative data solutions that power analytics, operational processes and business intelligence across global automotive operations!
Become the data engineering expert behind mission-critical data infrastructure where your expertise will drive data ingestion, transformation and provisioning while ensuring compliance, security and data quality across enterprise-wide data assets!
Expert-level data engineering with Python, PySpark, Terraform, AWS and Big Data expertise.
Hybrid and remote working flexibility with 1960 flexible annual hours.
Technical leadership role with mentoring responsibilities and enterprise data platform ownership.
PositionContract:
01 February 2026 – 31 December 2028
8 years related experience.
Commencement01 February 2026
LocationHybrid:
Midrand / Menlyn / Rosslyn / Home Office rotation.
Data Science and Engineering – Enterprise Data & Analytics.
The product focuses on creation and provisioning of enterprise-wide data spanning DGOs and Data Assets including data protection and other compliance and security aspects. This includes data ingests for the Enterprise D&A Use Cases (TOP
20) and data provisioning for operational processes.
- Relevant IT / Business / Engineering Degree
- AWS Certified Cloud Practitioner
- AWS Certified Sys Ops Associate
- AWS Certified Developer Associate
- AWS Certified Architect Associate
- AWS Certified Architect Professional
- Hashi Corp Certified Terraform Associate
- Above average experience/understanding in data engineering and Big Data pipelines
- Experience in working with Enterprise Collaboration tools such as Confluence JIRA
- Experience developing technical documentation and artefacts
- Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV
- Knowledge of the Agile Working Model
- Python 3.x (above average experience)
- Py Spark
- Power Shell / Bash
- Boto3
- Terraform (above average experience)
- SQL – Oracle/Postgre
SQL (above average experience) - ETL (above average experience)
- Big Data (above average experience)
- Technical data modelling and schema design (not drag and drop)
- Group Cloud Data Hub (CDH)
- Group CDEC Blueprint
- AWS Glue
- Cloud Watch
- SNS (Simple Notification Service)
- Athena
- S3
- Kinesis Streams (Kinesis Firehose)
- Lambda
- DynamoDB
- Step Function
- Param Store
- Secrets Manager
- Code Build/Pipeline
- Cloud Formation
- AWS EMR
- Redshift
- Kafka
- Docker
- Linux / Unix
- Business Intelligence (BI) Experience
- Self-driven team player with ability to work independently and multi-task
- Strong written and verbal communication skills with precise documentation
- Strong organizational skills
- Ability to work collaboratively in a team environment
- Problem-solving capabilities
- Above-board work ethics
- Demonstrate expertise in data modelling Oracle SQL
- Exceptional analytical skills analysing large and complex data sets
- Perform thorough testing and data validation to ensure the accuracy of data transformations
- Strong written and verbal communication skills with precise documentation
- Self-driven team player with ability to work independently and multi-task
- Experience building data pipeline using AWS Glue or Data Pipeline or similar platforms
- Familiar with data store such as AWS S3 and AWS RDS or DynamoDB
- Experience and solid understanding of various software design patterns
- Experience preparing specifications from which programs will be written designed coded tested and debugged
- Strong organizational skills
- Experience working with Data Quality Tools such as Great Expectations
- Experience developing and working with REST APIs
- Basic experience in Networking and troubleshooting network issues
- Build and maintain Big Data Pipelines using Group Data Platforms
- Design develop and optimize ETL processes for large-scale data ingestion and transformation
- Implement data…
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: