More jobs:
Data Engineer
Job in
Pretoria, 0002, South Africa
Listed on 2026-01-11
Listing for:
CLS Human Capital Specialists
Full Time
position Listed on 2026-01-11
Job specializations:
-
IT/Tech
Data Engineer
Job Description & How to Apply Below
Job title :
Data Engineer
Job Location: Gauteng, Pretoria
Quick Recommended Links
- Jobs by Location
- Job by industries
- Our client, a fast-growing data solutions start-up based in Pretoria East, is looking for a motivated and detail-oriented Data Engineer to join their dynamic team. This is an exciting opportunity for someone passionate about data, technology, and innovation to contribute to building scalable data systems and play a key role in shaping the company’s data engineering capability.
- To design, develop and maintain robust, scalable and secure data engineering solutions across the full data lifecycle, from data ingestion and transformation to warehousing and integration.
Minimum education (essential):
- BSc in Computer Science, Engineering or relevant field
Minimum applicable experience (years):
- 2-4 years
Required nature of experience:
- Experience with SQL Server and Azure Synapse Analytics/Microsoft Fabric for query writing, indexing, performance tuning and schema design.
- Hands‑on experience developing ETL pipelines, including data extraction from REST/SOAP APIs, databases and flat files.
- Proficiency in data transformation using Python and Azure-native tools.
- Experience with data warehousing.
- Background in data modelling, including dimensional modelling, schema evolution and versioning.
- Practical knowledge of cloud-based data storage and processing using Azure Blob Storage.
- Familiarity with pipeline optimisation, fault tolerance, monitoring and security best practices.
- Experience developing web applications using C# and the .NET platform.
- Experience with front‑end development using Blazor, React.js, JavaScript/Type Script, HTML, CSS/SCSS.
- SQL Server, Azure Synapse Analytics, Azure Blob Storage, Microsoft Fabric
- Python
- REST/SOAP APIs, Data Extraction, Transformation, Loading (ETL)
- Azure Data Factory, Pipeline Orchestration
- Dimensional Modelling, Schema Evolution, Data Warehousing
- Power BI
- Performance Optimisation, Indexing, Query Tuning
- Cloud Data Processing, Backups
- C#, .NET, Blazor
- JavaScript/Type Script, HTML, CSS/SCSS
- Proficient in Afrikaans and English
- Own transport and license
ETL and Pipeline Development
- Design, build, and orchestrate efficient ETL pipelines using Azure Synapse for both batch and near‑real‑time data ingestion.
- Extract data from structured and unstructured sources including REST APIs, SOAP APIs, databases, and flat files.
- Apply robust data transformation logic using Python and native Azure Synapse transformation tools.
- Optimise data flows for performance, scalability and cost‑effectiveness.
- Implement retry mechanisms, logging and monitoring within pipelines to ensure data integrity and fault tolerance.
- Design and manage scalable and efficient data architectures using Microsoft SQL Server and Azure services, including Synapse Analytics/Microsoft Fabric and Blob Storage.
- Develop robust schema designs, indexes and query strategies to support analytical and operational workloads.
- Support schema evolution and version control, ensuring long‑term maintainability and consistency across datasets.
- Implement and maintain metadata repositories and data dictionaries for improved data governance and transparency.
- Define and maintain role‑based access control to ensure data security and compliance.
- Architect and manage enterprise data warehouses using Azure Synapse Analytics.
- Apply best practices for data loading, partitioning strategies and storage optimisation.
- Integrate warehousing solutions with Power BI and other analytics platforms for seamless business intelligence consumption.
- Develop and maintain conceptual, logical and physical data models.
- Implement dimensional modelling techniques (e.g., star/snowflake schemas) to support advanced analytics and reporting.
- Apply normalisation standards and relational modelling techniques to support OLTP and OLAP workloads.
- Ensure consistency of data models across systems and support schema versioning and evolution.
- Provide clear,…
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×