More jobs:
Job Description & How to Apply Below
Job Description
Data Engineering Lead is responsible for the development and support for internally created or supported ETL or database program (Big Data, data warehouse), including business requirement gathering, designing the data model and developing the solution.
Responsibilities- Responsible for the understanding and documenting the business requirements
- Deep understanding of ETL methodologies, Big Data and Data Warehousing principles, approaches, technologies, and architectures including the concepts, designs, and usage of data warehouses
- Works closely with business and technical teams to understand, document, design, develop, code, and test ETL processes
- Demonstrated experience in ETL design and options to improve load and extract performance
- Translate source to target mapping documents into ETL processes
- Building and integrating APIs
- Deploying ML solutions in the cloud
- Design, Develop, Test, Optimize and Deploy ETL code and stored procedures to perform all ETL related functions.
- Works in Agile environment and utilizes best principles of CI/CD and Dev Ops
- Design the data model for the Big Data
- Responsible for designing/implementing/managing ETL processes using Talend and Azure Data Factory
- Responsible to administrate and maintain the Azure/MS SQL server databases and Microsoft Enterprise Data warehouse
- Developing Spark Job in Scala or Java
- Write, implement, and maintain appropriate ETL processes
- Lead, train and support the work of other staff engaged in similar functions.
- Monitoring and maintaining the databases and all the ETL components
- Designs, codes in Python and SQL, orchestrates and monitors jobs in Azure Databricks / Snowflake or any data warehouse cloud
- Bachelor’s Degree in Computer Science, Engineering, or a related field
- 5+ years of experience within particular area of expertise
- 5+ years’ experience of ETL development using Talend / Airflow
- 5+ years’ experience of developing SQL queries, stored procedures, and views
- 3+ years’ experience developing/coding in Python
- 3+ years’ experience of database administration
- 3+ years’ experience working with any cloud solutions
- 3+ years’ experience of Azure Databricks or any cloud data warehouse such as Snowflake, Redshift
- 2+ years’ experience of Azure Data Factory
- Advanced certification in Azure (Azure Data Factory, Azure ML)
- Certification in Azure Databricks or any cloud warehouse
- Any certification in data science is a big plus.
Role Level: Mid-Level
• Work Type:
Full-Time
• Country:
United Arab Emirates
• City:
Abu Dhabi
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×