More jobs:
Job Description & How to Apply Below
Key Responsibilities
Data Engineering & ETL Development
Design and build scalable ETL/ELT pipelines using Databricks (Spark/SQL/PySpark).
Develop and optimize workflows in Databricks Workflows/Jobs.
Implement Delta Lake architectures for reliable and performant data storage.
Perform data modeling (dimensional, star schema, ELT structures).
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×