At Al Tayer Insignia, your career is more than a job — it’s a journey into the heart of luxury retail. For over 40 years, we’ve partnered with the world’s most iconic brands, creating award-winning retail experiences across our boutiques, department stores, and leading online platforms. With stores and outlets across the GCC and a truly seamless omnichannel presence, we bring style, innovation, and heritage together.
Here, you’ll join a diverse, customer obsessed, passionate team that celebrates creativity, values individuality, and empowers you to grow. Join us on our journey, reimagining fashion and redefining the meaning of luxury in the region.
As a Data Engineer you will be designing building new data pipelines and creating for Al Tayer omni channel brands. This involves ingestion from sources like salesforce marketing cloud, Google Analytics, firebase, commerce cloud, service cloud and various other internal and external sources.
You will be responsible to maintain data integrity data quality and report them on weekly/monthly basis.
You will create several prototype product which included ML’s and basic frontends and support in building them end to end.
ResponsibilitiesEssential Roles and Responsibilities
- Design, develop, and implement data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse.
- Optimize data pipelines for performance, scalability, and reliability to handle large volumes of data.
- Work with AWS services such as AWS Glue, AWS Data Pipeline, and Amazon S3 to build robust data processing workflows , api gateways.
- Utilize Tableau to create interactive and insightful data visualizations and dashboards for business stakeholders.
- Troubleshoot data-related issues, identify bottlenecks, and implement effective solutions.
- Exposure ML , NLP and to various LLM’s is an added advantage.
Education/Certification and Continued Education
- Bachelor's degree in Computer science/ Software Engineering / Information Technology.
Years of Experience
- Minimum 3 years of experience, open mindset and eagerness to learn and experiment with data to create value and there by knowledge.
Knowledge and Skills
- Python and SQL.
- Streaming and batch processing of data.
- Any visualization tool Tableau / Data studio /Metabase.
- CI/CD’s – bitbucket/ Github/Git Lab.
- Basic understanding of ML like logistic regression, NLP’s , clustering.
- Exposure ML , NLP and to various LLM’s is an added advantage.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).