Senior Data Engineer, Data Platforms
Listed on 2026-01-17
-
IT/Tech
Data Engineer, Cloud Computing
The Senior Data Engineer, Data Platforms is a pivotal role on our Data Team - with broad responsibility. You're not just managing data; you're pioneering the very platforms that underpin our data and analytics engineering pursuits across the company's extensive landscape. You will own state-of-the-art big data platforms that power the Global Partners data stack -- with your work standing as the backbone supporting all data-centric innovations.
You are fluent with platforms like AWS, Snowflake, Dagster, and dbt at your fingertips, and deploying via tools such as Kubernetes, Docker, and Terraform being second nature, you are primed to spearhead our pursuit of data excellence. Your vast experience, extending from data storage best practices to continuously assessing and integrating new technologies, ensures that Global Partners stays ahead of the curve.
You are a problem solver and an automation enthusiast who will be responsible for deploying robust solution for data engineering and data platform management.
At the heart of it all, you're not just an engineer; you see the art in orchestrating data. As you engage with teams, provide strategic guidance, and champion the consistent adoption of best practices, you're also shaping the future of data analytics. If you're ignited by the prospect of being at the helm of technological evolution, where every decision melds strategy with data - Join us.
Global Partners offers a collaborative team and an environment where we actively invest to create a culture of data driven excellence.
At Global Partners, business starts with people. Since 1933, we’ve believed in taking care of our customers, our guests, our communities, and each other—and that belief continues to guide us.
The Global Spirit is how we work to fuel that long term commitment to success. As a Fortune 500 company with 90+ years of experience, we’re proud to fuel communities—responsibly and sustainably. We show up every day with grit, passion, and purpose—anticipating needs, building lasting relationships, and creating shared value.
Responsibilities- Architect and implement scalable, cloud-native data platforms that serve as the foundation for all data engineering initiatives across the organization, utilizing technologies such as AWS, GCP, or Azure, Python, Docker, Kubernetes
- Automate deployment (CI/CD) pipelines for data infrastructure and applications, leveraging tools like Jenkins, Git Lab CI,Git Hub Actions to ensure rapid, reliable deployments.
- Implement Infrastructure as Code (IaC) practices using tools such as Terraform or Cloud Formation to manage and version control cloud resources.
- Develop and maintain robust data orchestration workflows using modern tools like Apache Airflow, Dagster, or Prefect, ensuring efficient data processing and transformation.
- Develop automated solutions and self‑service platforms to enable efficient onboarding of developers to efficiently set up, configure, and monitor their data environments.
- Optimize data storage and processing systems, including data lakes and data warehouses (e.g., Snowflake, Big Query, Redshift), to ensure cost‑effectiveness and performance at scale.
- Implement observability and monitoring solutions for data pipelines and infrastructure using tools like Prometheus, Grafana, or Data Dog to ensure system reliability and performance.
- Lead the adoption of Data Ops practices, fostering collaboration between data engineering, data science, and operations teams to streamline the entire data lifecycle.
- Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent experience in Data Engineering, Data Ops, MLOps, Software Engineering with a minimum of 5 years’ experience or 7 years’ experience, in lieu of an applicable degree.
- Strong proficiency in designing and implementing scalable, cloud-native (containerized) data platforms using Infrastructure as Code (e.g., Terraform, Docker, Kubernetes).
- Advanced programming skills in Python focusing on data‑intensive applications. Strong SQL proficiency and experience with cloud data warehouses (e.g., Snowflake, Big Query) required
- Proven track record in…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).