Data Platform Operations Engineer
Listed on 2026-01-12
-
IT/Tech
Data Engineer, Cloud Computing, Data Security, Data Analyst
Overview
Double Good's mission is to create joy. We create joy with our delectable and award-winning popcorn. We create joy with our easy-to-use fundraising platform that raises a meaningful amount of money for youth sports and activities, empowering kids to pursue their dreams. We create joy through our Kids Foundation which hosts Double Good Days events across the country to bring all-ability fun to children with special needs and their families.
As featured on the Today Show, Double Good is not just about the product; we have a strong social mission. In recent years, Double Good has seen 40% year over year growth, and we're excited about our future! We are building a best-in-class data platform to support our exciting future, and we are looking for a Data Platform Operations Engineer to join our team as we scale our modern data stack.
Location - This is a hybrid role based out of our downtown Chicago office
About the roleWe are seeking a Data Platform Operations Engineer to join us in building, automating, and operating our Enterprise Data Platform. This role is ideal for someone with a unique combination of Data Ops/Dev Ops, Data Engineering, and Database Administration expertise. As a key member of our Data & Analytics team, you will ensure our data infrastructure is reliable, scalable, secure, and high-performing—enabling data-driven decision-making across the business.
Experience& Skills we value
- Snowflake Administration: Own the administration, monitoring, configuration, and optimization of our Snowflake data warehouse. Implement and automate user/role management, resource monitoring, scaling strategies, and security policies.
- Fivetran Management: Configure, monitor, and troubleshoot Fivetran pipelines for seamless ingestion from SaaS applications, ERPs, and operational databases. Resolve connector failures and optimize sync performance and cost.
- Data Ops/Automation: Build/improve CI/CD workflows using Git and other automation tools for data pipeline deployment, testing, and monitoring.
- Infrastructure as Code (IaC): Implement and maintain infrastructure using tools like Terraform and Titan to ensure consistent, repeatable, and auditable environments.
- Platform Monitoring & Reliability: Implement automated checks and alerting across Snowflake, Fivetran, and dbt processes to ensure platform uptime, data freshness, and SLA compliance. Proactively identify and resolve platform issues and performance bottlenecks.
- Database Performance and Cost Optimization: Monitor and optimize database usage (queries, compute, storage) for speed and cost-effectiveness. Partner with data engineers and analysts to optimize SQL and refine warehouse utilization.
- Security & Compliance: Enforce security best practices across the data platform (access controls, encryption, data masking). Support audits and compliance requirements (e.g., SOC2).
- Data Quality Operations: Build and automate data health and quality checks (using dbt tests and/or custom monitors). Rapidly triage and resolve data pipeline incidents with root cause analyses.
- Documentation & Process: Ensure all operational procedures (runbooks, escalation paths, knowledge base) and infrastructure documentation are accurate, up-to-date, and easily accessible.
- Collaboration: Partner with Data Architects, Data Engineers, and Dev Ops Engineers to understand data flow requirements, troubleshoot issues, and continuously enhance platform capabilities.
- 5+ years in a Data Ops, Dev Ops, Data Engineering, or Database Administration role in cloud data environments.
- Hands-on experience administering Snowflake, including security, performance tuning, cost management, and automation.
- Strong expertise with Fivetran setup, management, and incident troubleshooting.
- Proficiency in dbt for ETL development, testing, and orchestration.
- Advanced SQL skills for troubleshooting, diagnostics, and optimization.
- Proficient with version control (Git) and experience designing/deploying data pipelines in a collaborative environment.
- Scripting skills (Python, Bash, etc.) for workflow automation, data operations tasks, and deployment pipelines.
- Experience with cloud platforms…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).