DevOps Enginee
Listed on 2026-01-14
-
IT/Tech
Data Engineer, Cloud Computing, Data Security, Systems Engineer
Dev Ops Engineer – EOHHS
Client is seeking an experienced Dev Ops Engineer to support our cloud data warehouse modernization initiative, migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The Dev Ops Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The role will design and implement CI/CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments.
Responsibilities- Build and maintain CI/CD pipelines for Snowflake, Informatica (IICS), and Airflow DAG deployments.
- Implement automated code promotion between development, test, and production environments.
- Integrate testing, linting, and security scanning into deployment processes.
- Develop and maintain Infrastructure as Code (IaC) using Terraform for Snowflake objects, network, and cloud resources.
- Manage configuration and environment consistency across multi‑region/multi‑cloud setups.
- Maintain secure connectivity between cloud and on‑prem systems (VPNs, private links, firewalls).
- Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance.
- Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage.
- Optimize pipeline performance, concurrency, and cost governance in Snowflake.
- Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow.
- Troubleshoot platform and orchestration issues, lead incident response during outages.
- Enforce Dev Sec Ops practices including encryption, secrets management, and key rotation.
- Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements.
- Participate in testing, deployment, and release management for new data workflows and enhancements.
- 3–7+ years in Dev Ops, Cloud Engineering, or Data Platform Engineering roles.
- Bachelor’s degree or equivalent in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field.
- Experience with Snowflake (roles, warehouses, performance tuning, cost control).
- Experience with Apache Airflow (DAG orchestration, monitoring, deployments).
- Strong CI/CD skills using Git Lab, Git Hub Actions, Azure Dev Ops, Jenkins, or similar.
- Proficiency with Terraform, Python, and Shell scripting.
- Deep understanding of cloud platforms: AWS, Azure, or GCP.
- Experience with containerization (Docker, Kubernetes), especially for Airflow.
- Strong knowledge of networking concepts and security controls.
- Effective communication with technical and non‑technical stakeholders.
- Ability to troubleshoot complex distributed data workloads.
- Strong documentation and cross‑team collaboration skills.
- Proactive and committed to process improvement and automation.
- Detail‑oriented, with a focus on data accuracy and process improvement.
- Preferred:
Experience migrating from SQL Server or other legacy DW platforms. - Preferred:
Knowledge of Fin Ops practices for Snowflake usage optimization. - Preferred:
Background in healthcare, finance, or regulated industries.
Mid‑Senior level
Employment TypeContract
Job FunctionAnalyst and Business Development
IndustryIT Services and IT Consulting
Referrals increase your chances of interviewing at Synergyassure Inc by 2x
Boston, MA
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).