Data Engineer; EL Focus - Azure/Snowflake
Listed on 2026-01-12
-
IT/Tech
Data Engineer
About Life Science Connect
Life Science Connect is dedicated to uniting life sciences professionals and suppliers to accelerate research, development, and manufacturing We help professionals discover market opportunities by facilitating mutually beneficial connections between audiences and strategic partners. This accelerates the advancement of life-improving, life-extending, and life-saving therapies and devices. We serve a loyal, satisfied readership that demands original, compelling content with utility. Our comprehensive suite of capabilities for B2B sales and marketing enablement contributes significantly to the creation and maintenance of robust business development pipelines for our partners.
TheMission:
Data Force Multiplier
Life Science Connect is pivoting from a traditional publisher to a Data Authority. We are building a modern “Efficiency Stack” centered on Azure, Snowflake, and dbt to power our proprietary intent scoring and analytics products.
Staff Data EngineerThis is a critical, high-impact role focused specifically on the Extraction and Load (EL) portion of our architecture. You will not just build pipelines; you will architect the ingestion framework that feeds our entire analytics ecosystem. Your success will be measured by your ability to deliver high-quality, reliable, and timely raw data to our Analytics Engineering team, acting as a force multiplier that enables them to focus purely on business logic and transformation.
Key Responsibilities- EL Pipeline Architecture & Execution
- Ingestion Architecture:
Own the design, development, and optimization of scalable data ingestion pipelines using Azure Data Factory (ADF). Move beyond basic “drag-and-drop” configurations to build resilient, parameterized frameworks. - Complex Source Integration:
Design robust pipelines for high-volume, complex sources including Salesforce, Google Analytics (GA4), and internal APIs. Build custom connectors (using Python/Azure Functions) when native ADF connectors encounter API limits or sampling constraints. - Snowflake Landing:
Architect efficient loading patterns into Snowflake (Snowpipe, External Stages), ensuring that the “Raw Layer” is optimized for cost and performance before transformation begins.
- Ingestion Architecture:
- Analytics Collaboration & Schema Governance
- dbt Bridge:
Act as the primary partner to the Analytics Engineering team. Collaborate on Raw Layer schema design, ensuring that data lands in a structure that is easily consumable by dbt, preventing “garbage in” scenarios. - Data Reliability:
Provide the downstream teams with thoroughly documented reliable raw data feeds. Guarantee that the data in the warehouse matches the source of truth.
- dbt Bridge:
- Pipeline Orchestration & Optimization
- Advanced Orchestration:
Design dependency-aware pipeline orchestrations that manage the full data lifecycle, ensuring data arrives in the correct order and at the required frequency. - Performance Tuning:
Continuously monitor pipeline performance (latency, throughput) and optimize ADF resource allocation to control costs without sacrificing speed.
- Advanced Orchestration:
- Engineering Standards & Security
- CI/CD Implementation:
Define and lead the implementation of CI/CD pipelines for data workflows. Enforce automated testing and deployment processes using Git/Git Hub, treating infrastructure as code. - Security & Compliance:
Implement security best practices within the ingestion layer, specifically regarding Azure Key Vault for credential management and PIPL/GDPR compliance for PII handling.
- CI/CD Implementation:
- Experience 7+ years of professional experience in data engineering with a focus on high-volume production pipelines.
- Hybrid Skillset Expert-level proficiency with Azure Data Factory (ADF) and strong coding skills. Proficient in Python and SQL, capable of writing custom scripts for API interactions, data validation, and complex logic that GUI tools cannot handle.
- API Mastery Understand nuances of integrating with complex SaaS APIs (Salesforce, GA4). Handle rate limits, pagination, and token management programmatically.
- Warehouse Expertise Extensive experience loading data into Snowflake and understand architectural implications of loading patterns on warehouse costs.
- St…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).