Job Description & How to Apply Below
Experience:
5–8 Years
Role Summary
We are seeking a Senior Data Engineer to architect and implement scalable ETL frameworks for transforming Excel-based RCM operational data into a governed, analytics-ready data model within Domo.
The ideal candidate will drive data engineering strategy, automation, performance optimization, and data governance across the reporting lifecycle.
Key Responsibilities
Architecture & Design
Design end-to-end ETL/ELT architecture for ingesting multi-source data (Excel, flat files, APIs)
Build scalable ingestion pipelines using Domo Workbench, APIs, and automated scheduling
Define enterprise-grade data modeling standards (fact/dimension, star schema, conformed dimensions)
Establish naming conventions, dataset versioning, and environment promotion standards
Data Engineering & Transformation
Develop advanced SQL transformations and Magic ETL dataflows
Implement complex business logic including AR aging, denial categorization, payment posting, adjustments, and revenue analytics
Perform data normalisation, deduplication, and reconciliation against source systems
Optimize dataflows for performance and refresh efficiency
Implement incremental loads and change data capture (CDC) logic
Automation & Optimization
Automate file ingestion from network/shared drives or secure storage environments
Develop Python-based pre-processing scripts where required
Configure job scheduling, monitoring, logging, and failure alerts
Reduce manual reporting dependencies through process automation
Data Governance & Quality
Implement data validation frameworks and reconciliation checkpoints
Design audit trails for report traceability
Ensure SLA adherence for daily, weekly, and monthly reporting cycles
Establish data quality KPIs and monitoring dashboards
Collaboration & Leadership
Partner with MIS, Operations, and Business stakeholders to translate RCM logic into structured datasets
Mentor junior data engineers and analysts
Conduct technical design reviews and enforce engineering best practices
Drive continuous improvement initiatives in reporting modernization
Required Technical Skills
6+ years of hands-on experience in ETL pipeline development
Advanced SQL expertise (joins, window functions, performance tuning, query optimization)
Strong experience in Domo (Workbench, Magic ETL, SQL Dataflows, Dataset Governance)
Experience handling large structured Excel datasets and automating ingestion
Proficiency in Python for data manipulation (pandas, file automation, scripting)
Strong understanding of data modeling and warehousing concepts
Experience designing incremental and full refresh strategies
Exposure to secure data handling and role-based access control
Preferred Qualifications
Experience in Healthcare / RCM analytics
Familiarity with AR, Denials, Payments, Adjustments, Claims lifecycle
Experience migrating legacy Excel reporting to BI platforms
Knowledge of data orchestration tools or workflow schedulers
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×