Overseas Contractor
Job in
Decatur, Macon County, Illinois, 62523, USA
Listed on 2026-03-09
Listing for:
LTM
Contract
position Listed on 2026-03-09
Job specializations:
-
IT/Tech
Data Engineer, Data Analyst, Data Science Manager, Cloud Computing
Job Description & How to Apply Below
About Us:
LTIMindtree is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 700+ clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - a Larsen & Toubro Group company - combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation more information, please visit .
Job Title: Azure Databricks Lead
Location: Decatur, IL onsite
Job Summary
We are looking for a Data Replication Specialist with expertise in Microsoft Fabric and Databricks Lakeflow connect to design implement and manage data replication and synchronization solutions across multiple data environments The ideal candidate will ensure that business critical data is consistently available accurate and up to date across systems for analytics reporting and operational needs
Key Responsibilities
Design and implement data replication strategies using Microsoft Fabric Copy Data and Mirroring features
Configure and maintain replication pipelines between on premises and cloud based data sources eg SQL Server Azure SQL Synapse Data Lake Power BI
Monitor and optimize replication performance latency and reliability
Troubleshoot replication failures data drift and synchronization issues
Collaborate with data engineers architects and analytics teams to align replication processes with business and data governance requirements
Develop and maintain automation scripts for replication monitoring and ing using Power Shell Python or REST APIs
Document replication architectures data flow diagrams and operational procedures
Implement security compliance and data governance standards within replicated environments
Stay updated on Microsoft Fabric enhancements and best practices for data replication and integration
Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect
Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems
Implement realtime and batch ingestion workflows to support analytics and reporting needs
Optimize data ingestion performance ensuring scalability reliability and cost efficiency
Collaborate with data architects analysts and business stakeholders to define data requirements and ingestion strategies
Ensure data quality lineage and governance compliance across the ingestion process
Automate data ingestion monitoring ing and error handling mechanisms
Stay up to date with emerging Databricks Lakehouse and data integration technologies and best practices
Required Qualifications
Bachelors or Masters degree in Computer Science Information Systems Data Engineering or a related field
4 years of experience in data engineering or ETL development
Handson experience with Databricks SQL PySpark Delta Lake
Proficiency with Lakeflow Connect for building and managing data ingestion workflows
Strong understanding of data integration patterns data modeling and data lakehouse architectures
Experience with cloud platforms Azure AWS or GCP and associated data services
Knowledge of CICD version control Git and infrastructureascode practices
Familiarity with data governance security and compliance standards
Required Skills Qualifications
Bachelors degree in Computer Science Information Systems or related field
3-5 years of experience in data replication ETLELT or data integration
Handson experience with Microsoft Fabric Azure Data Factory or Synapse Pipelines
Strong understanding of Copy Data and Mirroring capabilities in Microsoft Fabric
Proficiency in SQL and scripting languages Power Shell Python
Experience working with Azure Data Lake One Lake and Power BI
Knowledge of change data capture CDC incremental loads and realtime…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×