×
Register Here to Apply for Jobs or Post Jobs. X

AI Data Enablement Lead

Job in Stamford, Fairfield County, Connecticut, 06925, USA
Listing for: Ascot Group
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 155000 - 175000 USD Yearly USD 155000.00 175000.00 YEAR
Job Description & How to Apply Below

This is an opportunity to join Ascot Group - one of the world’s preeminent specialty risk underwriting organizations.

Designed as a modern-era company operating through an ecosystem of interconnected global operating platforms, we’re bound by a common mission and purpose:
One Ascot. Our greatest strength is a talented team who flourishes in a collaborative, inclusive, and entrepreneurial culture, steeped in underwriting excellence, integrity, and a passion to find a better way, The Ascot Way.

The Ascot Way guides our people and our organization. Our underwriting platforms collaborate to find creative ways to deploy our capital in a true cross-product and cross-platform approach. These platforms work as one, deploying our capital creatively through our unique Fusion Model:
Client Centric, Risk Centric, Technology Centric.

Built to be resilient, Ascot maximizes client financial security while delivering bespoke products and world class service — both pre- and post-claims. Ascot exists to solve our clients’ brightest tomorrow, through agility, collaboration, resilience, and discipline.

Position Overview:

Reporting to the Senior AI Data Program Manager, the AI Data Enablement Lead is a senior individual contributor role responsible for the end-to-end delivery of enterprise data solutions on the Databricks Lakehouse platform. This role combines technical leadership with hands‑on data engineering to ensure timely, high‑quality delivery of scalable, secure, and performant data pipelines and analytics solutions. The Data Delivery Lead partners with business stakeholders to translate data requirements into actionable delivery plans, leads technical teams across onsite and offshore locations, and ensures adherence to enterprise data standards and best practices.

This role will be in the office with a hybrid work schedule.

Responsibilities:

Own end-to-end delivery of data initiatives on the Databricks platform, from design through implementation and production support.

Plan and manage delivery timelines, milestones, dependencies, and risks to ensure successful outcomes.

Coordinate work across onsite and offshore teams to ensure alignment, productivity, and quality delivery.

Act as the primary technical delivery point of contact for business and technology stakeholders.

Provide technical leadership and solution design guidance for Databricks-based data integration and analytics solutions.

Ensure data architecture and implementation align with enterprise standards for scalability, security, and performance.

Enforce best practices for data modeling, ETL/ELT design, analytics, and platform governance.

Design, develop, and manage data pipelines using Databricks, Apache Spark, Azure Data Factory (ADF), and SQL.

Lead ingestion, transformation, and processing of data from enterprise and regional source systems into data Lakehouse in Databricks.

Review and optimize SQL queries, Spark jobs, and ADF workflows for performance, reliability, and maintainability.

Ensure data quality, accuracy, data lineage, and consistency for downstream analytics and reporting.

Monitor and troubleshoot data pipelines and platform issues to ensure stable operations.

Drive performance tuning, reliability improvements, and operational best practices.

Ensure data security, access controls, and compliance with enterprise governance standards.

Partner with business leaders, data architects, analysts, and data scientists to support analytics and decision‑making needs.

pMentor and guide data engineers, promoting strong engineering discipline and delivery ownership.

Drive continuous improvement, modernization, and innovation across data delivery processes and platforms.

Commitment to The Ascot Way:
Embody The Ascot Way in their daily interaction with colleagues, fostering colleague engagement and development, collaboration, inclusivity, and individual accountability.

Requirements:

Bachelor’s degree in information systems, Computer Science, Data Analytics, or a related field (or equivalent experience).

Minimum of 7 years of hands‑on experience delivering enterprise data solutions using Databricks.

Hands‑on expertise with Databricks work spaces, unity catalogs, Apache…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary