×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Architect

Job in Charlotte, Mecklenburg County, North Carolina, 28245, USA
Listing for: CRC Insurance Services, Inc.
Full Time position
Listed on 2026-01-24
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager
Salary/Wage Range or Industry Benchmark: 100000 - 125000 USD Yearly USD 100000.00 125000.00 YEAR
Job Description & How to Apply Below

Regular or Temporary:
Regular

Language Fluency:
English (Required)

Work Shift:

1st Shift (United States of America)

If you have a disability and need assistance with the application, you can request a reasonable accommodation. Send an email to Accessibility () (accommodation requests only; other inquiries won't receive a response).

Job Description

We are seeking an experienced Data Architect with deep expertise in Databricks and a strong understanding of the insurance industry. The ideal candidate will design and implement scalable data architectures that support advanced analytics, reporting, and AI/BI/ML initiatives across Broking, Binding, underwriting, claims, actuarial, and policy management functions.

Key Responsibilities Data Architecture & Strategy
  • Design and implement enterprise data architecture leveraging Databricks Lakehouse, Delta Lake, and Azure cloud‑native services
  • Define data integration, modeling, and governance frameworks tailored for insurance data domains such as agency, carrier, policy, claims, Invoicing & billing, and reinsurance.
  • Create scalable and secure data pipelines that handle structured and unstructured data from internal and external sources.
  • Author and maintain technological roadmap with mappings to business capabilities
  • Experiment and establish adoption blueprint for emerging capabilities within Databrick and in data technology
  • Secure the data with appropriate NIST cyber controls
Data Engineering & Analytics Enablement
  • Lead the design of ETL/ELT pipelines using PySpark, SQL, Delta Live Tables, Lakeflow connect, Autoloader, Lake flow declarative pipelines, DBT, and Databricks Workflows.
  • Partner with actuarial, underwriting, and BI teams to design semantic layers and analytics‑ready datasets.
  • Optimize performance and cost of Databricks clusters and workflows.
  • Enable machine learning and predictive analytics by providing clean, governed, and feature‑rich data sets.
Governance & Quality
  • Review solution developed by the divisional teams and vendors to ensure the solution uses modern technology, built to scale, has resilience, and is cost effective to operate.
  • Implement data quality, lineage, and metadata management frameworks using tools like Unity Catalog, Collibra, or Alation.
  • Establish and enforce data security and compliance policies aligned with insurance regulations (e.g., NAIC, Privacy, HIPAA, GDPR, CFIUS).
  • Ensure consistency of master data across operational and analytical systems.
Collaboration & Leadership
  • Collaborate with business leaders, data engineers, divisional data teams, data visualization experts, and platform head to align architecture with organizational goals.
  • Mentor teams on best practices for data modeling, Databricks optimization, and cloud data architecture.
  • Evaluate new technologies to continuously improve data strategy and capabilities.
Required Qualifications
  • Bachelor’s or master’s degree in computer science, Data Engineering, Information Systems, or related field.
  • 7+ years of experience in data architecture or data engineering, with at least 4 years in Databricks.
  • Strong experience with insurance data models, including policy, claims, premium, fees, agency, underwriting, accounting, and insured domains.
  • Expertise in cloud platforms (AWS, Azure, or GCP) and modern data lakehouse architecture.
  • Proficiency in SQL, PySpark, Delta Lake, and Databricks SQL.
  • Experience integrating with BI tools (Power BI, Tableau, Looker) and data governance tools.
  • Excellent communication and stakeholder management skills.
Preferred Skills
  • Experience with streaming data frameworks (Kafka, Delta Live Tables).
  • Familiarity with AI/ML pipelines in Databricks.
  • Certification(s):
  • Databricks Certified Data Engineer Professional
  • Azure/AWS Certified Data Architect
  • Insurance Data Management Association (IDMA) certifications
Benefits

At CRC Group, we're committed to supporting every aspect of teammates' well‑being – physical, emotional, financial, social, and professional. Eligible full‑time teammates enjoy access to medical, dental, vision, life, disability, and AD&D insurance; tax‑advantaged savings accounts; and a401(k) plan with company match. CRC Group also offers generous paid time off programs,…

Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary