×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Databricks Architect MDM exp

Job in Woodbridge Township, Middlesex County, New Jersey, USA
Listing for: VeridianTech
Full Time position
Listed on 2026-01-11
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below
Position: Databricks Architect with MDM exp.

Job Title

Databricks Architect with MDM exp.

Location

Iselin, NJ (Hybrid)

Duration

12 months

Job Summary

We are seeking highly skilled Databrick Architect with strong expertise in MDM SQL, Python, Data warehouse, Cloud ETL tools to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting‑edge data solutions.

Key Responsibilities
  • Data Pipeline Development
    • Build and maintain scalable ETL/ELT pipelines using Databricks.
    • Leverage PySpark/Spark and SQL to transform and process large datasets.
    • Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non‑relational systems.
  • Collaboration & Analysis
    • Work closely with multiple teams to prepare data for dashboard and BI tools.
    • Collaborate with cross‑functional teams to understand business requirements and deliver tailored data solutions.
  • Performance & Optimization
    • Optimize Databricks workloads for cost efficiency and performance.
    • Monitor and troubleshoot data pipelines to ensure reliability and accuracy.
  • Governance & Security
    • Implement and manage data security, access controls and governance standards using Unity Catalog.
    • Ensure compliance with organizational and regulatory data policies.
  • Deployment
    • Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments.
    • Manage version control for Databricks artifacts and collaborate with team to maintain development best practices.
  • Technical Skills
    • Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.)
    • Proficiency in Azure Cloud Services.
    • Solid Understanding of Spark and PySpark for big data processing.
    • Experience in relational databases.
    • Knowledge on Databricks Asset Bundles and Git Lab.
    Preferred Experience
    • Familiarity with Databricks Runtimes and advanced configurations.
    • Knowledge of streaming frameworks like Spark Streaming.
    • Experience in developing real‑time data solutions.
    #J-18808-Ljbffr
    To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
    (If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
     
     
     
    Search for further Jobs Here:
    (Try combinations for better Results! Or enter less keywords for broader Results)
    Location
    Increase/decrease your Search Radius (miles)

    Job Posting Language
    Employment Category
    Education (minimum level)
    Filters
    Education Level
    Experience Level (years)
    Posted in last:
    Salary