Data architect motor vehicle manufacturing domain
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Data Science Manager, Data Analyst
Minimum 11+ years of experience without architect experience please dont apply for this position.
Motor vehicle manufacturing domain experience is must.
Additionally, it is important that the Linked In profile of the candidates matches their resume, as the customer may review their profiles.
Looking forward to your support in finding the right candidates.
Need only strong resumes at least 90% match.
About the RoleWe are seeking a skilled Data Architect to lead the migration, analysis, and reporting of service and repair data, including external data sourced from dealers in the truck industry. This role plays a critical part in enabling data-driven decision-making for our Aftersales operations by designing and implementing robust data pipelines, data models, and real-time reporting solutions.
Key Responsibilities- Design and architect scalable data ingestion pipelines to process service and repair data from multiple sources, including external dealer data streams.
- Ingest real-time data using Confluent Kafka and ensure reliable, high-throughput data flow.
- Utilize Google Dataflow to perform data cleansing, normalization, and transformation operations.
- Model and optimize data storage in Snowflake to support efficient querying and reporting.
- Develop and maintain interactive and insightful dashboards using Tableau to enable business users to monitor Aftersales service performance.
- Implement real-time reporting capabilities by pushing transformed data through APIs to dashboards.
- Collaborate closely with data engineers, analysts, and business stakeholders to understand data requirements and deliver optimal solutions.
- Establish and enforce data governance, quality, and security standards.
- Continuously improve data architecture to support scalability, performance, and evolving business needs.
- Proven experience as a Data Architect or similar role in data-intensive environments.
- Strong expertise in data ingestion technologies, especially Confluent Kafka.
- Hands-on experience with Google Dataflow (Apache Beam) for data processing and transformation.
- Deep knowledge of cloud data warehousing concepts, with proficiency in Snowflake.
- Experience creating reports and dashboards using Tableau.
- Solid understanding of data modeling techniques (star schema, snowflake schema, normalized forms).
- Familiarity with API integration for real-time data delivery.
- Strong problem-solving skills and attention to data quality and reliability.
- Excellent communication skills and ability to translate technical concepts for business stakeholders.
- Experience in the automotive or truck industry, particularly in Aftersales service data.
- Knowledge of additional cloud platforms and data tools (e.g., AWS, GCP, Azure).
- Programming skills in Python, SQL, or Java.
- Understanding of dealer management systems and external data integration challenges.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).