×
Register Here to Apply for Jobs or Post Jobs. X

Platform Data Engineer

Job in Bengaluru, 560001, Bangalore, Karnataka, India
Listing for: Confidential
Full Time position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing, Big Data
Salary/Wage Range or Industry Benchmark: 300000 INR Yearly INR 300000.00 YEAR
Job Description & How to Apply Below
Location: Bengaluru

Key Responsibilities :
Design & Implement Data Architecture :
Design, implement, and maintain the overall  data platform architecture  ensuring the scalability, security, and performance of the platform.
Data Technologies Integration :
Select, integrate, and configure data technologies (cloud platforms like  AWS ,  Azure ,  GCP ,  data lakes ,  data warehouses ,  streaming platforms  like  Kafka ,  containerization technologies ).
Infrastructure Management :
Setup and manage the infrastructure for  data pipelines ,  data storage , and  data processing  across platforms like  Kubernetes  and  Airflow .
Develop Frameworks & Tools :
Develop internal frameworks to improve the efficiency and usability of the platform for other teams like  Data Engineers  and  Data Scientists .
Data Platform Monitoring & Observability :
Implement and manage  monitoring  and  observability  for the data platform, ensuring high availability and fault tolerance.
Collaboration :
Work closely with software engineering teams to integrate the data platform with other business systems and applications.
Capacity & Cost Optimization :
Involved in  capacity planning  and  cost optimization  for data infrastructure, ensuring efficient utilization of resources.
Tech Stack Requirements :
Apache Iceberg  (version 0.13.2):
Experience in managing table formats for scalable data storage.
Apache Spark  (version 3.4 and above):
Expertise in building and maintaining  batch processing  and  streaming data processing  capabilities.
Apache Kafka  (version 3.9 and above):
Proficiency in managing  messaging platforms  for real-time data streaming.
Role-Based Access Control (RBAC) :

Experience with  Apache Ranger  (version 2.6.0) for implementing and administering security and access controls.
RDBMS :
Experience working with  near real-time data storage solutions , specifically  Oracle  (version 19c).
Great Expectations  (version 1.3.4):
Familiarity with implementing  Data Quality (DQ) frameworks  to ensure data integrity and consistency.
Data Lineage & Cataloging :

Experience with  Open Lineage  and  Data Hub  (version 0.15.0) for managing  data lineage  and catalog solutions.
Trino  (version 4.7.0):
Proficiency with  query engines  for batch processing.
Container Platforms :
Hands-on experience in managing container platforms such as  SKE  (version 1.29 on  AKS ).
Airflow  (version 2.10.4):
Experience using  workflow and scheduling tools  for orchestrating and managing data pipelines.
DBT  (Data Build Tool):
Proficiency in using  ETL/ELT frameworks  like  DBT  for data transformation and automation.
Data Tokenization :

Experience with  data tokenization  technologies like  Protegrity  (version 9.2) for ensuring data security.
Desired Skills :
Domain Expertise :
Familiarity with the  Banking  domain is a plus, including working with financial data and regulatory requirements.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary