×
Register Here to Apply for Jobs or Post Jobs. X

Kafka Integration Specialist

Job in 261201, Ahmedabad, Uttar Pradesh, India
Listing for: Forward Eye Technologies
Full Time position
Listed on 2026-03-05
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 500000 - 1000000 INR Yearly INR 500000.00 1000000.00 YEAR
Job Description & How to Apply Below
We are seeking a highly skilled  Kafka Integration Specialist  with extensive experience in designing, developing, and integrating  Apache Kafka  solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions.

Roles & Responsibilities :
Design, implement, and maintain  Kafka-based data pipelines .
Develop integration solutions using  Kafka Connect, Kafka Streams , and other related technologies.
Manage Kafka clusters, ensuring high availability, scalability, and performance.
Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions.
Implement best practices for data streaming, including message serialization, partitioning, and replication.
Monitor and troubleshoot Kafka performance, latency, and security issues.
Ensure data integrity and implement failover strategies for critical data pipelines.

Skills Required:

Strong experience in  Apache Kafka  (Kafka Streams, Kafka Connect).
Proficiency in programming languages like  Java, Python, or Scala .

Experience with distributed systems and data streaming concepts.
Familiarity with  Zookeeper, Confluent Kafka , and Kafka Broker configurations.
Expertise in creating and managing topics, partitions, and consumer groups.
Hands-on experience with integration tools such as  REST APIs, MQ, or ESB .
Knowledge of cloud platforms like  AWS, Azure, or GCP  for Kafka deployment.

Experience with monitoring tools like  Prometheus, Grafana , or Datadog is a plus.
Exposure to  Dev Ops practices, CI/CD pipelines , and infrastructure automation is a plus.
Knowledge of data serialization formats like  Avro, Protobuf , or JSON is a plus.

QUALIFICATION:

Bachelor's degree in Computer Science, Information Technology, or a related field.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary