×
Register Here to Apply for Jobs or Post Jobs. X

Confluent Kafka Engineer

Job in Woodlawn, Prince George's County, Maryland, USA
Listing for: Galaxy Infotech INC
Full Time position
Listed on 2026-03-14
Job specializations:
  • IT/Tech
    Data Engineer, Big Data
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: CONFLUENT KAFKA ENGINEER

T+S USC

Need local
-onsite

Need Linked In

  • Must have active or prior Public Trust Clearance.
Description

Seeking a Confluent Kafka Engineer to work on-site in Woodlawn, Maryland. Will provide expertise in the development, testing, and production support of Confluent Kafka-based systems. This role requires deep expertise in Kafka architecture, including Confluent Control Center, Kafka Streams, and Kafka Connect. The engineer will collaborate closely with cross-functional teams to ensure the smooth operation of data streaming services.

Responsibilities
  • Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance.
  • Ensure data integrity and availability in a big data environment.
  • Expertise in a programming language, such as Java or Python.
  • Collaborate with product design teams and SMEs to understand data pipeline needs.
  • Participate in all Agile ceremonies.
  • Write and maintain high-quality code for Kafka producers, consumers, and stream processing applications.
  • Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.
  • Utilize Kafka Streams for real-time processing of streaming data, transforming and enriching data as it flows through the pipeline.
  • Employ KSQLDB for stream processing tasks, including real-time analytics and transformations.
  • Collaborate with data engineers, software developers, and Dev Ops teams to integrate Kafka solutions with existing systems.
  • Ensure all Kafka-based solutions are scalable, secure, and optimized for performance.
  • Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.
Requirements

Minimum Education and Years of

Experience:

  • Bachelor's degree in Computer Science, Information Technology, or a related field + 10+ years of experience in a technical field.
    • Technical Master's or Doctorate degree may substitute for 5 years of required experience.

Minimum

Skills:

  • Software development experience with a solid understanding of building, deploying, and maintaining applications that leverage the Confluent Kafka platform, focusing on data streaming and messaging solutions.
  • 5+ years of experience on an Agile development team
  • Extensive experience with Apache Kafka and Confluent Kafka, including proficiency with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.
  • Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.
  • Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.
  • Familiarity with distributed systems, microservices architecture, and event-driven design patterns.
  • Experience with AWS and containerization (Kubernetes) is a plus.
  • Proficiency in programming languages, such as Java.
  • Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.
  • Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j).
  • Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.
  • Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.
  • Experience with KSQLDB for real-time processing and analytics on Kafka topics.
  • Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.
  • Understanding of networking, security, and compliance aspects related to Kafka.
  • Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, Git).
Desired
  • Experience in an AWS environment.
  • Experience with Hadoop or other big data platform.
  • Excellent troubleshooting and analytical skills to quickly identify and resolve issues.
  • Proficiency in Software development, preferably Java.
  • Experience working on Agile projects and understanding Agile terminology.
  • Participate in daily scrum and provide updates.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary