×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Kafka Engineer

Job in Woodlawn, Prince George's County, Maryland, USA
Listing for: Social Security Administration
Full Time position
Listed on 2026-02-10
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Summary

IT Specialist (APPSW) Kafka Engineer positions are being filled through the Office of Personnel Management's delegated Direct Hire Authority, open to all U.S. citizens. Selections made under this bulletin will be processed as new appointments to the civil service. Current civil service employees would, therefore, be given new appointments to the civil service. Under the provisions of the Direct Hire Authority, Veterans Preference and the "Rule of Many" do not apply.

Additional summary details at the time of posting may apply.

Duties

This announcement serves as public notice. Applications submitted will be placed into a pool and will remain on file for selection as positions become available. Vacancies may be filled for up to 6 months after the closing date of this announcement. Applicants may not receive notifications of referral status until the full 6-month eligibility period has elapsed. Positions filled with applications from this pool will be placed into positions where artificial intelligence duties occupy the majority of the work performed.

Qualifications

Resumes exceeding two pages in length will not be considered. Please visit the new resume guidance for more information.

Duties Design, develop, and maintain robust Kafka-based applications and data pipelines that support SSA's business operations, including real-time or near-real-time data to AI/ML models. Collaborate with development, operations, and infrastructure teams to deliver reliable, scalable, and high-performing Kafka solutions. Ensure the availability, reliability, and performance of Kafka clusters and related systems. Work closely with architects, data engineers, and stakeholders to define requirements and deliver solutions.

Troubleshoot and resolve issues in Kafka applications, ensuring minimal downtime and optimal performance. Document code, design decisions, processes, configurations, and best practices for future reference and team knowledge sharing. Mentor junior developers and share Kafka expertise, fostering a culture of learning and growth. Stay current with the latest Kafka releases, features, and ecosystem advancements. Perform statistical analysis to monitor team performance, improve processes, and ensure customer satisfaction.

Define and set SLAs for projects, ensuring high standards of service delivery. Read all sections of this announcement in its entirety. This information is crucial to submitting a successful application. Applicants must qualify for the series and grade of the posted position. Experience must be IT related; the experience may be demonstrated by paid or unpaid experience and/or completion of specific, intensive training (for example, IT certification).

Your resume must provide sufficient experience and/or education, knowledge, skills, abilities, and proficiency of any required competencies to perform the specific position for which you are applying.

To qualify for the 2210 IT Specialist series, the applicant must demonstrate the following competencies:
Attention to Detail;
Customer Service;
Oral Communication;
Problem Solving.

Minimum Qualifications

Grade 14 To qualify at the GS-14 level, you must have at least 52 weeks of specialized experience at the GS-13 level, or equivalent, designing, developing, and maintaining scalable, fault-tolerant data pipelines using Apache Kafka; managing and administering Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching; leading large-scale projects, serving as a Product Owner or Agile/Scrum team lead;

demonstrating strong programming skills in Java, with Python experience as a plus; utilizing Kafka APIs (Producer, Consumer, Streams, Connect) for event-driven and microservices-based solutions; applying knowledge of serialization formats (Avro, Protobuf, JSON) and schema registry/data governance, including Hackolade for data modeling; optimizing producer/consumer performance and handling large-scale data ingestion; implementing unit and integration testing for Kafka applications; configuring and tuning Kafka clusters for performance, reliability, and scalability;

monitoring and troubleshooting Kafka clusters using tools such as Prometheus and Grafana; supporting hybrid integration architecture patterns.

Grade 15 To qualify at the GS-15 level, you must have at least 52 weeks of specialized experience at the GS-14 level, or equivalent, leading the design, development, and implementation of enterprise-scale, fault-tolerant data pipelines using Apache Kafka; providing expert-level management and administration of Kafka clusters throughout the Systems Development Life Cycle (SDLC), including upgrades and patching; overseeing large-scale, cross-functional projects as a senior Product Owner or Agile/Scrum leader, ensuring alignment with organizational goals;

demonstrating advanced proficiency in Java programming, with experience in Python as a plus; architecting event-driven and microservices-based solutions leveraging Kafka APIs…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary