More jobs:
Cloud Developer Security Clearance
Job in
Melbourne, Brevard County, Florida, 32901, USA
Listed on 2026-02-28
Listing for:
Leidos
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
R
- Description Position:
Cloud Developer
Location:
Melbourne, FL (Patrick SFB) Make an Impact with Leidos At Leidos, we deliver innovative solutions through the dedication of our diverse and talented teams. United by a shared commitment to our customers’ success, we empower our people, support our communities, and operate sustainably. Our Mission, Vision, and Values guide every aspect of our work, ensuring we do the right thing—for our customers, our people, and the world around us.
Thrive in an Impactful Environment The Leidos Defense Sector offers a broad portfolio of systems, solutions, and services across land, sea, air, space, and cyberspace. We support critical defense missions with capabilities in enterprise and mission IT, large-scale intelligence systems, command and control, geospatial and data analytics, cybersecurity, logistics, training, and intelligence operations. Our teams tackle the world’s toughest security challenges for customers with “can’t fail” missions.
Leidos Defense Sector is seeking an Cloud Developer to join our engineering team to design, build, and maintain scalable cloud-native solutions. The ideal candidate combines strong serverless development skills (Lambda, API Gateway, Step Functions), hands-on data engineering experience (ETL/ELT, streaming, data lakes), and expertise with XML transformation tools. This role partners with product, and data teams to deliver reliable, performant systems to mobilize trusted data for customer services.
Primary Responsibilities:
* Design, develop, and deploy cloud‑native applications and microservices on AWS using serverless and container‑based architectures
* Implement, maintain, and optimize XML transformation workflows using tools such as XSLT, Apache DFDL, Saxon, or Altova Map Force
* Build and maintain data ingestion, transformation, and storage pipelines (batch and streaming) to support analytics and ML workloads
* Develop APIs and backend services using serverless frameworks (Lambda, API Gateway) and event‑driven architectures (SNS, SQS, Event Bridge, Kinesis)
* Collaborate with data engineers to model, optimize, and operationalize data workflows across S3, Glue, Redshift, Snowflake, and related platforms
* Automate infrastructure provisioning and deployments using IaC tools (Cloud Formation, Terraform, CDK) and CI/CD pipelines
* Monitor, troubleshoot, and optimize system performance, cost, and reliability; implement observability using Cloud Watch, X‑Ray, Prometheus/Grafana
* Apply security best practices across the stack (IAM, KMS, VPC, encryption, least privilege) and ensure compliance with DoD security policies
* Produce and maintain documentation, runbooks, and code reviews; participate in post‑incident reviews
* Support on‑call rotation and occasional after‑hours work as needed
Basic Qualifications:
* Bachelor’s degree in Computer Science, Engineering, or related disciple with 4+ years of relevant experience. Additional experience, training, or certifications may substitute for degree
* U.S. citizen;
Currently possessing an Active DoD Top Secret clearance and eligibility for Top Secret/SCI
* IAT Level II certification (e.g., CompTIA Security+) or ability to earn within 6 months of hire
* 2+ years of hands‑on AWS experience (Lambda, API Gateway, S3, IAM, RDS/Dynamo
DB, VPC, Cloud Watch)
* Strong programming skills in at least one serverless‑friendly language (Python, Node.js/Type Script, Java, or Go)
* Experience building data pipelines and working with data stores (S3, Redshift, Dynamo
DB, RDS)
* Familiarity with IaC tools (Terraform, Cloud Formation, CDK) and CI/CD pipelines
* Hands on experience with code repositories such as Git, Git Lab, Git Hub
* Knowledge of RESTful API design, event‑driven architectures, and asynchronous processing
* Experience with observability and troubleshooting in production environments
Preferred Qualifications:
* Experience with streaming technologies (Kinesis, Kafka, Glue Streaming)
* Hands‑on experience with Apache NiFi, including custom processors, flow versioning, and integration with Kafka, HDFS/S3, and downstream ETL systems
* Familiarity with serverless orchestration (Step Functions)…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×