Full-Stack Software Engineer; Data Pipeline & Analytics Focus
Listed on 2026-02-28
-
Software Development
Data Engineer, Software Engineer
👋🏼 Hey!
Bitwise is a leading provider of mission-focused intelligence solutions that advance national security for the Intelligence Community and Department of Defense. We’re a small and growing company, so you can expect to hop in on the ground floor with us and be a consequential member of the team. You'll be more than a contract performer for us – you’ll also be asked for ideas to improve our company and improve your career, and you’ll contribute to our team culture.
We value growth and community above almost all else, so we gather regularly for game nights, happy hours, tech talks, and plenty more. We think you’ll like it here!
Remember, Bitwise is not a cult. 🖖🏼
🎯 What We Look ForBitwise hires talented engineers who are driven by purpose and who value a culture of technical excellence, growth, and overall wellness. We deliver new and innovative intelligence solutions to our customers at the very forefront of our country’s national security missions. And we do it every single day. Our work matters, and so will you.
We ask that every new hire be able to:
- Contribute meaningful thought leadership — if not right away, over time
- Interact with our customers and earn their confidence in your abilities
- Be detailed, even if that means taking a little more time to get it right
- Contribute their ideas and ideals to improve all aspects of our company
- Allow us to invest in their education so we both grow together
- Know and live our core values every single day
As a Full-Stack Software Engineer with a focus on Data Pipelines & Analytics, you will build and maintain systems that ingest, transform, enrich, and present large-scale data sets in support of mission objectives. Your work will span backend data processing, API development, and user-facing analytics interfaces that enable stakeholders to explore and act on complex information.
This role blends software engineering with data engineering. You will help design scalable ingestion workflows, implement ETL and streaming pipelines, and ensure data integrity across multiple storage technologies. At the same time, you’ll contribute to application layers that expose this data through intuitive visualizations and mission-aligned workflows.
You will work in a collaborative environment alongside analysts, data scientists, Dev Ops engineers, and other software developers to ensure solutions are performant, reliable, and adaptable to evolving requirements.
Responsibilities will include:
- Maintain an active TS/SCI with Polygraph. Candidates without a current clearance will not be considered.
- Design and implement data ingestion and transformation pipelines
- Develop backend services to process and normalize structured and unstructured data
- Build APIs that expose processed data for mission applications
- Implement ETL and/or streaming workflows using modern dataflow tools
- Develop and maintain data models and schema frameworks
- Integrate with relational, document, graph, and search databases
Optimize performance for large-scale queries and high-volume data environments - Ensure data integrity, validation, and monitoring across the pipeline lifecycle
- Collaborate with mission stakeholders to refine data requirements and analytics use cases
- Active TS/SCI with Polygraph.
- Strong experience in backend development using Python, Java, or similar languages
- Experience building or maintaining data pipelines (ETL and/or streaming)
- Familiarity with JSON and schema-driven data modeling
- Experience working with No
SQL and/or relational databases - Familiarity with RESTful API design
- Experience with Git-based development workflows
- Comfortable working in Linux development environments
- Ability to manage complex data sets and translate analytical requirements into technical implementations
- Don't quite meet these requirements? Peep our other prime positions
- Experience with dataflow tools such as Apache NiFi, Kafka, Airflow, or similar
- Experience with Elasticsearch, Mongo
DB, Redis, graph databases, or similar technologies - Familiarity with containerization technologies (Docker, Kubernetes)
- Experience with CI/CD pipelines and automated testing
- Experience supporting production CNO…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).