×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineer, Data Scientist, AI Engineer

Job in New York, New York County, New York, 10261, USA
Listing for: Keru.ai
Full Time position
Listed on 2026-01-20
Job specializations:
  • Software Development
    Data Scientist, AI Engineer, Software Engineer
Job Description & How to Apply Below
Location: New York

About Kepler

Kepler is building a transparent and intelligent deep-research platform

Financial professionals spend 60-70% of their time manually gathering and consolidating data in a $26.5 trillion industry where speed and accuracy directly impact outcomes. The research landscape has fragmented into dozens of specialized systems — analysts toggle between platforms for financials, transcripts, market data, and macro indicators, reviewing hundreds of documents across disconnected sources for a single investment thesis.

Generic AI tools promise efficiency but fail the trust test. They hallucinate data, confabulate reports, and provide insights without showing their work, forcing analysts back into manual verification. In an industry where being wrong costs millions, opacity isn't acceptable.

Kepler solves this by automating research while maintaining the accuracy and traceability financial decisions demand. The result: faster decisions, deeper analysis, and a competitive advantage where synthesizing information more thoroughly than competitors translates directly to performance.

Kepler was founded by two Palantir veterans with 20 years of combined experience building core parts of Palantir's Gotham and Foundry Platform. Our founders created Palantir Quiver, the analytics engine behind $100M+ enterprise deals with BP and Airbus, architected core compute and data systems, led major Department of Defense projects, and served as Head of Business Engineering at Citadel.

We're backed by founders of OpenAI, Facebook AI, Mother Duck, DBT, and Outerbounds.

The Role

As a Software Engineer at Keru, you'll architect and build the intelligent backend infrastructure that powers our autonomous AI research agents. You'll design the core orchestration systems that coordinate multiple specialized AI agents, manage complex multi-step research workflows, and ensure reliable execution of mission-critical financial analysis tasks.

Within your first 90 days, you will:

  • Design and ship your first multi-agent workflow system into production with senior mentorship

  • Build and deploy an agent orchestration service that handles real financial research tasks

  • See your code power autonomous research workflows at top financial institutions

  • Take ownership of a core agent system from architecture to production deployment

This role is ideal for engineers who want to build foundational agentic infrastructure at the intersection of AI and finance
, where robust system architecture enables autonomous agents to augment human decision-making at enterprise scale.

What You'll Do
  • Agent Orchestration & Workflow Engineering: Design and implement sophisticated agent coordination systems that manage complex, multi-step research workflows. Build state machines, task queues, and execution engines that coordinate specialized AI agents across diverse data sources and analysis tasks.

  • Multi-Agent System Architecture: Architect scalable systems for agent communication, task delegation, and result synthesis. Implement patterns for agent specialization, load balancing, and dynamic resource allocation across research workflows.

  • Autonomous Task Execution: Build robust execution frameworks that handle long-running, multi-phase research tasks with automatic retry logic, error recovery, and graceful degradation. Ensure agents can autonomously navigate complex decision trees while maintaining human oversight capabilities.

  • AI Model Integration & Management: Integrate and orchestrate multiple language models (GPT, Claude, specialized financial models) with intelligent routing, fallback mechanisms, and cost optimization. Build abstraction layers that allow seamless model swapping and A/B testing.

  • Real-Time Data Pipeline Architecture: Design high-throughput data ingestion systems that process streaming financial data, SEC filings, news feeds, and alternative datasets. Build event-driven architectures that trigger agent workflows based on real-time market events.

  • Agent Memory & Context Management: Implement sophisticated memory systems that allow agents to maintain context across long research sessions, learn from past interactions, and build upon previous analysis. Design vector…

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary