GSET Data Platform Eng - GBM Public - Software Engineer - NYC
Listed on 2026-03-01
-
Software Development
Data Engineer
Equities Data Platform & Analytics Engineer
As a Data Platform & Analytics Engineer, you will design, build, and maintain a large volume, high‑processing and low‑latency post‑trade stack that is used to source, transform, persist, and distribute equities trading data to various consumers, including regulatory, compliance, billing, risk, analytics, and client reports.
This stack is being rebuilt as a cloud‑native platform and presents a once‑in‑a‑generation opportunity to architect high‑volume (multi‑billion messages per day), low‑latency streaming solutions.
This is an analytics‑heavy role; your work drives the routing decisions of trading algorithms to be optimized for exchange fees and rebates. We are currently integrating agentic AI capabilities into our analytics platform to deepen our client outreach.
You will also be responsible for developing integrated data quality controls and visibility/entitlement platforms to ensure nuanced data access enforcement and real‑time validation of data in terms of completeness and accuracy.
How will you fulfill your potential?- Implementing and/or enhancing the data platform ecosystem, which is challenging as it must address scale, resiliency, performance/throughput with optimal usage of resources.
- Working with trading & post‑trade teams, regulatory and compliance operations and coverage teams to ensure smooth rollout & migration of any upgrades or changes.
- Liaising with senior stakeholders across the firm to implement technology strategies.
- Working on the latest technologies in high‑performance computing, cloud‑stack, big data & distributed processing.
- Rolling out new agentic AI capabilities to drive operational efficiency and revenue growth.
- Minimum 1 year of work experience in high performance server‑side/data processing applications.
- Minimum 1 year of Java or 2 years of Python programming experience.
- Exposure to scripting languages, relational databases (OLTP/OLAP) and cloud services (Snowflake/Singlestore).
- Clear understanding of algorithms and data structures.
- Familiarity with core programming concepts and techniques (e.g. concurrency, memory management), I/O, performance optimization, high volume, near real‑time processing.
- Comfortable with standard SDLC tools, e.g. version control systems, build & debugging tools.
- Strong written and oral communication skills.
- Highly motivated, committed and capable of working against timelines with minimal guidance.
- Finance domain expertise & exposure to trading, investment banking is preferred but not required.
- Bachelor’s degree / Master’s degree in Computer Science, Computer Engineering or related field.
- Experience with several of the following:
- Large‑scale, distributed enterprise systems – Flink, Kafka, Spark, Hadoop, etc.
- High performance, high availability systems.
- Databases/cloud services including Snowflake/Singlestore.
- Agentic AI workflows, exposure to working with generative AI models.
The expected base salary for this New York, New York, United States‑based position is $110,000‑$130,000. In addition, you may be eligible for a discretionary bonus if you are an active employee as of fiscal year‑end.
BenefitsGoldman Sachs is committed to providing our people with valuable and competitive benefits and wellness offerings, as it is a core part of providing a strong overall employee experience. A summary of these offerings, which are generally available to active, non‑temporary, full‑time and part‑time US employees who work at least 20 hours per week, can be found here.
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).