Data Platform Engineer
Listed on 2026-02-28
-
IT/Tech
Data Engineer, Data Analyst
Company Description
Tradeweb is a global leader in electronic trading for rates, credit, equities, and money markets. As financial markets become increasingly interconnected, our technology enables efficient, multi-asset trading on a global scale. We serve more than 3,000 clients in more than 85 countries, including many of the world’s largest banks, asset managers, hedge funds, insurers, corporations, and wealth managers.
Creative collaboration and sharp client focus have helped fuel our organic growth. We facilitated average daily trading volume (ADV) of more than $2.2 trillion over the past four fiscal quarters, topping $2.5 trillion in ADV for the first quarter of 2025.
Since our IPO in 2019, Tradeweb has completed four acquisitions and doubled our revenues – and 2024 was our 25th consecutive year of record revenues.
Tradeweb is a great place to work, recognized in 2024 by Forbes as one of America’s Best Companies (2024) and by U.S. News & World Report as one of the Best Financial Services Companies to Work For.
Mission:
Move first and never stop.
Collaborate with clients to create and build solutions that drive efficiency, connectivity, and transparency in electronic trading.
Be part of a new function at Tradeweb, providing data engineering support directly to our International Data Science and Analytics teams. You will be using your experience to curate datasets, optimize code and build ETL flows to accelerate the development and deployment of business-critical data and analytics products. This position relies on close alignment and integration with the business facing teams and provides an opportunity to learn the fundamentals of electronic trading and the data that powers our markets.
Job Responsibilities- Build and run processes that utilize Tradeweb’s data platform using such technologies as public cloud infrastructure (AWS and GCP), Kafka, databases and containers
- Enhance Tradeweb’s data science platform based on open source software and Cloud services
- Build and run ETL pipelines to onboard data into the platform, define schema, build DAG processing pipelines and monitor data quality.
- Manage and run mission crucial production services.
- 3+ years of experience in a Data Platform Engineering role
- Strong software engineering experience and working with Python
- Strong experience working with SQL and databases/engines such as MySQL, Postgre
SQL, SQL Server, Snowflake, Redshift, Presto, etc - Experience building ETL and stream processing pipelines using Kafka, Spark, Flink, Airflow/Prefect, etc.
- Familiarity with data science stack: e.g. Juypter, Pandas, Scikit-learn, Dask, Pytorch, MLFlow, Kubeflow, etc.
- Experience with using AWS/GCP (S3/GCS, EC2/GCE, IAM, etc.), Kubernetes and Linux in production.
- Strong proclivity for automation and Dev Ops practices
- Experience with managing increasing data volume, velocity and variety
- Agile, self-starter and is focused on getting things done
- Ability to deal with ambiguity
- Participate in on-call outside of regular business hours
- Development skills in C++, Java, Go, Rust
- Understands TCP/IP and distributed systems
- Experience managing time series data
- Familiarity with working with open source communities
We offer a comprehensive range of benefits to support our employees at every stage of life and career. Our programs include enhanced parental leave, family-building and postpartum support through Maven, subsidized gym membership and a wide range of learning and development opportunities, to name a few! While specific offerings may vary by location, our teams will be happy to provide more detailed information about the benefits available in your region as you move through the recruitment process.
#J-18808-LjbffrTo Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: