×
Register Here to Apply for Jobs or Post Jobs. X

Software Engineer II - Data Engineer

Job in Seattle, King County, Washington, 98127, USA
Listing for: The Trade Desk
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 125000 - 150000 USD Yearly USD 125000.00 150000.00 YEAR
Job Description & How to Apply Below

Join to apply for the Software Engineer II - Data Engineer role at The Trade Desk.

2 days ago Be among the first 25 applicants.

The Trade Desk is a global technology company with a mission to create a better, more open internet for everyone through principled, intelligent advertising. Handling over 1 trillion queries per day, our platform operates at an unprecedented scale. We have also built something even stronger and more valuable: an award‑winning culture based on trust, ownership, empathy, and collaboration. We value the unique experiences and perspectives that each person brings to The Trade Desk, and we are committed to fostering inclusive spaces where everyone can bring their authentic selves to work every day.

Do you have a passion for solving hard problems at scale? Are you eager to join a dynamic, globally‑connected team where your contributions will make a meaningful difference in building a better media ecosystem? Come and see why Fortune magazine consistently ranks The Trade Desk among the best small‑to‑medium‑sized workplaces globally.

What We Do

This specialized role is within the Technology Operations group of The Trade Desk’s Engineering. This group is focused on delivering world‑class solutions for Enterprise needs within The Trade Desk.

What you’ll do
  • Data Pipeline Development:
    Design, build, and optimize scalable ETL/ELT pipelines for both batch and real‑time data processing from disparate sources.
  • Infrastructure Management:
    Assist in the design and implementation of data storage solutions, including data warehouses and data lakes (e.g., Snowflake, S3, Spark), ensuring they are optimized for performance and cost efficiency.
  • Data Quality and Governance:
    Implement data quality checks, monitor data pipeline performance, and troubleshoot issues to ensure data accuracy, reliability, and security, adhering to compliance standards (e.g., GDPR, CCPA).
  • Collaboration:

    Work closely with product managers, data scientists, business intelligence analysts, and other software engineers to understand data requirements and deliver robust solutions.
  • Automation and Optimization:
    Automate data engineering workflows using orchestration tools (e.g., Apache Airflow, Dagster, Azure Data Factory) and implement internal process improvements for greater scalability.
  • Mentorship:
    Participate in code reviews and provide guidance or mentorship to junior team members on best practices and technical skills.
  • Documentation:
    Produce comprehensive and usable documentation for datasets, data models, and pipelines to ensure transparency and knowledge sharing across teams.
Who you are
  • Bachelor's degree in computer science, information security, or a related field, or equivalent work experience. Master’s degree preferred.
  • 4+ years of experience in a Data engineering role and a broad understanding of Data Modeling, SQL, OLAP, and ETL. Experience working with data pipelines including data modeling at petabyte scale is a bonus.
  • 4+ years of experience working with multiple database platforms by designing and implementing data and analytics solutions using technologies such as Snowflake, Databricks, Vertica, SQL Server, and MySQL.
  • 4+ years of experience required in one or more programming languages, particularly SQL. Proficiency in PL/SQL, Python, C#, Scala or Java is also required.
  • Experience with workflow technologies like Spark, Airflow, Glue, Prefect or Dagster.
  • Experience with version control systems, specifically Git.
  • Familiarity with Dev Ops best practices and automation of processes such as building, configuration, deployment, documentation, testing, and monitoring.
  • Understanding BI and reporting platforms and awareness of industry trends in the BI/reporting space, and how it can apply to an organization’s product strategies.
  • AdTech industry and its trends preferred.
  • Experience with containerization tools like Docker and Kubernetes preferred.
  • Familiarity with data modeling and data warehousing concepts preferred.
  • Basic understanding of machine learning concepts and how models work.
  • Strong analytical and problem‑solving skills with attention to detail.
  • Excellent communication and collaboration skills to work effectively with diverse…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary