×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Job in Renton, King County, Washington, 98056, USA
Listing for: Ziplyfiber
Full Time position
Listed on 2026-02-12
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Benefits

Medical, dental, vision, 401k, flexible spending account, paid sick leave, paid time off, parental leave, quarterly performance bonus, training, career growth and education reimbursement programs.

Ziply Fiber Overview

Ziply Fiber is a local internet service provider dedicated to elevating the connected lives of the communities we serve. We offer the fastest home internet in the nation, are refreshingly great customer experience, and affordable plans that put customers in charge.

Our Company Values
  • Genuinely Caring: We treat customers and colleagues like neighbors, with empathy and full attention.
  • Empowering You: We help customers choose what is best for them, and we support employees in implementing new ideas and solutions.
  • Innovation and Improvement: We constantly seek ways to improve how we serve customers and each other.
  • Earning Your Trust: We build trust through clear, honest, human communication.
Job Summary

The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs.

Essential Duties and Responsibilities

The essential duties and responsibilities listed below are a range of duties performed by the employee and not intended to reflect all duties performed.

  • Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets.
  • Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems.
  • Automate data workflows to increase efficiency and reduce manual intervention.
Data Infrastructure, Modeling & Governance
  • Optimize data models for analytics and business intelligence reporting.
  • Build and maintain data infrastructure, ensuring performance, reliability, and scalability.
  • Implement best practices for data governance, security, and compliance.
  • Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms.
Cross‑Functional Collaboration, Leadership & Documentation
  • Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions.
  • Mentor and train junior engineers, fostering a culture of learning and innovation.
  • Develop and maintain documentation for data engineering processes and workflows.
Other Duties

Perform other duties as required to support the business and evolving organization.

Required Qualifications
  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Minimum of eight (8) years of experience in data engineering, ETL development, or related fields.
  • Strong proficiency in SQL and database technologies (Postgre

    SQL, MySQL, Oracle, SQL Server, etc.).
  • Familiarity with Linux/Unix and scripting technologies.
  • Proficiency in Python for data engineering tasks.
  • Hands‑on experience with Microsoft Azure and its data services such as Azure Data Factory and Azure Synapse Analytics.
  • Experience working with data warehouses such as Snowflake or Azure SQL Data Warehouse.
  • Familiarity with workflow automation tools such as Autosys.
  • Knowledge of data modeling, schema design, and data architecture best practices.
  • Strong understanding of data governance, security, and compliance standards.
  • Ability to work independently in a remote environment across different time zones.
  • Exposure to Graph

    QL and RESTful APIs for data retrieval and integration.
  • Familiarity with No

    SQL databases such as Mongo

    DB.
  • Experience with version control software such as Git Lab.
Preferred Qualifications
  • Proven aptitude for independently managing complex procedures, even when encountered infrequently.
  • Proactive approach to learning and optimizing operational workflows.
  • Familiarity with Dev Ops practices and CI/CD pipelines for data engineering, including Azure Dev Ops.
  • Proficiency in designing, writing, and maintaining complex stored procedures and ETL workflows for robust data processing.
Knowledge, Skills, and Abilities
  • Strong problem‑solving and analytical skills.
  • Ability to manage multiple priorities and work in a fast‑paced environment.
  • Excellent verbal and written…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary