×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Principal Data Engineer

Job in Lehi, Utah County, Utah, 84043, USA
Listing for: DigiCert, Inc.
Full Time position
Listed on 2026-01-13
Job specializations:
  • Software Development
    Data Engineer
Salary/Wage Range or Industry Benchmark: 147000 - 185000 USD Yearly USD 147000.00 185000.00 YEAR
Job Description & How to Apply Below
Principal Data Engineer (Digi Cert, Inc., Lehi, UT) Architect and contribute to the design of distributed data processing systems, focusing on modularity, reusability, and fault tolerance. Collaborate on design reviews and provide input into architectural decisions to help evolve the data platform and support new analytics and machine learning (ML) use cases. Develop framework-level components for data quality, observability, and recovery, enabling self-service capabilities across teams.

Standardize and implement patterns for incremental data processing, ensuring consistency, idempotency, and scalability. Design and implement schema evolution strategies, metadata-driven processing, and lineage tracking to support auditability and regulatory requirements. Optimize data storage formats, partitioning strategies, and compute resource utilization for cost-effective performance tribute to the technical roadmap by evaluating new technologies and proposing architectural improvements aligned with long-term platform goals.

Collaborate with product, compliance, and platform teams to design reusable data interfaces and contract-driven ingestion frameworks. Salary: $147,000 - $185,000 per year.

MINIMUM REQUIREMENTS:

Bachelor’s degree or U.S. equivalent in Computer Science, Statistics, Informatics, Information Systems, Mechanical Engineering, Computer Engineering, or a related field, plus 5 years of professional experience as a Data Engineer, Data Platform Engineer, Machine Learning Engineer, or any occupation/position/job title working in in hybrid deployments. Must also have the following special skills: 4 years of professional experience designing, building and optimizing large scale data pipelines;

4 years of professional experience utilizing big data tools including Spark, Kafka, PySpark, and Databricks; 4 years of professional experience utilizing pipeline and workflow management tools including Airflow, and working with stream-processing systems including Spark-streaming; 4 years of professional experience working with object-oriented and object function scripting languages including Python and continuous integration and continuous delivery (CI/CD) including Jenkins; 4 years of professional experience building monitoring dashboards including;
Grafana; 4 years of professional experience performing change data capture (CDC) streaming for No

SQL databases; 4 years of professional experience performing root cause analysis and processes to answer business questions and identify opportunities for improvement; and 4 years of professional experience with message queuing, stream processing, and high scalable data stores. CONTACT:
Submit resume online at: ; or via email to  Must specify job code (ABDA).
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary