×
Register Here to Apply for Jobs or Post Jobs. X

Senior Engineer - Data Analytics & Engineering

Job in Snowflake, Navajo County, Arizona, 85937, USA
Listing for: GEICO
Full Time position
Listed on 2025-10-29
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager
  • Engineering
    Data Engineer, Data Science Manager
Salary/Wage Range or Industry Benchmark: 100000 - 215000 USD Yearly USD 100000.00 215000.00 YEAR
Job Description & How to Apply Below
Location: Snowflake
GEICO .
For more information, please .
** At GEICO, we offer a rewarding career where your ambitions are met with endless possibilities.
**** Every day we honor our iconic brand by offering quality coverage to millions of customers and being there when they need us most. We thrive through relentless innovation to exceed our customers’ expectations while making a real impact for our company through our shared purpose.
**** When you join our company, we want you to feel valued, supported and proud to work here. That’s why we offer The GEICO Pledge:
Great Company, Great Culture, Great Rewards and Great Careers.
** GEICO is seeking an experienced Senior Data Engineer with a passion for building high-performance, low maintenance, zero-downtime data solutions. You will help drive our insurance business transformation as we transition from a traditional IT model to a tech organization with engineering excellence as its mission. Within the Data Analytics and Vertical Engineering team, you will play a key role in leveraging modern technologies to enhance our data capabilities, while championing innovation, best practices, and continuous learning.
** Position Description
** As a Senior Data Engineer, you will work to build and maintain robust data systems that power a state-of-the-art analytics platform. Our team thrives and succeeds in delivering high-quality data solutions in a hyper-growth environment where priorities shift quickly. The ideal candidate has broad and deep technical knowledge, typically ranging from data pipeline development and data transformation to data storage and processing optimization.
** Position Responsibilities
** As a Senior Engineer, you will:
* Team up with Tech leads, Managers, work in an Agile environment to tackle organization wide goals centered on Data Engineering and Analytical Reporting.
* Scope, design, and build scalable, resilient distributed systems
* Utilize programming languages like Python, SQL, and No

SQL databases, along with Apache Spark for data processing, dbt for data transformation, container orchestration services such as Docker and Kubernetes, and various Azure tools and services.
* Collaborate with data producers and consumers to define data requirements, data models, and transformation logic.
* Use your technical expertise to shape product definitions and drive towards optimal solutions.
* Troubleshoot and resolve technical issues related to data ingestion, processing, and consumption within the Snowflake
* Engage in cross-functional collaboration throughout the entire development lifecycle
* Lead in design sessions and code reviews with peers to elevate the quality of engineering across the organization
* Define, create, and support reusable data components and patterns that align with both business and technology requirements
* Build a world-class analytics platform to satisfy reporting needs
* Mentor junior engineers and contribute to knowledge sharing within the team and across the organization.
* Stay at the forefront of emerging identity trends, technologies, and best practices, and apply this knowledge to enhance GEICO’s data protection strategies
* Consistently share best practices and improve processes within and across teams
** Qualifications
* ** 6+ years of experience in data engineering, data warehousing, or a related data-focused discipline, with a focus on data ingestion and onboarding.
* Experience working with data technologies such as SQL, Python, PySpark, Spark, Scala, JSON, Kafka, DBT (Data Build Tool), Iceberg, Snowflake, Airflow, ADO and Azure Data Factory (ADF) are preferred
* Experience building and operating data pipelines using technologies like Apache Spark, Flink, Kafka, etc.
* Experience with SQL and data modeling techniques. Proficiency in programming languages such as Python, Scala, or Java.
* Experience with Apache Iceberg for managing large-scale tabular data is a plus
* Experience with orchestration tools such as Apache Airflow or similar technologies to automate and manage complex data pipelines
* Experience with business intelligence tools (Power BI or Superset preferred)
* Hands-on experience with cloud data platforms, such as AWS, Azure,…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary