×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Job in Houston, Harris County, Texas, 77246, USA
Listing for: Alava Consulting
Full Time position
Listed on 2026-03-06
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager, Data Warehousing, Data Analyst
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

NO THIRD PARTIES NO SPONSORSHIP MUST BE LOCAL TO HOUSTON

Alava Consulting's client is looking for a Sr. Data Engineer. This is a permanent onsite position in downtown. This person must have experience in upstream oil and gas.

The Data Engineer is responsible for designing, building, and operating scalable, reliable, and secure data pipelines that support enterprise analytics, reporting, and advanced data use cases. This role is critical to enabling data-driven decision-making by ensuring trusted, high-quality data is available across operational, engineering, financial, and HSE domains.

This position requires hands‑on experience in upstream oil and gas and proven expertise working with enterprise‑scale cloud data platforms, with Snowflake (or comparable platforms) as a core prerequisite. The Data Engineer will play a key role in advancing the company’s cloud‑based data platform and accelerating its journey toward becoming a data‑driven, technology‑enabled enterprise.

Key Responsibilities
  • Design, build, and maintain scalable, fault‑tolerant ETL/ELT pipelines supporting structured and semi‑structured data across upstream operational, engineering, financial, and ESG domains.
  • Develop and operate data ingestion pipelines using Fivetran (or similar tools), including connector configuration, schema management, and change data capture (CDC) patterns.
  • Engineer and maintain data pipelines within the Snowflake environment, ensuring reliability, performance, and cost efficiency.
  • Monitor, troubleshoot, and remediate data pipeline failures, latency issues, and data inconsistencies in production environments.
  • Implement logging, alerting, and basic observability for data pipelines to support operational reliability.
  • Data Transformation & Medallion Architecture
  • Implement data transformations using dbt, applying modular, testable, and version‑controlled transformation logic.
  • Develop and maintain datasets aligned to a medallion architecture ensuring clear separation between raw, refined, and analytics‑ready data layers.
  • Apply data modeling best practices (dimensional, star schema, domain‑oriented models) to support analytics and reporting use cases.
  • Maintain documentation and tests within dbt to improve data transparency, lineage, and maintainability.
  • Analytics & Reporting Enablement
  • Design and deliver standardized, production‑grade dashboards and reports using Power BI and Spotfire.
  • Work with business users to define KPIs, metrics, and reporting requirements, translating them into governed, scalable data models.
  • Transition ad‑hoc and spreadsheet‑based reporting into certified, reusable analytics assets.
  • Support semantic models and curated datasets that enable consistent, performant reporting across teams.
  • Data Quality, Governance & Reliability
  • Implement automated data quality checks, validations, and reconciliation processes within data pipelines.
  • Publish and maintain certified datasets and analytics‑ready tables to support self‑service consumption.
  • Enforce metric consistency by aligning transformations and reporting logic to approved definitions and standards.
  • Apply CI/CD, version control, and testing practices to data pipelines and dbt projects.
  • Business Partnership & Enablement
  • Work directly with business analysts, operations users, and reporting consumers to clarify data requirements and troubleshoot issues.
  • Support day‑to‑day data needs by delivering reliable datasets and reports aligned to defined standards and priorities.
  • Provide technical input into data solutions while following established architecture, governance, and platform guidelines.
  • Assist in onboarding users to certified datasets and reports, helping reduce dependency on ad‑hoc data extracts.
Qualifications & Experience
  • Education
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
  • Technical Skills & Experience
  • Experience with Power BI and Spotfire.
  • Advanced SQL, dbt, Snowflake. Familiarity with medallion architecture CI/CD, and Git.
  • Industry experience in upstream oil and gas is required, ideally with exposure to drilling, production operations, reserves, and regulatory reporting.
  • Proven experience deploying and managing cloud data platforms with Snowflake expertise is essential.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary