×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Cloud Data Engineer

Job in Tulsa, Tulsa County, Oklahoma, 74145, USA
Listing for: BOK Financial
Full Time position
Listed on 2026-01-06
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below

Summary

If you're in search of a position that blends a fervor for technological breakthroughs, a chance for career progression, and a collaborative work environment, then you've arrived at the right destination. We have an enticing role ready for a tech‑savvy individual like you! At BOK Financial, we're fostering a workspace where extraordinary talents can display their skills, aim for the highest standards, and contribute to top‑tier projects.

Job Description

As a Cloud Data Engineer, you'll be at the heart of our data‑driven initiatives, responsible for designing, building, and maintaining robust and scalable data solutions on Azure Databricks and the Amazon Web Services (AWS) cloud platform. You will leverage your expertise in a wide array of AWS services and big data technologies to transform raw data into valuable, usable insights.

This critical role involves developing end‑to‑end data pipelines, managing data lakes using formats like Apache Delta Lake, Apache Iceberg, and creating secure, high‑performance APIs to deliver data products to stakeholders. You will work closely with business partners, data scientists, analysts, and other engineering teams to understand data requirements and deliver effective, reliable, and compliant solutions that drive key business decisions.

If you are a problem‑solver with a strong technical background and a passion for building innovative data architectures in the cloud, this role offers the opportunity to make a significant impact on our business success.

Team Culture

BOK Financial is a place where your passion for technological innovation is valued and career development is encouraged. The company fosters an environment where unique talents can thrive, achieve high standards, and contribute to prestigious projects. It's an ideal platform to advance your IT career within a vibrant and supportive culture.

How You'll Spend Your Time
  • You will design and develop data architecture: create scalable, reliable, and efficient data lakehouse solutions on Azure, leveraging Databricks Unity Catalog and native Azure data services.
  • You will build and maintain data pipelines: design, construct, and automate ETL/ELT processes to ingest data from diverse sources into the Azure ecosystem.
  • You will create and manage data APIs: design, develop, and maintain secure and scalable RESTful and other APIs to facilitate data access for internal teams and applications, typically leveraging AWS services.
  • You will manage Delta tables: build and manage Unity Catalog tables on Azure Blob Storage to enable data lakehouse features like ACID transactions, time travel, and schema evolution.
  • You will optimize data performance: implement partitioning strategies, data compaction, and fine‑tuning techniques for Unity Catalog Delta tables to enhance query performance.
  • You will ensure data quality and integrity: implement data validation and error‑handling processes, leveraging Delta's transactional capabilities for consistent data.
  • You will collaborate with stakeholders: work closely with data scientists, analysts, software engineers, and business teams to understand their data needs and deliver effective solutions.
  • You will provide technical support: offer technical expertise and troubleshooting for data‑related issues related to pipelines and API endpoints.
  • You will maintain documentation: create and maintain technical documentation for data workflows, processes, and API specifications.
Education & Experience Requirements

This level of knowledge is normally acquired through completion of a Bachelor's Degree in a data‑centric field (Computer Science, Economics, Information Systems, Data Analytics, etc.) and 7+ years' experience with a demonstrated track record of successful technical leadership in the execution of large‑scale data projects or equivalent combination of education and experience.

  • Proven experience in data engineering, with significant hands‑on experience using Databricks and Azure.
  • Programming proficiency in Python, Java, or Scala.
  • Strong SQL skills for querying, data modeling, and database design.
  • Experience administering and managing Databricks Unity Catalog.
  • Experience with big data…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary