×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Tulsa, Tulsa County, Oklahoma, 74145, USA
Listing for: Bayer CropScience Limited
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Staff Data Engineer

At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where 'Health for all Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us.

If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.

Staff Data Engineer YOUR TASKS AND RESPONSIBILITIES

The primary responsibilities of this role, Staff Data Engineer are to:

  • A staff data engineer designs and leads the implementation of data flows to connect operational systems, data for analytics and business intelligence (BI) systems. Individuals in this role will:
  • Recognize opportunities to reuse existing data flows;
  • Lead the build of data streaming systems;
  • Optimize the code to ensure processes perform optimally;
  • Lead work on database management;
Communicating Between Technical and Non-Technical Colleagues
  • Communicate effectively with technical and non-technical stakeholders;
  • Support and host discussions within a multidisciplinary team, with potentially difficult dynamics;
  • Be an advocate for the team externally, and can manage differing perspectives.
Data Analysis and Synthesis
  • Undertake data profiling and source system analysis;
  • Present clear insights to colleagues to support the end use of the data.
Data Development Process
  • Design, build and test data products that are complex or large scale;
  • Build teams to complete data integration services.
Data Innovation
  • Understand the impact on the organization of emerging trends in data tools, analysis techniques and data usage;
Data Integration Design
  • Select and implement the appropriate technologies to deliver resilient, scalable and future‑proved data solutions and integration pipelines;
Data Modeling
  • Produce relevant data models across multiple subject areas;
  • Explain which models to use for which purpose;
  • Understand industry-recognized data modelling patterns and standards, and when to apply them;
  • Compare and align different data models.
Metadata Management
  • Design an appropriate metadata repository and present changes to existing metadata repositories;
  • Understand a range of tools for storing and working with metadata;
  • Provide oversight and advice to more inexperienced members of the team.
Problem Resolution
  • Respond to problems in databases, data processes, data products and services as they occur;
  • Initiate actions, monitor services and identify trends to resolve problems;
  • Determine the appropriate remedy and assist with its implementation, and with preventative measures.
Programming and Build
  • Use agreed standards and tools to design, code, test, correct and document moderate‑to‑complex programs and scripts from agreed specifications and subsequent iterations;
  • Collaborate with others to review specifications where appropriate.
Technical Understanding
  • Understand the core technical concepts related to the role, and apply them with guidance;
Testing
  • Review requirements and specifications, and define test conditions;
  • Identify issues and risks associated with work;
  • Analyze and report test activities and results.
WHO YOU ARE

Bayer seeks an incumbent who possesses the following:

Required Qualifications:
  • Proficiency in programming language such as Python or Java;
  • Experience with Big Data technologies such as Hadoop, Spark, and Kafka;
  • Familiarity with ETL processes and tools;
  • Knowledge of SQL and No

    SQL databases;
  • Strong understanding of relational databases;
  • Experience with data warehousing solutions;
  • Proficiency with cloud platforms;
  • Expertise in data modeling and design;
  • Experience in designing and building scalable data pipelines;
  • Experience with RESTful APIs and data integration.
Preferred Qualifications:
  • Relevant certifications (e.g., GCP Certified, AWS Certified, Azure Certified);
  • Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field;
  • Strong analytical and communication skills;
  • Ability to work collaboratively in a team environment;
  • High level…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary