×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer

Job in Indianapolis, Hamilton County, Indiana, 46262, USA
Listing for: Eli Lilly and Company
Full Time position
Listed on 2026-01-19
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager, Data Warehousing
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Indianapolis

Data Engineer page is loaded## Data Engineer locations:
US, Indianapolis INtime type:
Full time posted on:
Posted Todayjob requisition :
R-99281

At Lilly, we unite caring with discovery to make life better for people around the world. We are a global healthcare leader headquartered in Indianapolis, Indiana. Our employees around the world work to discover and bring life-changing medicines to those who need them, improve the understanding and management of disease, and give back to our communities through philanthropy and volunteerism. We give our best effort to our work, and we put people first.

We’re looking for people who are determined to make life better for people around the world.

The Global Services Tech at Lilly team is actively seeking a Data engineer in Indianapolis, IN to partner with internal business and Tech at Lilly partners to accelerate the delivery of data solutions for analytics and business purposes.## ## What You’ll Be Doing:

A Data engineer onsite in Indianapolis, IN is responsible for designing, developing, and maintaining the data solutions that ensure the availability and quality of data for analysis and/or business transactions.
They design and implement efficient data storage, processing and retrieval solutions for datasets and build data pipelines, optimize database designs, and work closely with data scientists, architects, and analysts to ensure data quality and accessibility.  Data engineers require strong skillsets in data integration, acquisition, cleansing, harmonization, and transforming data.  They play a crucial role in transforming raw data into datasets designed for analysis which enable organizations to unlock valuable insights for decision making.

## ## How You’ll Succeed:
* Engage and partner with cross-functional tech teams across Global Services (finance), third party solution delivery providers, and Data architects to understand the business problem and enhance/develop the appropriate data solution leveraging the modern tech stack
* Design and implement highly performant data ingestion/processing pipelines from multiple sources
* Develop technical solutions which combine disparate information to create meaningful insights for business partners
* Operate with a quality mindset always considering the impact of design decisions on the long-term support and maintenance of data pipelines/jobs
* Ensure data integrity, security, and privacy requirements are met
* Stay abreast of tools and technologies to influence our strategy so that it provides best usage opportunities for business## ## What You Should Bring:
* A foundational set of knowledge in:  communication, leadership, teamwork, problem solving skills, solution / blueprint definition, business acumen, architectural processes (e.g. blueprinting, reference architecture, governance, etc.), technical standards, project delivery, and industry knowledge.
* Strong skillsets in data integration, acquisition, cleansing, harmonization, and transforming data
* Experience designing large scale data models for functional, operational, and analytical environments (Conceptual, Logical, Physical & Dimensional)
* Experience in several of the following disciplines: statistical methods, data modeling, ontology development, semantic graph construction and linked data, relational schema design.
* Demonstrated SQL/PLSQL and data modeling proficiency.
* Experience with data modeling tools such as, ER Studio and Erwin or TOAD
* Experience in AWS or Azure techstack
* Experience in building/integrating APIs
* Experience in creating data products using APIs
* Experience with security models and development on large data sets
* Experience with Microsoft Fabric
* Experience with multiple database technologies
* Experience with multiple database solutions (e.g. Postgres, Redshift, Aurora, Athena) and formal database designs (3NF, Dimensional Models)
* Experience with Agile Development, CI/CD, Jenkins, Github, Automation platforms
* Experience in implementing effective data loading strategy (CDC, incremental loads)
* Demonstrated ability to analyze large, complex data domains and craft practical solutions for subsequent data exploitation via analytics.
*…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary