More jobs:
Data Engineer/AWS III
Job in
Columbus, Franklin County, Ohio, 43224, USA
Listed on 2026-01-12
Listing for:
J.P. Morgan
Full Time
position Listed on 2026-01-12
Job specializations:
-
IT/Tech
Cloud Computing, Data Engineer, AI Engineer, Data Analyst
Job Description & How to Apply Below
Job responsibilities
- Provide direction, oversight, and coaching for a team of entry-level to mid-level engineers working on basic to moderately complex tasks.
- Executes solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into effective visual solutions.
- Work in an Agile development environment with team members, including Product Managers,SRE Engineers.
- Develop secure, high-quality production code, review and debug code written by others, and drive decisions influencing product design, application functionality, and technical operations.
- Serve as a subject matter expert in one or more areas of focus and actively contribute to the engineering community as an advocate of firmwide frameworks, tools, and practices of the end to end Development Life Cycle.
- Influence peers and project decision-makers to consider the use and application of leading-edge technologies.
- Stay current with industry trends and emerging technologies in Data Management, Artificial Intelligence and Machine Learning.
- 5+ years applied experience in data engineering. In addition, demonstrated coaching and mentoring experience.
- Hands-on experience in writing code using Python libraries such as Pandas, Boto3, PySpark, and Jupyter Notebooks, along with AWS services including Glue, S3, Kafka, and Kubernetes
- Hands-on practical experience delivering system design, application development, testing, and operational stability.
- Collaborate with various stakeholders and independently tackle design and functionality challenges with minimal oversight
- Proficient in automation and continuous delivery methods.
- Skilled in resolving code issues and proficient in Git for managing repositories and team collaboration.
- Experience and proficiency across the data lifecycle
- Evaluates and reports onaccesscontrol processes todetermineeffectiveness of dataasset security with minimal supervision
- Advanced understanding of agile methodologies, Application Resiliency, and Security
- Bachelor’s degree in data science, Computer Science, Information Systems, Statistics, or a related field.
- Strong Python experience, especially in the context of developing solutions for large financial platforms.
- Strong experience in AWS serverless services:
This includes expertise in using AWS Step functions, Lambda , Dynamodb and No
SQL database services - Experience using AWS Lake Formation service.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×