×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Lead Software Engineer - DataBricks​/PySpark

Job in Columbus, Franklin County, Ohio, 43224, USA
Listing for: J.P. Morgan
Full Time position
Listed on 2026-01-14
Job specializations:
  • IT/Tech
    Data Engineer
Job Description & How to Apply Below
Position: Lead Software Engineer - DataBricks / PySpark

We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.

As a Lead Software Engineer at JPMorgan Chase within the Corporate Technology Workforce Data Analytics team, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted, market-leading technology products that are secure, stable, and scalable. As a key technical contributor, you are tasked with implementing critical technology solutions across multiple technical domains, supporting various business functions to achieve the firm’s business objectives.

Job Responsibilities:
  • Execute creative data solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
  • Develop secure high-quality production data pipelines, and review and debug data processes implemented by others.
  • Identify opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of data applications and systems.
  • Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture.
  • Lead communities of practice across Data Engineering to drive awareness and use of new and leading-edge technologies.
  • Contribute to a team culture of diversity, opportunity, inclusion, and respect.
Required Qualifications , Capabilities, and

Skills:
  • Formal training or certification on software engineering concepts and 5+ years applied experience
  • 3+ years of experience in Data Engineering, specifically design, application development, testing and operational stability in Python, PySpark, Glue, Lambda, Databricks and AWS.
  • Knowledge of Unity Catalog, data formats including Delta tables, Iceberg tables.
  • Hands-on practical experience delivering system design, application development, testing, and operational stability.
  • Advanced proficiency in data processing frameworks and tools, including knowledge in Parquet and Iceberg.
  • Proficiency in automation and continuous delivery methods.
  • Proficient in all aspects of the Software Development Life Cycle.
  • Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security.
  • Demonstrated proficiency in data applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.).
  • In-depth knowledge of the financial services industry and their IT systems.
  • Practical cloud-native experience.
Preferred Qualifications , Capabilities, and

Skills:
  • AWS Certification
  • Databricks Certification
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary