×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Lead Data Integration Engineer

Job in Memphis, Shelby County, Tennessee, 37544, USA
Listing for: Raymond James Financial, Inc.
Full Time position
Listed on 2026-01-12
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below

Job Description

This position follows our hybrid workstyle policy:
Expected to be in a Raymond James office location a minimum of 10-12 days a month.

Please note:
This role is not eligible for Work Visa sponsorship, either currently or in the future.

Responsibilities
  • Deep expertise in Microsoft SQL Server, SSIS, and SQL development.
  • Strong proficiency in writing and optimizing complex stored procedures, functions, and packages.
  • Hands‑on experience with Python for data manipulation, automation, and pipeline development.
  • Familiarity with Oracle databases and PL/SQL development is required for cross‑platform data integration.
  • Experience in implementing CI/CD pipelines and Dev Ops practices for data solutions.
  • Understanding data warehousing concepts, ETL methodologies, and data modeling techniques.
  • Experience with Unix and Shell scripting
  • Experience with job scheduler tools such as BMC Control‑M
  • Proven track record working in both waterfall and agile SDLC frameworks
  • Knowledge of the Financial Services industry including middle and back‑office functions
  • Experience in collaborating with business counterparts to understand detailed requirements
  • Excellent verbal and written communication skills
  • Produce and maintain detailed technical documentation for all development efforts.
Skills
  • MS SQL Server & SQL Proficiency:
    Deep expertise in writing and optimizing complex SQL queries, stored procedures, functions, and triggers is fundamental.
  • SSIS Expertise:
    In-depth knowledge of designing, developing, deploying, and maintaining ETL (Extract, Transform, Load) processes and packages using SQL Server Integration Services (SSIS). This includes robust error handling and logging mechanisms.
  • ETL & Data Warehousing:
    Strong understanding of ETL methodologies, data warehousing concepts (e.g., Kimball methodology, star schemas), and data modeling techniques (normalization/denormalization).
  • Performance Tuning:
    Ability to identify, investigate, and resolve database and ETL performance issues, including capacity and scalability planning.
  • Programming

    Languages:

    Proficiency in additional programming/scripting languages, such as Python or Power Shell/Shell scripting, for automation, data manipulation, and pipeline development.
  • Cloud & Dev Ops (Desired):
    Familiarity with cloud platforms (e.g., Azure Data Factory, AWS Glue, Google Cloud) and experience implementing CI/CD pipelines and Dev Ops practices for data solutions is a strong advantage.
  • Exposure to streaming technologies such as Kafka is a plus.
  • Experience in financial services or enterprise‑scale applications is preferred.
  • Excellent communication, analytical, and problem-solving skills.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary