×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer - U.S

Job in Cincinnati, Hamilton County, Ohio, 45208, USA
Listing for: Luma Financial Technologies, LLC
Full Time position
Listed on 2026-01-18
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below
Position: Data Engineer - U.S.

Founded in 2018, Luma Financial Technologies (“Luma”) has pioneered a cutting-edge fintech software platform that has been adopted by broker/dealer firms, RIA offices, and private banks around the world. By using Luma, institutional and retail investors have a fully customizable, independent, buy-side technology platform that helps financial teams more efficiently learn about, research, purchase, and manage alternative investments as well as annuities.

Luma gives these users the ability to oversee the full, end-to-end process lifecycle by offering a suite of solutions. These include education resources and training materials; creation and pricing of custom structured products; electronic order entry; and post-trade management. By prioritizing transparency and ease of use, Luma is a multi-issuer, multi-wholesaler, and multi-product option that advisors can utilize to best meet their clients’ specific portfolio needs.

Headquartered in Cincinnati, OH, Luma also has offices in New York, NY, Miami, FL, Zurich, Switzerland and Lisbon, Portugal. For more information, please visit Luma’s website .

About the role

We are looking for an experienced Data Engineer to lead our data infrastructure development, focusing on building robust, scalable, and efficient data solutions. The ideal candidate will bring expertise in modern data engineering technologies and a proven track record of delivering high-performance data pipelines.

Please note:
  • This is a hybrid position required to work from Luma Financial Technologies New York, NY or Cincinnati, OH office 2-3 days/week
  • Sponsorship for U.S. work authorization is not available for this opportunity
  • Design, develop, and maintain advanced data pipelines in Snowflake using dbt
  • Design, develop and maintain data pipelines using Python
  • Implement and optimize complex ETL/ELT processes
  • Ensure comprehensive data quality and consistency across multiple systems
Performance and Optimization
  • Create and optimize sophisticated SQL queries for advanced reporting and analysis
  • Develop efficient database queries with a focus on performance optimization
  • Troubleshoot complex data transformation challenges
Monitoring and Reliability
  • Implement and manage production data pipeline monitoring
  • Develop proactive health checks and monitoring protocols
  • Diagnose and rapidly resolve data integration issues
Cross-Functional Collaboration
  • Interface effectively with product, engineering, and business intelligence teams
  • Translate complex technical requirements into comprehensive data solutions
  • Provide technical leadership and guidance on data engineering challenges
Qualifications
  • 3-5 years of professional experience in data engineering
  • Bachelor's degree in Computer Science, Data Science, or related field
  • Excellent written and verbal communication skills
  • Proven ability to collaborate effectively across geographical boundaries
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary