Senior Data Engineer - Databricks
Listed on 2026-01-19
-
IT/Tech
Data Engineer, Cloud Computing
SMBC Group is a top-tier global financial group. Headquartered in Tokyo and with a 400-year history, SMBC Group offers a diverse range of financial services, including banking, leasing, securities, credit cards, and consumer finance. The Group has more than 130 offices and 80,000 employees worldwide in nearly 40 countries. Sumitomo Mitsui Financial Group, Inc. (SMFG) is the holding company of SMBC Group, which is one of the three largest banking groups in Japan.
SMFG’s shares trade on the Tokyo, Nagoya, and New York (NYSE: SMFG) stock exchanges.
In the Americas, SMBC Group has a presence in the US, Canada, Mexico, Brazil, Chile, Colombia, and Peru. Backed by the capital strength of SMBC Group and the value of its relationships in Asia, the Group offers a range of commercial and investment banking services to its corporate, institutional, and municipal clients. It connects a diverse client base to local markets and the organization’s extensive global network.
The Group’s operating companies in the Americas include Sumitomo Mitsui Banking Corp. (SMBC), SMBC Nikko Securities America, Inc., SMBC Capital Markets, Inc., SMBC MANUBANK, JRI America, Inc., SMBC Leasing and Finance, Inc., Banco Sumitomo Mitsui Brasileiro S.A., and Sumitomo Mitsui Finance and Leasing Co., Ltd.
The Databricks Developer is responsible for implementing, supporting, and enhancing the internal fraud detection platform by developing scalable data pipelines, integrating batch processing methods, and ensuring the platform aligns with the bank’s risk management, legal, and regulatory requirements for fraud detection and prevention.
This role requires deep functional and technical expertise in databricks development, including strong development skills in PySpark and the Azure cloud ecosystem. Proven expertise in designing and managing CI/CD pipelines using tools such as Azure Dev Ops, Git Hub, or similar.
The developer will work closely with business units and support teams to deliver the initial application, system enhancements, perform upgrades, and provide on‑call user support. The ideal candidate holds a degree in Computer Science or a related field and has at least 5 years of professional experience in data engineering and cloud‑based development.
Key Responsibilities Role Objectives:Delivery
- Design, develop, and optimize large‑scale batch data pipelines using Databricks and PySpark on the Azure cloud platform.
- Lead technical architecture and implementation of Azure‑based solutions, supporting cloud migration and consolidation initiatives.
- Build and maintain ETL processes, ensuring seamless data integration and high data quality across diverse sources.
- Develop orchestration workflows using Azure Functions, Azure Data Factory (ADF), Logic Apps, and other Azure services.
- Proven expertise in designing and managing CI/CD pipelines using tools such as Azure Dev Ops, Git Hub, or similar.
- Implement secure and scalable solutions leveraging Blob Storage, Key Vault, Managed Identities, and Azure Dev Ops.
- Provide technical guidance and support for architectural decisions and platform enhancements.
- Own end‑to‑end project delivery, working closely with business stakeholders, IT teams, and third‑party vendors.
- Incorporate a variety of data processing techniques—including batch and streaming workflows—while exposing and integrating APIs and external services into Databricks pipelines to enhance platform functionality and enable seamless data exchange across systems.
- Review and contribute to core code changes, ensuring best practices and supporting production deployments.
- Experience in developing and implementing disaster recovery strategies for cloud‑based applications.
Required Qualifications
- Bachelor’s degree in Computer Science, Information Systems, or a related technical field.
- Minimum 5 years of experience in data engineering, with a focus on Databricks, PySpark, and Azure.
- Strong understanding of data integration, transformation, and migration strategies.
- Experience with CI/CD pipelines and version control using Azure Dev Ops or Git Hub.
- Excellent problem‑solving skills and ability to resolve moderately…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).