×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

GSC: Senior Data Engineer

Job in Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listing for: HSBC
Full Time position
Listed on 2026-03-01
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Location: Town of Poland

Some careers shine brighter than others.

If you’re looking for a career that will help you stand out, join HSBC, and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

Your career opportunity

HSBC Markets and Securities is an emerging markets-led and financing-focused investment banking and trading business that provides tailored financial solutions to major government, corporate and institutional clients worldwide.

In IT we provide HSBC with a genuine competitive advantage across the globe. Global Business Insights (GBI) provide critical metrics and reports to Markets and Securities Services Operations to enable them to monitor the health of their business and make data-driven decisions.

The GBI Transformation is a large and complex data integration program spanning all of MSS Ops globally. We serve a diverse audience of users and data visualisation requirements from Exco down, and over 80 data sources in multiple time-zones across Middle Office, Post-Trade and Securities Services IT and elsewhere. We are a critical enabler for the Rubix 2025 Strategy and the MSS control agenda, providing operational KPI and KRI metrics which allow senior management to measure the success of their BAU and CTB investment dollars.

We are looking for a GCP developer who can design, develop, test and deploy ETL/SQL pipelines connected to a variety of on-prem and Cloud data sources - both data stores and files. We will be using mainly GCP technologies like Cloud Store, Big Query, and Data Fusion.

You will also need to work with our devops tooling to deliver continuous integration/deployment capabilities, automated testing, security, and IT compliance.

The role will be responsible for the provisioning of subject matter expertise to support Enterprise Risk Management (ERM) Leadership Team (LT) and ERM Assurance teams discharge their responsibilities in relation to operational risk and resilience risk steward delivery across all service areas, delivery of assurance activities, embedding of assurance practices and embedding of stewardship activities and service catalogue in respective GB/GF/Specialist team.

What

you’ll do
  • Design, build, test and deploy Google Cloud data models and transformations in Big Query environment (e.g. SQL, stored procedures, indexes, clusters, partitions, triggers, etc.).
  • Create and manage ETl/ELT data pipelines to model raw/unstructured data into Data Vault universal model, enriched, transformed and optimised raw data into suitable for end consumers usage.
  • Review and refine, interpret and implement business and technical requirements.
  • Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective.
  • Monitor data pipelines for failures or performance issues and implementing fixes or improvements as needed.
  • Optimise ETL/ELT processes for performance and scalability, ensuring they can handle large volumes of data efficiently.
  • Integrate data from multiple sources, ensuring consistency and accuracy.
  • Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc. Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team.
What you need to have to succeed in this role
  • Proven (3+ years) hands on experience in SQL querying and optimisation of complex queries/transformation in Big Query, with a focus on cost, time-effective SQL coding and concurrency/data integrity. Proven (3+ years) hands on experience in SQL Data Transformation/ETL/ELT pipelines development, testing and implementation, ideally in GCP Data fusion.
  • Proven Experience in Data Vault modelling and usage.
  • Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub. Hands on development in Python, Terraform.
  • Proficiency in Git usage for version control and collaboration. Proficiency with CI/CD processes/pipelines designing, creation, maintenance in Dev Ops tools like…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary