Senior Data Engineer; Trade Surveillance
Job in
Belfast, County Antrim, BT1, Northern Ireland, UK
Listed on 2026-03-01
Listing for:
TP ICAP Group
Full Time
position Listed on 2026-03-01
Job specializations:
-
IT/Tech
Data Engineer, Data Security, Data Science Manager, Data Analyst
Job Description & How to Apply Below
Belfast time type:
Full time posted on:
Posted Todayjob requisition :
R4971
** Role Overview
** Effective Market Abuse Surveillance is highly dependent on data of multiple types and from many different sources. Trade Surveillance requires complete Trade and Pre-Trade transactional data, Instrument and Client Reference Data and Market Data. Communications Surveillance requires ingestion of messages from multiple electronic platforms.
This role will play a critical role in ensuring the reliability, scalability, and compliance of data pipelines that support surveillance systems across communications and trading activities, covering the above structured and unstructured data.
The role bridges engineering and operations, enabling robust data ingestion, transformation, and monitoring to meet regulatory and internal compliance requirements. The Data Ops Engineer will play a critical role in collaborating with upstream teams to ensure data completeness, accuracy, and timeliness is as expected and that any data completeness or quality issues are visible.
The role will also work on other Surveillance data initiatives such as persisting Surveillance Alerts in the firm’s data lake for analytics purposes.
Role Responsibilities
* Design, build, maintain and optimise end-to-end data pipelines and workflows between the source data points and target destinations, working with the wider Surveillance Technology team to prioritise automation, scalability and strategy at the heart of the design.
* Implement automated data completeness and quality checks, validation rules, and reconciliation processes to ensure accuracy, completeness, and timeliness of the data ingested and to make visible any data that is not processed.
* Identify Critical Data Elements and implement failover and recovery strategies for the respective Data Flows.
* Build AWS infrastructure using Terraform or CDK
* Write unit, integration, and infrastructure tests
* Monitor, investigate and resolve data anomalies through collaboration with, Business Analysts, Developers, and Testers across functions and verticals.
* Implement data management and governance frameworks to ensure data is ingested and loaded per the requirements of the consuming platform;
Scila for Trade Surveillance, and Global Relay for Communications Surveillance.
* Partnering with the Data Strategy and Data Infrastructure team to ensure Data Lineage, auditability and retention policies are enforced across all necessary pipelines.
* Ensuring that Data consumed and processed is compliance with regulatory, legal, and security protocols.
* Work closely with surveillance analysts, compliance officers, and engineering teams to translate business rules into technical specifications.
* Partnering closely with stakeholders and subject matter experts such as the Cloud Infrastructure team to optimise performance and costs.
* Stay updated on industry trends and emerging tech to ensure continuous improvement.
** Experience / Competences
**** Essential Criteria
*** Strong experience ETL/ELT data pipeline builds from design, to implementation, to maintenance in relation to financial market messaging platforms, and trade & order systems.
* Solid understanding of CI/CD pipelines, ideally with a background in software engineering, product management or data analytics.
* Experience with some of EKS, Lambda, Event Bridge, Step Functions, S3, Dynamo
DB, AWS Glue, Snowflake, Terraform and Transfer Family.
* Strong proficiency in Python or Java, SQL, and data pipeline frameworks (e.g., Airflow, dbt, Spark), with solid experience with the AWS ecosystem.
* Proven expertise with data governance frameworks and compliance regulations in financial services.
* Knowledge of streaming technologies (Kafka, Kinesis) and API integrations, and hands-on experience with monitoring tools (e.g. Grafana) and observability practices.
* Excellent problem-solving skills and ability to work in a fast-paced environment.
* Bachelor’s degree in Computer Science, Data Science, Engineering, or related field.
* Previous experience in Data Ops and Data Engineering.
* Strong communication and collaboration skills to engage with technical and non-technical stakeholders.
* Strong experience with Agile software delivery.
** Non Essential
*** Experience with market data ingestion, metadata extraction, and event-driven architectures.
* Proficient with Terraform or CDK (infrastructure-as-code).
* Experience in Business Communications Technology e.g. Bloomberg, ICE, Symphony, Teams Chat, etc.
* Familiarity with security best practices, IAM, and VPN configuration.
* Experience with regulatory compliance and data security in financial services.
* Knowledge of financial markets and trading platforms.
* Experience with Git Lab, Qliksense & Alation
* Certifications in Data Ops, cloud platforms, or related areas.
Job Band Manager, Job Level 6
** Company Statement
** We know that the best…
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×