Data Engineer
Greater London, London, Greater London, W1B, England, UK
Listed on 2026-02-28
-
Software Development
Data Engineer
PCI Pal is a leading provider of SaaS solutions that empower companies to take payments securely, adhere to strict industry governance, and remove their business from the significant risks posed by non‑compliance and data loss. We are integrated and resold by some of the worlds’ leading business communications vendors, as well as major payment service providers. We are currently looking for a Data Engineer to join our UK team.
Location:Hybrid, with travel to our London and Ipswich Offices at least once a month for collaboration and team meetings. The Opportunity
The Data Engineer role is a new position created to reflect the rapid growth of the business.
We are looking for a skilled AWS Data Engineer to take ownership of our existing cloud‑based data infrastructure and evolve it into a best‑practice, scalable, and business‑critical platform. You will be responsible for refining and expanding our AWS‑based data ecosystem, ensuring it supports the organisation’s growing analytical and reporting needs across Development, Finance, Operations, and beyond.
This is a hands‑on role where you will define technical direction, embed best practices, and drive data quality, efficiency, and cost optimisation across the platform.
You Will Be Responsible For:- Identify and implement improvements to our current data infrastructure, be it cost optimisation or adherence to best practice.
- Collaborate with business stakeholders - including Development, Finance, and Product teams - to identify new data opportunities and translate requirements into practical solutions.
- Design, build, and maintain data ingestion, transformation pipelines and visualisations using AWS services (e.g., S3, Glue, Lambda, Athena, Redshift, Step Functions, Quick Sight).
- Develop Python scripts and ETL processes for data processing and automation.
- Integrate data from various APIs and third‑party sources into cloud‑based data systems.
- Write and optimise SQL queries for data extraction, transformation, and validation.
- Use Git Lab for version control, CI/CD, and collaborative development.
- Implement best practices for data quality, scalability, and security.
- Monitor and control data platform costs.
- Strong hands‑on experience with AWS data services (e.g., S3, Glue, Lambda, Athena, Redshift, Dynamo
DB). - Strong hands‑on experience with data visualisation tools such as Quick Sight.
- Proficiency in Python for data processing, API integration, and automation.
- Strong SQL skills for data extraction and transformation.
- Experience working with REST APIs and JSON data structures.
- Practical experience with Git Lab (version control, branching, and CI/CD pipelines).
- Solid understanding of data pipeline design and ETL best practices.
- Experience with infrastructure as code (e.g., Terraform or Cloud Formation).
- Advantageous to have:
- Knowledge of Sales Force and/or Sage Intacct data flow.
Offer:
- 25 days holiday, rising to 28 days per annum with length of service.
- Medical, dental and optical insurance cover.
- An exciting and flexible working environment surrounded by friendly and committed co‑workers.
- UK:
Electric Vehicle Scheme • “Work from anywhere” 2 weeks per year policy. - Training and development opportunities.
- Access to an employee assistance programme and wellbeing support hub.
- Team events.
- Ad‑hoc incentives and competitions.
If you have any questions or want to find out more, we’d love to hear from you.
Please contact the People Team
#J-18808-LjbffrTo Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: