More jobs:
Data Engineer
Job in
Glasgow, Glasgow City Area, G1, Scotland, UK
Listed on 2026-02-28
Listing for:
N Consulting Limited
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Apply Now Role:
Data Engineer
Location:
Glasgow, UK Work Mode:
Hybrid (3 Days from Office) Contract Role: 6 months
Experience:
10+ years
Start Date:
Only Immediate Joiners or candidates with max 2-3 weeks’ notice No Visa Sponsorship Must have skill: AWS cloud ecosystem, Snowflake, Python, and Apache Spark, Banking Domain
Role Overview We are seeking an experienced Data Engineer with strong expertise in AWS cloud ecosystem, Snowflake, Python, and Apache Spark, along with proven experience in the banking domain. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines and modern data platforms that support analytics, reporting, and regulatory requirements.
Key Responsibilities Design, build, and maintain scalable data pipelines using AWS services and modern data engineering practices. Develop and optimize ETL/ELT workflows using Python and Apache Spark. Implement and manage Snowflake data warehouse solutions including data modeling, performance tuning, and optimization. Work closely with business stakeholders, data analysts, and architects to understand banking data requirements. Integrate data from multiple banking systems such as payments, transactions, customer, and risk platforms.
Ensure data quality, governance, security, and compliance aligned with banking regulations. Develop data ingestion frameworks for structured and semi-structured data. Optimize data processing performance and cost efficiency within AWS environments. Support real-time and batch data processing solutions. Document data architecture, data flows, and technical processes.
Required
Skills & Qualifications 6+ years of experience in Data Engineering. Strong hands-on experience with AWS services (S3, Glue, Lambda, Redshift, EMR, Athena, Step Functions, IAM). Extensive experience with Snowflake including schema design and performance tuning. Strong programming skills in Python. Hands-on experience with Apache Spark / PySpark. Experience building ETL/ELT pipelines and data integration frameworks. Strong SQL and data modeling skills.
Experience working with large-scale datasets.
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×