×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer

Remote / Online - Candidates ideally in
City of Edinburgh, Edinburgh, City of Edinburgh Area, EH1, Scotland, UK
Listing for: Comcarde Ltd
Remote/Work from Home position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Location: City of Edinburgh

Benefits

  • Flexible and remote working
  • Remote working allowance
  • 33 days holiday including public holidays
  • Your birthday as a day off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • A culture that champions rapid career progression
  • Investment in your learning and development
  • Regular team events & socials
  • A collaborative culture where documentation is treated as a first-class product
  • Regular team events and off‑sites
Why this role exists
  • Data is becoming a critical part of   DGE’s next growth phase, powering internal analytics and customer-facing insights and monitoring.
  • The data engineering space is largely greenfield. We need a production grade data platform that can ingest, transform, validate, and monitor data from core systems and operational tooling.
  • The robustness, scalability, and governance of our data architecture impacts our ability to grow safely and meet regulatory expectations.
  • This role owns the insights data platform, while partnering closely with Analytics, Product, and Engineering to ensure the platform delivers trusted datasets and timely signals.
What You Will Do
  • Design and ship a tiered data platform that supports multiple latency needs, including low latency pipelines for operational monitoring and customer-facing insights, plus batch pipelines for reporting and deeper analysis.
  • Build and own end-to-end ingestion patterns across batch, micro batch, and selected near real-time use cases, with strong orchestration and dependency management.
  • Implement schema evolution, data contracts, and approaches for late arriving and corrected data so consumers can trust the outputs.
  • Treat curated datasets as products that are well defined, documented, reliable, and safe to use for both internal and external consumers.
  • Set platform standards for idempotent ingestion, deduplication, data quality, lineage, and observability.
  • Ensure the platform meets regulated fintech & payments expectations for access control, security, and governance while staying cost efficient as volumes grow.
  • Partner with Product and Engineering on event and domain modelling. Decide what data gets emitted and what latency and granularity is needed for analytics and product goals.
  • Support Data Science with reliable feature-ready datasets and pragmatic collaboration, without owning reporting or business analysis.
  • Evolve the current lightweight tooling into a more observable, structured platform. Improve standards without creating unnecessary platform complexity.
  • Automate data infrastructure and workflows using infrastructure as code and CI/CD practices.
What We Are Looking For Must have
  • Proven experience designing, building, and operating production grade data pipelines and platforms.
  • Strong SQL, specifically Postgre

    SQL, plus at least one programming language such as Python or Java.
  • Experience with data processing or orchestration tooling such as Spark, Airflow, or Kafka.
  • Experience designing data models for analytics and reporting workloads.
  • Practical knowledge of data quality, testing, observability, lineage, and governance patterns.
  • Strong experience with AWS based data platforms, with hands on use of services like S3, Glue, Athena, Redshift, Kinesis, EMR, or MSK.
  • Infrastructure as code experience using Terraform or Cloud Formation, and comfort operating systems in production.
  • Ability to collaborate across Engineering, Product, Analytics, and Data Science, and drive standards through influence.
Nice to have
  • Experience building customer-facing data products where latency and correctness affect user outcomes.
  • Experience in regulated fintech or payments environments, especially around access control and auditability.
  • Experience with cost and performance optimisation at scale in AWS data stacks.
Tech context
  • This role will work across ingestion, orchestration, modelling, governance, and observability in an AWS centric environment, with Postgre

    SQL and modern data tooling. Current tooling is intentionally lightweight, and the platform is evolving as  
    -DGE grows. In some cases you do not need to be hands-on day to day, but you must be fluent enough to make strong technical decisions and review work.
What We Offer
  • Flexible, remote-first working
  • 33 days holiday, including public holidays
  • Birthday off
  • Family healthcare
  • Life insurance
  • Employee assistance programme
  • Investment in learning and development
  • Regular team events and off-sites
  • A collaborative culture where documentation is treated as a first-class product

Apply for this role

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary