×
Register Here to Apply for Jobs or Post Jobs. X

Software Developer - Senior​/Senior Data Engineer

Job in Southlake, Tarrant County, Texas, 76092, USA
Listing for: Aquent
Full Time position
Listed on 2026-03-08
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Software Developer - Senior / Senior Data Engineer [208286]

Imagine being at the forefront of protecting financial integrity and enabling critical data‑driven decisions within a leading financial services institution. Our client is dedicated to innovation and safeguarding their operations through robust data solutions. They are seeking a talented and driven individual to join their team and significantly impact their fraud data analytics and reporting capabilities. This is an exceptional opportunity to leverage cutting‑edge cloud technologies and contribute directly to the security and stability of financial systems.

As a key member of the team, you will be instrumental in designing, building, and maintaining scalable data pipelines on a leading cloud platform. Your expertise will directly support the development of effective data solutions crucial for fraud data analytics and reporting. You will collaborate with cross‑functional teams, ensuring data integrity, reliability, and scalability, and your work will empower data‑driven insights that protect customers and assets.

  • What You’ll Do:
  • Design, build, and maintain robust and scalable data pipelines using cloud platform tools such as Big Query, Cloud Storage, Dataflow (Apache Beam), Cloud Composer (Airflow), and Pub/Sub.
  • Develop high‑performance, production‑grade Python and SQL code, optimizing queries for efficient data extraction, transformation, and loading (ETL) processes.
  • Implement complex data models in Big Query, leveraging partitioning, clustering, and materialised views to achieve optimal performance.
  • Collaborate closely with cross‑functional teams, including business customers and Subject Matter Experts, to gather data requirements and deliver impactful solutions.
  • Implement and uphold best practices for data quality, data governance, and data security.
  • Proactively monitor and troubleshoot data pipeline issues, ensuring high availability and performance of critical data flows.
  • Contribute to strategic data architecture decisions, providing recommendations for continuous improvement of data pipelines.
  • Stay current with emerging trends and technologies in cloud‑based data engineering and cybersecurity to drive innovation.
  • Lead investigation and resolution efforts for identified data issues, taking ownership to resolve them in a timely manner.
  • Document processes and procedures thoroughly for producing accurate metrics and ensuring operational clarity.
  • What You’ll Bring:
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field.
  • 8+ years of hands‑on experience in data management, including gathering data from diverse sources, consolidating it into centralised locations, and transforming it with business logic for consumption in visualisation and data analysis.
  • Strong expertise in Big Query, Cloud Storage, Dataflow, Pub/Sub, Cloud Composer, and related cloud platform services.
  • Proficiency in Python and SQL for data processing and automation.
  • Extensive experience with ETL processes and data pipeline design.
  • Excellent problem‑solving skills and meticulous attention to detail.
  • Strong communication and collaboration skills, with the ability to actively listen, dialogue freely, and verbalise ideas effectively.
  • Ability to thrive in an Agile work environment, delivering incremental value to customers by effectively managing and prioritising tasks.
  • Bonus Points:
  • Deep expertise in real‑time processing using Kafka or Pub/Sub.
  • Experience with Power BI development and visualisation.
  • Familiarity with modern data stacks such as Snowflake or Databricks (while our focus is on the primary cloud platform).
  • Knowledge of Dev Ops practices and tools such as Terraform.
  • Familiarity with data visualisation tools such as Tableau, Grafana, and/or Looker.
  • Google Professional Data Engineer certification.
  • Demonstrated domain knowledge in Fraud and Financial Crime.
  • About Aquent Talent:
    Aquent Talent connects the best talent in marketing, creative, and design with the world’s biggest brands. Our eligible talent get access to amazing benefits like subsidised health, vision, and dental plans, paid sick leave, and retirement plans with a match. More information on our awesome benefits! Aquent is an equal‑opportunity employer. We evaluate qualified applicants without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, and other legally protected characteristics.

    We’re about creating an inclusive environment—one where different backgrounds, experiences, and perspectives are valued, and everyone can contribute, grow their careers, and thrive.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary