Data Engineer
Job in
Boca Raton, Palm Beach County, Florida, 33481, USA
Listed on 2026-02-28
Listing for:
AgileEngine, LLC.
Full Time
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Data Analyst, Data Science Manager
Job Description & How to Apply Below
WHY JOIN US
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!
ABOUTTHE ROLE
This Senior Data Engineer (Python) role is central to transforming large, diverse datasets into reliable insights that support research and strategic decisions across a global financial platform. You will help shape a unified data ecosystem, partnering with data scientists, researchers, and stakeholders to connect technology with real business impact. What makes this opportunity unique is the scale of data, use of modern cloud, AI, and data engineering practices, and strong influence on platform evolution.
It’s a chance to grow technically while contributing to a mission-driven, collaborative environment.
- Design and build scalable Data Lakes, Data Warehouses, and Data Lake houses
- Design and implement robust ETL/ELT processes at scale using Python and pipeline orchestration tools like Airflow
- Develop ingestion workflows from diverse third-party APIs and data sources
- Manage and optimize file formats such as Parquet, Avro, and ORC for high-performance data retrieval
- Work with AI development tools to support machine learning initiatives and advanced analytics
- Act as a technical consultant to gather requirements, understand business goals, and translate them into technical roadmaps
- Work with Terraform and other tools to build AWS and on-prem infrastructure
- You must be authorized to work for ANY employer in the US, as employment visa sponsorship is not available
- Bachelor’s degree in computer science/engineering or other technical field, or equivalent experience
- 5+ years of experience with Python with strong hands‑on expertise
- 5+ years of experience with data processing and analytics libraries such as Pandas, Polars, PySpark, and DuckDB
- 2+ years of experience with Big Data technologies such as Spark and Snowflake
- Expert‑level knowledge of Airflow or similar pipeline orchestration tools
- Deep understanding of Medallion Architecture
, columnar file formats, and database technologies including SQL, No
SQL, and Lakehouse architectures - Proven ability to work with third‑party APIs for complex data ingestion
- Proficiency with cloud platforms such as AWS, GCP, and Snowflake, including advanced SQL optimization
- Familiarity with the fintech industry and financial data domains
- Documentation skills for data pipelines, architecture designs, and best practices
- Open Search or Elasticsearch
- AWS Sage Maker Studio and Jupyter for data analysis
- Terraform
- Scala
- Professional growth: Mentorship, Tech Talks, and personalized growth roadmaps
- Competitive compensation: USD‑based pay with education, fitness, and team activity budgets
- Exciting projects: Modern solutions with Fortune 500 and top product companies
- Flextime: Flexible schedule with remote and office options
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×