Data Engineer
Listed on 2026-03-01
-
IT/Tech
Data Engineer, Data Science Manager
RAPP Chicago is looking for a Data Engineer to join our award-winning Technology team.
WHO WE AREWe are RAPP – world leaders in activating growth with precision and empathy at scale.
As a global, next-generation precision marketing agency we leverage data, creativity, technology, and empathy to foster client growth. We champion individuality in the marketing solutions we create, and in our workplace. We fight for solutions that adapt to the individual’s needs, beliefs, behaviors, and aspirations.
We foster an inclusive workplace that emphasizes personal well-being.
HOW WE DO ITAt RAPP, our fearless superconnectors help to create value from personal brand experiences by focusing on three key areas: connected data, connected content and connected decisioning.
Our data analysts identify who that person is, our strategists understand what they want, and our award-winning technologists and creatives know how to deliver it – ensuring we’re able to activate authentic customer connections for our clients.
Part of Omnicom’s Precision Marketing Group, RAPP is comprised of 2,000+ creatives, technologists, strategists, and data and marketing scientists across 15+ global markets.
YOUR ROLEWe are looking for a Data Engineer who is eager to learn and grow while contributing to the development of scalable, cloud‑native data pipelines and platforms. The ideal candidate has foundational knowledge of Python and an interest in building data workflows using modern technologies such as Apache Airflow, AWS Lambda, Dynamo
DB, and dbt. You should have a strong curiosity for how data systems operate, a willingness to learn best practices in data engineering, and a motivation to support advanced analytics, as you gain hands‑on experience.
- Data Pipeline Development
- Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow
. - Develop serverless workflows leveraging AWS Lambda for scalable event‑driven data processing.
- Implement and optimize dbt models for analytics and transformations.
- Design, build, and maintain robust ETL/ELT pipelines using Python and Airflow
- Data Architecture & Storage
- Design schemas and manage data in DynamoDB and other cloud‑native storage solutions.
- Ensure high availability, scalability, and performance of data systems.
- Integrate structured, semi‑structured, and unstructured data sources.
- Automation & Orchestration
- Build workflow orchestration strategies using Airflow for scheduling and monitoring pipelines.
- Automate infrastructure deployment and CI/CD pipelines for data services.
- Quality & Governance
- Implement data validation, testing, and monitoring frameworks.
- Ensure compliance with security, privacy, and governance standards.
- 1–3 years of experience in data engineering, software engineering, or a related role.
- Proficiency in Python for data engineering and automation.
- Familiarity with Apache Airflow for workflow orchestration.
- Basic understanding of AWS Lambda and serverless design patterns.
- Exposure to DynamoDB (schema design and performance considerations).
- Knowledge of dbt for data transformation and analytics modeling.
- Experience working in cloud environments (AWS preferred).
- Understanding of CI/CD workflows
, Git, and Dev Ops practices. - Strong analytical, problem‑solving, and communication skills.
- Experience with other AWS services (S3, Glue, Redshift, Kinesis).
- Familiarity with data warehouse and data lake architectures.
- Exposure to real‑time streaming and event‑driven data pipelines.
- Knowledge of containerization (Docker, Kubernetes).
- Exceptional attention to detail and organizational skills.
- Strong written and verbal communication skills, with the ability to explain complex metadata systems to non‑technical users.
- Ability to work collaboratively and cross‑functionally with creative, marketing, and IT teams.
- Proactive problem‑solver who can identify issues and suggest improvements.
- Time management skills with the ability to prioritize and manage multiple tasks in a fast‑paced environment.
RAPP's current hybrid model is designed to enable in‑person connections and collaboration that is core to our culture, while also supporting flexibility for all employees. As such, we have the option to work…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).