Data Engineer
Listed on 2026-01-13
-
IT/Tech
Data Engineer
About Air Apps
Air Apps is a family‑founded company on a mission to create the world’s first AI‑powered Personal & Entrepreneurial Resource Planner (PRP). Born in Lisbon in 2018 and now with offices in Lisbon and San Francisco, we have remained self‑funded while reaching over 100 million downloads worldwide.
The RoleAs a Data Engineer at Air Apps, you will design, build, and optimize data pipelines, warehouses, and lakes to ensure efficient data processing and analytics. You will work closely with data analysts, scientists, and software engineers to create scalable, reliable data infrastructure for business intelligence and machine learning initiatives.
Responsibilities- Design, build, and maintain scalable data pipelines and ETL workflows to support analytics and reporting.
- Develop and optimize data warehouses and data lakes using cloud platforms such as AWS, Google Cloud, or Azure.
- Implement real‑time and batch data processing solutions for various business needs.
- Work with structured and unstructured data, ensuring proper data modeling and storage strategies.
- Ensure data reliability, consistency, and scalability through best practices in architecture and engineering.
- Collaborate with data analysts, scientists, and software engineers to enable efficient data access and analysis.
- Automate data ingestion, transformation, and validation processes to improve data quality.
- Monitor and optimize query performance and data processing efficiency.
- Implement security, compliance, and governance standards for data storage and access control.
- Stay up to date with emerging data engineering trends, tools, and technologies.
- 4+ years of experience in data engineering, software engineering, or database management.
- Proficiency in SQL, Python, or Scala for data processing and automation.
- Hands‑on experience with cloud‑based data solutions (AWS Redshift, Google Big Query, Azure Synapse, Snowflake).
- Experience building ETL pipelines with tools such as Apache Airflow, dbt, Talend, or Fivetran.
- Strong understanding of data modeling, schema design, and database optimization.
- Experience with big data frameworks (Apache Spark, Hadoop, Kafka, Flink) is a plus.
- Familiarity with orchestration tools, containerization (Docker, Kubernetes), and CI/CD workflows.
- Knowledge of data security, governance, and compliance (GDPR, CCPA, SOC
2). - Strong problem‑solving and debugging skills with the ability to handle large‑scale data challenges.
- Experience working in fast‑paced, data‑driven environments with cross‑functional teams.
- Apple hardware ecosystem for work.
- Annual Bonus.
- Medical Insurance (including vision & dental).
- Disability insurance – short and long‑term.
- 401k up to 4% contribution.
- Air Conference – opportunity to meet the team, collaborate, and grow together.
- Transportation budget.
- Free meals at the hub.
- Gym membership.
We are committed to fostering a diverse, inclusive, and equitable workplace. Applicants from all backgrounds, experiences, and perspectives are welcomed.
Application DisclaimerApplicants must submit their own work without any AI‑generated assistance. Any use of AI in application materials, assessments, or interviews will result in disqualification.
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).