×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Data Engineer​/Apache Airflow Specialist

Remote / Online - Candidates ideally in
Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listing for: Intetics
Full Time, Remote/Work from Home position
Listed on 2026-02-06
Job specializations:
  • Software Development
    Data Engineer
Job Description & How to Apply Below
Position: Data Engineer / Apache Airflow Specialist
Location: Town of Poland

Data Engineer / Apache Airflow Specialist

Intetics Inc. is a global technology company providing custom software application development, distributed professional teams, software product quality assessment, and "all-things-digital" solutions. Based on its proprietary business model of Remote In-Sourcing®, advanced Technical Debt Reduction Platform (TETRA™), and measurable quality management platform (
Predictive Software Engineering
), Intetics enables clients to achieve measurable business results.

Position details
  • Position: Data Engineer / Apache Airflow Specialist
  • Level: Senior
  • Technologies: Apache Airflow, Python, Flask, Elasticsearch, Unix/Linux, Oracle, Postgre

    SQL, Git Lab
  • Workload: 1100 hrs/year
  • Location: Remote — work from anywhere
  • English level: Advanced
  • Education: Technical degree
Role Description

Intetics is looking for an experienced Data Engineer / Apache Airflow Specialist to join our distributed team for a data-driven project focused on large-scale ETL workflows, data indexing, and performance optimization.

The specialist will design, implement, and optimize data pipelines using Apache Airflow, manage database performance on Oracle and Postgre

SQL, and support Elasticsearch integration for efficient data retrieval and search operations.

You will also be responsible for ensuring smooth deployment pipelines in Git Lab and collaborating with a cross-functional engineering team to enhance the quality and reliability of complex data processes.

What You'll Do
  • Develop, orchestrate, and maintain complex Apache Airflow DAGs for ETL and data-processing pipelines.
  • Build and optimize Python-based ETL scripts, integrating with Flask APIs when needed.
  • Design and manage Elasticsearch indexing and performance tuning workflows.
  • Handle Unix/Linux scripting and operations for automation and monitoring.
  • Work with Oracle and Postgre

    SQL databases for large-scale data processing.
  • Implement and maintain Git Lab CI/CD pipelines for build, test, and deploy stages.
  • Collaborate with the project team to ensure scalability, reliability, and quality of data solutions.
What We're Looking For
  • ≥ 3 years of Apache Airflow DAG orchestration.
  • ≥ 5 years of Python (ETL focus), with Flask API experience as a plus.
  • ≥ 3 years of Elasticsearch (data indexing & optimization).
  • ≥ 3 years of Unix/Linux scripting & operations.
  • ≥ 3 years with Oracle or Postgre

    SQL (ideally both).
  • ≥ 3 years of Git Lab pipelines (build/test/deploy).
  • Advanced English and a technical degree.
Nice-to-Have / Bonus
  • Experience with Great Expectations or similar data-quality tools.
  • Airflow on Kubernetes.
  • Proven performance tuning experience.
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Other
Industries
  • IT Services and IT Consulting

Apply BELOW

To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary