Data Engineer; Remote from Poland
Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listed on 2026-01-22
-
Software Development
Data Engineer
Location: Town of Poland
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in Poland.
This role offers the opportunity to shape and maintain a high-performing, scalable data infrastructure in a dynamic, fast-growing environment. You will be responsible for designing, developing, and optimizing data pipelines while ensuring data quality, security, and performance. Collaborating closely with development and operations teams, you will influence the evolution of the data warehouse platform and experiment with new tools to drive innovation.
The position emphasizes hands‑on work with cloud‑based technologies, modern data warehousing solutions, and ELT frameworks. Ideal candidates are proactive, solution‑oriented, and passionate about enabling data‑driven decision‑making across the organization. This is a chance to work in a flexible, creative, and supportive culture while making a tangible impact on product and business outcomes.
- Maintain, configure, and optimize the existing data warehouse platform and pipelines.
- Design and implement incremental data integration solutions prioritizing data quality, performance, and cost‑efficiency.
- Drive innovation by experimenting with new technologies and recommending platform improvements.
- Collaborate with development and operations teams to ensure seamless data flow and integration.
- Implement and enforce data security, auditing, and monitoring best practices.
- Support the scaling and evolution of the data architecture to meet growing business needs.
- 7+ years of experience in data warehousing, database administration, or database development.
- 5+ years of hands‑on experience as a Data Engineer using SQL and Python.
- Strong experience with cloud platforms such as AWS, GCP, or Azure, including containerization (Docker/Kubernetes).
- Proven ability to work with large datasets using tools like Snowflake, Big Query, Redshift, Databricks, Vertica, Teradata, or Hadoop/Hive/Spark.
- Experience building maintainable, high‑performance, and scalable data pipelines.
- Proficiency with ELT tools and data integration frameworks such as Airflow, DBT, S3, and REST APIs.
- Positive, solution‑oriented mindset and willingness to learn new technologies.
- Excellent written and verbal communication skills in English.
- Competitive total compensation package.
- Strong work‑life balance initiatives and flexible remote work environment.
- Autonomy and freedom to make decisions and propose improvements.
- Opportunities for professional growth, continuous learning, and career development.
- Collaborative, people‑oriented environment with a supportive culture.
- Exposure to innovative technologies and the chance to make a tangible impact on products and business outcomes.
We use an AI-powered matching process to ensure your application is reviewed quickly, objectively, and fairly against the role's core requirements. Our system identifies the top-fitting candidates, and this shortlist is then shared directly with the hiring company. The final decision and next steps (interviews, assessments) are managed by their internal team.
We appreciate your interest and wish you the best!
Data Privacy NoticeBy submitting your application, you acknowledge that Jobgether will process your personal data to evaluate your candidacy and share relevant information with the hiring employer. This processing is based on legitimate interest and pre‑contractual measures under applicable data protection laws (including GDPR). You may exercise your rights (access, rectification, erasure, objection) at any time.
#J-18808-Ljbffr(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).