×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer; Senior Poland : Kielce, Kraków, Wrocław + Remote – PLN NET B2B Da

Remote / Online - Candidates ideally in
Town of Poland, Jamestown, Chautauqua County, New York, 14701, USA
Listing for: Virtus Lab Sp. z o.o. All trademarks their respective owners
Remote/Work from Home position
Listed on 2026-01-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Salary/Wage Range or Industry Benchmark: 90000 - 120000 USD Yearly USD 90000.00 120000.00 YEAR
Job Description & How to Apply Below
Position: Data Engineer (Senior) Poland : Kielce, Kraków, Wrocław + Remote 21 000 – 27 000 PLN NET B2B Da[...]
Location: Town of Poland

We are #VLteam – tech enthusiasts constantly striving for growth. The team is our foundation, that’s why we care the most about the friendly atmosphere, a lot of self‑development opportunities and good working conditions. Trust and autonomy are two essential qualities that drive our performance. We simply believe in the idea of “measuring outcomes, not hours”. Join us & see for yourself!

About

the role

You will participate in defining the requirements and architecture for the new platform, implement the solution, and remain involved in its operations and maintenance post‑launch. Your work will introduce data governance and management, laying the foundation for accurate and comprehensive reporting that was previously impossible. You will adhere to and actively promote engineering best practices, data governance standards, and the use of open standards.

Build data ingestion & processing pipelines. Collaborate with stakeholders to define requirements, develop data pipelines and data quality metrics.

Required skills
  • Python – Advanced
  • Databricks/Snowflake – Advanced
  • SQL – Regular
  • Apache Airflow or other orchestration tool – Regular
  • Data modeling – Nice to have
  • LLM productivity tools like Cursor/Claude Code – Nice to have
  • dbt – Nice to have

Project

Compass BI

Project scope

Our client is a company that specialises in insurance, reinsurance, and asset management. It focuses on delivering scalable capital solutions that cater to various areas, including property, casualty, and speciality insurance lines.

Currently in its startup phase, it is actively strategising to enhance its operations and expand its service capabilities. A key aspect of their plans involves the adoption of modern technologies, which will enable the company to streamline processes and increase overall efficiency in their offerings through improved management of data. By leveraging modern technology, our client aims to position itself as a competitive player in the insurance industry while also addressing the evolving needs of its clients.

As part of this transformation, Virtuslab (VL) will accelerate progress on its roadmap and in building a modern data platform with scalable compute capabilities, enhance reporting and workflow automation, and embed cloud‑native engineering practices.

The project involves building a comprehensive reporting and analytics platform from the ground up. Key challenges include integrating data from multiple complex sources, ensuring high data quality and consistency, and designing scalable data models that support both operational and analytical reporting.

It also requires close collaboration with business stakeholders to understand reporting needs and translate them into effective data solutions.

Team

The team is small but highly motivated, taking on a broad scope of responsibilities as the platform is built and expanded.

What we expect in general
  • Hands‑on experience with Python
  • Proven experience with data warehouse solutions (e.g., Big Query, Redshift, Snowflake)
  • Experience with Databricks or data lakehouse platforms
  • Strong background in data modelling, data catalogue concepts, data formats, and data pipelines/ETL design, implementation and maintenance
  • Ability to thrive in an Agile environment, collaborating with team members to solve complex problems with transparency
  • Experience with AWS/GCP/Azure cloud services, including: GCS/S3/ABS, EMR/Dataproc, MWAA/Composer or Microsoft Fabric, ADF/AWS Glue
  • Experience in ecosystems requiring improvements and the drive to implement best practices as a long‑term process
  • Experience with Infrastructure as Code practices, particularly Terraform, is an advantage
  • Proactive approach
  • Familiarity with Spark is a plus
  • Familiarity with Streaming tools is a plus

Don’t worry if you don’t meet all the requirements. What matters most is your passion and willingness to develop. Apply and find out!

A few perks of being with us
  • Building tech community
  • Home office reimbursement
  • Training Package
  • Virtusity / in‑house training
  • And a lot more!
Apply now

We encourage interested candidates to submit their application. For more details, please visit our careers page.

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary