×
Register Here to Apply for Jobs or Post Jobs. X

Sr Data Engineer

Job in Minneapolis, Hennepin County, Minnesota, 55400, USA
Listing for: Jack Links Protein Snacks
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Science Manager
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Company Description

Running with Sasquatch is more than just a clever marketing campaign. As a Jack Link’s team member, Running with Sasquatch means we roll up our buffalo plaid sleeves and do the hard work first. We don’t shy away from challenges. In fact, we push hard and take risks. True to our North Woods roots, we're a bunch of ordinary people who accomplish extraordinary things by driving results with innovation, creativity and a clear sense of urgency.

Like our awesome protein products, we have an unwavering passion for quality, and you won’t find anything artificial here. What you see is what you get…authentic, humble and fun people who Run with Sasquatch!

Running with Sasquatch takes a team. We invite you to run with us, succeed with us, and celebrate with us. Most importantly, Feed Your Wild Side® with us on our journey to be the dominant global leader of branded protein snacks!

Jack Link's Protein Snacks is a global leader in snacking and the No. 1 meat snack manufacturer worldwide. Still family-owned and operated with headquarters in Minong, Wisconsin, Jack Link’s also has a large corporate hub in Downtown Minneapolis, Minnesota, and operates a total of 11 manufacturing and distribution facilities in four countries. Jack Link’s produces high-quality, great-tasting protein snacks that feed the wild sides of consumers around the world.

Jack Link's Protein Snacks family of brands includes Jack Link's, LK, World Kitchens Jerky, Bifi and Peperami.

Job Description

Are you ready to shape the future of data at Jack Link’s? We’re looking for a Senior Data Engineer to play a pivotal role in evolving our modern data foundation, enabling the next generation of analytics, automation, and insights. At Jack Link’s, we are building an analytics capability that fuels smarter decisions, faster innovation, and stronger business outcomes.

We’re looking for a Senior Data Engineer to lead the development of scalable, end-to-end data pipelines that power analytics, automation, and external product integrations. This full‑stack role spans the entire data lifecycle—from ingestion and transformation to governance and infrastructure. A key focus area is building and maintaining scalable data pipelines from multiple data sources such as S/4

HANA and Datasphere into Microsoft Fabric.

You’ll work closely with IT professionals, product owners, business relationship managers, and analytics teams to design and maintain data models, schemas, and tables that support reporting, dashboards, and ML/AI workflows. A strong focus will be on building and maintaining data pipelines; and data preparation and feature engineering to ensure data is structured, accessible, and optimized for decision‑making. This role collaborates with enterprise data architects, platform engineers, and analytics product owners, and is ideal for someone who thrives in a cross‑functional, product‑oriented environment.

Core

Responsibilities
  • Build & Manage Data Pipelines: Design and maintain scalable pipelines for ingesting, transforming, and storing data from SAP and non‑SAP sources into Microsoft Fabric.
  • Lead Medallion Architecture Design in Microsoft Fabric: Design and manage bronze, silver, and gold layer structures within Microsoft Fabric to support scalable, governed, and analytics‑ready data pipelines.
  • Develop in Microsoft Fabric: Use tools like Lakehouse, Data Warehouse, and Notebooks to process and transform data efficiently.
  • Model & Prepare Data for Analytics: Build robust data models and perform feature engineering to support reporting, dashboards, and ML/AI use cases.
  • Integrate SAP Systems: Connect SAP Datasphere and S/4

    HANA analytics into Fabric; orchestrate SAP and non‑SAP data flows.
  • Ensure Data Quality & Governance: Implement governance practices, maintain metadata, and ensure data integrity across platforms.
  • Automate with Code: Write clean, efficient Python and SQL for ETL workflows, automation, and API development.
  • Support MLOps & AIOps: Help deploy and monitor analytics models using modern Dev Ops practices.
  • Collaborate & Mentor: Partner with business, analytics, and IT teams; mentor junior engineers and promote best practices.
Required…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary