×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Expert, Data Platforms Engineer

Job in Louisville, Jefferson County, Kentucky, 40201, USA
Listing for: Schneider Electric
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 114400 - 171600 USD Yearly USD 114400.00 171600.00 YEAR
Job Description & How to Apply Below

For this U.S. based position, the expected compensation range is $114,400 - $171,600 per year, which includes base pay and short-term incentive.

The compensation range for this full‑time position applies to candidates located within the United States. Our salary ranges are determined by reviewing roles of similar responsibility and level. Within the salary range, individual pay is determined by several factors including performance, knowledge, job related skills, experience, and relevant education or training. Schneider Electric also offers a comprehensive benefits package to support our employees, inclusive of medical (with member reward points), dental, vision, and basic life insurance, Benefit Bucks (credits to apply towards your benefits) flexible work arrangements, paid family leaves, 401(k) + match, well‑being and recognition (including service anniversary) programs, 12 holidays per year, 15 days of paid time off per year (pro‑rated in the first year of employment based on start date), opportunity to purchase company stock (eligibility depends on start date), and military leave benefits.

You must submit an online application to be considered for the position. The Company will accept applications on an ongoing basis until the position is filled.

If you believe this job posting is not compliant with applicable state pay transparency laws in the U.S., please notify the Company as soon as possible upon discovery by completing this form Job Posting Compliance Form.

Join Schneider Electric and help us power a sustainable future through data. As a Data Engineer within our Data Platforms organization, you’ll play a critical role in designing and delivering advanced data solutions that enable trusted analytics and AI across the enterprise. You will collaborate with business stakeholders, solution architects, and global technology teams to create efficient, scalable data flows while ensuring strong data governance practices.

Your expertise will help transform raw data into actionable insights that drive innovation and smarter decisions for our customers and the planet.

Job Description
  • Design, develop and engineer data lake by data source identification, data extraction, data modelling, data store design and governance
  • Develop data flows to ingest and transform both structured and unstructured data using warehousing and big data methodologies and tools to build refined and aggregated data assets for business intelligence and AI enablement.
  • Implement efficient patterns for data storage, partitioning, semantic layer and metadata structures to enable consumption of data as a product.
  • Implement automation capabilities to enable governance features for data privacy and protection, data inventory and cataloguing, authorization and data security.
  • Implement enhanced observability, cost management and reporting strategies for data engineering operations.
  • Build relevant integration testing and quality assurance capabilities while ensuring compliance with architectural standards for development.
  • Perform assessments (Proof of Concepts) of the latest tools and technology to solution and assist in evolving the technology landscape.
  • Lead, mentor, and support the Data Delivery teams with methodologies, tools and technical knowledge.
Qualifications and Experience
  • 6+ years minimum experience in technical roles with focus in data engineer.
  • 3+ years of Hands‑on experience in AWS services including but not limited to: S3, KMS, Lambda, Glue (or Spark), SQS, Event Bridge, AWS Step functions etc.
  • 3+ years of hand on experience in writing clean, modular, testable, and maintainable code using either functional or OOP principles.
  • Good understanding of AWS networking concepts: VPC, subnets, routing, NAT gateways, security groups.
  • Proficiency in IAM concepts: roles, policies, assume‑role, and cross‑account access.
  • Good experience in at least one programming language such as Python, Scala, Java etc
  • Good understanding of Spark architecture, execution plans, and performance tuning.
  • Strong SQL skills (complex joins, window functions, aggregations etc) and experience with Redshift and/or Athena, or any other distributed compute engines.
  • Experience…
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary