×
Register Here to Apply for Jobs or Post Jobs. X

Pyspark Teradata Datawarehouse Problem Solving Skills Communication Skills Develop

Job in Lansing, Ingham County, Michigan, 48900, USA
Listing for: COOLSOFT
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Analyst, Data Engineer, Data Science Manager, Data Warehousing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below
Position: Pyspark TERADATA DATA WAREHOUSE Problem Solving Skills Communication Skills Develop

Pyspark TERADATA DATA WAREHOUSE Problem Solving Skills Communication Skills Develop

Requirement

Job title Developer

Job location in Lansing, MI

Skills required Bachelor Degree In Computer Science, Information Systems, Data Analytics, Pyspark TERADATA DATA WAREHOUSE Problem S

Open Date 23-Feb-2026

Close Date

Job type Contract

Duration 12 Months

Compensation DOE

Status requirement ---

Job interview type ---

Email Recruiter:coolsoft

Job Description Developer:
Bachelor Degree In Computer Science, Information Systems, Data Analytics, Pyspark TERADATA DATA WAREHOUSE Problem S

Start date : 03/30/2026

End Date : 1 Years from projected start date

Submission deadline : 3/3/2026 10:00:00 AM

Client Info : DHHS

Note:

  • Interview Process: MS Teams Video Interviews
  • Duration: 1 year with possible extension

Hybrid schedule:
Resource will be working a hybrid schedule. NO REMOTE ONLY OPTION. Will need to be onsite from day 1, two days a week (Wed/Thur ONSITE).

Open to local and non local. Non-local candidates must be willing to relocate starting Day 1. Please ensure you confirm prior to submitting.

Description

We are seeking an experienced Senior Report Developer to serve as a Data Analyst and Dashboard Developer within the client Analytics and Visualization Division. This role focuses on analyzing complex datasets, generating actionable insights, and developing interactive dashboards using Power BI, SAP Business Objects, Crystal Reports, and BI Query. In addition, the position will involve designing and optimizing cloud-native data and AI platforms leveraging Amazon Web Services (AWS) to support advanced analytics, machine learning, and real-time data processing initiatives.

The ideal candidate will have strong analytical skills, proficiency in reporting tools, and the ability to translate business requirements into effective data visualizations and scalable data solutions.

Key Responsibilities
  • Analyze large and complex datasets to identify trends, patterns, and actionable insights for business stakeholders.
  • Design, develop, and maintain interactive dashboards and reports using Power BI, SAP Business Objects, and Crystal Reports.
  • Develop and optimize SQL queries and scripts for data extraction and transformation.
  • Create and optimize queries in Hummingbird BI Query to support business intelligence and data analysis requirements.
  • Architect and implement end-to-end data pipelines using AWS services such as Amazon S3, Glue, Lambda, EMR, Redshift, Kinesis, and Sage Maker.
  • Design, build, and optimize scalable cloud-native data and AI platforms leveraging AWS to support advanced analytics and machine learning initiatives.
  • Develop robust ETL/ELT workflows, ensuring data quality, governance, security, and compliance across distributed systems.
  • Collaborate with business units to gather requirements and translate them into technical specifications for reporting and data solutions.
  • Partner with cross-functional teams, including data architects, business analysts, and application developers, to ensure alignment of reporting and data solutions with organizational goals.
  • Ensure data accuracy, integrity, and compliance with client standards and policies.
  • Provide technical expertise in reporting tools and assist in troubleshooting and performance tuning of dashboards and reports.
  • Document processes, data flows, and reporting standards for ongoing maintenance and knowledge sharing.
Required Qualifications
  • Bachelors degree in computer science, Information Systems, Data Analytics, or related field.
  • 3+ years of experience in report development and data analysis within large-scale environments.
  • 3+ years of experience in AI Data engineering with AWS platforms.
  • 3+ years of experience in python, PySpark and Advanced Data Structures.
  • 3+ years of experience in Teradata Data Warehouse and ETL utilities.
  • Experience in AWS cloud and AI platform.
  • Proficiency in SQL, Hummingbird BI/Query, SAP Business Objects, Tableau, and Power BI.
  • Strong experience with dashboard development and data visualization best practices.
  • Solid understanding of relational databases and data warehouse concepts, preferably Teradata.
  • Excellent analytical, problem-solving, and communication skills.

Call  Ext 100 for more details. Please provide Requirement  while calling.

EOE Protected Veterans/Disability

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary