×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

GCP Data Engineer; W2 Position – No C2C

Job in Dearborn, Wayne County, Michigan, 48120, USA
Listing for: Systems Technology Group, Inc. (STG)
Full Time position
Listed on 2026-03-01
Job specializations:
  • Engineering
    Data Engineer
  • IT/Tech
    Data Engineer
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: GCP Data Engineer (only W2 Position – No C2C Accepted)

Title:

GCP Data Engineer (only W2 Position – No C2C Accepted) Description

STG is a SEI CMMi Level 5 company with several Fortune 500 and State Government clients. STG has an opening for GCP Data Engineer.

Please note that this project assignment is with our own direct clients. We do not go through any vendors. STG only does business with direct end clients. This is expected to be a long‑term position. STG will provide immigration and permanent residency sponsorship assistance to those candidates who need it.

Position Description

We’re seeking an experienced GCP Data Engineer who can build a cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP.

Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform.

Key Responsibilities:

  • Work in collaborative environment including pairing and mobbing with other cross‑functional engineers
  • Work on a small agile team to deliver working, tested software
  • Work effectively with fellow data engineers, product owners, data champions and other technical experts
  • Demonstrate technical knowledge/leadership skills and advocate for technical excellence
  • Develop exceptional analytics data products using streaming, batch ingestion patterns in the Google Cloud Platform with solid data warehouse principles
  • Be the subject matter expert in data engineering and GCP tooling technologies
Skills Required
  • Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production
  • Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products
  • Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting
  • Experience in working with all stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management
Experience Required
  • In‑depth understanding of Google’s product technology (or other cloud platform) and underlying architectures
  • 5+ years of analytics application development experience required
  • 5+ years of SQL development experience
  • 3+ years of cloud experience (GCP preferred) with solution designed and implemented at production scale
  • Experience working in GCP based Big Data deployments (Batch/Real‑Time) leveraging Terraform, Big Query, Google Cloud Storage, Pub/Sub, Dataflow, Dataproc, Airflow, etc.
  • 2+ years professional development experience in Java or Python, and Apache Beam
  • Experience extracting, loading, transforming, cleaning, and validating data – designing pipelines and architectures for data processing
  • 1+ year of designing and building CI/CD pipelines
Education Required
  • Bachelor’s degree or equivalent qualification in computer science, engineering or related disciplines.
Location

GCP Data Engineer is based in Dearborn, MI. A great opportunity to experience the corporate environment leading personal career growth.

Resume Submittal Instructions

Interested/qualified candidates should email their Word formatted resumes to Vasavi Konda – vasavi.konda(.@) and/or contact @ ). In the subject line of the email please include:
First and Last Name: GCP Data Engineer.

For more information about STG, please visit us at

“Opportunities don't happen, you create them.”

Troy, Michigan 48084

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary