×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer; GCP, SQL, BI Reporting, PySpark

Remote / Online - Candidates ideally in
San Jose, Santa Clara County, California, 95199, USA
Listing for: Insight Global
Remote/Work from Home position
Listed on 2026-01-11
Job specializations:
  • IT/Tech
    Data Engineer, Database Administrator
Salary/Wage Range or Industry Benchmark: 55 - 62 USD Hourly USD 55.00 62.00 HOUR
Job Description & How to Apply Below
Position: Data Engineer (GCP, SQL, BI Reporting, PySpark)

Get AI-powered advice on this job and more exclusive features.

This range is provided by Insight Global. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.

Base pay range

$55.00/yr - $62.00/yr

FINTECH Company - W2 Contract 6 months w/ extension

Pay Rate: $55hr-$62hr

Job Description

We are seeking a talented and motivated Data Engineer to join our dynamic team in San Jose, CA. As a Data Engineer, you will play a crucial role in designing, building, and maintaining our data infrastructure to support our innovative financial technology solutions.

Key Responsibilities
  • Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process and analyze large datasets.
  • Data Integration: Integrate data from various sources, ensuring data quality and consistency.
  • Database Management: Optimize and manage databases, ensuring high performance and availability.
  • ETL Processes: Develop and maintain ETL (Extract, Transform, Load) processes to support data warehousing and analytics.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Data Security: Implement and maintain data security measures to protect sensitive information.
  • Documentation: Create and maintain comprehensive documentation for data processes and systems.
Qualifications
  • 7+ years of experience in data engineering or a related role.
  • Must have worked with GCP, SQL, BI Reporting, PySpark.
  • Big tech background.
  • Education:

    Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in SQL and experience with relational databases (e.g., MySQL, Postgre

    SQL).
  • Strong programming skills in Python or Java.
  • Experience with big data technologies (e.g., Hadoop, Spark).
  • Familiarity with cloud platforms (e.g., AWS, GCP, Azure).
  • Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift).
  • Excellent problem‑solving and analytical skills.
  • Strong communication and collaboration abilities.
  • Ability to work in a fast‑paced, dynamic environment.
  • Comprehensive health, dental, and vision insurance.
  • 401(k) plan with company match.
  • Flexible work hours and remote work options.
Seniority level

Mid‑Senior level

Employment type

Contract

Job function

Engineering and Analyst

Industries

Software Development

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary