×
Register Here to Apply for Jobs or Post Jobs. X

Senior Google Cloud Data Engineer

Job in New York, New York County, New York, 10261, USA
Listing for: Valtech
Part Time position
Listed on 2026-01-15
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
Location: New York

Why Valtech? We’re advisors, visionaries, creative and techies. We embrace all things digital. We talk to each other. We have fun. We love our clients. We’re looking ahead
• We are global

Why Valtech? We’re the experience innovation company - a trusted partner to the world’s most recognized brands. To our people we offer growth opportunities, a values‑driven culture, international careers and the chance to shape the future of experience.

The opportunity

At Valtech, you’ll find an environment designed for continuous learning, meaningful impact, and professional growth. Whether you're pioneering new digital solutions, challenging conventional thinking or building the next generation of customer experiences, your work will help transform industries.

tel? We are seeking a Senior Google Cloud Data Engineer with a strong background in data engineering and a hands‑on approach to driving projects independently from start to finish. This role requires someone who thrives in a proactive environment and can deliver impactful solutions in cybersecurity analytics and anomaly detection.

The ideal candidate will have experience in developing dashboards and insights to detect and visualize security threats, including executive‑level dashboards. Comfort with Looker Core, Big Query, SQL, and Python is essential, along with familiarity in Machine Learning for anomaly detection, alerting mechanisms, and data manipulation.

This position is onsite in New York City (Brooklyn area) 3 days per week, and remote 2 days per week.

Role responsibilities
  • Drive projects independently, ensuring timely delivery and high‑quality outcomes.
  • Lead and execute projects related to cybersecurity, including anomaly detection in data to identify potential security threats.
  • Develop dashboards and visuals for operational and executive use, leveraging Looker Core and Big Query.
  • Implement alerting systems to proactively detect anomalies and security issues.
  • Design and Attempts for data ingestion and transformation pipelines leveraging Google Cloud Dataflow / Apache Beam and Big Query for analytics and large‑scale querying.
  • Implement Machine Learning models (specifically leveraging Big Query ML or Vertex AI) to identify statistical outliers, potential intrusions, and fraudulent patterns in network traffic.
  • Build fault‑tolerant, self‑healing, adaptive, and highly accurate streaming and batch data pipelines on GCP.
  • Perform data manipulation and analysis to support threat detection and reporting.
  • Collaborate closely with cross‑functional teams to ensure data quality, availability, and reliability across cloud environments.
  • Develop and maintain technical documentation for all assigned systems and projects.
Must have qualifications

To be considered for this role, you must meet the following essential qualifications:

  • Candidates must be legally authorized to work in the United States and must hold either U.S. citizenship or lawful permanent resident (Green Card) status.
  • Strong expertise in:
    • Python – for building data pipelines, processing tasks, and automation
    • SQL – Advanced SQL capabilities (nested fields, analytic functions).
  • Hands‑on experience with Core GCP Data Stack:
    • Big Query:
      Expert‑level SQL, performance tuning, and specifically Big Query ML (BQML) for implementing logistic regression or k‑means clustering on data in‑place.
    • Dataflow (Apache Beam):
      Strong proficiency in writing data pipelines (Python or Java) handling windowing, watermarks, and triggers for streaming data.
    • Pub/Sub:
      Experience with event‑driven architecture and message queuing.
  • Hands‑on experience with Visualization & BI:
    • Looker Core – Advanced proficiency in LookML (not just drag‑and‑drop), including derived tables, explores, and Liquid syntax.
    • Semantic Modeling:
      Develop robust LookML models to create a trusted layer of data governance, ensuring metrics are consistent across the organization.
    • Dashboard Creation:
      Design intuitive, high‑impact dashboards in Looker for two distinct audiences:
      Operational teams (real‑time threat monitoring) and Executives (risk posture and compliance reporting).
  • Familiarity with Alerting Frameworks
  • Ability to work independently and drive projects solo with minimal…
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary