More jobs:
Data Generalist/Canada, Part time, Remote
Remote / Online - Candidates ideally in
New York, New York County, New York, 10261, USA
Listed on 2026-02-28
New York, New York County, New York, 10261, USA
Listing for:
Braintrust
Part Time, Remote/Work from Home
position Listed on 2026-02-28
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Location: New York
We are seeking a senior, part‑time Data Generalist to support analytics, data engineering, and Mar Tech initiatives over a 4‑month contract (March–June). This project may extend beyond that, but that is what we anticipate right now. This role is ideal for a highly autonomous engineer who can operate with minimal hand holding in a fast‑moving environment.
The role is remote, based in North America, averaging ~15 hours per week, with workload that may be uneven or “spiky” depending on project needs. Strong communication and comfort in client‑facing discussions are essential.
Responsibilities- Design, build, and optimize data pipelines and analytics solutions on Google Cloud Platform (GCP).
- Develop and optimize Big Query datasets using partitioning, clustering, and query optimization.
- Build and maintain ETL/ELT pipelines using Python, SQL, and tools such as Fivetran and dbt.
- Develop and support custom APIs using Cloud Functions or Cloud Run.
- Implement event‑driven architectures using Pub/Sub and workflow orchestration via Cloud Composer (Airflow) or Cloud Workflows.
- Apply IAM and security best practices, including handling PHI/PII in HIPAA‑compliant environments.
- Support marketing data use cases including attribution, Google Ads, Meta Ads, and API‑based integrations.
- Partner closely with the Martech Architect to translate business and marketing requirements into scalable technical solutions.
- Communicate clearly with internal teams and clients, explaining technical concepts and tradeoffs.
- Strong expertise in Google Cloud Platform, including:
- Big Query (partitioning, clustering, optimization)
- Cloud Functions and/or Cloud Run
- Pub/Sub
- IAM and security best practices
- Cloud Composer (Airflow) or Cloud Workflows
- Solid data engineering background, including:
- Python (pandas, requests, google‑cloud libraries)
- Advanced SQL (complex queries, window functions, performance tuning)
- ETL/ELT pipeline design
- Data modeling (star and snowflake schemas)
- Familiarity with dbt
- Experience with data clean rooms
- Experience in healthcare and/or marketing technology, including:
- HIPAA compliance requirements
- PHI/PII handling
- Marketing attribution concepts
- API integrations (REST, OAuth)
- Hands‑on experience with:
- Fivetran
- Git version control
- Terraform (infrastructure as code)
- Monitoring tools (Datadog, Cloud Monitoring)
- Analytics platforms such as Tableau or Looker
- Experience with Live Ramp or IQVIA
- Prior healthcare data background
- Backend development experience using JavaScript/Type Script
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×