×
Register Here to Apply for Jobs or Post Jobs. X

AI Data Engineer

Job in 530001, Visakhapatnam, Andhra Pradesh, India
Listing for: Quantum Gandiva AI
Full Time position
Listed on 2026-02-05
Job specializations:
  • IT/Tech
    Data Engineer, Data Science Manager
Job Description & How to Apply Below
AI Data Engineer (Agentic AI)

Location:

Visakhapatnam

• Team:
Data/Platform

• Type:
Full-time
Company:
Quantum Gandiva AI
Salary - Negotiable

* Why this role
* Help build the data backbone that powers our multi-agent (agentic) AI systems from real-time ingestion and feature pipelines to evaluation datasets, lineage, and cost-aware execution across clouds.

* What you’ll do
* • Design and run ETL/ELT pipelines for training, inference, and evaluation workflows (incremental loads, CDC, backfills).

• Build batch + streaming layers for telemetry, RAG corpora, events, and analytics.

• Orchestrate jobs with observability, lineage, data quality checks, and rollbacks.

• Own cost-efficient platforms on AWS (primary) and support cross-cloud analytics on GCP Big Query when needed.

• Partner with Agents/Research to shape datasets, evaluation harnesses, and prompt/versioned artifacts.

• Ship with security-by-design (IAM, least privilege, network boundaries, secrets hygiene).

Must-have skills (aligned)

Data & Compute

• ETL/ELT frameworks, PySpark/Spark, advanced SQL, PostgreSQL

• Airflow (orchestration) and/or dbt Cloud (transform + tests)

Cloud & Infra (Primary: AWS)

• S3, Glue, Lambda, EKS, EC2, Redshift, VPC/VPN

• Cost management (right-sizing, storage classes, quotas/limits)

Platforms & Ecosystem

• Kafka (streaming), Redis (caching), familiarity with Hadoop

Analytics & BI

• Power BI or Tableau (data modeling, semantic layers, dashboards)

Programming & Ops

• Python (packaging, testing, typing), CLI tooling, Git; CI/CD basics

Nice to have

• IaC (Terraform/Cloud Formation), Great Expectations / dbt tests

• Lakehouse patterns (Delta/Iceberg), feature stores, vector DBs, S3 lifecycle for RAG

• Experience with agent telemetry, eval datasets, and cost/perf trade-offs

Qualifications

• 2-3+ years in data engineering (or a portfolio showing production pipelines)

• Proven ownership of pipelines with SLA delivery, ideally on AWS

How we work

• On-site in Vizag, fast iterations, tight feedback loops with Agents/Research

• Pragmatic engineering: measure → optimize → automate

Compensation & apply

• Competitive package based on experience and impact.

• Apply on Linked In with your resume/Git Hub/portfolio.

QG AI is an equal-opportunity workplace. We welcome applicants from all backgrounds.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary