Senior Data Engineer
Listed on 2026-03-01
-
IT/Tech
Data Engineer, Data Security, Data Science Manager
About Avaya
Avaya is an enterprise software leader that helps the world’s largest organizations and government agencies forge unbreakable connections.
The Avaya Infinity™ platform unifies fragmented customer experiences, connecting the channels, insights, technologies, and workflows that together create enduring customer and employee relationships.
We believe success is built through strong connections – with each other, with our work, and with our mission. At Avaya, you'll find a community that values your contributions and supports your growth every step of the way.
Learn more at
OverviewYou’ll build and scale the real-time and batch data platform that powers a large enterprise contact center solution. Our products demand ultra-low-latency decisioning for live interactions and cost-efficient big-data analytics for historical insights. We’re primarily on Azure today and expanding to GCP and AWS. Data is the backbone for our AI features and product intelligence.
Primary charte
r: complex contact center analytics and operational intelligence: an AI-enabled enterprise contact center analytics. Our vision is a flexible AI-enabled data platform that unifies contact center KPIs, customer/business outcomes, and AI quality/performance, and pervasively applies AI to deliver advanced features that help users easily leverage rich contact center data alongside business data and AI performance monitoring to drive decisions end-to-end.
- Design, build, and operate low-latency streaming pipelines (Kafka, Spark Structured Streaming) and robust batch ETL/ELT on Databricks Lakehouse.
- Establish reliable orchestration and dependency management (Airflow), with strong SLAs and on-call readiness for business-critical data flows.
- Model, optimize, and document curated datasets and interfaces that serve analytics, product features, and AI workloads.
- Implement data quality checks, observability, and backfills; drive root-cause analysis and incident prevention.
- Partner with application teams (Go/Java), analytics, and ML/AI to ship data products into production.
- Build and maintain datasets and services that power RAG pipelines and agentic AI workflows (tool-use/function calling).
- When Spark/Databricks isn’t optimal, design and operate custom processors/services in Go to meet strict latency or specialized transformation requirements.
- Instrument prompt/response and token usage telemetry to support LLMOps evaluation and cost optimization; provide datasets for labeling and golden sets.
- Improve performance and cost (storage/compute), review code, and raise engineering standards.
- Design data solutions aligned to enterprise security, privacy, and compliance requirements (e.g., SOC 2, ISO 27001, GDPR/CCPA as applicable), partnering with Security/Legal.
- Implement RBAC/ABAC and least-privilege access; manage service principals, secrets, and key rotation; enforce encryption in transit and at rest.
- Govern sensitive data: classification, PII handling, masking/tokenization, retention/archival, lineage, and audit logging across pipelines and storage.
- Build observability for data security and quality; support incident response, access reviews, and audit readiness.
- Embed controls in CI/CD (policy checks, dependency vulnerability scanning) and ensure infra‑as‑code adheres to guardrails.
- Partner with security engineering on penetration tests, threat modeling, and red‑team exercises; remediate findings and document controls.
- Contribute to compliance audits (e.g., SOC 2/ISO 27001) with evidence collection and continuous control monitoring; support DPIAs/PIAs where required.
- 6+ years building production‑grade data pipelines at scale (streaming and batch).
- Deep proficiency in Python and SQL; strong Spark experience on Databricks (or similar).
- Advanced SQL: window functions, CTEs, partitioning/z‑ordering, query planning and tuning in lakehouse environments.
- Hands‑on with Kafka (or equivalent) and an orchestrator (Airflow preferred).
- Strong data modeling skills and performance tuning for low latency and high throughput.
- Production mindset: SLAs, monitoring, alerting, CI/CD, and on‑call participation.
- Proficient using AI…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).