The Alexa Daily Essentials team delivers experiences critical to how customers interact with Alexa as part of daily life. Alexa users engage with our products across experiences connected to Timers, Alarms, Calendars, Food, and News. Our experiences include critical time saving techniques, ad-supported news audio and video, and in-depth kitchen guidance aimed at serving the needs of the family from sunset to sundown.
Keyjob responsibilities Data Infrastructure
- Architect and develop robust data pipelines that ingest and transform data for business intelligence analytics
- Maintain data pipelines using scripting languages such as Python, Spark, SQL and AWS services such as S3, Glue, Lambda, SNS, SQS, KMS
- Design and implement scalable data infrastructure supporting Redshift clusters and visualization dashboards that serve product stakeholders
- Build self‑serve data platforms to enable scientists and business stakeholders to answer business questions with data. Implement and support reporting and analytics infrastructure for internal customers
- Ensure compliance with data governance. Address data access restrictions while maintaining adherence to data protection policies and security standards
- Implement data quality monitoring mechanisms with alerting for pipeline failures or anomalies. Ensure data and reporting consistency across the organization
- Excel at communicating complex ideas to technical and non‑technical audiences
- Build relationships with product and engineering stakeholders and counterparts
- Work with stakeholders to gather requirements, and translate business needs into data engineering solutions
- Work cross‑functionally across product and engineering teams to drive adoption of data assets
- Drive automation and operational excellence in data infrastructure. Reduce manual intervention and improve system reliability
- Scale existing solutions. Create new solutions as required based on team and stakeholder needs
- 5+ years of data engineering experience
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with SQL
- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS
- Experience mentoring team members on best practices
- Experience building data products incrementally and integrating and managing data sets from multiple sources
- Experience with big data technologies such as:
Hadoop, Hive, Spark, EMR - Experience operating large data warehouses
- Experience communicating with users, other technical teams, and management to collect requirements, describe data modeling decisions and data engineering strategy
Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit (Use the "Apply for this Job" box below).
for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
The base salary range for this position is listed below. As a total compensation company, Amazon's package may include other elements such as sign‑on payments and restricted stock units (RSUs). Final compensation will be determined based on factors including experience, qualifications, and location. Amazon offers comprehensive benefits including health insurance (medical, dental, vision, prescription, basic life & AD&D insurance), Registered Retirement Savings Plan (RRSP), Deferred Profit Sharing Plan (DPSP), paid time off, and other resources to improve health and well‑being.
We thank all applicants for their interest, however only those interviewed will be advised as to hiring status.
CAN, BC, Vancouver - - CAD annually
#J-18808-LjbffrTo Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: