×
Register Here to Apply for Jobs or Post Jobs. X

AWS Cloud Engineer

Job in Seattle, King County, Washington, 98127, USA
Listing for: Tata Consultancy Services
Full Time position
Listed on 2026-02-28
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 100000 - 120000 USD Yearly USD 100000.00 120000.00 YEAR
Job Description & How to Apply Below

Overview

Job Description
: AWS data services (S3, Glue, Redshift, Athena, Lambda, Step Functions, Kinesis, etc.). Unity Catalog, PySpark, AWS Glue, Lambda, Step Functions, and Apache Airflow.

Responsibilities
  • Data Pipeline Development:
    Design, develop, and optimize ETL/ELT pipelines using AWS & Databricks services such as Unity Catalog, PySpark, AWS Glue, Lambda, Step Functions, and Apache Airflow.
  • Data Integration:
    Integrate data from various sources, including relational databases, APIs, and streaming data, ensuring high data quality and consistency.
  • Cloud Infrastructure Management:
    Build and manage scalable, secure, and cost-efficient data infrastructure using AWS services like S3, Redshift, Athena, and RDS.
  • Data Modeling:
    Create and maintain data models to support analytics and reporting requirements, ensuring efficient querying and storage.
  • Performance Optimization:
    Monitor and optimize the performance of data pipelines, databases, and queries to meet SLAs and reduce costs.
  • Collaboration:

    Work closely with data scientists, analysts, and software engineers to understand data needs and deliver solutions that enable business insights.
  • Security and Compliance:
    Implement best practices for data security, encryption, and compliance with regulations such as GDPR, CCPA, or ITAR.
  • Automation:
    Automate repetitive tasks and processes using scripting (Python, Bash) and Infrastructure as Code (e.g., Terraform, AWS Cloud Formation).
  • Agile Development:
    Build and optimize CI/CD pipelines to enable rapid and reliable software releases using Git Lab in an Agile environment.
  • Monitoring and Troubleshooting:
    Set up monitoring and alerting for data pipelines and infrastructure, and troubleshoot issues to ensure high availability.
Requirements
  • Programming skills in Python, Scala, or PySpark for data processing and automation.
  • Expertise in SQL and experience with relational and No

    SQL databases (e.g., RDS, Dynamo

    DB).
  • Education:

    Bachelor of Computer Science.
  • Base Salary Range: $100,000 - $120,000 per annum.
Benefits
  • Discretionary Annual Incentive.
  • Comprehensive Medical Coverage:
    Medical & Health, Dental & Vision, Disability Planning & Insurance, Pet Insurance Plans.
  • Family Support:
    Maternal & Parental Leaves.
  • Insurance Options:
    Auto & Home Insurance, Identity Theft Protection.
  • Convenience & Professional Growth:
    Commuter Benefits & Certification & Training Reimbursement.
  • Time Off:
    Vacation, Time Off, Sick Leave & Holidays.
  • Legal & Financial Assistance:
    Legal Assistance, 401K Plan, Performance Bonus, College Fund, Student Loan Refinancing.
  • U.S. citizenship or U.S. permanent residency (Green Card) required to comply with applicable laws.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary