×
Register Here to Apply for Jobs or Post Jobs. X

Sr Data Engineer

Job in Glendale, Los Angeles County, California, 91222, USA
Listing for: CCG Business Solutions, LLC
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 70 - 90 USD Hourly USD 70.00 90.00 HOUR
Job Description & How to Apply Below

CCG Talent Management is a business solutions company. We provide business consulting and talent placement services. Our team understands the principles of connecting purpose to business and career placement. A client of CCG is currently seeking a Sr Data Engineer

Job Description

Job Title:

Sr Data Engineer

Location:

Glendale, CA – Hybrid Onsite Schedule
The Company Headquartered in Los Angeles, this leader in the Entertainment & Media space is focused on delivering world-class stories and experiences to it's global audience. To offer the best entertainment experiences, their technology teams focus on continued innovation and utilization of cutting edge technology. Platform / Stack
You'll work with technologies that include Python, AWS, Airflow and Snowflake.

What You'll Do As a Sr Data Engineer:

  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build tools and services to support data discovery, lineage, governance, and privacy
  • Collaborate with other software/data engineers and cross-functional teams
  • Work on a Tech stack that includes Airflow, Spark, Databricks, Delta Lake, and Snowflake
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements
  • Maintain detailed documentation of your work and changes to support data quality and data governance requirements
Qualifications
  • 5+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Strong SQL skills and ability to create queries to analyze complex datasets
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
  • Experience in developing APIs with GraphQL
  • Deep Understanding of AWS or other cloud providers as well as infrastructure as code
  • Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
Additional Information

Base Salary
-USD $70 – 90/hr W2

Our highly competitive compensation package and outstanding benefits

All your information will be kept confidential according to EEO guidelines.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary