More jobs:
Job Description & How to Apply Below
Every day, we help clients engage with new technology paradigms, creatively building solutions that solve their most pressing business challenges and move them to the forefront of their industry.
Job Title : GCP Data Engineer
Key Skills : PySpark , Airflow , SQL , GCP
Job Location s : Gurugram
Experience : 5+ years
Education Qualification : Any Degree Graduation
Work Mode : Hybird
Employment Type
: Contract
Notice Period : Immediate - 10 Days
Job description:
Job Summary
We are seeking a seasoned GCP Data Analytics professional with extensive experience in Big Data technologies and Google Cloud Platform services to design and implement scalable data solutions
Job Description
Design develop and optimize data pipelines using GCP Big Query Dataflow and Apache Airflow to support large scale data analytics Utilize the Big Data Hadoop ecosystem to manage and process vast datasets efficiently Collaborate with cross functional teams to gather requirements and deliver reliable data solutions Ensure data quality consistency and integrity across multiple data sources Monitor and troubleshoot data workflows to maintain high system availability and performance Stay updated with emerging trends and best practices in GCP data analytics and big data technologies
Roles and Responsibilities
Implement and manage ETL processes leveraging GCP services such as Big Query Dataflow and Airflow Develop scalable maintainable and reusable data pipelines to support business intelligence and analytics needs Optimize SQL queries and data models for performance and cost efficiency in Big Query Integrate Hadoop ecosystem components with GCP services to enhance data processing capabilities Automate workflow orchestration using Apache Airflow for seamless data operations Collaborate with data engineers analysts and stakeholders to ensure alignment of data solutions with organizational goals Participate in code reviews testing and deployment activities adhering to best practices Mentor junior team members and contribute to continuous improvement initiatives within the data engineering team
Skills
Mandatory Skills : GCP Storage,GCP Big Query,GCP Data Proc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Data stream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Data fusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×