More jobs:
Cloud Engineer - Abinitio
Job in
500016, Prakāshamnagar, Telangana, India
Listed on 2026-02-04
Listing for:
Confidential
Full Time
position Listed on 2026-02-04
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Big Data
Job Description & How to Apply Below
Senior Data Engineer - Abinitio, GCP
Location:
Hyderabad / Bangalore/ Chennai
Experience:
3
-8 years
Employment Type:
Full-Time
About Us
Onix is a trusted cloud consulting company and leading Google Cloud partner that helps companies get the most out of their technology with cloud-powered solutions, best-in-class services, and the Datametica Birds, data migration products that unleash AI potential.
We are able to deliver exceptional results for our customers because of our 20+ year partnership with Google Cloud, depth of technology expertise, and IP-driven data and AI solutions.
We offer solutions across a wide range of use cases and industries that are tailored to the unique needs of each customer. From advanced cloud security solutions to innovative AI capabilities and data migration products, we have you covered. Our global team of experts are the most reliable, talented and knowledgeable in the industry.
Key Responsibilities:
ETL Development:
Design and develop robust ETL pipelines using Ab Initio for efficient data ingestion, transformation, and processing.
Big Data & Cloud Technologies:
Implement Data Warehousing and Data Lake solutions on GCP to handle large-scale structured and unstructured datasets.
Cloud Data Streaming:
Design and develop real-time streaming solutions using GCP Pub/Sub, Dataflow, and Big Query for data ingestion and analytics.
Python Development:
Write efficient, scalable, and reusable Python scripts for automation, data transformation, and integration with cloud services.
Performance Optimization:
Optimize queries, ETL workflows, and data processing pipelines to enhance efficiency and reduce latency.
Collaboration & Best Practices:
Work closely with data scientists, analysts, and cloud engineers to ensure seamless data flow and availability.
Security & Compliance:
Ensure data governance, security, and IAM policies are enforced across all data processing solutions.
Required Skills &
Qualifications:
6-8 years of experience in data engineering and cloud-based data solutions.
Expertise in Ab Initio for ETL pipeline design, implementation, and optimization.
Proficiency in Python for data processing, scripting, and automation.
Hands-on experience with Data Warehouses and Data Lakes (Big Query, Snowflake, or similar).
Strong experience in GCP Streaming technologies including Pub/Sub, Dataflow, and Big Query for real-time data processing.
Good understanding of cloud-based data architectures, including GCP components such as Cloud Storage, Big Query, and Cloud Functions.
Knowledge of batch and real-time data processing frameworks.
Experience in SQL performance tuning and optimization for large-scale datasets.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×