More jobs:
Job Description & How to Apply Below
Role :
Data Ops Lead
Exp : 8
-12 Years
Location :
Bengaluru - Whitefield
Mode :
Hybrid .
Immediate Joiners preferred.
About the role :
As a Dataops Lead, you will be responsible for managing, design highly scalable and Available solution for data pipelines that provides the foundation for collecting, storing, modelling, and analysing massive data sets from multiple channels.
Responsibilities:
Align Sigmoid with key Client initiatives o Interface daily with customers across leading Fortune 500 companies to understand strategic requirements
Connect with VP and Director level clients on a regular basis.
Travel to client locations
Ability to understand business requirements and tie them to technology solutions
Strategically support Technical Initiatives
Design, manage & deploy highly scalable and fault-tolerant distributed components using Bigdata technologies.
Ability to evaluate and choose technology stacks that best fit client data strategy and constraints
Drive Automation and massive deployments
Ability to drive good engineering practices from bottom up
Develop industry leading CI/CD, monitoring and support practices inside the team
Develop scripts to automate devOps processes to reduce team effort
Work with the team to develop automation and resolve issues
Support TB scale pipelines
Perform root cause analysis for production errors
Support developers in day-to-day devOps operations
Excellent experience in Application support, integration development and data management.
Design roster and escalation matrix for team
Provide technical leadership and manage it day to day basis
Guiding devOps in day-to-day design, automation & support tasks
Play a key role in hiring technical talents to build the future of Sigmoid.
Conduct training for technology stack for developers in house and outside
Culture
Must be a strategic thinker with the ability to think unconventional / out-of-box.
Analytical and data driven orientation.
Raw intellect, talent and energy are critical.
Entrepreneurial and Agile: understands the demands of a private, high growth company.
Ability to be both a leader and hands on 'doer'.
Qualifications:
- 8 - 12 years track record of relevant work experience and a computer Science or a related technical discipline is required
Proven track record of building and shipping large-scale engineering products and/or knowledge of cloud infrastructure such as Azure/AWS preferred
Experience in Python/Java programming is a must.
Experience in managing Linux systems, build and release tools like Jenkins
Effective communication skills (both written and verbal)
Ability to collaborate with a diverse set of engineers, data scientists and product managers
Comfort in a fast-paced start-up environment.
Support experience in Big Data domain
Architecting, implementing, and maintaining Big Data solutions
Experience with Hadoop ecosystem (HDFS, Map Reduce, Oozie, Hive, Impala, Spark, Kerberos, KAFKA, etc)
Experience in container technologies like Docker, Kubernetes & configuration management systems
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×