More jobs:
Job Description & How to Apply Below
Dev Ops Specialist
Location:
Nagpur / India / Remote
Job Summary
We are seeking a skilled Dev Ops Developer to join our team and drive the development, deployment, and optimization of cloud-native applications on AWS.
Key Responsibilities
- Design and implement scalable, secure, and resilient cloud-native applications using Google Cloud/AWS.
- Design and manage Data Lake environments for large-scale data ingestion, processing, and analytics.
- Design and implement CI/CD pipelines using Dev Ops, Git Hub Actions,
- Develop and deploy cloud applications using AWS/Google-cloud services l
- Automate infrastructure provisioning with tools like Terraform, ARM templates, or Bicep
- Monitor and optimize cloud environments using AWS Monitor, Application Insights, and Log Analytics
- Collaborate with development and operations teams to streamline release cycles and improve system reliability
- Troubleshoot and resolve issues in cloud infrastructure and application deployments|
Required
Skills & Qualifications
- Strong experience with Google Cloud/AWS services and cloud architecture
- Proficiency in Dev Ops tools: AWS Dev Ops, Git, Docker, Kubernetes, Jenkins
- Knowledge of infrastructure as code (IaC) and automation scripting (Power Shell, Bash, Python)
- Experience designing and maintaining robust ETL pipelines for ingesting and transforming large-scale, real-time datasets from APIs (e.g., Vortexa, Kpler, AIS/ship tracking, Market Pricing data).
- Familiarity with supporting data science workflows, including integration with Python-based analytics environments (e.g., Jupyter, Databricks).
- Experience working with time-series data, geospatial data, and event-driven architectures for real-time tracking and alerting.
- Familiarity with monitoring and logging tools in Google Cloud/AWS
- Exposure monitoring and logging tools like Prometheus, Grafana, or ELK Stack.
- Experience with agile methodologies and collaborative development environments
- Familiarity with Git Ops, Dev Sec Ops practices, Zero Trust Architecture, and policy-as-code.
- Knowledge of data lakehouse architecture, Delta Lake, or Apache Spark on AWS.
- Experience with hybrid cloud environments.
- Bachelors degree in computer science, Engineering, or related field.
- Skills to support PCI DSS certification
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×