Design, build, and manage AWS cloud infrastructure and data platforms. Develop ETL/data pipelines, automate deployments, and ensure system reliability.
Job Title: DataOps Engineer (AWS)
Experience: 5–7 Years
Employment Type: Full-Time
Work Mode: Remote
Location: India
Job Summary
We are looking for a skilled DevOps + Data Engineer (AWS) to design, build, and manage scalable cloud infrastructure and data platforms. The ideal candidate will have strong experience in AWS, Terraform, CI/CD, and ETL/data pipelines, and will work closely with data engineers, analytics teams, and application developers to ensure reliable, secure, and high-performing systems.
Key Responsibilities
- Design, deploy, and manage AWS cloud infrastructure using Infrastructure as Code (Terraform)
- Build, maintain, and optimize ETL/data pipelines for large-scale data processing
- Automate infrastructure provisioning, deployment, and monitoring
- Develop and manage CI/CD pipelines for data and application workloads
- Ensure high availability, scalability, security, and cost optimization of AWS environments
- Support data ingestion, transformation, and orchestration workflows
- Monitor systems, troubleshoot issues, and improve system reliability
- Implement logging, monitoring, and alerting for infrastructure and data pipelines
- Collaborate with Data Engineers, Analytics teams, and DevOps teams
- Enforce security best practices, IAM policies, and compliance standards
Required Skills & Qualifications
- 5–7 years of experience in DevOps, Data Engineering, or Cloud Engineering
- Strong hands-on experience with AWS services (EC2, S3, RDS, Lambda, EMR, Glue, Redshift, etc.)
- Solid experience with Terraform for infrastructure automation
- Experience building and maintaining ETL/data pipelines
- Proficiency in Python and/or Shell scripting
- Experience with CI/CD tools (GitHub Actions, Jenkins, GitLab CI, etc.)
- Strong understanding of Linux systems and networking fundamentals
- Experience with monitoring tools (CloudWatch, Prometheus, Grafana, ELK, etc.)
Preferred / Nice-to-Have Skills
- Experience with Airflow, AWS Glue, or similar orchestration tools
- Knowledge of Docker and Kubernetes
- Experience with data warehouses and analytics platforms
- Understanding of security best practices and data governance
- Exposure to cost optimization and performance tuning on AWS
Top Skills
AWS
Ci/Cd
Cloudwatch
Elk
ETL
Grafana
Linux
Prometheus
Python
Shell Scripting
Terraform
Similar Jobs
Financial Services
The Payments Product Controller leads financial assessments and compliance for payments initiatives, manages cross-functional teams, and provides insights to senior management. Responsibilities include ensuring regulatory compliance, maintaining internal controls, and overseeing change management within the payments domain.
Top Skills:
Sap Ledger System
Financial Services
The Data Management Associate will deliver data management solutions, ensure data quality, perform analysis using SQL, apply Python, and use Alteryx for data transformation, while supporting KYC LOB stakeholders.
Top Skills:
AlteryxLarge Language Models (Llms)PythonSQL
Financial Services
As an Analytical Associate, you'll lead analytics initiatives, create data visualizations, and develop analytics solutions to drive business insights and influence strategy.
Top Skills:
AlteryxAWSDatabricksPythonQlikSigmaSnowflakeSQLTableauThoughtspotUi Path
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

