Fractal Logo

Fractal

AWS Data Engineer

Posted 9 Days Ago
Be an Early Applicant
Pune, Maharashtra
Senior level
Pune, Maharashtra
Senior level
The AWS Data Engineer will create and manage data pipelines, support analytics solutions, and collaborate with AWS Professional Services and Product Owners on requirements and implementations. Key tasks include handling various data sources, developing in AWS, and using technologies like Python and Apache Airflow to manage and transform data for business decision-making.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Job Profile - Senior Engineer AWS cloud

Exp - 6-10 years

Job Location – Bangalore

Responsibilities:
As a Data Engineer, you will be responsible implementing complex data pipelines and analytics
solutions to support key decision-making business processes in our client’s domain. You will
gain exposure to a project that is leveraging cutting edge AWS technology that applies Big Data
and Machine Learning to solve new and emerging problems for our clients. You will gain a
added advantage of working very closely with AWS Professional Services teams directly
executing within AWS Services and Technologies to solve complex and challenging business
problems for Enterprises.
 

Key responsibilities include:
- Work closely with Product Owners and AWS Professional Service Architects to understand
requirements, formulate solutions, and implement them.
- Implement scalable data transformation pipelines as per design
- Implement Data model and Data Architecture as per laid out design.
- Evaluate new capabilities of AWS analytics services, develop prototypes, and assist in drawing
POVs , participate in design discussions
Requirements
• Minimum 3 years’ experience implementing transformation and loading of data from a
wide variety of traditional and non-traditional sources such as structured, unstructured,
and semi structured using SQL, NoSQL and data pipelines for real-time, streaming,
batch and on-demand workloads
• At least 2 years implementing solutions using AWS services such as Lambda, AWS
Athena and Glue AWS S3, Redshift, Kinesis, Lambda, Apache Spark,
• Experience working with data warehousing data lakes or Lakehouse concepts on AWS
• Experience implementing batch processing using AWS Glue/Lake formation, & Data
Pipeline
• Experience in EMR/MSK
• Experience or Exposure to AWS Dynamo DB will be a plus
• Develop object-oriented code using Python, besides PySpark, SQL and one other
languages (Java or Scala would be preferred)
• Experience on Streaming technologies both OnPrem/Cloud such as consuming and
producing from Kafka, Kinesis
• Experience building pipelines and orchestration of workflows in an enterprise
environment using Apache Airflow/Control M
• Experience implementing Redshift on AWS or any one of Databricks on AWS, or Snowflake on AWS
• Good understanding of Dimensional Data Modelling will be a plus.
• Ability to multi-task and prioritize deadlines as needed to deliver results
• Ability to work independently or as part of a team
• Excellent verbal and written communication skills with great attention to detail and accuracy
• Experience working in an Agile/Scrum environment

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

AWS
Java
Pyspark
Python
Scala
SQL

Similar Jobs

Be an Early Applicant
7 Days Ago
Mumbai, Maharashtra, IND
Hybrid
289,097 Employees
Mid level
289,097 Employees
Mid level
Financial Services
As a Software Engineer II, you'll design and deliver technology products, tackle complex technical problems, produce architectural designs, and enhance data insights for software applications. You will contribute to the team's culture and collaborate in various software engineering practices while ensuring high-quality code and system stability.
Be an Early Applicant
7 Days Ago
Mumbai, Maharashtra, IND
Hybrid
289,097 Employees
Mid level
289,097 Employees
Mid level
Financial Services
As a Software Engineer III, you will design, develop, and implement software solutions while ensuring high quality and security. You will analyze data, troubleshoot technical issues, and contribute to architecture and design artifacts. A critical part of the role involves working with various technologies to improve coding practices and system architecture.
2 Days Ago
Mumbai, Maharashtra, IND
5,262 Employees
Senior level
5,262 Employees
Senior level
Artificial Intelligence • Consulting
As a Data Engineer, you will design and maintain data pipelines using Snowflake and DBT, perform ETL processes, and ensure data quality and consistency. Your role will involve collaboration with data analysts and engineers to optimize workflows and implement best practices for data management. You'll leverage your AWS experience to enhance cloud-based data solutions.

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account