Fractal Logo

Fractal

Data Engineer _ AWS + Python

Posted 8 Days Ago
Be an Early Applicant
6 Locations
Mid level
6 Locations
Mid level
The Data Engineer will implement data pipelines and analytics solutions using AWS technologies, collaborate with stakeholders, and optimize data processes.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Data Analytics Engineer- AWS at Fractal.ai

Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500® companies.

Fractal has more than 3,000 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia.

Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, and recognized as an "Honorable Vendor" in 2021 Magic Quadrant™ for data & analytics by Gartner.

Experience: 3-5 years

Location: Pan (India)

Responsibilities:

As a Data Engineer, you will be responsible implementing complex data pipelines and analytics solutions to support key decision-making business processes in our client’s domain.

You will gain exposure to a project that is leveraging cutting edge AWS technology that applies Big Data and Machine Learning to solve new and emerging problems for our clients. You will gain a added advantage of working very closely with AWS Professional Services teams directly executing within AWS Services and Technologies to solve complex and challenging business problems for Enterprises.

Key responsibilities include:

- Work closely with Product Owners and AWS Professional Service Architects to understand requirements, formulate solutions, and implement them.

- Implement scalable data transformation pipelines as per design - Implement Data model and Data Architecture as per laid out design.

- Evaluate new capabilities of AWS analytics services, develop prototypes, and assist in drawing POVs, participate in design discussions

Requirements:

• Minimum 3 years’ experience implementing transformation and loading of data from a wide variety of traditional and non-traditional sources such as structured, unstructured, and semi structured using SQL, NoSQL and data pipelines for real-time, streaming, batch and on-demand workloads

• At least 2 years implementing solutions using AWS services such as Lambda, AWS Athena and Glue AWS S3, Redshift, Kinesis, Lambda, Apache Spark,

• Experience working with data warehousing data lakes or Lakehouse concepts on AWS

• Experience implementing batch processing using AWS Glue/Lake formation, & Data Pipeline

• Experience in EMR/MSK

• Experience or Exposure to AWS Dynamo DB will be a plus

• Develop object-oriented code using Python, besides PySpark, SQL and one other languages (Java or Scala would be preferred)

• Experience on Streaming technologies both OnPrem/Cloud such as consuming and producing from Kafka, Kinesis

• Experience building pipelines and orchestration of workflows in an enterprise environment using Apache Airflow/Control M

• Experience implementing Redshift on AWS or any one of Databricks on AWS, or Snowflake on AWS

• Good understanding of Dimensional Data Modelling will be a plus.

• Ability to multi-task and prioritize deadlines as needed to deliver results

• Ability to work independently or as part of a team

• Excellent verbal and written communication skills with great attention to detail and accuracy

• Experience working in an Agile/Scrum environment

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Apache Airflow
Spark
AWS
Aws Athena
Aws Dynamodb
Aws Glue
Aws Kinesis
Aws Lambda
Aws Redshift
Aws S3
Databricks
Emr
Kafka
NoSQL
Python
Snowflake
SQL

Similar Jobs

5 Days Ago
Hybrid
Mumbai, Maharashtra, IND
Senior level
Senior level
Financial Services
As a Lead Software Engineer, you will develop software solutions, ensure code quality, lead evaluation sessions, and drive technology innovation.
Top Skills: AWSData BricksJavaKafka
10 Days Ago
6 Locations
Mid level
Mid level
Artificial Intelligence • Consulting
Seeking a Senior Python + AWS Data Engineer to develop scalable architectures, automate processes, and enhance application performance while collaborating with cross-functional teams.
Top Skills: AWSAws SagemakerAzureDockerJenkinsKubernetesLambdaMlflowPythonStep FunctionTerraform
Senior level
Payments
The Sr. Cloud Engineer will develop scalable cloud data solutions, enhance governance frameworks, and lead cross-functional teams in agile environments.
Top Skills: AthenaAWSAzureAzure Data FactoryDatabricksNifiPythonRSnowflakeSnowpipe

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account