Exusia Logo

Exusia

Snowflake Data Engineer

Reposted 24 Days Ago
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
The Snowflake Data Engineer will develop and manage large-scale data pipelines, collaborate with stakeholders, and optimize data engineering solutions using Snowflake.
The summary above was generated by AI

Department:                          Sales and Delivery Team - Empower

Industry:                                 Information Technology & Services, Computer Software, Management Consulting

Location:                                 WFH/ India Remote

Experience Range:                6 - 10 years

Basic Qualification:               Bachelor of Engineering or Equivalent

Travel Requirements:           Not required

Website:                                 www.exusia.com

 

Exusia, a cutting-edge digital transformation consultancy, and is looking for top talent in DWH & Data Engineering space with specific skills in Snowflake/Python/DBT to join our global delivery team's Industry Analytics practice in India.

What’s the Role?

 

·        Full-time job to work with Exusia's clients in the United States to deliver on bleeding-edge data-driven solutions

·        Developing and managing large scale data pipelines and data repositories

·        Collaborate with Product Owners, Solution Architects to develop optimized data engineering solutions.

 

Criteria for the Role!

 

·        Minimum 6 years of experience working as a Data engineer

·        Min 2 years exp in Snowflake and DBT

·        Master of Science (preferably in Computer and Information Sciences or Business Information Technology) or an Engineering degree in the above areas.

·        Excellent communication skills and should be able to work directly with business stakeholders, creative problem solver, flexible, proactive, attitude to learn newer tools/technologies


Responsibilities

 

·      Implementing end to end data pipelines to move data from source systems into data lake or data warehouse

·      Build pipeline automation and orchestration process

·      Develop Snowflake data models (e.g., star schema, snowflake schema) for optimized query performance.

·      Working with Data Analysts to ensure the pipelines are tested and optimised to provide accurate and timely data

·      Working in an agile software delivery model and manage changing requirements and priorities during the SDLC 

Mandatory Skills

 

·      Develop and maintain Snowflake data models (e.g., star schema, snowflake schema) for optimized query performance.

·      Create and maintain Snowpipe & SnowSQL scripts for data loading, data transformations, and data retrieval.

·      Proficiency in SQL for data manipulation, transformation, and processing.

·      Expertise in DBT to develop modular, scalable, and well-documented data pipelines.

·      Strong python programming experience to support data processing and automation.

·      Hands-on experience with Airflow for orchestrating data pipelines

·      Knowledge of cloud platforms specifically storage and databases to source or stage data for Snowflake

·      Problem-solving skills with the ability to work with large datasets and debug data pipeline issues.

 

Nice to have skills

 

·      Understanding of Data Modeling & legacy ETL technologies

·      Prior migration experience - On Prem legacy databases to snowflake

·      Knowledge of Spark / Spark based data processing tools

·      Exposure to one or more Cloud platforms – Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform

·      Exposure to Data Governance aspects like Metadata Management, Data Dictionary, Data Glossary, Data Lineage

Top Skills

Airflow
Dbt
Python
Snowflake
SQL

Exusia Pune, Mahārāshtra, IND Office

Marisoft C, Wing C Vadgon Sheri, Kalyani Nagar Annex, Floors 1 & 2, Pune, Maharashtra, India, 411014

Similar Jobs

Yesterday
In-Office or Remote
Noida, Gautam Buddha Nagar, Uttar Pradesh, IND
Senior level
Senior level
Software
Lead the design and implementation of scalable ETL pipelines using Python and Snowflake, ensuring efficient data processing and architecture for cloud-native solutions.
Top Skills: AWSEc2EmrPythonRdsS3SnowflakeSnowparkSnowpipeSnowsql
3 Days Ago
Remote
Hinjawadi, Pune, Mahārāshtra, IND
Mid level
Mid level
Fintech • Financial Services
The Snowflake Data Engineer will design and maintain data pipelines using Snowflake, collaborating with teams to enhance data solutions while ensuring data governance and compliance.
Top Skills: Apache AirflowAWSAzureBashFivetranGCPGitPythonSnowflakeSQL
17 Days Ago
Remote
India
Senior level
Senior level
Artificial Intelligence • Cloud • Information Technology • Machine Learning • Software
Design and optimize data models and pipelines within Snowflake for AI applications, ensuring data quality and efficient processing.
Top Skills: Dynamic TablesPythonSnowflakeSnowparkSnowpipeSQLStreamsTasks

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account