Aryng Logo

Aryng

Data Engineer

Posted 6 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in Mumbai, Maharashtra
Mid level
Remote
Hiring Remotely in Mumbai, Maharashtra
Mid level
Develop and implement data engineering solutions in cloud environments, handle asynchronous data ingestion, and optimize data pipelines for efficiency.
The summary above was generated by AI
Description

Welcome! You made it to the job description page!

Aryng is looking for a Data Engineer with experience in developing enterprise-class distributed data engineering solutions on the cloud. We are seeking an entrepreneurial and technology-proficient Data Engineer who is an expert in the implementation of a large-scale, highly efficient data platform, batch, and real-time pipelines and tools for Aryng clients. This role is based out of India. You will work closely with a team of highly qualified data scientists, business analysts, and
engineers to ensure we build effective solutions for our clients. Your biggest strength is creative and effective problem-solving.

Key Responsibilities:

  • Should have implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques.
  • Implement application components using Cloud technologies and infrastructure.
  • Assist in defining the data pipelines and able to identify bottlenecks to enable the adoption of data management methodologies.
  • Implementing cutting edge cloud platform solutions using the latest tools and platforms offered by GCP, AWS, and Azure. (AWS is preferred)

Functional capabilities: Requirement gathering, Client Mgt, team handling, Program delivery, Project Management (Project Estimation, Scope of Project, Agile methodology).

Requirements
  • 3-5 years of data engineering experience is a must.
  • 3+ years implementing and managing data engineering solutions using Cloud solutions GCP/AWS/Azure or on-premise distributed servers. AWS is preferred.
  • Should be comfortable working and interacting with clients
  • 2+ years’ experience in Python.
  • Must be strong in SQL and its concepts.
  • Experience in Big Query, Snowflake, Redshift, DBT.
  • Strong understanding of data warehousing, data lake, and cloud concepts.
  • Excellent communication and presentation skills
  • Excellent problem-solving skills, highly proactive and self-driven
  • Consulting background is a big plus.
  • Must have a B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related area of study
  • Working knowledge of Airflow is preferred

Good to have:

  • Experience in some of the following: Apache Beam, Hadoop, Airflow, Kafka,Spark
  • Experience in Tableau, Looker, or other BI tools is preferred

Availability:

  • Available to join immediately

This role requires mandatory overlap hours with clients in the US from 8 am to 1 pm PST.

Benefits
  • Direct Client Access
  • Flexible work hours
  • Rapidly Growing Company
  • Awesome work culture
  • Learn From Experts
  • Work-life Balance
  • Competitive Salary
  • Executive Presence
  • End to End Problem Solving
  • 50%+ Tax Benefit
  • 100% Remote company
  • Flat Hierarchy
  • Opportunity to become a thought leader

Why Join Aryng: Click on the

Top Skills

Airflow
Apache Beam
AWS
Azure
Big Query
Dbt
GCP
Hadoop
Kafka
Looker
Python
Redshift
Snowflake
Spark
SQL
Tableau

Similar Jobs

5 Days Ago
Remote
Hybrid
Pune, Maharashtra, IND
Junior
Junior
Artificial Intelligence • Cloud • Information Technology • Sales • Security • Software • Cybersecurity
This role involves managing Snowflake infrastructure, optimizing Tableau reporting, ensuring data integrity, and collaborating on data governance and modeling.
Top Skills: AirflowDbtEltETLFivetranMatillionSnowflakeSQLTableau
6 Days Ago
Remote
India
Mid level
Mid level
Big Data • Consulting
The Data Engineer will develop and implement distributed data solutions using cloud technologies, focusing on data ingestion and analytics. Responsibilities include managing data pipelines, collaborating with clients, and utilizing tools like AWS, GCP, and SQL.
Top Skills: AirflowApache BeamAWSAzureBig QueryCloud TechnologiesDbtGCPHadoopKafkaLookerPythonRedshiftSnowflakeSparkSQLTableau
6 Days Ago
Remote
India
Mid level
Mid level
Big Data • Consulting
The Data Engineer will develop cloud-based data solutions, implement data pipelines, and ensure efficient data management for clients, leveraging AWS, GCP, and Azure.
Top Skills: AirflowApache BeamAWSAzureBig QueryDbtGCPHadoopKafkaLookerPythonRedshiftSnowflakeSparkSQLTableau

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account