Aryng Logo

Aryng

Data Engineer

Reposted 4 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Pune, Maharashtra
Mid level
In-Office or Remote
Hiring Remotely in Pune, Maharashtra
Mid level
The Data Engineer will develop and implement distributed data solutions using cloud technologies, focusing on data ingestion and analytics. Responsibilities include managing data pipelines, collaborating with clients, and utilizing tools like AWS, GCP, and SQL.
The summary above was generated by AI
Description

Welcome! You made it to the job description page!

Aryng is looking for a Data Engineer with experience in developing enterprise-class distributed data engineering solutions on the cloud. We are seeking an entrepreneurial and technology-proficient Data Engineer who is an expert in the implementation of a large-scale, highly efficient data platform, batch, and real-time pipelines and tools for Aryng clients. This role is based out of India. You will work closely with a team of highly qualified data scientists, business analysts, and
engineers to ensure we build effective solutions for our clients. Your biggest strength is creative and effective problem-solving.

Key Responsibilities:

  • Should have implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques.
  • Implement application components using Cloud technologies and infrastructure.
  • Assist in defining the data pipelines and able to identify bottlenecks to enable the adoption of data management methodologies.
  • Implementing cutting edge cloud platform solutions using the latest tools and platforms offered by GCP, AWS, and Azure. (AWS is preferred)

Functional capabilities: Requirement gathering, Client Mgt, team handling, Program delivery, Project Management (Project Estimation, Scope of Project, Agile methodology).

Requirements
  • 3-5 years of data engineering experience is a must.
  • 3+ years implementing and managing data engineering solutions using Cloud solutions GCP/AWS/Azure or on-premise distributed servers. AWS is preferred.
  • Should be comfortable working and interacting with clients
  • 2+ years’ experience in Python.
  • Must be strong in SQL and its concepts.
  • Experience in Big Query, Snowflake, Redshift, DBT.
  • Strong understanding of data warehousing, data lake, and cloud concepts.
  • Excellent communication and presentation skills
  • Excellent problem-solving skills, highly proactive and self-driven
  • Consulting background is a big plus.
  • Must have a B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related area of study
  • Working knowledge of Airflow is preferred

Good to have:

  • Experience in some of the following: Apache Beam, Hadoop, Airflow, Kafka,Spark
  • Experience in Tableau, Looker, or other BI tools is preferred

Availability:

  • Available to join immediately

This role requires mandatory overlap hours with clients in the US from 8 am to 1 pm PST.

Benefits
  • Direct Client Access
  • Flexible work hours
  • Rapidly Growing Company
  • Awesome work culture
  • Learn From Experts
  • Work-life Balance
  • Competitive Salary
  • Executive Presence
  • End to End Problem Solving
  • 50%+ Tax Benefit
  • 100% Remote company
  • Flat Hierarchy
  • Opportunity to become a thought leader

Why Join Aryng: Click on the

Top Skills

Airflow
Apache Beam
AWS
Azure
Big Query
Cloud Technologies
Dbt
GCP
Hadoop
Kafka
Looker
Python
Redshift
Snowflake
Spark
SQL
Tableau

Similar Jobs

Yesterday
Remote or Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Senior Data Engineer responsible for designing and optimizing data pipelines in Azure Databricks, ensuring data quality and collaborating with cross-functional teams to translate business requirements into technical solutions.
Top Skills: Azure DatabricksPysparkPythonSQL
14 Days Ago
Remote
India
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Data Engineer, you will build and maintain data lakes, optimize data pipelines, and collaborate with business stakeholders to enhance data processing efficiency.
Top Skills: AirflowAWSDatabricksDbtEmrHiveJavaKinesisPythonRdsS3ScalaSparkSQLSqs
Yesterday
Remote or Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Big Data Engineer will design, implement, and optimize data systems using Scala and Spark, ensuring high-quality code while integrating various technologies for enhanced data analytics.
Top Skills: AirflowApache KafkaCloud ArchitectureGitJenkinsJupyter NotebookNifiPythonScalaSparkSplunkSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account