Mindera Logo

Mindera

Senior Data Engineer

Posted 3 Days Ago
Be an Early Applicant
India
Senior level
India
Senior level
The Senior Data Engineer will design and develop scalable ETL/ELT pipelines, enhance data streaming solutions, and optimize Spark jobs for performance while maintaining data integrity. Responsibilities include collaborating with data teams, documenting workflows, and researching new technologies.
The summary above was generated by AI

Description

We are looking for an experienced Data Engineer to become a valuable member of our energetic team. The perfect candidate will possess extensive knowledge of big data technologies, ETL/ELT workflows, and data modeling techniques. This position will concentrate on designing and enhancing data pipelines, maintaining data integrity, and bolstering our analytics projects.

Requirements

We are looking for an experienced Data Engineer with 5 to 7+ years of pertinent experience to be a part of our energetic team. The ideal candidate will possess a robust background in big data technologies, ETL/ELT processes, and data modeling. This position will concentrate on developing and refining data pipelines, ensuring data fidelity, and facilitating our analytics efforts.

Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT pipelines using PySpark and Databricks to facilitate data ingestion and processing.
  • Implement and enhance data streaming solutions for real-time data processing.
  • Improve Spark job performance by addressing memory management, partitioning strategies, and implementing efficient data storage formats.
  • Collaborate with data scientists and analysts to gather data requirements and provide reliable datasets for analysis.
  • Create and refine complex SQL queries for data extraction, transformation, and analysis.
  • Maintain data quality and integrity through automated testing and validation methods.
  • Document data workflows and maintain metadata for governance purposes.
  • Research and adopt new data engineering technologies and methods to enhance efficiency and scalability.

Mandatory Skills:

  • PySpark: Proficient in using PySpark for data processing and ETL workflows.
  • Azure Databricks: Experience with the Databricks platform, including cluster setup and management.
  • Data Streaming: Knowledge of streaming data processing with frameworks such as Spark Streaming.
  • Python: Strong programming skills in Python for scripting and automation tasks.
  • SQL: Advanced skills in SQL for querying and managing relational databases.
  • Spark Optimization: Experience in optimizing Spark applications for enhanced performance.

Optional Skills:

  • Snowflake: Familiarity with Snowflake for data warehousing and query optimization.
  • Cloud Platforms: Understanding of cloud services (AWS, Azure, GCP) for data storage and processing.
  • ETL/ELT Concepts: Knowledge of ETL/ELT processes, data modeling, and data warehousing best practices.
  • Big Data Tools: Familiarity with tools and frameworks such as Kafka, Hadoop, and Hive.
  • CI/CD Practices: Understanding of CI/CD for automated deployment and version control using tools like Git, Jenkins, etc.
Benefits
We offer
  • Flexible working hours (self-managed)
  • Competitive salary
  • Annual bonus, subject to company performance
  • Access to Udemy online training and opportunities to learn and grow within the role

At Mindera we use technology to build products we are proud of, with people we love.

Software Engineering Applications, including Web and Mobile, are at the core of what we do at Mindera.

We partner with our clients, to understand their products and deliver high-performance, resilient and scalable software systems that create an impact on their users and businesses across the world.

You get to work with a bunch of great people, and the whole team owns the project together.

Our culture reflects our lean and self-organisation attitude.

We encourage our colleagues to take risks, make decisions, work in a collaborative way and talk to everyone to enhance communication. We are proud of our work and we love to learn all and everything while navigating through an Agile, Lean and collaborative environment.

Check out our Blog: and our Handbook:

Our offices are located: Porto, Portugal | Aveiro, Portugal | Coimbra, Portugal | Leicester, UK | San Diego, USA | Chennai, India | Bengaluru, India

Top Skills

Pyspark
Python
SQL

Similar Jobs

Yesterday
Remote
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills: Apache AirflowAWSDockerFlinkKubernetesSpark
Yesterday
Remote
Hybrid
India
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Data Engineer at SailPoint will design and implement ELT processes for data ingestion and processing systems. Responsibilities include developing scalable data pipelines, collaborating with cross-functional teams, and utilizing AWS services to ensure efficient data solutions. The role requires strong adaptability and problem-solving skills in a dynamic environment.
Top Skills: AWSDockerFlinkKubernetesSpark
Yesterday
Hyderabad, Telangana, IND
Senior level
Senior level
Big Data • Fintech • Information Technology • Insurance • Financial Services
The Senior Data Engineer will design, build, and measure complex ELT jobs for processing various data sources. They will work collaboratively on projects involving batch pipelines, data modeling, and data mart solutions to implement efficient data collection and processing pipelines that meet business needs.
Top Skills: Data Modeling

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account