Kone Logo

Kone

Senior Lead Data Engineer

Posted 4 Days Ago
Be an Early Applicant
Pune, Maharashtra
Senior level
Pune, Maharashtra
Senior level
The Senior Lead Data Engineer will develop and maintain data pipelines for scalable analytics and digital use cases, primarily using cloud technology. This role includes hands-on engineering tasks, optimizing workloads, collaboration with cross-functional teams, and ensuring data quality and compliance. Key responsibilities also involve documentation and adhering to cybersecurity guidelines.
The summary above was generated by AI

We are looking for a

Senior Data Engineer, Data Products (Pune)

To join the team developing the Data Foundation based data products for a KONE. Our Data Foundation based data products are key enabler in our digital transformation creating ability to develop new scalable analytics, AI and digital use cases by leveraging data across the whole KONE organization. Data products play a vital role in business value generation and driving optimized data architecture is crucial to ensure reusability of the data assets on the cloud based Data Foundation.

This is a hands-on, roll up your sleeves position that requires a passion for data, engineering and DevOps. In this role you will be doing hands on data engineering tasks from new development to resolving technical issues, maintaining and optimizing workloads. We offer a chance to work hands-on with state-of-the-art cloud technology in a global, multi-cultural work environment being located in our office in Pune. We approach multi-cloud data engineering experience, as your professional background.

We are searching for an enthusiastic person to join the team who is excited about developing own professional skills even further, learning new things and contributing to team success. An ideal candidate has a strong background in data engineering, SW engineering and data integration with modern multi cloud stack, but above all will to commit to a DevOps mindset and reach goals together. 

We are expecting you to take self-driven, proactive approach to your work, find & implement solutions, continuously looks for improvement opportunities in own area, solve problems, make decisions and share the learnings to colleagues. We want to work with people who enjoy teamwork, are not afraid to step out of their comfort zone, want to help others and share information.

To succeed in this role, following professional experience will play a key role:

  • Master's degree in either software engineering, data engineering, computer science, or a related field
  • Hands-on data engineering professional experience (> 3 years)
    • Previous hands-on professional experience in developing and maintaining data pipelines on AWS, Azure and/or Databricks. Working proficiency the the tech stack: AWS, Gitlab, Databricks for ETL, Airflow, SCL, Python, Scala, and DBT for developing ETL, jobs, AWS CDK and Terraform for IaC.
    • Hands-on development experience on lake house architecture based on Databricks and Delta Lake, multi-hop medallion architecture to divide the data lake into bronze, silver and gold layers based on the quality and reusability of the data stored there, data product publishing in Unity catalog.
    • Strong coding proficiency with multiple languages SQL and Python, additional language are bonus. Ability to write compelling code, technical documentation and visualize your technical design.
    • Fluency in industry-standard DevOps practices and tools.
    • Practical experience in working with enterprise data landscapes and data structures: Structural data, non-structural data, metadata, master data, transactional data, batch/NRT. Experience on enterprise data sources: Experience in woring with e.g. SAP ERP, Salesforce, Droduct data management (PDM), and many others.
  • Way of working professional experience
    • Passion to utilize agile development methodologies and tools (Jira, Confluence, draw.io).
    • Inbuilt cybersecurity awareness. Understanding on data privacy and compliancy regulations.
    • Ability to work in global multi-cultural team and effectively collaborate within the team.
    • Ability to self-organize, take accountability and be proactive, seek feedback, be courageous and resilient, and have excellent problem-solving skills.
    • Experience of DataOps and ITSM processes.
    • Proficiency in spoken and written English language, and strong facilitation and communication skills.

 The position is based in Pune in India.

*********** KONE DATA ENGINEER ROLE RELATED RESPONSIBILITIES ***********

Quality focus

  • Responsible for the design and implementation of data pipelines according to business/analytics needs and best practices
  • Responsible for ensuring that data pipelines are monitored and reliable
  • Responsible for fixing defects in a timely manner
  • Responsible for assembling data sets into a useful format for analysis using fit-for-purpose database technologies
  • Responsible for building services and tools to make data more accessible to all data consumers
  • Responsible for the documentation of data transformations, data models, and data flows
  • Responsible for following the KONE cybersecurity guidelines

Collaboration focus

  • Works with cross-functional analytics, business, and technology teams to deliver scalable successes
  • Handles code reviews in the team
  • Continuously looks for improvement opportunities in own area and shares them with rest of the team; explain own work and resulting conclusions both orally and in writing
  • Planning focus
  • Decomposes problems into component parts and effectively solve well-scoped problems
  • Participates and actively contributes to agile ceremonies such as daily stand-ups, sprint planning & retros
  • Participates to backlog grooming and story estimations

Accountabilities and Decisions

  • Responsible for understanding the project goals, data, methods, and their limitations
  • Responsible for taking initiative to what needs to be done without being asked
  • Responsible for seeing opportunities in solving problems within the scope of work for data engineering
  • Responsible to plan and design implementation and identify required data source within own scope
  • Responsible for following the KONE cybersecurity guidelines
  • Accountable for adequate test coverage for backlog items in own scope
  • Accountable for documenting the code in design documents and code itself · Accountable to review peer deliverables as planned
  • Accountable on following agreed best practices and guidelines
  • Accountable for defect fixing of the implementation in own scope
  • Accountable for defining the test cases
  • Accountable to provide knowledge transfer to production

At KONE, we are focused on creating an innovative and collaborative working culture where we value the contribution of each individual. Employee engagement is a key focus area for us and we encourage participation and the sharing of information and ideas. Sustainability is an integral part of our culture and the daily practice. We follow ethical business practices and we seek to develop a culture of working together where co-workers trust and respect each other and good performance is recognized. In being a great place to work, we are proud to offer a range of experiences and opportunities that will help you to achieve your career and personal goals and enable you to live a healthy and balanced life.

Read more on www.kone.com/careers

Top Skills

AWS
Python
Scala
SQL

Similar Jobs

58 Minutes Ago
Remote
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
SailPoint seeks a Senior Data Engineer to design and implement robust data ingestion and processing systems. Responsibilities include developing scalable data pipelines, integrating diverse data sources, leveraging AWS services, and using tools like Apache Airflow for orchestration. Candidates should have extensive experience in data engineering and relevant technologies.
Top Skills: Apache AirflowAWSDockerFlinkKubernetesSpark
2 Days Ago
Remote
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The Sr. Data Engineer will work on projects involving PySpark and Scala with a focus on data analysis and debugging. They will utilize their skills in Spark, GIT, and familiar CICD tools to manage the Big Data Application Life Cycle while ensuring efficient incident management using Control-M and Service Now.
Top Skills: PysparkScalaSpark
2 Days Ago
Pune, Maharashtra, IND
Mid level
Mid level
Healthtech • Logistics • Pharmaceutical
The Data Engineer II at Cencora is responsible for managing and analyzing data to support Legal and Compliance departments. This role includes data extraction, analysis, validating data quality, and developing analytics models. The engineer will also collaborate with various teams to improve data integrity, conduct root-cause analyses, and assist in training users on business analytics solutions.
Top Skills: PythonRSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account