Honeywell Logo

Honeywell

Software Engr II

Posted 4 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in India
Entry level
In-Office or Remote
Hiring Remotely in India
Entry level
The role involves software design, development, and maintenance, focusing on data analysis, machine learning model support, and collaboration with cross-functional teams.
The summary above was generated by AI

We are seeking a Data Engineer with 2 to 4 years of experience to design, build, and maintain scalable data pipelines using Databricks and cloud‑based data platforms. The ideal candidate will have hands‑on experience with Databricks Lakehouse architecture, building reliable ETL/ELT pipelines, and enabling analytics and data science use cases across the organization.

Your role will also include overseeing, supervising, and reviewing tasks performed by team members to ensure effective execution of work; managing end-to-end processes and projects for both internal and external clients with responsibility for timely and accurate delivery; issuing clear instructions and guidance to team members on assigned tasks; and mentoring and guiding junior colleagues to support their skill development, professional growth, and overall success.


Your role will also include overseeing, supervising and reviewing tasks performed by team members to ensure effective execution of work; managing end‑to‑end processes and projects for both internal and external clients with responsibility for timely and accurate delivery; issuing clear instructions and directions to team members on tasks to be performed; and mentoring and guiding junior colleagues to support their skill development, professional growth, and overall success

Responsibilities
  • Design, develop, and maintain data pipelines using Databricks (PySpark / Spark SQL)
  • Implement and optimize ETL/ELT workflows using Databricks jobs, notebooks, and workflows
  • Build and manage Delta Lake tables, ensuring data reliability, performance, and ACID compliance
  • Develop and optimize data models for analytics, BI, and downstream consumption
  • Work with batch and streaming data processing using Spark Structured Streaming (where applicable)
  • Collaborate with data scientists, analysts, and product teams to deliver trusted datasets
  • Ensure data quality, validation, and monitoring across pipelines
  • Optimize Spark jobs for cost and performance (partitioning, caching, tuning)
  • Follow best practices for code versioning, documentation, and deployment
  • Support production workloads and assist with troubleshooting data issues
Qualifications
  • 2–4 years of professional experience as a Data Engineer 
  • Strong hands‑on experience with Databricks platform
  • Proficiency in Python (PySpark) and Spark SQL
  • Solid experience with Delta Lake, including merges and time travel
  • Strong SQL skills for data transformation and analysis
  • Experience with cloud data storage (AWS S3 / Azure Data Lake / GCP Cloud Storage)
  • Understanding of data warehousing and lakehouse concepts
  • Experience with ETL orchestration tools (Databricks Workflows, Airflow, Azure Data Factory, etc.)
  • Familiarity with Git and version control practices

Good to Have -

  • Experience with streaming technologies (Kafka, Event Hubs, Kinesis)
  • Exposure to dbt, Unity Catalog, or Databricks governance features
  • Knowledge of cloud security, IAM, and cost optimization
  • Experience supporting BI tools (Power BI, Tableau, Looker)
  • Understanding of data science or ML workflows on Databricks
  • Experience working in Agile/Scrum teams
About UsHoneywell helps organizations solve the world's most complex challenges in automation, the future of aviation and energy transition. As a trusted partner, we provide actionable solutions and innovation through our Aerospace Technologies, Building Automation, Energy and Sustainability Solutions, and Industrial Automation business segments – powered by our Honeywell Forge software – that help make the world smarter, safer and more sustainable.

Top Skills

AWS
Azure
GCP
Power BI
Python
SQL
Tableau

Similar Jobs

11 Days Ago
Remote
India
Mid level
Mid level
Internet of Things • On-Demand • Payments • Software
The Software Dev Engr II role involves developing code, collaborating with teams, participating in testing and documentation, and enhancing skills.
Top Skills: .NetAsp.Net CoreAWSC#GitMs Sql ServerSoapWeb Services
11 Days Ago
Remote
India
Mid level
Mid level
Internet of Things • On-Demand • Payments • Software
The Software Dev Engr II designs, develops, and maintains software applications, ensuring quality and performance by collaborating with cross-functional teams to meet requirements and troubleshoot issues.
Top Skills: Apache KafkaDockerGitGraphQLJava SeJenkinsKubernetesLinuxmacOSMavenMybatisPostgresRest ApiSQLTeamcity
9 Hours Ago
Remote or Hybrid
Senior level
Senior level
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
The Senior Marketing Manager will lead India's TMT marketing strategy, executing integrated programs to enhance customer engagement and business growth through effective campaign delivery and stakeholder collaboration.
Top Skills: Msft DynamicsPower BI

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account