Nexaminds Logo

Nexaminds

Senior Databricks Engineer (Exp: 8-12 yrs | Databricks ,PySpark , Apache Spark, Python, SQL, Azure, Devops)

Posted Yesterday
Be an Early Applicant
In-Office or Remote
4 Locations
Senior level
In-Office or Remote
4 Locations
Senior level
Seeking a Data Engineer to design and maintain scalable data pipelines using Databricks and Azure, focusing on ETL/ELT workflows, data quality, and collaboration with teams.
The summary above was generated by AI

Unlock Your Future with Nexaminds!

At Nexaminds, we're on a mission to redefine industries with AI. We're passionate about the limitless potential of artificial intelligence to transform businesses, streamline processes, and drive growth.

Join us on our visionary journey. We're leading the way in AI solutions, and we're committed to innovation, collaboration, and ethical practices. Become a part of our team and shape the future powered by intelligent machines. If you're driven by ambition, success, fun, and learning, Nexaminds is where you belong.

 

πŸš€ Are you a PRO at developing Databricks Pipelines / enhance or support existing pipelines?

Are you strong with Databricks, Pyspark,Apache Spark, SQL, Python debugging skills dealing with business-critical data with extensive exposure to Azure?

Have you worked hands-on in E-Commerce or Retail domains?

Then don’t wait any further β€” your next big opportunity is here! πŸŒŸ

Join us at Nexaminds and be part of an exciting journey where innovation meets impact. The benefits are unbelievable β€” and so is the experience you’ll gain!


Role Summary

We are seeking a Data Engineer with strong Databricks expertise to design, build, and maintain scalable, high-performance data pipelines on cloud platforms. The role focuses on developing production-grade ETL/ELT pipelines, enabling data modernization initiatives, and ensuring data quality, governance, and security across enterprise data platforms.

You will work closely with data engineers, analysts, and business stakeholders to deliver reliable, cost-efficient, and scalable data solutions, primarily on Azure.

Key Responsibilities

  • Design, build, and maintain scalable data pipelines using Databricks (Pyspark, Apache Spark, Delta Lake, SQL, Python).
  • Develop and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
  • Implement data quality, data profiling, governance, and security best practices.
  • Design and maintain data models to support analytics, reporting, and downstream consumption.
  • Collaborate with data engineers, analysts, and business stakeholders to define and implement data requirements.
  • Troubleshoot and resolve issues across data workflows, Spark jobs, and distributed systems.
  • Support cloud data platform modernization and migration initiatives.
  • Automate workflows using Databricks Workflows / Jobs and scheduling tools.
  • Participate in code reviews and contribute to engineering best practices.
  • Work within Agile/Scrum teams to deliver data solutions iteratively.

Must-Have Skills & Experience

  • 10+ years of experience in Data Engineering
  • Solid understanding with Azure Cloud platform.
  • Strong hands-on expertise with Databricks, Pyspark, Apache SparkDelta LakeDatabricks SQL
  • Excellent programming skills in Python and SQL.
  • Experience building production-grade ETL/ELT pipelines.
  • Experience in Data ModelingData ProfilingData Warehousing , distributed computing concepts
  • Working knowledge of Shell Scripting for automation.
  • Experience with Azure Event Hub / Github/Terraform
  • Experience using JFrog Artifactory or any other similar antifactory for artifact management.
  • Understanding of cloud security and access controls.

Nice-to-Have Skills

  • Exposure to CI/CD pipelines for data engineering workloads.
  • Knowledge of streaming data processing.
  • Familiarity with Azure DevOps or similar tools.
  • Experience supporting large-scale analytics or enterprise data platforms.

Soft Skills

  • Strong analytical and problem-solving skills.
  • Ability to work independently and in cross-functional teams.
  • Excellent communication skills to interact with technical and non-technical stakeholders.
  • Proactive mindset with attention to data accuracy and reliability.

 

What you can expect from us

Here at Nexaminds, we're not your typical workplace. We're all about creating a friendly and trusting environment where you can thrive. Why does this matter? Well, trust and openness lead to better quality, innovation, commitment to getting the job done, efficiency, and cost-effectiveness.

  • Stock options πŸ“ˆ
  • Remote work options 🏠
  • Flexible working hours πŸ•œ
  • Benefits above the law
  • But it's not just about the work; it's about the people too. You'll be collaborating with some seriously awesome IT pros.
  • You'll have access to mentorship and tons of opportunities to learn and level up.

Ready to embark on this journey with us? πŸš€πŸŽ‰ If you're feeling the excitement, go ahead and apply!

Top Skills

Spark
Azure
Azure Devops
Azure Event Hub
Databricks
Delta Lake
Git
Jfrog Artifactory
Pyspark
Python
Shell Scripting
SQL
Terraform

Similar Jobs

8 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Responsible for supporting data transmission and integration solutions, troubleshooting issues, collaborating with teams, and implementing new transmissions. Requires leadership and effective communication skills.
Top Skills: AnsibleAzure Dev OpsData ExchangeElasticEnterprise Scheduling MaestroIbm Connect:DirectIbm SterlingIbm Workload SchedulerIisInformatica MftKubeLinuxOpen ShiftPowershellPythonSplunkSshSslTectiaWindows
8 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The role involves developing AI-driven solutions, designing scalable systems, implementing cloud migrations, and utilizing technologies like Apache Kafka and Terraform.
Top Skills: Agentic AiApache KafkaAWSAzureDomain-Driven DesignETLGCPGenerative AiSparkTerraform
14 Hours Ago
Easy Apply
Remote
India
Easy Apply
Mid level
Mid level
Cloud • Security • Software • Cybersecurity • Automation
The Strategic Account Executive will manage enterprise sales cycles in India, focusing on customer relationships, consultative selling, and collaboration with internal teams to drive GitLab's software platform adoption.
Top Skills: AIApplication Lifecycle ManagementDevsecopsGitSoftware Development Tools

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account