DATAMAXIS, Inc Logo

DATAMAXIS, Inc

Sr. Databricks Data Engineer - I

Posted 12 Days Ago
Be an Early Applicant
India
Senior level
India
Senior level
This role involves designing and developing enterprise data solutions and robust data pipelines using Databricks, PySpark, and SQL, focusing on optimization and continuous improvement.
The summary above was generated by AI

Job Title: Databricks Data Engineer - I
Experience: 5+ years
Location: Remote
Job Type: Full-time with AB2

We are seeking an experienced Databricks Data Engineer who can play a crucial role in our Fintech data lake project.What You Bring
• 5+ years of experience working in data warehousing systems
• 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows
• for data processing and pipeline development
• 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working
• in Databricks Medallion architecture
• 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP
• Experience working in using Git version control, and well versed with CI/CD best practices to
• automate the deployment and management of data pipelines and infrastructure
• Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle
• Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors
• Experience in a fast-paced, ever-changing and growing environment
• Understanding of metadata management, data lineage, and data glossaries is a plus
• Must have eport development experience using PowerBI, SplashBI or any enterprise reporting toolWhat You’ll Do
• Involve in design and development of enterprise data solutions in Databricks, from ideation to
• deployment, ensuring robustness and scalability.
• Work with the Data Architect to build, and maintain robust and scalable data pipeline architectures on
• Databricks using PySpark and SQL
• Assemble and process large, complex ERP datasets to meet diverse functional and non-functional
• requirements.
• Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance
• data solution quality
• Focus on improving performance, reliability, and maintainability of data pipelines.
• Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large
• datasets
• Involve in release management using Git and CI/CD practices
• Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer.Qualifications
• Bachelors Degree in Computer Science, Engineering, Finance or equivalent experience
• Good communication skills 

We are seeking an experienced Databricks Data Engineer who can play a crucial role in our Fintech data lake project.What You Bring
• 5+ years of experience working in data warehousing systems
• 3+ strong hands-on programming expertise in Databricks landscape, including SparkSQL, Workflows
• for data processing and pipeline development
• 3+ strong hands-on data transformation/ETL skills using Spark SQL, Pyspark, Unity Catalog working
• in Databricks Medallion architecture
• 2+ yrs work experience in one of cloud platforms: Azure, AWS or GCP
• Experience working in using Git version control, and well versed with CI/CD best practices to
• automate the deployment and management of data pipelines and infrastructure
• Nice to have hands-on experience building data ingestion pipelines from ERP systems (Oracle
• Fusion preferably) to a Databricks environment, using Fivetran or any alternative data connectors
• Experience in a fast-paced, ever-changing and growing environment
• Understanding of metadata management, data lineage, and data glossaries is a plus
• Must have eport development experience using PowerBI, SplashBI or any enterprise reporting toolWhat You’ll Do
• Involve in design and development of enterprise data solutions in Databricks, from ideation to
• deployment, ensuring robustness and scalability.
• Work with the Data Architect to build, and maintain robust and scalable data pipeline architectures on
• Databricks using PySpark and SQL
• Assemble and process large, complex ERP datasets to meet diverse functional and non-functional
• requirements.
• Involve in continuous optimization efforts, implementing testing and tooling techniques to enhance
• data solution quality
• Focus on improving performance, reliability, and maintainability of data pipelines.
• Implement and maintain PySpark and databricks SQL workflows for querying and analyzing large
• datasets
• Involve in release management using Git and CI/CD practices
• Develop business reports using SplashBI reporting tool leveraging the data from Databricks gold layer.Qualifications
• Bachelors Degree in Computer Science, Engineering, Finance or equivalent experience
• Good communication skills 

Top Skills

AWS
Azure
Ci/Cd
Databricks
GCP
Git
Power BI
Pyspark
Sparksql
Splashbi
Unity Catalog
Workflows

Similar Jobs

10 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
As a Senior Staff Data Engineer at SailPoint, you will design and implement data pipelines, collaborate with teams, and ensure data integration across platforms.
Top Skills: Apache AirflowAWSDbtDockerFlinkHelmJavaJenkinsKafkaKubernetesKustomizePythonSnowflakeSparkSQLTerraform
18 Hours Ago
In-Office or Remote
Bengaluru, Karnataka, IND
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
As a Senior Engineering Manager, guide technical decisions, manage teams, and collaborate with product managers to enhance user experiences in a microservices environment.
Top Skills: DevOpsMicroservicesSoftware Engineering
22 Hours Ago
In-Office
Hyderabad, Telangana, IND
Expert/Leader
Expert/Leader
Artificial Intelligence • Consumer Web • Edtech • HR Tech • Information Technology • Software • Conversational AI
The Principal Software Engineer will lead technical projects, mentor teams, and develop AI solutions using advanced technologies in a collaborative agile environment.
Top Skills: AngularAWSDockerExpress.JsGitGradleGruntJavaJavaScriptKafkaKubernetesMavenNode.jsNpmPythonRabbitMQReactReduxSpring BootWebpack

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account