Drivetrain Logo

Drivetrain

Software Engineer (Data Engineering)

Posted 13 Days Ago
Remote
Hiring Remotely in India
Junior
Remote
Hiring Remotely in India
Junior
The role involves building and monitoring large-scale data pipelines, transforming data using DBT, and collaborating with the data platform team to solve customer issues using advanced SQL and big data solutions.
The summary above was generated by AI

Drivetrain is on a mission to empower businesses to make better decisions. Our financial planning & decision-making platform helps companies scale and achieve their targets predictably.


Drivetrain is a remote-first company headquartered in the San Francisco Bay Area. Founded in 2021 by a couple of ex-Googlers, Drivetrain is a fast-growing company on a trajectory for success with backing from leading venture capital firms.


Drivetrain provides a great culture for its employees to thrive in and be happy. 


💜 Remote-friendly: Drivetrain brings together the best and the brightest, no matter where they are and provides them a great degree of autonomy. We trust our people.

🗣️ Open & transparent: We know that when our creators have access to all the information they need, their best work will emerge.

👏 Idea-friendly: We provide an environment to explore new ideas, to take risks, to make mistakes, and to learn, so you can succeed. Anyone in the company can come up with great ideas and become a catalyst for positive change. We let the best ideas win.

👥 Customer-centric: We follow a product-led growth strategy, continuously learning from our customers and collaborating to build the amazing software that Drivetrain is.


About the role


Drivetrain is looking for a Data Engineer to join our team. The role enables you to lay the foundation of an exceptional data engineering practice.


The ideal candidate will be confident with a programming language of their choice, be able to learn new technologies quickly, have strong software engineering and computer science fundamentals and have extensive experience with common big data workflow frameworks and solutions. 

What you’ll be doing

  • Build and monitor large-scale data pipelines that ingest data from a variety of sources.
  • Develop and scale our DBT setup for transforming data.
  • Work with our data platform team to solve customer problems.
  • Use your advanced SQL & big data skills to craft cutting edge data solutions.

Requirements

  • 1-5+ years in a data engineering role(high-growth startup environments highly preferred).
  • Expert-level SQL skills.
  • 1 year experience in DBT is required.
  • At least 1 year experience with a leading data warehouse is required.
  • Track record of success in building new data engineering processes and an ability to work through ambiguity.
  • Willingness to roll up your sleeves and fix problems in a hands-on manner.
  • Intellectual curiosity and research abilities.

Sounds exciting? Apply at [email protected]. It may just be the next best decision you’ve ever made!

Top Skills

SQL

Similar Jobs

15 Days Ago
Remote
2 Locations
Junior
Junior
Artificial Intelligence • eCommerce • Software
As a Software Engineer specializing in Data Engineering, you will build and optimize large data pipelines using technologies such as Spark and Kafka. Your responsibilities will include developing ETL processes, data validation, and collaborating with various stakeholders to enhance the data ecosystem, delivering tailored solutions for customer business growth.
Top Skills: JavaPythonScalaSpark
18 Days Ago
Remote
Bengaluru, Karnataka, IND
Mid level
Mid level
eCommerce • Software
The Data Engineer will design, develop, test, deploy, and maintain data integration pipelines using AWS Cloud services, including Databricks and various data sources. Responsibilities include optimizing complex database queries and enhancing data processes while collaborating with a skilled team.
Top Skills: Python
18 Days Ago
Remote
8 Locations
Entry level
Entry level
Cloud • Software
The Software Engineer in Data Infrastructure will work on automating data platform operations, ensuring fault-tolerance, and managing Big Data platforms at scale. Responsibilities include collaborating with a distributed team, writing Python code to create features, debugging issues, and interaction with upstream communities, along with domain expertise for various data systems.
Top Skills: Python

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account