Job Description
Where you’ll work: India (Remote)
Engineering at GoTo
We’re the trailblazers of remote work technology. We build powerful, flexible work software that empowers everyone to live their best life, at work and beyond. And blaze even more trails along the way. There’s ample room for growth – so you can blaze your own trail here too. When you join a GoTo product team, you’ll take on a key role in this process and see your work be used by millions of users worldwide.
Your Day to Day
As a Senior Software Engineer – Bigdata, you would be:
Design, develop, and maintain robust, scalable, and efficient ETL/ELT pipelines to process structured and unstructured data from various sources.
Expertise in Python Programming.
Leverage AWS services (e.g., S3, EKS, Lambda, EMR) to architect and implement cloud-native data solutions. Work with Apache Spark and Databricks to process large-scale datasets, optimize performance, and build reusable data transformations.
Design and implement data models (both relational and dimensional) that support analytics, reporting, and machine learning use cases. Schedule, monitor, and orchestrate workflows using Apache Airflow or equivalent tools.
Collaborate with analysts, data scientists, and business stakeholders to deliver trusted, high-quality data for downstream consumption. Build data quality checks, logging, monitoring, and alerting to ensure pipeline reliability and visibility.
Develop SQL-based transformations and optimize queries for performance in cloud data warehouses and lakehouses.
Enable data-driven decisions by supporting self-service BI tools like Tableau, ensuring accurate and timely data availability.
Ensure adherence to data governance, security, and compliance requirements. Mentor junior engineers and contribute to engineering best practices, including CI/CD, testing, and documentation.
What We’re Looking For
As a Senior Software Engineer – Bigdata, your background will look like:
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering or similar roles, with proven ability to build and scale end-to-end data pipelines.
Strong expertise in ETL/ELT development, data ingestion, and transformation using SQL and scripting languages (Python preferred).
Hands-on experience with Apache Spark and Databricks for big data processing. In-depth knowledge of AWS services such as S3, Hive, Lambda, and EMR.
Proficient in data modeling, including dimensional and normalized models.
Experience with Airflow or similar orchestration frameworks.
Familiarity with BI tools like Tableau for reporting and dashboarding.
Strong understanding of data warehousing, lakehouse architectures, and modern data stack concepts.
Excellent problem-solving skills, communication, and the ability to work in an agile and collaborative environment.
At GoTo, authenticity and inclusive culture are key to our thriving workplace, where diverse perspectives drive innovation and growth. Our team of GoGetters is passionate about learning, exploring, and working together to achieve success while staying committed to delivering exceptional experiences for our customers. We take pride in supporting our employees with comprehensive
benefits, wellness programs, and global opportunities for professional and personal development. By maintaining an inclusive environment, we empower our teams to do their best work, make a meaningful impact, and grow their career. Learn more.