SailPoint Logo

SailPoint

Senior Data Engineer (Flink/Spark Streaming, Java)

Posted Yesterday
Be an Early Applicant
Hybrid
Pune, Maharashtra
Senior level
Hybrid
Pune, Maharashtra
Senior level
The Senior Data Engineer will design and implement ELT processes, build data pipelines, collaborate with teams, utilize AWS services, and ensure seamless data integration.
The summary above was generated by AI

SailPoint is the leader in identity security for the cloud enterprise. Our identity security solutions secure and enable thousands of companies worldwide,
giving our customers unmatched visibility into the entirety of their digital workforce, ensuring workers have the right access to do their job – no more, no less.
Want to be on a team that full of results-driven individuals who are constantly seeking to innovate? Want to make an impact?
At SailPoint, our Data Platform team does just that. SailPoint is seeking a Senior Data Engineer to help build robust data ingestion and processing system to power our data platform. We are looking for well-rounded engineers who are passionate about building and delivering reliable, scalable data pipelines.
 

This is a unique opportunity to build something from scratch but have the backing of an organization that has the muscle to take it to market quickly, with a
very satisfied customer base.

Responsibilities:

  • Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses.

  • Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks.

  • Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem.

  • Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems.

  • Develop and maintain workflow orchestration using tools like Apache Airflow.

  • Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes.

  • Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills.

Qualifications:

  • BS in computer science or a related field.

  • 5+ years of experience in data engineering or related field.

  • Demonstrated system-design experience orchestrating ELT processes targeting data

  • Hands-on experience with at least one streaming or batch processing framework, such as Flink or Spark.

  • Hands-on experience with containerization platforms such as Docker and container orchestration tools like Kubernetes.

  • Proficiency in AWS service stack.

  • Familiarity with workflow orchestration tools such as Airflow.

  • Experience with DBT, Kafka, Jenkins and Snowflake.

  • Experience leveraging tools such as Kustomize, Helm and Terraform for implementing infrastructure as code.

  • Strong interest in staying ahead of new technologies in the data engineering space.

  • Comfortable working in ambiguous team-situations, showcasing adaptability and drive in solving novel problems in the data-engineering space.

Preferred

  • Experience with AWS

  • Experience with CICD

  • Experience instrumenting code for gathering production performance metrics

  • Experience in working with a Data Catalog tool ( Ex: Atlan / Alation)

What success looks like in the role

Within the first 30 days you will:

  • Onboard and set up your development and test environment and get access to the data platform.

  • Meet your team and key stakeholders and complete new-hire onboarding.

  • Build understanding of the company, the team’s mission, and your role, and identify the team’s top priorities.

  • Complete your first ticket and learn core workflows (e.g. version control and configuration).

  • Complete relevant product and platform training.

  • Shadow on-call and monitoring to see how the team supports production.

By 60 days:

  • Contribute regularly by working on well-scoped tickets and documenting how similar work is done.

  • Use team development practices (e.g. CI/CD, code review) and give or receive feedback on pull requests.

  • Complete required security and secure-coding training.

  • Build familiarity with the team’s data models, business logic, dashboards, and runbooks.

  • Participate in secondary on-call and monitoring.

By 90 days:

  • Take on more ownership: higher-point tickets, primary on-call, and helping improve runbooks and operations.

  • Learn observability, data, and streaming concepts used by the team and document key components and their interactions.

  • Share your experience (e.g. internal blog or project summary) and continue learning deployment and orchestration concepts relevant to the platform.

  • Collaborate with support and engineering management to resolve escalations and improve how the team operates.

SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team.  All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law.  

Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact [email protected] or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations.  NOTE: Any unsolicited resumes sent by candidates or agencies to this email will not be considered for current openings at SailPoint.

Top Skills

Apache Airflow
AWS
Dbt
Docker
Flink
Helm
Java
Jenkins
Kafka
Kubernetes
Kustomize
Snowflake
Spark
Terraform

SailPoint Pune, Mahārāshtra, IND Office

Lohia Jain Arcade, Sr. No. 106/107, Near Chatursringi Temple, Senapati Bapat Road , Pune, Maharashtra , India, 411016

Similar Jobs at SailPoint

2 Days Ago
Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
Lead the Sustaining Engineering team, providing oversight on troubleshooting high-priority issues, managing project deliverables, and ensuring quality patch releases for the Identity Product.
Top Skills: AWSAzureGCPJava
2 Days Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
The Senior Staff UI Engineer is responsible for developing and maintaining UI components for Identity Products, collaborating with other engineers, and improving code quality and solutions for customer issues.
Top Skills: AjaxAngularAngularjsBootstrapDb2ExtjsGitHibernateJ2EeJavaJavaScriptJbossJqueryJsfJSONJunitMybatisMySQLOracleSQLSubversionTestngTomcatWeblogicWebsphereXML
2 Days Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
Lead a front-end engineering team for Identity Product at SailPoint, focusing on feature development, team management, and stakeholder collaboration.
Top Skills: AngularAWSCloudbeesCSSCypressHTMLJavaScriptJenkinsJestNode.jsReactSeleniumTypescript

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account