As a Senior Data Engineer, you will build and manage data pipelines, processing solutions, and analytical tools using Big Data technologies and AI-assisted coding.
What is Findem:
Findem is the only talent data platform that combines 3D data with AI. It automates and consolidates top-of-funnel activities across your entire talent ecosystem, bringing together sourcing, CRM, and analytics into one place. Only 3D data connects people and company data over time - making an individual’s entire career instantly accessible in a single click, removing the guesswork, and unlocking insights about the market and your competition no one else can. Powered by 3D data, Findem’s automated workflows across the talent lifecycle are the ultimate competitive advantage. Enabling talent teams to deliver continuous pipelines of top, diverse candidates while creating better talent experiences, Findem transforms the way companies plan, hire, and manage talent. Learn more at www.findem.ai
Experience - 5 - 9 years
We are looking for an experienced Big Data Engineer, who will be responsible for building, deploying and managing various data pipelines, data lake and Big data processing solutions using Big data and ETL technologies.
Alongside strong core engineering skills, candidates must be highly proficient in AI-assisted coding. Fluency with AI tools such as Cline, Cursor, or similar is expected as part of a modern engineering workflow. They should be able to use these tools effectively, write good prompts, and guide the AI in a smart and responsible manner to produce high-quality, maintainable code.
Location- Delhi, India
Hybrid- 3 days onsite
Responsibilities
- Build data pipelines, Big data processing solutions and data lake infrastructure using various Big data and ETL technologies
- Assemble and process large, complex data sets that meet functional non-functional business requirements ETL from a wide variety of sources like MongoDB, S3, Server-to-Server, Kafka etc., and processing using SQL and big data technologies
- Build analytical tools to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics Build interactive and ad-hoc query self-serve tools for analytics use cases
- Build data models and data schema for performance, scalability and functional requirement perspective Build processes supporting data transformation, metadata, dependency and workflow management
- Research, experiment and prototype new tools/technologies and make them successful
Skill Requirements
- Must have-Strong in Python/Scala
- Must be highly proficient in AI-assisted coding. Fluency with AI tools such as Cline, Cursor, or similar is expected as part of a modern engineering workflow. They should be able to use these tools effectively, write good prompts, and guide the AI in a smart and responsible manner to produce high-quality, maintainable code.
- Must have experience in Big data technologies like Spark, Hadoop, Athena / Presto, Redshift, Kafka etc
- Experience in various file formats like parquet, JSON, Avro, orc etc
- Experience in workflow management tools like airflow Experience with batch processing, streaming and message queues
- Any of visualization tools like Redash, Tableau, Kibana etc
- Experience in working with structured and unstructured data sets
- Strong problem solving skills
Good to have
- Exposure to NoSQL like MongoDB
- Exposure to Cloud platforms like AWS, GCP, etc
- Exposure to Microservices architecture
- Exposure to Machine learning techniques
The role is full-time and comes with full benefits. We are globally headquartered in the San Francisco Bay Area with our India headquarters in Bengaluru.
Equal Opportunity
As an equal opportunity employer, we do not discriminate on the basis of race, color, religion, national origin, age, sex (including pregnancy), physical or mental disability, medical condition, genetic information, gender identity or expression, sexual orientation, marital status, protected veteran status or any other legally-protected characteristic.
Top Skills
Airflow
Athena
AWS
Cline
Cursor
GCP
Hadoop
Kafka
Kibana
MongoDB
Presto
Python
Redash
Redshift
Scala
Spark
Tableau
Similar Jobs
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The Sales Engineer will support the North India Sales Team by delivering product presentations, configuring product installations, and acting as a liaison for key accounts. The role involves engaging with customers, gathering technical requirements, and ensuring project success through effective communication and problem-solving skills.
Top Skills:
Cloud SecurityEndpoint SecurityIdentity PreventionMdrThreat Intelligence SolutionsXdrZero-Trust
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The Sales Engineer will manage technical relationships with customers, provide expertise on Next-Gen SIEM, and collaborate with sales to expand customer use and drive growth.
Top Skills:
Cloud ConceptsKubernetesLinuxOrchestration ToolsProgramming LanguagesSiem PlatformsWindows
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Join CrowdStrike's SRE team as Engineer III, enhancing automation and reliability of services, building developer platforms, and supporting CI/CD tools.
Top Skills:
AnsibleBazelBitbucketChefDatadogEnvoyGitGithub ActionsGitlabGrafanaHaproxyHoneycombJenkinsKubernetesNew RelicNginxPrometheusPuppetSaltTerraform
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.