Fractal Logo

Fractal

GCP_Data Engineer

Reposted 11 Days Ago
Be an Early Applicant
In-Office
5 Locations
Junior
In-Office
5 Locations
Junior
Design and develop frameworks for data ingestion and processing using various tools on GCP. Ensure compliance with data governance and support development lifecycle.
The summary above was generated by AI

It's fun to work in a company where people truly BELIEVE in what they are doing!

We're committed to bringing passion and customer focus to the business.

Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. 
Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, Presto, Druid, airflow 
Deep understanding of BigQuery architecture, best practices, and performance optimization. 
Proficiency in LookML for building data models and metrics. 
Experience with DataProc for running Hadoop/ Spark jobs on GCP. 
Knowledge of configuring and optimizing DataProc clusters. 
Offer system support as part of a support rotation with other team members.   
Operationalize open source data-analytic tools for enterprise use.   
Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification.   
Understand and follow the company development lifecycle to develop, deploy and deliver the solutions.    

Minimum Qualifications: 
• Bachelor's degree in Computer Science, CIS, or related field 
• Experience on project(s) involving the implementation of software development life cycles (SDLC) GCP DATA ENGINEER

If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Not the right fit?  Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!

Top Skills

Airflow
Spark
BigQuery
Dataproc
Druid
Hadoop
Hive
Java
Kafka
Presto
Python
Scala
SQL

Similar Jobs

48 Minutes Ago
Hybrid
Mumbai, Maharashtra, IND
Senior level
Senior level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
Responsible for driving integrated talent management activities, collaborating with senior leaders on talent initiatives to support business strategy, and improving processes for talent programs and employee engagement.
49 Minutes Ago
Hybrid
Mumbai, Maharashtra, IND
Senior level
Senior level
Big Data • Food • Hardware • Machine Learning • Retail • Automation • Manufacturing
Lead integrated talent management activities, drive performance management, oversee rewards processes, ensure compliance, and implement continuous improvement in HR operations.
55 Minutes Ago
Hybrid
2 Locations
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Responsible for installing, configuring, and maintaining Hadoop ecosystem components, managing cluster performance, implementing security, and automating monitoring.
Top Skills: AirflowFlumeHadoopHbaseHdfsHiveKafkaKerberosOozieRangerSentrySparkSqoopYarnZookeeper

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account