Capco Logo

Capco

Junior Big Data Nifi Developer

Posted 2 Hours Ago
Be an Early Applicant
Hybrid
Pune, Maharashtra
Mid level
Hybrid
Pune, Maharashtra
Mid level
Seeking a Big Data NiFi Developer to design, develop, and maintain data flow pipelines, integrating and processing large data volumes using NiFi and Spark.
The summary above was generated by AI

Job Title: Big Data Nifi Developer

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

Job Title: Big Data Nifi Developer

Location: Pune (Hybrid)

Experience: 3 to 5 Years

Work Mode: Hybrid (2-3 days from client office, rest remote)

Job Description:

We are seeking a highly skilled and motivated Big Data NiFi Developer to join our growing data engineering team in Pune. The ideal candidate will have hands-on experience with Apache NiFi, strong understanding of big data technologies, and a background in data warehousing or ETL processes. If you are passionate about working with high-volume data pipelines and building scalable data integration solutions, we’d love to hear from you.

Key Responsibilities:
  • Design, develop, and maintain data flow pipelines using Apache NiFi.
  • Integrate and process large volumes of data from diverse sources using Spark and NiFi workflows.
  • Collaborate with data engineers and analysts to transform business requirements into data solutions.
  • Write reusable, testable, and efficient code in Python or Java or Scala.
  • Develop and optimize ETL/ELT pipelines for performance and scalability.
  • Ensure data quality, consistency, and integrity across systems.
  • Participate in code reviews, unit testing, and documentation.
  • Monitor and troubleshoot production data workflows and resolve issues proactively.
Skills & Qualifications:
  • 3 to 5 years of hands-on experience in Big Data development.
  • Strong experience with Apache NiFi for data ingestion and transformation.
  • Proficient in at least one programming language: Python, Scala, or Java.
  • Experience with Apache Spark for distributed data processing.
  • Solid understanding of Data Warehousing concepts and ETL tools/processes.
  • Experience working with large datasets, batch and streaming data processing.
  • Knowledge of Hadoop ecosystem and cloud platforms (AWS, Azure, or GCP) is a plus.
  • Excellent problem-solving and communication skills.
  • Ability to work independently in a hybrid work environment.
Nice to Have:
  • Experience with NiFi registry and version control integration.
  • Familiarity with containerization tools (Docker/Kubernetes).
  • Exposure to real-time data streaming tools like Kafka.


Top Skills

Apache Nifi
AWS
Azure
Docker
ETL
GCP
Hadoop
Java
Kafka
Kubernetes
Python
Scala
Spark

Similar Jobs at Capco

4 Hours Ago
Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Develop and maintain microservices using Spring Boot in Java, handle API transactions, and manage the data layer for micro apps.
Top Skills: AWSCore JavaMicroservicesRest ApiSpring Boot
10 Hours Ago
Remote or Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The DevOps Engineer will build CI/CD pipelines, manage cloud infrastructure, and automate deployments to enhance software delivery and system reliability.
Top Skills: AnsibleAWSAzureBashDockerElkGCPGithub ActionsGitlab CiGoGrafanaJenkinsKubernetesPrometheusPythonTerraform
10 Hours Ago
Remote or Hybrid
Pune, Maharashtra, IND
Mid level
Mid level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As an SDET Engineer, you'll design, develop, and maintain automation frameworks for trading applications, collaborate globally, and perform various testing activities.
Top Skills: AnsibleBitbucketCucumberGithub ActionsGitlab CiJavaJenkinsJIRAMySQLRest AssuredSeleniumSQLSQL ServerTerraform

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account