Lytx Logo

Lytx

Senior Data Engineer (Streaming and Big Data)

Reposted 4 Days Ago
Be an Early Applicant
India
Senior level
India
Senior level
The role involves developing, deploying, and maintaining Big Data pipelines, mentoring junior staff, and contributing to data applications using various cloud and data technologies.
The summary above was generated by AI

Lytx:

Are you looking to take your career in Data Engineering to the next level? Do you enjoy building data pipelines that enable critical business insights? Are you an expert in designing and delivering scalable data streaming solutions? Lytx is looking for a Senior Data Engineer to join our growing team!

You’ll know you’re right for this job if you enjoy designing, developing, deployment and maintaining Big Data pipelines in the Cloud. In addition, you intimately know the Data Engineering landscape and enjoy advising software developers and mentoring more junior employees.

What You'll Do:
  • Participate in the full software development lifecycle (requirements, design, code, unit test, deployment, sustaining)
  • Provide senior-level contributions to the development teams responsible for implementing mission critical data applications  
  • Participate in the research, design, and testing of next generation data engine platforms
  • Develop and guide long-term strategy for data pipelines and persistent and NoSQL data storage 
  • Evaluate and recommend database infrastructure and tools including cloud technologies  
  • Build integrated and automated data pipelines using the Hadoop Ecosystem (Spark, Kafka, Hive, Yarn, Oozie)  
  • Develop and manage steaming data pipelines used for training and developing machine learning models
  • Process documentation and data flow diagramming  
  • Mentor and coach junior engineers ,software developers, and data analysts      
  • Perform other related duties as assigned

What You'll Need:

  • Bachelor’s degree in Information Technology or relevant experience.
  • Minimum 5 years of experience in Data Engineering and/or Software Engineering
  • Minimum 5 years experience developing data pipelines with Java, Python or Scala (Real time pipelines preferred)
  • Strong knowledge of Apache Hadoop ecosystem such as HDFS, MapReduce, Sqoop, Yarn, Hive, and Oozie
  • Significant experience with Spark, Kafka, Nifi, and other streaming data tools
  • Experience deploying data pipelines in the cloud (AWS preferred)
  • Strong experience with Data Lake technologies and techniques
  • Solid understanding of T-SQL and ETL programming including SQL Server Integration Services (SSIS)  
  • Excellent conceptual, analytical, and problem-solving skills
  • Experience in a high growth, fast paced environment
  • Demonstrated record of delivering quality technology outcomes  
  • Strong organizational skills and a keen attention to detail
  • Self-motivated, engaged, and accountable

Innovation Lives Here

You go all in no matter what you do, and so do we. At Lytx, we’re powered by cutting-edge technology and Happy People. You want your work to make a positive impact in the world, and that’s what we do. Join our diverse team of hungry, humble and capable people united to make a difference.

Together, we help save lives on our roadways.

Find out how good it feels to be a part of an inclusive, collaborative team. We’re committed to delivering an environment where everyone feels valued, included and supported to do their best work and share their voices.

Lytx, Inc. is proud to be an equal opportunity/affirmative action employer and maintains a drug-free workplace. We’re committed to attracting, retaining and maximizing the performance of a diverse and inclusive workforce. EOE/M/F/Disabled/Vet.

Top Skills

AWS
Hadoop
Hdfs
Hive
Java
Kafka
Mapreduce
Nifi
Oozie
Python
Scala
Spark
Sql Server Integration Services
Sqoop
Ssis
Yarn

Similar Jobs

8 Hours Ago
Hybrid
Bangalore, Bengaluru, Karnataka, IND
Entry level
Entry level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Develop and maintain ETL pipelines, design solutions for reporting, optimize performance, and collaborate with teams for efficient data flow.
Top Skills: Apache AirflowAWSAzureBigQueryGCPHadoopKafkaNumpyPandasPysparkPythonSparkSQL
12 Hours Ago
Remote
Bengaluru, Karnataka, IND
Entry level
Entry level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The role involves working on machine learning engineering for developer infrastructure, supporting team collaboration and software solutions at Atlassian.
12 Hours Ago
Remote
Bengaluru, Karnataka, IND
Entry level
Entry level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The role involves working in Dev Infrastructure at Atlassian, where you'll contribute to team collaboration and software solutions.

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account