Arista is excited to scale the Wi-Fi Team in the Pune Development Center to take its Cognitive Wi-Fi solution to the next level. Arista has ambitious plans to grow the Pune-based Development Center in the next couple of years and now is a great time to join the team when you can have a significant impact on the shape and direction as the office grows.
At Arista, we value the diversity of thought and perspectives each employee brings. We believe fostering an inclusive environment where individuals from various backgrounds and experiences feel welcome is essential for driving creativity and innovation.
Our commitment to excellence has earned us several prestigious awards, such as the Great Place to Work Survey for Best Engineering Team and Best Company for Diversity, Compensation, and Work-Life Balance. At Arista, we take pride in our track record of success and strive to maintain the highest quality and performance standards in everything we do.
Job DescriptionWho You’ll Work With
As a Data Engineer, you will be a member of the Wi-Fi Data team, which is a part of the broader Software Engineering team. With increasing amounts of data being ingested into the cloud, the Wi-Fi Data team will play a crucial role in the success of Arista’s Cognitive Wi-Fi solution. Because this team is small and relatively new, there is a lot of room for growth and making an impact.
What You’ll Do
As part of the Wi-Fi Data team, you will work closely with the Data Scientists to build and maintain data and AI/ML pipelines that operate at scale. These may include anomaly detection, root cause analysis, automatic remediation and analytics use cases. You would also work on developing ELT data pipelines for extracting data from multiple Wi-Fi data sources and ingesting them into a data warehouse. In most cases you will have to work directly on the product backend to extract the data for further processing. Apart from these core responsibilities, you will also be developing and owning the CI/CD pipelines for deployment of your data pipelines. Depending on the project, you may get a chance to share your work with a larger community through talks and blog posts.
Qualifications- Bachelor’s degree in Computer Science or a related field.
- Proficient with Python or Go.
- Experience working with databases (Relational and/or NoSQL).
- Experience with data processing libraries, such as Apache Beam.
- Experience in developing data pipelines and ELT/ETL workflows.
- Hands-on experience with DevOps tools such as Jenkins, Git, Docker, Kubernetes, Ansible, and CI/CD pipelines.
- Knowledge of data manipulation libraries, such as Pandas (Python), would be a plus.
All your information will be kept confidential according to EEO guidelines.