Wells Fargo Logo

Wells Fargo

Senior Software Engineer - Big Data Developer

Posted 2 Days Ago
Be an Early Applicant
Hybrid
Hyderabad, Telangana
Senior level
Hybrid
Hyderabad, Telangana
Senior level
The Senior Software Engineer will lead projects in big data solutions, design and develop applications, and resolve technical challenges while guiding less experienced staff.
The summary above was generated by AI

About this role:

Wells Fargo is seeking a Senior Software Engineer We believe in the power of working together because great ideas can come from anyone. Through collaboration, any employee can have an impact and make a difference for the entire company. Explore opportunities with us for a career in a supportive environment where you can learn and grow.

In this role, you will:

  • Lead moderately complex initiatives and deliverables within technical domain environments
  • Contribute to large scale planning of strategies
  • Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments
  • Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures
  • Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
  • Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals
  • Lead projects and act as an escalation point, provide guidance and direction to less experienced staff

Required Qualifications:

  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education

Desired Qualifications:

  • 4+ years of experience in building end-to-end business solutions using Big data technologies like HDFS, Hive, Kafka, Scala, Python and Spark.
  • Demonstrated strength in data modeling, ETL development, and data warehousing. 
  • Preferring knowledge and hands-on experience working with Hadoop Ecosystem and Big data technologies like HDFS, Hive, Kafka, Spark, Scala and Python. 
  • Experience in API design & development.
  • Candidate should have good knowledge of Linux and have ETL development, deployment and optimization experience using standard big data tools. 
  • Should have good understanding of Git, JIRA, Change / Release management,  build/deploy, CI/CD & Share Point.
  • Continually develop depth and breadth in key competencies. 
  • Demonstrate curiosity towards learning and treat negative events as opportunities for learning.  
  • Ability to communicate clearly and concisely and use strong writing and verbal skills to communicate facts, figures, and ideas to others. 
  • Deliver effective presentations and talks.

Job Expectations:

  • Team member will be working as an Individual contributor in delivery team working for data pipeline creation, data onboarding and support for Data Streams within RDS.
  • Design and develop highly scalable applications and research technologies to solve complex business problems.
  • Develop reusable solutions that can be shared with multiple groups.
  • Define opportunities across IT to maximize business impact and innovate engineering processes to reduce software construction and maintenance costs. 
  • Expected to contribute towards integrating complex platforms including several components with business domain and process context. 
  • Focus on building relevant engineering and business capabilities in the organization to keep pace with demand and best practices in the industry. 
  • Coordinate implementation activities across a broad range of functions and departments; work with client groups to identify, arrange, and/or deliver training needs. 
  • Lead organizational initiatives. Work with stakeholders to research new frameworks, tools & proof of concepts.  
  • Develop and lead focused groups and communities to facilitate technical discussions, source ideas, and provide engineering leadership.
  • 4+ years of experience in building end-to-end business solutions using Big data technologies like HDFS, Hive, Kafka, Scala, Python and Spark.
  • Demonstrated strength in data modeling, ETL development, and data warehousing. 
  • Preferring knowledge and hands-on experience working with Hadoop Ecosystem and Big data technologies like HDFS, Hive, Kafka, Spark, Scala and Python. 
  • Experience in API design & development.
  • Candidate should have good knowledge of Linux and have ETL development, deployment and optimization experience using standard big data tools. 
  • Should have good understanding of Git, JIRA, Change / Release management,  build/deploy, CI/CD & Share Point.
  • Continually develop depth and breadth in key competencies. 
  • Demonstrate curiosity towards learning and treat negative events as opportunities for learning.  
  • Ability to communicate clearly and concisely and use strong writing and verbal skills to communicate facts, figures, and ideas to others. 
  • Deliver effective presentations and talks.

Posting End Date: 

10 Jun 2025

*Job posting may come down early due to volume of applicants.

We Value Equal Opportunity

Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic.

Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit’s risk appetite and all risk and compliance program requirements.

Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process.

Applicants with Disabilities

To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo.

Drug and Alcohol Policy

 

Wells Fargo maintains a drug free workplace.  Please see our Drug and Alcohol Policy to learn more.

Wells Fargo Recruitment and Hiring Requirements:

a. Third-Party recordings are prohibited unless authorized by Wells Fargo.

b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Top Skills

Ci/Cd
Git
Hdfs
Hive
JIRA
Kafka
Linux
Python
Scala
Spark

Similar Jobs

53 Minutes Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Financial Services
As a Data Engineer II, you will design, develop, and troubleshoot data solutions, focusing on data collection, storage, and analytics within an agile team.
Top Skills: Apache KafkaAWSAws KinesisIbm MqJavaKubernetesMskNoSQLPysparkSQL
53 Minutes Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Financial Services
As a Data Engineer III, you will design and develop data solutions, maintain data pipelines, and support data analytics in an agile environment.
Top Skills: Apache KafkaAWSAws KinesisIbm MqJavaKubernetesNoSQLPysparkRest ApiSparkSQL
53 Minutes Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Financial Services
As a Data Engineer III, you'll design and maintain data pipelines, ensure data integrity, and contribute to a diverse team while leveraging SQL, AWS, and various data processing technologies.
Top Skills: Apache KafkaAWSAws KinesisIbm MqJavaJavasparkKubernetesNoSQLPysparkRest ApiSQL

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account