Oportun Logo

Oportun

Senior Software ML Engineer - R12388

Reposted 12 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in IN
Senior level
Remote
Hiring Remotely in IN
Senior level
The role involves platform engineering focusing on real-time ML deployment, data pipeline development, and CI/CD automation. Responsibilities include designing scalable infrastructure and collaborating with teams to optimize workflows.
The summary above was generated by AI
ABOUT OPORTUN

Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.

 

WORKING AT OPORTUN


Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.


Position Overview:

 

We are seeking a highly skilled Platform Engineer with expertise in building self-serve platforms that combine real-time ML deployment and advanced data engineering capabilities. This role requires a blend of cloud-native platform engineering, data pipeline development, and deployment expertise. The ideal candidate will have a strong background in designing data workflows and scalable infrastructure for ML pipelines while enabling seamless integrations and deployments.

 

Responsibilities:

  • Platform Engineering
    Design and build self-serve platforms that support real-time ML deployment and robust data engineering workflows.
    Develop microservices-based solutions using Kubernetes and Docker for scalability, fault tolerance, and efficiency.
    Create APIs and backend services using Python and FastAPI to manage and monitor ML workflows and data pipelines.
  • Real-Time ML Deployment
    Architect and implement platforms for real-time ML inference using tools like AWS SageMaker and Databricks.
    Enable model versioning, monitoring, and lifecycle management with observability tools such as New Relic.
  • Data Engineering
    Build and optimize ETL/ELT pipelines for data preprocessing, transformation, and storage using PySpark and Pandas.
    Develop and manage feature stores to ensure consistent, high-quality data for ML model training and deployment.
    Design scalable, distributed data pipelines on platforms like AWS, integrating tools such as DynamoDB, PostgreSQL, MongoDB, and MariaDB.
    Implement data lake and data warehouse solutions to support advanced analytics and ML workflows.
  • CI/CD and Automation
    Design and implement robust CI/CD pipelines using Jenkins, GitHub Actions, and other tools for automated deployments and testing.
    Automate data validation and monitoring processes to ensure high-quality and consistent data workflows.
  • Documentation and Collaboration
    Create and maintain detailed technical documentation, including high-level and low-level architecture designs.
    Collaborate with cross-functional teams to gather requirements and deliver solutions that align with business goals.
    Participate in Agile processes such as sprint planning, daily standups, and retrospectives using tools like Jira.


Required Qualifications
5+ years of experience in platform engineering, DevOps, or data engineering roles.
Hands-on experience with real-time ML model deployment and data engineering workflows.Technical Skills
Strong expertise in Python and experience with Pandas, PySpark, and FastAPI.
Proficiency in container orchestration tools such as Kubernetes (K8s) and Docker.
Advanced knowledge of AWS services like SageMaker, Lambda, DynamoDB, EC2, and S3.
Proven experience building and optimizing distributed data pipelines using Databricks and PySpark.
Solid understanding of databases such as MongoDB, DynamoDB, MariaDB, and PostgreSQL.Proficiency with CI/CD tools like Jenkins, GitHub Actions, and related automation frameworks.
Hands-on experience with observability tools like New Relic for monitoring and troubleshooting.

We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.

 

California applicants can find a copy of Oportun's CCPA Notice here:  https://oportun.com/privacy/california-privacy-notice/.

 

We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).

Top Skills

Aws Sagemaker
Docker
DynamoDB
Fastapi
Github Actions
Jenkins
Kubernetes
Mariadb
MongoDB
New Relic
Pandas
Postgres
Pyspark
Python

Similar Jobs

6 Hours Ago
Remote
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
As a Senior Software Engineer, you will design and develop scalable backend services, automate testing for applications, and improve software quality, focusing on delivering high-quality releases.
Top Skills: AWSAzureGCPJavaKafkaKubernetesRabbitMQRestful ApisSpringSpring BootSQLSqs
10 Hours Ago
Remote
Hybrid
India
Senior level
Senior level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Develop and design cloud-based mobile applications and microservices, create CI/CD pipelines, write scalable code, and lead technical initiatives.
Top Skills: Advanced JavaAngularjsBambooCi/CdCore JavaDockerEs6GradleGulpJava 8JavaScriptJenkinsMavenMicroservicesMongoDBNode.jsNosql DatabasesReact NativeReactRedisSpring BootSpring CloudSpring FrameworkWebpack
15 Hours Ago
Remote
Hybrid
Bengaluru, Karnataka, IND
Senior level
Senior level
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
Lead the development of generative AI solutions, collaborate with stakeholders, enforce data security, and explore AI enhancements for enterprise integration.
Top Skills: Aws BedrockAzure Ai StudiosBoomiGoogle Cloud Vertex AiIpaasJavaPythonSQLWorkatoZapier

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account