Nimble Gravity Logo

Nimble Gravity

Data Operations Analyst

Reposted 25 Days Ago
Remote
12 Locations
Mid level
Remote
12 Locations
Mid level
The Data Operations Analyst will manage data workflows, execute Python scripts, collaborate with stakeholders, and ensure smooth AI and analytics processes.
The summary above was generated by AI

Nimble Gravity is a team of outdoor enthusiasts, adrenaline seekers, and experienced growth hackers. We love solving hard problems and believe the right data can transform and propel growth for any organization.

We’re looking for a Data Operations Analyst to bridge the gap between business operations and technical data workflows. You’ll work hands-on with Python scripts, Databricks notebooks, and LLM API calls, ensuring seamless execution of pre-built analytics and AI tools. Beyond technical tasks, you'll collaborate with business stakeholders, drive operational efficiency, and help translate data insights into actionable strategies.

If you’re someone who enjoys solving business problems with data, working with pre-existing ML models and workflows, and keeping AI processes running smoothly, then you belong with us!

What You’ll Own
  • Ensure the execution and reliability of data pipelines, notebooks, and ML workflows within Databricks and similar tools.

  • Operate and maintain production-grade AI tools and LLM API calls—ensuring availability, performance, and business continuity.

  • Translate complex technical workflows into business impact, partnering with ops, analytics, and leadership teams.

  • Identify and resolve data bottlenecks; optimize Python scripts and SQL queries for performance and scalability.

  • Maintain clear documentation of processes and data products to support transparency and reusability.

  • Validate, monitor, and report on the performance of AI models and analytics outputs.

  • Partner with engineering and analytics teams to improve automation and elevate internal tooling.

  • Contribute to AI/data governance and compliance frameworks for scalable growth.

What You Bring
  • 3–5+ years of experience in data ops, analytics, engineering, or related technical/ops roles.

  • Strong hands-on proficiency in Python, with the ability to read, run, and troubleshoot code.

  • Experience in production environments working with Databricks, SQL, and cloud-based data platforms.

  • Exposure to AI/ML tooling, including working with pre-trained models and external APIs.

  • A proactive mindset with a demonstrated ability to connect data solutions to business priorities.

  • Strong communication and collaboration skills — you can speak “tech” and “business” fluently.

  • English proficiency at the C1 level.

Bonus Points If You Have
  • Experience integrating or managing LLM APIs (OpenAI, Anthropic, etc.) in operational workflows.

  • Familiarity with modern orchestration tools (e.g., Airflow, dbt, Dagster).

  • A background in program or project management within data-centric environments.

  • Passion for documentation, reproducibility, and process improvement.

Nimble Gravity is an Equal Opportunity Employer and considers applicants for employment without regard to race, color, religion, sex, orientation, national origin, age, disability, genetics or any other basis forbidden under federal, state, or local law. Nimble Gravity considers all qualified applicants.


Top Skills

Databricks
Llm Apis
Python
SQL
Workflow Automation Tools

Similar Jobs

10 Days Ago
Remote
12 Locations
Junior
Junior
Real Estate
Analyze customer service trends and operational workflows to improve efficiency and satisfaction. Collaborate across teams, providing data-driven insights and developing reporting tools.
Top Skills: Data Visualization ToolsFreshdeskHexIntercomLookerPower BISalesforce Service CloudSQLTableauZendesk
3 Days Ago
Remote or Hybrid
16 Locations
Entry level
Entry level
Fintech • Software
The job involves filling out a form to express interest in future opportunities with IMC Trading after NeurIPS 2024.
4 Days Ago
Remote or Hybrid
Santiago, Región Metropolitana de Santiago, CHL
Junior
Junior
Artificial Intelligence • Big Data • Cloud • Information Technology • Software • Big Data Analytics • Automation
The role involves delivering services based on the Dynatrace platform, including implementation, training, and client support, while ensuring performance monitoring and application enhancement.
Top Skills: .NetAnsibleAWSAzureAzure DevopsDb2DynatraceGCPGitlabJavaJavaScriptJenkinsKubernetesMs SqlOpenshiftOraclePythonTerraform

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account