Harrison.ai Logo

Harrison.ai

Senior Data Engineer

Posted 23 Days Ago
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
As a Senior Data Engineer, you'll design and maintain scalable data platforms, implement data pipelines, and ensure data governance while collaborating with various stakeholders.
The summary above was generated by AI
What we’re about

At Harrison.ai, we’re redefining what’s possible in healthcare. Through our diagnostic AI solutions, we’re building tools that support clinicians to deliver earlier, more accurate diagnoses and raise the standard of care for millions of patients worldwide.

Our mission is bold but simple: to scale global healthcare capacity and create a fairer, healthier world. By using AI as a co-pilot for clinicians, we’re tackling one of healthcare’s biggest challenges, the shortage of human expertise, and giving every patient the chance to access timely, high-quality care, no matter where they live.

Because while we’re building cutting-edge AI, what we’re really building is hope—that everyone can access the healthcare they deserve.

And we’re just getting started.

About Your Role

Our Analytics Team plays a central role in driving data-informed decision-making and maximising the value of our evolving data ecosystem. We are seeking a skilled and motivated Senior Data Engineer to architect and scale data platforms and pipelines that support analytics across the business.

As a Senior Data Engineer reporting to the Director of Analytics, you will design, build, and maintain scalable data platforms that power analytics and decision-making. You’ll collaborate with engineering, commercial and operations stakeholders to ensure accurate, timely and reliable data delivery. Your expertise will help us extract actionable insights and drive data-informed innovation across global markets. If you are passionate about building data systems that enable meaningful healthcare impact we’d love to hear from you.

What You Do:
  • Design and implement end-to-end data pipelines to collect, process and transform data from multiple sources.

  • Develop and maintain scalable data platforms to enable efficient data storage, retrieval and analysis.

  • Collaborate with stakeholders to translate business and data requirements into robust technical designs.

  • Build and manage data orchestration workflows using tools such as Airflow and GitHub Actions.

  • Optimise and automate data ingestion and transformation processes to streamline workflows and reduce operational overhead.

  • Implement and maintain CI/CD pipelines to ensure consistent, testable and reliable data engineering deployments.

  • Implement data governance frameworks and ensure compliance with security, privacy, and regulatory standards.

  • Monitor data infrastructure performance, troubleshooting issues proactively to ensure reliability and scalability.

  • Define and maintain data models that support analytics and product reporting needs.

What You Bring:
  • Proven experience designing, building, and maintaining data platforms in cloud environments (AWS preferred).

  • Proven experience with cloud data warehouses (e.g. AWS Redshift, Snowflake, Big Query).

  • Strong programming skills in Python, SQL, and/or Spark.

  • Experience deploying dbt or similar frameworks for analytics engineering and data transformation.

  • Experience with Git and version control best practices.

Nice to have skills and characteristics:

  • Experience working in a regulated healthcare or medical device environment.

  • Demonstrated experience implementing Terraform or similar infrastructure-as-code tools.

  • Hands-on experience with CI/CD automation and DevOps workflows (e.g. GitHub Actions).

  • Experience using modern workflow orchestration tools such as Apache Airflow.

  • Familiarity with GDPR and international data compliance.

  • Exposure to BI and reporting software (ideally Tableau and/or Metabase).

Why join us?

🌍 Innovate for Global Good. Join us to pioneer world-first AI technology that transforms patient outcomes and helps build a healthier, fairer world.

🤝 Collaboration Across Continents. Work with brilliant minds from every corner of the globe in a culture built on trust, autonomy, and genuine teamwork.

🚀 Well-Funded & Global. Backed by world-class investors including Aware Super, Blackbird Ventures, Skip Capital, and Horizons Ventures, we’ve raised over US$240M to accelerate our global impact.

🌱 Scale Your Potential. Tap into yearly L&D budgets, mentoring, hackathons, and secondments—all supported by a transparent growth framework to grow your career.

💻Flex for Life. Work when and where you do your best—with WFH options, flexible hours, and the autonomy to make an impact your way.

🙌 Support for Every Family Journey. From fertility to parenthood, loss, and even grandparenthood—we provide inclusive, thoughtful policies to support families in every stage.

What's next?

If you’re inspired by what we're up to, please apply now and we'll be in touch soon.

We are proud to be an Equal Opportunity Employer. Diversity’s not a buzzword here, it’s in our DNA. Diverse perspectives shape our culture and make our work better. We’re committed to building inclusive teams that represent a variety of backgrounds and skills. We look forward to hearing from you.

Top Skills

AWS
Aws Redshift
Big Query
Python
Snowflake
Spark
SQL

Similar Jobs

13 Hours Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Design, build, and operate scalable ETL/ELT pipelines using PySpark and AWS data services. Orchestrate workflows with Apache Airflow, implement AWS Glue jobs and Data Catalog, manage Lake Formation permissions, publish datasets for BI, and deliver QuickSight visualizations while ensuring data quality and performance.
Top Skills: Pyspark,Apache Airflow,Aws Glue,Aws Lake Formation,Aws Glue Data Catalog,Amazon Quicksight
9 Hours Ago
Remote
India
Senior level
Senior level
Information Technology • Consulting
Design, build, and optimize scalable ELT/ETL pipelines on Azure using ADF and Databricks, integrate data into Snowflake, implement Data Vault 2.0 modelling, ensure data quality and performance, and collaborate with stakeholders and analytics teams to maintain production workflows.
Top Skills: Azure Data Factory (Adf),Azure Databricks,Snowflake,Sql,Data Vault 2.0,Microsoft Azure
9 Hours Ago
Remote
India
Senior level
Senior level
Information Technology • Consulting
Design, build, and maintain scalable batch and real-time data pipelines on Azure using Kafka, Spark, Databricks and ADF. Optimize Snowflake models and SQL, process structured and semi-structured data, ensure data quality, performance, and collaborate with cross-functional teams.
Top Skills: Microsoft Azure,Apache Kafka,Apache Spark,Python,Databricks,Azure Data Factory,Snowflake,Sql,Xml,Json,Scala

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account