Blenheim Chalcot Logo

Blenheim Chalcot

Data Engineer

Posted Yesterday
Be an Early Applicant
In-Office
Mumbai, Maharashtra
Mid level
In-Office
Mumbai, Maharashtra
Mid level
As a Core Data Engineer, design and optimize data pipelines, implement ETL frameworks, and collaborate with teams to ensure data quality and architecture.
The summary above was generated by AI
Are you excited by the idea of helping scale a fast-growing tech business across London, Mumbai and Austin?
Do you want to build a career in data engineering while working with cutting-edge GenAI tools?
Are you passionate about building scalable data infrastructure that powers analytics, machine learning, and customer insights?
 
If so, we’d love to hear from you!

About Us:

Fospha
is dedicated to building the world's most powerful measurement solution for online retail. For over a decade, we've helped teams make smarter decisions with full-funnel marketing insights, forecasting, and optimisation. With Fospha, every team moves faster and grows smarter.
Trusted by over 200 leading brands across three continents, including Huel, Oh Polly, and Represent, Fospha manages $2.5 billion in annual ad spend.
 
We're scaling fast across London, Mumbai, and Austin – and we're now looking for a Core Data Engineer to join our Core Data Team to support the growth of the business through robust, scalable data infrastructure.
 
Ready to make your mark? Let’s go! 🚀

The Role:

As a Core Data Engineer at Fospha, you’ll be at the heart of our data ecosystem. You’ll design, build, and optimise pipelines that move, transform, and scale data across multiple systems. Working at the intersection of analytics, engineering, and machine learning, you’ll ensure our data infrastructure grows as fast as our ambitions.
 
You’ll define data quality standards, shape the data roadmap, and support high-quality data access for our Data Science and Analytics teams. This is a high-impact role where your work will directly empower teams to move faster and deliver smarter insights.

Key Responsibilities:

  • Design, build, and optimise data pipelines using dbt, Python, and SQL
  • Implement and maintain scalable ELT/ETL frameworks that power analytics and ML systems
  • Collaborate with Data Science and Platform teams to ensure robust and reliable model deployment pipelines
  • Own the reliability, scalability, and observability of data workflows in production
  • Contribute to data architecture decisions and documentation, ensuring data integrity and consistency across sources
  • Design and maintain data models used by ML Engineers, Data Analysts, and Data Scientists
  • Drive automation, versioning, and quality validation in data delivery
  • Conduct exploratory data analysis to uncover trends and inform strategic decision-making
  • Identify opportunities for process improvement and promote a culture of continuous data excellence
  • In addition to your core responsibilities, we expect you to be excited by the opportunity GenAI brings and find new ways of working utilising our GenAI hub!

What are we looking for?

We hire for potential – you should apply if you:
  • Have proven experience building data pipelines in dbt, Python, and SQL
  • Demonstrate a strong grasp of data modelling, warehousing, and orchestration tools
  • Understand data architecture and ELT flows
  • Are familiar with ML Ops principles and how they integrate with engineering systems
  • Have experience with cloud-native data stacks (preferably AWS)
  • Take a pragmatic approach to balancing perfection with delivery
  • Understand agile methodologies and best practices
  • Know how to apply data quality frameworks and version control in data delivery

Our Values and Principles:

You will be able to demonstrate examples of our core principles:
  • Seek inclusion & diversity: We create an environment where everyone feels welcome, and people are encouraged to speak and be heard
  • Work Hard, Work Well, Work Together: We take responsibility for making things happen, independently and together; we help colleagues in need and close loops, making sure our work is complete and has lasting impact
  • Grow: We are proactive, curious and unafraid of failure
  • Customer at the heart: We care about the customer, feel their pain and love building product that solves their biggest problems
  • Candour with caring: We deliver candid feedback with kindness and receive it with gratitude

What we can offer you:

  • Competitive salary
  • Be exposed to the right mix of challenges and learning and development opportunities
  • 25 days of paid holiday + your birthday off!
  • Quarterly team socials

By submitting your CV you understand that we have a legitimate interest to use your personal data for the purposes of assessing your eligibility for this role.  This means that we may use your personal data to contact you to discuss your CV or arrange an interview, or transfer your CV to the hiring manager(s) of the role you have applied for.  You can ask us at any time to remove your CV from our database by emailing [email protected] – but please note that this means we will no longer consider you for the role you have applied for. You can review our privacy policy here.

Top Skills

AWS
Dbt
Python
SQL

Similar Jobs

Yesterday
Hybrid
3 Locations
Senior level
Senior level
Artificial Intelligence • Healthtech • Professional Services • Analytics • Consulting
Lead and implement large-scale data projects using cloud technologies. Mentor teams, optimize solutions, and ensure timely delivery while collaborating globally with clients and experts.
Top Skills: AWSAzureDatabricksGCPPower BIRedshiftSalesforceSnowflake
4 Days Ago
Hybrid
2 Locations
Senior level
Senior level
Artificial Intelligence • Healthtech • Professional Services • Analytics • Consulting
Lead a technical team to deliver data engineering solutions, manage project lifecycle, and collaborate with stakeholders to optimize supply chain operations across industries.
Top Skills: AthenaAWSAws NeptuneAzureEc2EmrGCPGlueLakeformationManaged AirflowPythonRdsRedshiftS3Sagemaker StudioScala
Yesterday
In-Office
3 Locations
Senior level
Senior level
Information Technology • Business Intelligence • Consulting
The Data Engineer develops structured data sets and data models for analytics, manages ETL processes, collaborates with stakeholders, and ensures data integrity and quality.
Top Skills: .NetBashCassandraHadoopMicrosoft Azure Data FactoryMicrosoft SqlMySQLPerlPythonSAPSap BtpSap Data ServicesShell ScriptingSQLSql Analysis ServerStorm

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account