Yassir Logo

Yassir

Senior Data Engineer

Posted 3 Days Ago
Be an Early Applicant
Remote or Hybrid
8 Locations
Senior level
Remote or Hybrid
8 Locations
Senior level
The role involves building a centralized data lake on GCP, developing SPARK-powered data pipelines, ensuring data quality, and collaborating with cross-functional teams for advanced analytics and data models.
The summary above was generated by AI
Yassir is the leading super App in the Maghreb region set to changing the way daily services are provided. It currently operates in 45 cities across Algeria, Morocco and Tunisia with recent expansions into France, Canada and Sub-Saharan Africa. It is backed (~$200M in funding) by VCs from Silicon Valley, Europe and other parts of the world.
We offer on-demand services such as ride-hailing and last-mile delivery. Building on this infrastructure, we are now introducing financial services to help our users pay, save and borrow digitally.
Helping usher the continent into a digital economy era. We’re not just about serving people - we’re about creating a marketplace to bring people what they need while infusing social values.

Responsibilities

  • Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise.
  • Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
  • Design and implement data validation and quality checks to ensure data's accuracy, completeness, and consistency as it flows through the pipelines.
  • Work with the Data Science and Machine Learning teams to engage in advanced analytics.
  • Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data.
  • Collaborate with the product team to design, implement, and maintain the data models for analytical use cases.
  • Design, develop, and upkeep data dashboards for various teams using Looker Studio.
  • Engage in technology explorations, research and development, POC’s and conduct deep investigations and troubleshooting.
  • Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance.
  • Troubleshoot data issues and conduct root cause analysis when reporting data is in question.

Required Technical Skills

  • PySpark  Batch and Streaming
  • GCP  Dataproc, Dataflow, DataStream, Dataplex, Pub/Sub, BigQuery and
  • Cloud Storage
  • NoSQL (preferably MongoDB
  • Programming languages: Scala/Python
  • Great Expectation, or similar DQ framework
  • Familiarity with workflow management tools like: Airflow, Prefect or Luigi
  • Understanding of Data Governance, Data Warehousing and Data Modelling
  • Good SQL knowledge

Business

  • Able to communicate effectively, distill technical knowledge into digestible
  • messages in a succinct / visual way
  • Proactively identify and contribute with team development initiatives, and
  • supporting junior members.

Good to have skills

  • Infrastructure-as-Code, preferably Terraform
  • Docker and Kubernetes
  • Looker
  • AI / ML engineering knowledge
  • Lineage, or relevant tools e.g. Atlan
  • DBT

At Yassir, we believe in the power of diversity and the importance of an inclusive culture. So, if you're ready to bring your unique perspective and experiences to the table, then we're excited to listen.

Don't just apply for a job, come and be a part of our journey. Let's create a better tomorrow together.

We look forward to receiving your application!

Best of luck,
Your Yassir TA Team


Top Skills

Airflow
Dbt
Docker
GCP
Kubernetes
Looker
Luigi
NoSQL
Prefect
Pyspark
Python
Scala
SQL
Terraform

Similar Jobs

9 Days Ago
Remote or Hybrid
Pune, Maharashtra, IND
Senior level
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
Seeking a Senior Data Engineer to design and implement ELT processes, build reliable data pipelines, and utilize AWS for data solutions.
Top Skills: Apache AirflowAWSDbtDockerFlinkHelmJenkinsKafkaKubernetesKustomizeSnowflakeSparkTerraform
2 Days Ago
Remote
India
Senior level
Senior level
Digital Media • Consulting
The Senior Data Engineer will design and develop ETL pipelines and data products for financial systems, ensuring data quality and collaboration with finance stakeholders.
Top Skills: Apache AirflowAws GlueAws LambdaDbtPythonSQL
12 Days Ago
Remote
2 Locations
Senior level
Senior level
Information Technology • Software • Cybersecurity
As a Senior Data Engineer, you will architect security data ecosystems by designing data lakehouse architectures, implementing real-time streaming pipelines, and enabling AI/ML features. You will manage data ingestion patterns and ensure system integrity through automation and observability.
Top Skills: Apache BeamApache FlinkDbtGoGoogle Cloud PlatformKubernetesPythonScalaSQLTerraform

What you need to know about the Pune Tech Scene

Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account