About Fusemachines
Founded in 2013, Fusemachines is a global provider of enterprise AI products and services, on a mission to democratize AI. Leveraging proprietary AI Studio and AI Engines, the company helps drive the clients’ AI Enterprise Transformation, regardless of where they are in their Digital AI journeys. With offices in North America, Asia, and Latin America, Fusemachines provides a suite of enterprise AI offerings and specialty services that allow organizations of any size to implement and scale AI. Fusemachines serves companies in industries such as retail, manufacturing, and government.
Fusemachines continues to actively pursue the mission of democratizing AI for the masses by providing high-quality AI education in underserved communities and helping organizations achieve their full potential with AI.
About the role:
This is a remote, full time consulting position (contract) responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics) in the Media domain.
We are seeking a Senior Data Engineer with Snowflake, and proven implementation of clean rooms and data enrichment, delivering Data and Analytics products using Agile methodology. The ideal candidate will possess strong technical, analytical, and interpersonal skills.
Qualification / Skill Set Requirement:
5+ years of real-world data engineering development experience in Snowflake (certifications preferred)
Proven experience as a Snowflake Developer, with a strong understanding of Snowflake architecture and concepts.
Strong programming skills in SQL, with proficiency in writing efficient and optimized code for data integration, storage, processing, and manipulation.
Proficient in snowflake services such as clean rooms, snowpipe, stages, stored procedures, views, materialized views, tasks and streams.
Robust understanding of data partitioning and other optimization techniques in Snowflake.
Knowledge of data security measures in Snowflake, including role-based access control (RBAC) and data encryption, and how those are applied to clean rooms.
Highly skilled in one or more languages such as Python, Scala, and proficient in writing efficient and optimized code for data integration, storage, processing and manipulation.
Proven experience in Azure.
Strong knowledge of SDLC tools and technologies, including project management software (Jira or similar), source code management (GitHub or similar), CI/CD system (GitHub actions, AWS CodeBuild or similar) .
Skilled in Data Integration from different sources such as APIs, databases, flat files, event streaming.
Good understanding of Data Modeling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions.
Strong experience in working with ELT and ETL tools and being able to develop custom integration solutions as needed.
Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent.
Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues.
Strong experience in working with ELT and ETL tools and being able to develop custom integration solutions as needed.
Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive
Responsibilities:
Design and architect Snowflake data warehouses and databases to meet the data storage and processing requirements.
Design and architect Snowflake clean rooms to meet the data security and sharing requirements.
Create and optimize ELT (Extract, Load, Transform) processes to move, transform, and load data from various sources into Snowflake.
Identify and resolve query performance bottlenecks and optimize SQL queries to ensure efficient data retrieval and processing.
Integrate Snowflake with other data systems and tools, such as data lakes, analytics platforms, and BI tools.
Collaborating with Product, Engineering, Data Scientists and Analysts to understand data requirements and develop data solutions, including reusable components.
Evaluating and implementing new technologies and tools to improve data integration, data processing and analysis.
Evaluate, design, and implement data governance solutions: cataloging, lineage, quality and data governance frameworks that are suitable for a modern analytics solution, considering industry-standard best practices and patterns.
Assess best practices and design schemas that matches business needs for delivering a modern analytics solution (descriptive, diagnostic, predictive, prescriptive)
Be an active member of our Agile team, participating in all ceremonies and continuous improvement activities.
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.



