The Data Engineer is responsible for building scalable data models using DBT, managing Snowflake data warehouses, developing data pipelines, and ensuring data quality and governance. Strong SQL skills and experience with version control and automation tools are required to partner with stakeholders and enhance data processes.
- Strong SQL & DBT Expertise
- Experience building and maintaining scalable data models in DBT.
- Proficient in modular SQL, Jinja templating, testing strategies, and DBT best practices.
- Data Warehouse Proficiency
- Hands-on experience with Snowflake including:
- Dimensional and data vault modeling (star/snowflake schemas)
- Performance optimization and query tuning
- Role-based access and security management
- Data Pipeline & Integration Tools
- Experience with Kafka (or similar event streaming tools) for ingesting real-time data.
- Familiarity with SnapLogic for ETL/ELT workflow design, orchestration, and monitoring.
- Version Control & Automation
- Proficient in Git and GitHub for code versioning and collaboration.
- Experience with GitHub Actions or other CI/CD tools to automate DBT model testing, deployment, and documentation updates.
- Data Quality & Governance
- Strong understanding of data validation, testing (e.g., dbt tests), and lineage tracking.
- Emphasis on maintaining data trust across pipelines and models.
- Stakeholder Management
- Partner with business and technical stakeholders to define data needs and deliver insights.
- Ability to explain complex data concepts in clear, non-technical terms.
- Documentation & Communication
- Maintain clear documentation for models, metrics, and data transformations (using DBT docs or similar).
- Strong verbal and written communication skills; able to work cross-functionally across teams.
- Problem-Solving & Ownership
- Proactive in identifying and resolving data gaps or issues.
- Self-starter with a continuous improvement mindset and a focus on delivering business value through data.
- IAC
- deploy scalable, secure, and high-performing Snowflake environments in line with data governance and security in palce using Terraform and other automation scripit
- Automate infrastructure provisioning, testing, and deployment for seamless operations.
Top Skills
Dbt
Git
Git
Github Actions
Kafka
Snaplogic
Snowflake
SQL
Terraform
Similar Jobs
Cloud • Information Technology • Consulting
The Lead Data Analyst at Kyndryl will analyze data to provide insights, develop visualizations, and support business decision-making through collaboration with cross-functional teams.
Top Skills:
AlteryxKnimePower BIPythonRSQLTableauWorkday
Payments
Lead the development of data-driven frameworks to enhance value delivery across pricing and customer success, utilizing analytics and business intelligence tools.
Top Skills:
Ai/MlPower BIPythonRSQLTableau
Payments
The Lead Data Engineer will design and develop data migration pipelines, ensure data security and compliance, and manage data models and ETL workflows.
Top Skills:
Apache NifiApache OzoneSparkAWSClouderaDatabricksJavaPythonScala
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.