The role involves ensuring data quality through testing ETL workflows, developing test plans, and collaborating with technical teams to maintain data integrity.
Are you obsessed with data, partner success, taking action, and changing the game? If you have a whole lot of hustle and a touch of nerd, come work with Pattern! We want you to use your skills to push one of the fastest-growing companies headquartered in the US to the top of the list.
Pattern accelerates brands on global ecommerce marketplaces leveraging proprietary technology and AI. Utilizing more than 46 trillion data points, sophisticated machine learning and AI models, Pattern optimizes and automates all levers of ecommerce growth for global brands, including advertising, content management, logistics and fulfillment, pricing, forecasting and customer service. Hundreds of global brands depend on Pattern’s ecommerce acceleration platform every day to drive profitable revenue growth across 60+ global marketplaces—including Amazon, Walmart.com, Target.com, eBay, Tmall, TikTok Shop, JD, and Mercado Libre. To learn more, visit pattern.com or email [email protected].
Pattern has been named one of the fastest growing tech companies headquartered in North America by Deloitte and one of best-led companies by Inc. We place employee experience at the center of our business model and have been recognized as one of Newsweek’s Global Most Loved Workplaces®.
We are seeking a highly motivated and detail-oriented Data Quality Engineer to join our growing team. You’ll work closely with Data Engineers, Data scientists, and business team to ensure the accuracy, completeness, and reliability of our data through rigorous testing of ETL workflows, data pipelines and building robust data quality checks. You will play a critical role in maintaining data integrity and supporting data-driven decision-making across the organization.
Roles and Responsibilities
- Analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations.
- Perform data validation, reconciliation, and integrity checks across various data sources and target systems.
- Build and automate data quality checks using SQL and/or Python scripting.
- Identify, document, and track data quality issues, anomalies, and defects.
- Collaborate with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure data quality standards are met.
- Define data quality KPIs and implement continuous monitoring frameworks.
- Participate in data model reviews and provide input on data quality considerations.
- Perform root cause analysis for data discrepancies and work with teams to drive resolution.
- Ensure alignment to data governance policies, standards, and best practices
- Experience with data quality frameworks (e.g., Soda, Great Expectations, Deequ, Monte Carlo, DBT Test).
- Experience with modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift).
- Advanced knowledge of SQL, including the ability to write stored procedures, triggers, analytic/windowing functions, and performance tuning.
- Familiarity with data pipeline tools like Airflow, DBT, or Informatica.
- Experience integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar.
- Background to big data platforms, data lakes, or non-relational databases.
- Understanding of data lineage, and master data management (MDM) concepts.
- Experience with Agile/Scrum development methodologies
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 8 plus years of experience as a Data Quality Engineer, ETL Tester, or a similar role.
- Strong understanding of ETL concepts, data warehousing principles, and relational database design.
- Proficiency in SQL for complex querying, data profiling, and validation tasks.
- Familiarity with data quality tools and testing methodologies.
Top Skills
Airflow
Spark
AWS
Dbt
Informatica
Python
Redshift
Snowflake
SQL
Similar Jobs
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
As a Backend Software Engineer II, you'll build cloud-based SaaS identity analytics products using Java microservices, collaborate on designs, and improve code quality.
Top Skills:
AngularAWSBackboneJavaJavaScriptReact
Enterprise Web • Fintech • Financial Services
As a Data Specialist, you will collect data from various sources, manage client relationships, and support quality assurance initiatives. Responsibilities include resolving data issues, ensuring data accuracy, improving processes, and participating in related projects.
Top Skills:
ExcelMS OfficePowerPoint
Enterprise Web • Fintech • Financial Services
The Senior QA Automation Engineer will design automation frameworks, conduct testing, mentor junior engineers, and ensure quality in an Agile environment.
Top Skills:
.NetC#CypressioJavaJavaScriptMstestNightwatchNunitPlaywrightTypescriptWebdriverioXunit
What you need to know about the Pune Tech Scene
Once a far-out concept, AI is now a tangible force reshaping industries and economies worldwide. While its adoption will automate some roles, AI has created more jobs than it has displaced, with an expected 97 million new roles to be created in the coming years. This is especially true in cities like Pune, which is emerging as a hub for companies eager to leverage this technology to develop solutions that simplify and improve lives in sectors such as education, healthcare, finance, e-commerce and more.


