The company is hiring a Data Warehouse Quality Engineer to support and enhance their data warehousing systems and data marts. This role involves collaborating with data teams to test, validate, and maintain ETL pipelines, data transformations, and overall data accuracy throughout the development lifecycle.
Highlights
Develop test plans, test cases, and technical scripts based on requirements from project teams and business users
Participate in Agile/Scrum ceremonies and provide status updates on quality engineering activities
Test ETL pipelines, DBT transformations, and Snowflake data loads to ensure data integrity
Write Snowflake SQL queries to validate source-to-target data mappings from various sources including SQL Server, files, and AWS S3
Analyze and profile data to detect quality issues and assess ETL readiness
Create and maintain automated data validation and regression testing scripts using Python
Document and communicate test results, defects, and resolutions to stakeholders
Requires a bachelor's degree plus 3 years of professional experience or equivalent qualifications
Must have 3+ years in IT software development focusing on systems analysis, data warehousing, ETL, or quality engineering
Proficiency with cloud data warehouse technologies such as AWS, DBT, Snowflake, and strong SQL skills with SQL Server and Snowflake
The Data Warehouse Quality Engineer will support the ongoing development and maintenance of our data warehousing systems and data marts. Through your efforts, you’ll be making valuable contributions to the continuous transformation and improvement of our information systems.
This role works with the rest of the Data Warehouse team to support and test throughout the entire data development lifecycle, including data profiling, ETL and data model design, development, testing and support. We’re looking for someone who has experience testing data warehouse systems and ETL, is team oriented, quality-focused and a great communicator, with the skills necessary to ensure successful growth and support of our data warehousing systems.
Responsibilities
Collaborate with project teams and business users to gather requirements and develop effective test plans, test cases, and technical scripts
Participate in Agile/Scrum ceremonies and provide clear QE status updates
Test ETL pipelines, DBT transformations, and Snowflake data loads to ensure data accuracy and integrity
Validate source-to-target mappings by writing Snowflake SQL queries to reconcile data from different sources like SQL Server, files, and AWS S3
Analyze and profile data to identify quality issues and assess ETL readiness
Develop and maintain automated scripts for data validation and regression testing in Python
Document test results, defects, and resolutions, and communicate findings to stakeholders
Minimum Qualifications
A bachelor's degree and 3 years of professional work experience (or a master's degree, or equivalent experience) is required.
Additional Qualifications
3+ years of experience in IT software development (such as Systems Analysis, Data Warehousing, ETL, and QE areas)
Knowledge of data warehousing concepts
Experience with cloud data warehouse technologies such as AWS, DBT, Snowflake.
Strong SQL skills for data validation and analysis (experience with SQL Server and Snowflake).
Familiarity with DBT (Data Build Tool) for data transformation validation.
Experience working with relational and dimensional data models
Must be a self-starter comfortable working in a fast-paced, flexible environment, and take the initiative to learn new tools and concepts quickly