Data Quality Management: Roles, Processes, Tools (2023)


  • What is data quality? Data quality dimensions
  • Data quality management: how to implement and how it works
    • 1. Defining the impact of poor data on performance via data quality assessment
    • 2. Defining data quality rules and metrics
    • 3. Defining data standards, metadata management standards, data validation rules
    • 4. Implementing data quality and data management standards
    • 5. Data monitoring and remediation
  • Data quality team: roles and responsibilities
    • Data quality analyst: a multitasker
  • Data quality tools
    • IBM InfoSphere Information Server for Data Quality: end-to-end for ongoing data monitoring and cleansing
    • Informatica Data Quality: automating data quality management for machine learning and AI
    • Trillium DQ: flexible and scalable data quality platform for various use cases
  • Final word

Reading time: 13 minutes

Data is the lifeblood of an organization and its commercial success. You probably heard these words from a conference lecturer or saw similar headlines online. But it is what it is: You need to rely on accurate information to complete a task or deliver a service, be it planning an optimal route to deliver sushi to numerous customers or sending customized newsletters. In the first case, that’s accurate order details that you need. In the second case, you must segment customers based on their activity and interests.To do so, you need to collect and analyze appropriate data, and that would take some time and effort.

Flaws in data can lead to different outcomes. For instance, a Skyscanner customer James Lloyd was suggested to take a “long wait” layover of 413,786 hours or 47 years in Bangkok on the way from Christchurch (New Zealand) to London. The story went viral thanks to a great sense of humor the Skyscanner social media manager Jen showed when replying to James’s question about what he could do during these years:

Data Quality Management: Roles, Processes, Tools (1)

A witty reply to James Lloyd’s question on Facebook. Source: Skyscanner Facebook

Using erroneous data may lead to tragic events, especially in the medical field. David Loshin, in The Practitioner’s Guide to Data Quality Improvement, mentions the 2003 case of Jesica Santillan, who died of botched heart-lung transplant. A surgeon used organs from a donor with an incompatible blood type. Inaccurate information on blood typing caused surgery complications that resulted in death.

Low-quality data can also impede and slow down the integration of business intelligence and ML-powered predictive analytics. US company executives that participated in the Data trust pulse survey by PricewaterhouseCoopers noted that unreliable data is one of the obstacles to monetizing data. “Much of a company’s historical data, acquired haphazardly, may lack the detail and demonstrable accuracy needed for use with AI and other advanced automation,” said the survey.

Since operational efficiency and, sometimes, people’s lives depend on whether the company uses reliable or unreliable information, it must consider and introduce the strategy of keeping data quality under control.

We mentioned the concept of data quality several times, so let’s explore it to the fullest. In this article, we’ll discuss what specialists take part in data quality activities and how they might contribute to the initiative. Also, we’ll explore techniques and tools for improving the health of data and maintaining it on the level that allows a company achieve its business goals.

What is data quality? Data quality dimensions

Data quality shows the extent to which data meets users’ standards of excellence or expectations. High-quality data can be easily processed and interpreted for various purposes, such as planning, reporting, decision making, or carrying out operating activities.

But what kind of data can we consider high quality? Views on what features make good quality data can differ not only across industries but also at the company level. Experts also suggest different combinations of data aspects and features to assess for quality evaluation. These measurable categories are called data quality dimensions.

In 1996, professors Richard Wang and Diane Strong described their conceptual framework for data quality in Beyond Accuracy: What Data Quality Means to Data Consumers. The research authors consider four data quality categories: intrinsic, contextual, representational, and accessibility. Each category includes several dimensions, 15 in total.Data Quality Management: Roles, Processes, Tools (2)

Data quality categories and their dimensions. Source: Beyond Accuracy: What Data Quality Means to Data Consumers


Data Quality Assessment Framework (DQAF) by the International Monetary Fund (IMF) considers five data quality dimensions:

  • Integrity – statistics are collected, processed, and disseminated based on the principle of objectivity.
  • Methodological soundness – statistics are created using internationally accepted guidelines, standards, or good practices.
  • Accuracy and reliability – source data used to compile statistics are timely, obtained from comprehensive data collection programs that consider country-specific conditions.
  • Serviceability – statistics are consistent within the dataset, over time, and with major datasets, as well as revisioned on a regular basis. Periodicity and timeliness of statistics follow internationally accepted dissemination standards.
  • Accessibility – data and metadata are presented in an understandable way, statistics are up-to-date and easily available. Users can get a timely and knowledgeable assistance.

The DQAF includes seven dataset-specific frameworks for evaluating and maintaining unified quality standards (national accounts, price index, monetary, the government finance statistics, and other types of statistics.)

To sum up expert opinions, the most commonly mentioned quality dimensions are:

Data Quality Management: Roles, Processes, Tools (3)

Critical data quality dimensions and features of data that meet their criteria

Data quality expert Laura Sebastian-Coleman, in Measuring Data Quality for Ongoing Improvement, notes that data quality dimensions function just like length, width, and height function to express the size of a physical object. “A set of data quality dimensions can be used to define expectations (the standards against which to measure) for the quality of the desired dataset, as well as to measure the condition of an existing dataset,” explains Sebastian-Coleman.

Data quality dimensions also allow for monitoring how the quality of data stored in various systems and/or across departments changes over time. These attributes are one of the building blocks for any data quality initiative. Once you know against what data quality dimensions you will evaluate your datasets, you can define metrics. For instance, duplicate records number or percentage will indicate the uniqueness of data.

Now let’s discuss the roadmap to implementing a data quality program in an organization.

Data quality management: how to implement and how it works

Data quality management (DQM) is a set of practices aimed at improving and maintaining the quality of data across a company’s business units. Data management specialist David Loshin underlines the continuous nature of DQM. The expert notes that the process includes a “virtuous cycle” that’s about ongoing observation, analysis, and improvement of information. The purpose of this cycle is to become proactive in controlling the health of data instead of fixing flaws once they are identified and dealing with the consequences of these flaws.

Data Quality Management: Roles, Processes, Tools (4)

The virtuous cycle of data quality management. Source: Business Intelligence

Let’s explore each of these five stages and processes that take place during each of them.

1. Defining the impact of poor data on performance via data quality assessment

First of all, the data quality analyst reviews data to find potential issues that cause delays in specific operations and, consequently, decrease revenue and influence margins. The qualitative data review gives a basic understanding of what data flaws have the big impact on business processes. Then the specialist/s outlines data quality requirements and critical data quality dimensions that will be in a company.

Next, the team starts data quality assessment via top-down and bottom-up approaches. The top-down approach allows for learning how employees create and use data and what data-related problems they face along the way, and which of them are the most critical. Data assessment also helps defining operations that are the most affected by poor-quality data.

The data quality analyst may examine how data is organized in databases, interview users personally, or organize surveys in which users can document issues.

The bottom-up approach uses statistical and data analysis tools and techniques, for instance, data profiling. Data profiling employs various statistical and analytical algorithms and business rules to explore content of datasets and characteristics of their data elements.There are three types of data profiling:

(Video) Data Management - Data Quality

  • Structure discovery (structure analysis) is used to learn whether data consistent and formatted correctly. Pattern matching is one of the ways to explore data record structure. Analysts can also check statistics in data, such as the minimum and maximum values, medians, means, or standard deviations to learn about the validity of data.
  • Content discovery entails examination of individual data records in a database to discover null or wrong values (incorrectly formatted).
  • Relationship discovery is about understanding interconnections between datasets, data records, database fields, or cells. Relationship discovery starts from metadata review. This analysis allows for spotting and eliminating such issues as duplicates, which may occur in non-aligned datasets.

Analysts then may consult about found data issues with domain experts.

2. Defining data quality rules and metrics

First, data quality analysts compile data assessment results focusing on data elements that seem critical based on the specific user’s needs. “The results of the empirical analysis will provide some types of measures that can be employed to assess the level of data quality within a particular business context,” notes David Loshin in The Practitioner’s Guide to Data Quality Improvement.

Then DQ analysts correlate business impacts to data flaws via defined business rules. That way, specialists define metrics they will use to ensure data is accurate enough and can be used for operational or analytical needs. They consult with data users on acceptability thresholds for metric scores. Data with metric scores that are lower than acceptability levels, doesn’t meet user expectations and must be improved to avoid negative impact on operations. Integrating acceptability thresholds with measurement methods allows for the framing of data quality metrics.

3. Defining data standards, metadata management standards, data validation rules

Once the impact of poor data is identified, data is examined, data quality rules and metrics are clear, the time comes to introduce techniques and activities on quality improvement. So, the goal of this stage is to document unified rules for data and metadata usage across the data lifecycle.

Data standards. Data quality standards are agreements on data entry, representation, formatting, and exchange used across the organization.

Metadata management standards. Policies and rules about metadata creation and maintenance are the baselines for successful data analytics initiatives and data governance. Metadata management standards can be grouped into three categories:

  • Business – the use of business terms and definitions in different business contexts, the use of acronyms; data security levels and privacy level settings.
  • Technical – structure, format, and rules for storing data (i.e., format and size for indexes, tables, and columns in databases, data models)
  • Operational – rules for using metadata describing events and objects during the ETL process (i.e., ETL load date, update date, confidence level indicator)

Please note that some practitioners consider operational metadata as a type the technical one.

Data validity rules. Data validity rules are used to evaluate data against inconsistencies. Developers write data validity rules and integrate into applications so that tools can identify mistakes even during data entry, for instance. Data validity rules enable proactive data quality management.

It’s also crucial to decide how to track data problems. Data quality issue tracking log provides information about flaws, their status, criticality, responsible employees, and includes the report notes. George Firican, director of data governance and BI at the University of British Columbia, has written an informative yet concise post, in which he advises on attributes to include in the log.

Another aspect to consider and approve is how to improve data. We’ll talk about them in the following section.

4. Implementing data quality and data management standards

During this step, the data quality team implements data quality standards and processes it documented before to manage the solid quality of data across its lifecycle.

The team may organize meetings to explain to employees the new data management rules or/and introduce a business glossary – a document with common terminology approved by stakeholders and managers.

Also, the data quality team members can train employees on how to use a data quality tool to perform remediation, whether it’s a custom or an-of-the-shelf solution.

5. Data monitoring and remediation

Data cleaning (remediation, preparation) entails detecting erroneous or incomplete records in data, removing or modifying them. There are many ways of performing data preparation: manually, automatically with data quality tools, as batch processing through scripting, via data migration, or using some of these methods together.

Data remediation includes numerous activities, such as:

  • Root cause analysis – identifying the source of erroneous data, reasons for the error to occur, isolating factors that contribute to the issue, and finding the solution.
  • Parsing and standardization – reviewing records in database tables against defined patterns, grammar, and representations to identify erroneous data values or values in the wrong fields and formatting them. For example, a data quality analyst may standardize values from different metric systems (lbs and kg), geographic record abbreviations (CA and US-CA).
  • Matching – identifying the same or similar entities in a dataset and merging them into one. Data matching is related to identity resolution and record linkage. The technique can be applied when joining datasets and when data from multiple sources were integrated into one destination (the ETL process). One uses identity resolution in datasets containing records about individuals to create a single view of the customer. Record linkage deals with records that may or may not refer to a common entity (i.e., database key, Social security number, URL) and which may be due to differences in record shape, storage location, or curator style or preference.
  • Enhancement – adding extra data from internal and external sources.
  • Monitoring – evaluating data in given intervals to ensure it can well serve its purposes.

And now, we need to find out what specialists would define metrics and standards to get data so good that it deserves a spot in perfectionist heaven, who would assess data, train other employees best practices, or who will be in charge of the strategy’s technical side.

(Video) Data Quality | Tutorial | Data Quality Management | Data Quality Assessment | Data Quality Issues

Data quality team: roles and responsibilities

Data quality is one of the aspects of data governance that aims at managing data in a way to gain the greatest value from it. A senior executive who is in charge of the data usage and governance on a company level is a chief data officer (CDO). The CDO is the one who must gather a data quality team.

The number of roles in a data quality team depends on the company size and, consequently, on the amount of data it manages. Generally, specialists with both technical and business backgrounds work together in a data quality team. Possible roles include:

Data owner – controls and manages the quality of a given dataset or several datasets, specifying data quality requirements. Data owners are generally senior executives representing the team’s business side.

Data consumer – a regular data user who defines data standards, reports on errors to the team members.

Data producer – captures data ensuring that data complies with data consumers’ quality requirements.

Data steward – usually is in charge of data content, context, and associated business rules. The specialist ensures employees follow documented standards and guidelines for data and metadata generation, access, and use. Data steward can also advise on how to improve existing data governance practices and may share responsibilities with a data custodian.

Data custodian – manages the technical environment of data maintenance and storage. The data custodian ensures the quality, integrity, and safety of data during ETL (extract, transform, and load) activities. Common job titles for data custodians are data modeler, database administrator (DBA), and an ETL developer that you can read about in our article

Data analyst – explores, assesses, summarizes data, and reports on the results to stakeholders.

Since a data analyst is one of the key roles within the data quality teams, let’s break down this person’s profile.

Data quality analyst: a multitasker

The data quality analyst’s duties may vary. The specialist may perform the data consumer’s duties, such as data standard definition and documentation, maintain the quality of data before it’s loaded into a data warehouse, which is usually the data custodian’s work. According to the analysis of job postings by an associate professor at the University of Arkansas at Little Rock Elizabeth Pierce and job descriptions we found online, the data quality analyst responsibilities may include:

  • Monitoring and reviewing the quality (accuracy, integrity) of data that users enter into company systems, data that are extracted, transformed and loaded into a data warehouse
  • Identifying the root cause of data issues and solving them
  • Measuring and reporting to management on data quality assessment results and ongoing data quality improvement
  • Establishing and oversight of service level agreements, communication protocols with data suppliers, and data quality assurance policies and procedures
  • Documenting the ROI of data quality activities.

Companies may require that the data quality analyst organizes and provides data quality training to employees, recommends actions for enhancing data fit for purpose. The specialist may also be in charge of ensuring compliance with the company’s data privacy policy.

It’s up to you how to assign duties across the data quality team. However, any team must include the person who manages the whole process, the one who does quality checks, manages data quality rules, develops data models, and a techie who maintains the flow of data and its storage across the organization.

Off-the-shelf data quality tools automate remediation and quality control with features like profiling, matching, metadata management, or monitoring. Businesses have a variety of options to choose from. Gartner included 15 providers in its Magic Quadrant for Data Quality Tools, seven of which became the leaders. Let’s explore several solutions by the best-rated vendors from Gartner’s point of view.

Data Quality Management: Roles, Processes, Tools (5)

Gartner includes 15 data quality tool providers into its Magic Quadrant

IBM InfoSphere Information Server for Data Quality: end-to-end for ongoing data monitoring and cleansing

IBM InfoSphere Information Server for Data Quality is one of four DQ products the vendor offers. It provides automated data monitoring and customizable cleansing in batches or real time. The solution detects data quality flaws and establishes a remediation plan based on metrics that are aligned with the user’s business goals. So, companies can define their own data quality rules.

(Video) 7 Principles Of Data Quality Management #dataquality

IBM InfoSphere Information Server for High Quality overview

The tool’s core features include:

  • Data profiling
  • DQ transformations: cleansing, matching, validation (i.e., flexible output table configuration for data validation rules, sequencing and impact analysis)
  • Customizable data standardization (i.e., data enrichment and data cleansing)
  • Data lineage maintenance — users can see what changes were made to data during its lifecycle
  • Data integration
  • Data classification (i.e., identifies the type of data contained in a column using three dozen predefined customized data classes)
  • Data quality assessment and cleansing activities within a Hadoop cluster

Сustomers can also benefit from FlexPoint licensing — get flexible access to IBM Unified Governance and Integration Platform.

The solution can be deployed on-premises or in the cloud. Pricing is available on demand. IBM offers information (video and interactive demos e-books) to help users learn about the solution’s capabilities.

Informatica Data Quality: automating data quality management for machine learning and AI

Informatica Data Quality uses a metadata-driven machine-learning-based approach to data quality management. One of the features the provider emphasizes is the tool’s flexibility in terms of workloads (web services, real time, batch, and big data), user roles (various business and IT), data types (transaction, IoT, third-party, product or supplier data), and use cases. Use cases include data governance, analytics, master data management, enterprise data lakes, etc.

Other key features of Informatica Data Quality are:

  • Automation of DQ critical tasks (data discovery) with the CLAIRE engine that uses ML and other AI techniques
  • Data profiling
  • DQ transformations: standardization, matching, enrichment, validation
  • Data integration
  • Rule builder for business analysts (building and testing without the help of IT members)
  • Pre-built, reusable common DQ business rules and accelerators (i.e., one rule can be used from tool to tool)
  • Managing exceptions (records that don’t meet data quality rules conditions).

Informatica Data Quality supports public cloud (i.e., AWS and Microsoft Azure) and on-premises deployment. One needs to contact the vendor to get pricing details.

Trillium DQ: flexible and scalable data quality platform for various use cases

Trillium DQ is a suite of enterprise-grade tools for data quality monitoring and management. It’s one of the six data quality solutions by Syncsort. It provides batch data quality but can scale to real-time and big data as well. Trillium DQ is also flexible regarding user roles, providing self-service capabilities for data stewards, business analysts, and other specialists. The platform supports numerous initiatives, such as data governance, migration, master data management, single customer view, eCommerce, fraud detection, etc.

The main Trillium DQ features include:

  • Data profiling
  • Pre-built or custom-built DQ transformations: data parsing, standardization, validation, matching, enrichment
  • Data linking
  • Data discovery (for internal and external sources)
  • Integration with custom and third-party applications via open-standards APIs
  • Integration with distributed architectures, such as Hadoop and Spark, Microsoft Dynamics, SAP, Amazon EMR service, and any hybrid environment for distributed platforms
  • Pre-built reports and scorecards

Users can choose between on-premises and cloud deployment. Pricing is available on demand.

You can always consider solutions by other leading providers like Oracle, SAS, Talend, and SAP or other vendors included in the Magic Quadrant. You can also explore listings by such peer-to-peer review sites as G2 or Capterra.

Demand for these packaged solutions is growing, especially amid a tremendous body of data generated daily and which must be harmonized. According to Gartner, the market for data quality software tools reached $1.61 billion in 2017, which is 11.6 percent more than in 2016.

Final word

There is a popular notion among experts that the data quality management strategy is the combination of people, processes, and tools. Once people can understand what makes a high-quality data in their specific industry and organization, what measures must be taken to ensure data can be monetized, and what tools can supplement and automate these measures and activities, the initiative would bring desired business outcomes.

Data quality dimensions serve the reference point for constructing data quality rules, metrics, defining data models and standards that all employees must follow from the moment they enter a record into a system or extract a dataset from third-party sources.

Ongoing monitoring, interpretation, and enhancement of data is another essential requirement that can turn reactive data quality management into a proactive one. Since everything goes in circles, let this be the circle of managing high-quality, valuable data.

(Video) The 7 Quality Control (QC) Tools Explained with an Example!


What is data quality management tools? ›

What Are Data Quality Management (DQM) Tools? Data quality management (DQM) tools are technologies used to identify, comprehend, and fix any flaws in data. DQM tools support business decision-making and business processes for efficient data governance.

What are data quality roles? ›

Data quality managers give advice on how quality systems are managed, as well as measuring and reporting performance against approved standards. They often set policy standards and may be a CRM manager, marketing operations professional, Salesforce administrator, or even an outside partner.

What are the four 4 key elements of quality management process? ›

When broken down, quality control management can be segmented into four key components to be effective: quality planning, quality control, quality assurance, and quality improvement.

What are the 4 types of quality management? ›

2.3. 5 Development and Certification of the QMS. Four types of quality processes are prominent in many industries: ISO 9001, AS9100, Six Sigma, and CMMI.

What are the 7 TQM tools? ›

These seven basic quality control tools, which introduced by Dr. Ishikawa, are : 1) Check sheets; 2) Graphs (Trend Analysis); 3) Histograms; 4) Pareto charts; 5) Cause-and-effect diagrams; 6) Scatter diagrams; 7) Control charts.

What are the 3 tools in data collection? ›

Tools used to gather data include case studies, checklists, interviews, occasionally observation, surveys, and questionnaires.

What are the 7 aspects of data quality? ›

How can you assess your data quality? Data quality meets six dimensions: accuracy, completeness, consistency, timeliness, validity, and uniqueness.

What are top 3 skills for data quality officer? ›

The following data quality skills will be required for this role: Data profiling. Data discovery. Information chain analysis and management.

What are the 3 key roles of data governance? ›

A good data governance program typically includes the steering committee with three main groups: data owners, data stewards, and data custodians. The three positions all work together to create the policies, process, and procedures for governing data, especially the reference data and master data elements.

What are the 5 quality procedures? ›

Quality procedures include: quality manual, procedure for the control of documents, procedure for the control of records, procedure for the performance of internal audits, procedure for the control of nonconformity, and procedure for the for integrating and controlling corrective action and preventive action.

What are the 4 basic components of a process? ›

These four critical items are: (1) Process Definition, (2) Process and Activity Roles, (3) Available Tools and (4) Training.

What are the 3 phases of quality management? ›

Quality management involves typically three phases – Quality Planning, Quality Assurance and Quality Control.

What are the 3 major project quality management processes? ›

There are three main processes of project quality management. These are Quality Planning, Quality Assurance, and Quality Control.

What are the 8 principles of quality management? ›

The 8 principles of QMS
  • Principle 1: customer focus. ...
  • Principle 2: leadership. ...
  • Principle 3: people involvement. ...
  • Principle 4: process approach. ...
  • Principle 5: systematic approach to management. ...
  • Principle 6: continual improvement. ...
  • Principle 7: factual approach to decision making. ...
  • Principle 8: mutually beneficial supplier relations.
13 Oct 2018

What are the new 7 QC tools? ›

Using 7 QC Tools For Quality Improvement and Customer Satisfaction
  • Histogram.
  • Cause and Effect Diagram.
  • Check Sheets.
  • Pareto Diagram.
  • Graphs.
  • Control Charts.
  • Scatter Diagram.
3 Aug 2022

Why are 7 quality tools used for? ›

The 7 QC tools are basic graphical representations of your data. They can be used to answer a wide variety of questions about your data and your process.

What are data tools? ›

Data Tools means Software which, among other features, configures and runs Data Pipelines and collects Configuration Data. As of the Effective Date, Data Tools include StreamSets Data Collector and Transformer.

What are the 4 methods of data collection? ›

Data may be grouped into four main types based on methods for collection: observational, experimental, simulation, and derived.

What are the 10 characteristics of data quality? ›

Terms in this set (10)
  • Accuracy. The Data is Correct. ...
  • Accessibility. The Data is easily obtained. ...
  • Consistency. The data is reliable. ...
  • Comprehensiveness. The required data is included. ...
  • Currency. The data is up to date. ...
  • Definition. The data and information in the health record are clearly defined. ...
  • Granularity. ...
  • Relevancy.

What are the 8 dimensions of data quality? ›

Garvin has developed a framework encompassing eight dimensions of quality: performance, features, reliability, conformance, durability, serviceability, aesthetics, and perceived quality (Garvin, 1988).

What are the 4 domains of the data quality model? ›

Application: The purpose for the data collection. Collection: The processes by which data elements are accumulated. Warehousing: Processes and systems used to archive data. Analysis: The process of translating data into meaningful information.

What are the 5 roles of a quality supervisor? ›

Quality Supervisor Job Duties

Conducting regular audits of production lines or departments to ensure compliance with company standards. Monitoring employee attendance and taking disciplinary action if necessary. Managing projects involving a team of employees working on specific tasks towards common goals.

What is the primary role of a data quality manager? ›

Job purpose

The Data Quality Manager will be the in-house data expert, comfortable working with data and business processes to enable business opportunities and enhance the member and customer experience.

What tools do data analysts use? ›

2) What Tools Do Data Analysts Use?
  • Business intelligence tools. BI tools are one of the most represented means of performing data analysis. ...
  • Statistical Analysis Tools. ...
  • General-purpose programming languages. ...
  • SQL consoles. ...
  • Standalone predictive analytics tools. ...
  • Data modeling tools. ...
  • ETL tools. ...
  • Automation Tools.

What are the 4 pillars of data governance? ›

There are four pillars to the data governance framework to enable organizations to get the most out of their data.
  • Identify distinct use cases. ...
  • Quantify value. ...
  • Improve data capabilities. ...
  • Develop a scalable delivery model.
9 Mar 2022

What is a data management role? ›

A data manager is responsible for developing, overseeing, organizing, storing, and analyzing data and data systems. A data manager ensures that all of this is always done with the utmost security and confidentiality, and in a timely manner.

What are the different roles in data? ›

In this article, you have learned about three major roles that can be present on a data team: the data engineer, data analyst, and data scientist.

What are 6 mandatory quality procedures? ›

Six procedure are- Control of Documents, Control of Records, Internal Audit, Corrective Action, Preventive Action, Control of Non Conforming Products."
  • Control of Documents.
  • Control of Records.
  • Internal Audit.
  • Corrective Action.
  • Preventive Action.
  • Control of Non Conforming Products. Comments (0)

What are the 6 types of quality control? ›

There are many different approaches to QC, with some of the more popular methodologies being statistical quality control (SQC), statistical process control (SPC), total quality management (TQM), Six Sigma, and Lean Six Sigma. Every approach, however, is based around the regular inspection of goods.

What are the five pillars of quality? ›

These five pillars of quality are (i) customer focus, (ii) total involvement of employees (iii) measurement, (iv) systematic support, and (v) continuous improvement. These five pillars provide the organization with the quality advantage.

What are 4 main elements of quality? ›

It has four main components: quality planning, quality assurance, quality control and quality improvement.

What are the 4 types of processes? ›

The main manufacturing process types are project, jobbing, batch, line and continuous. Project processes produce products of high variety and low volume. A feature of a project process is that the location of the product is stationary.

What are the 7 QC tools explain any one in brief? ›

Keywords: Seven QC Tools; Check Sheet; Histogram; Pareto Analysis; Fishbone Diagram; Scatter Diagram; Flowcharts, and Control Charts. There are seven basic quality tools, which can assist an organization for problem solving and process improvements.

What are the types of quality process? ›

There are four types of quality control: process control, acceptance sampling, control charts, and product quality control.

› blog › main-components-qu... ›

Quality Control Planning · What your stakeholder's expectations and priorities are, if applicable · What your company's definition of succ...
Quality control methods are strategic procedures that ensure the maintenance or improvement of a product's quality. Generally, these processes include train...
Quality planning means how to fulfill the process and deliverable products quality requirements. It helps the organization to schedule all of the tasks needed t...

Why do we need data quality tools? ›

Data Quality tools can help to make data more trustworthy and more manageable. Inaccurate data promotes poor decision-making, missed opportunities, and lower profits. As use of the cloud continues to grow and become more complex, Data Quality has become a critical issue.

What is the best quality management tool? ›

Top 10 Quality Management Systems (QMS)
  • Greenlight Guru.
  • Qualio.
  • MasterControl.
  • SAP.
  • Arena PLM & QMS.
  • EASE.
  • ETQ Reliance QMS.
  • Intellect.

What are the quality management tools and techniques explain? ›

There are seven basic quality tools identified as appropriate for use in both the quality management plan and control quality processes. They are known as Ishikawa's seven basic tools of quality: cause-and-effect diagrams, flowcharting, check sheets, Pareto diagrams, control charts, histograms and scatter diagrams.

What are data tools? ›

Data Tools means Software which, among other features, configures and runs Data Pipelines and collects Configuration Data. As of the Effective Date, Data Tools include StreamSets Data Collector and Transformer.

Why are the 7 quality tools important? ›

Utilizing the 7 QC tools in six sigma or quality management process helps in taking a systematic approach to identify and understand the risk, assess the risk, control fluctuation of product quality and accordingly provide solutions to avoid future defects.

What is the purpose of basic 7 tools of quality? ›

The seven basic tools of quality are a fixed set of visual exercises identified as being most helpful in troubleshooting issues related to quality.

What is data quality Framework? ›

The Data Quality Framework (DQF) provides an industry-developed best practices guide for the improvement of data quality and allows companies to better leverage their data quality programmes and to ensure a continuously-improving cycle for the generation of master data.

Which of the following are 3 basic tools of quality? ›

These seven basic quality control tools, which introduced by Dr. Ishikawa, are : 1) Check sheets; 2) Graphs (Trend Analysis); 3) Histograms; 4) Pareto charts; 5) Cause-and-effect diagrams; 6) Scatter diagrams; 7) Control charts.

What are 2 tools you can use to manage quality? ›

7 Management Tools For Quality Control
  • Flowchart. Most of us are familiar with flowcharts. ...
  • Check Sheet. A check sheet is a basic quality tool that is used to collect data. ...
  • Cause and Effect (fish bone) Diagram. ...
  • Pareto Chart. ...
  • Control Charts. ...
  • Histograms. ...
  • Scatter Diagrams.
8 May 2019

What are the 5 types of quality control? ›

The four types of quality control are process control, control charts, acceptance sampling, and product quality control.

What are the three processes of quality management? ›

There are three main processes of project quality management. These are Quality Planning, Quality Assurance, and Quality Control.

What are the 4 methods of data collection? ›

Data may be grouped into four main types based on methods for collection: observational, experimental, simulation, and derived.

What are the 5 types of data analytics? ›

Four main types of data analytics
  • Predictive data analytics. Predictive analytics may be the most commonly used category of data analytics. ...
  • Prescriptive data analytics. ...
  • Diagnostic data analytics. ...
  • Descriptive data analytics.


1. Developing an Automated Data Quality Process
(Salesforce Developers)
2. Data Quality Myth #4: Data Quality Tools #dataquality
(Lights OnData)
3. Quality (Part 1: Statistical Process Control)
(Infinity MFG)
4. Implementing Effective Data Quality
5. What is Data Governance?
6. What is Data Quality Management? Why is it Important? How do we Improve Data Quality?
(Tracey Smith - The Voice for Practical Analytics)
Top Articles
Latest Posts
Article information

Author: Fredrick Kertzmann

Last Updated: 03/08/2023

Views: 5936

Rating: 4.6 / 5 (66 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Fredrick Kertzmann

Birthday: 2000-04-29

Address: Apt. 203 613 Huels Gateway, Ralphtown, LA 40204

Phone: +2135150832870

Job: Regional Design Producer

Hobby: Nordic skating, Lacemaking, Mountain biking, Rowing, Gardening, Water sports, role-playing games

Introduction: My name is Fredrick Kertzmann, I am a gleaming, encouraging, inexpensive, thankful, tender, quaint, precious person who loves writing and wants to share my knowledge and understanding with you.