Using Quality Checks to Improve Your Survey Data

Using Quality Checks to Improve Your Survey Data

완독 시간: 4 분

A survey is only as good as its respondents’ answers. Respondents can become disengaged, lose interest, and give poor answers simply to get through a survey. That’s why it’s a smart practice to build quality checks into your survey.

Quality checks are measures written or programmed into a survey that trigger a flag for further review of a respondent’s inputs. A quality check can be as simple as a flag that indicates someone has completed the survey too quickly, or more complex, like a programmed variable that can notify you if a respondent has provided inconsistent responses. These quality checks can be useful to ensure that the respondent is remaining engaged, taking the appropriate amount of time to complete each question, and providing consistent responses.

A best practice is to follow the “baseball rule” when analyzing respondents who have tripped quality check flags. If a respondent fails any three quality checks, they should automatically be removed from the data set. A respondent who fails only one or two of the checks should be subjected to additional review.

The number of quality checks to use will depend on the length of your survey. A survey with a median length of interview (LOI) of about 10 minutes should have only around three quality checks. If your survey is closer to a 20-minute LOI, then we recommend five to seven quality checks to confirm your respondents are still engaged as they get deeper into the survey.

While quality checks are useful in protecting the quality of your data, they should not be the only line of defense. A tight screening section will likely lead to fewer quality-check flags. A sample provider who has a relationship with their panelists can also follow up with the respondent to get clarity on flagged responses.

What Are the Main Types of Quality Checks?

Quality checks are simple to incorporate into your survey and assist in safeguarding your data quality to give you higher confidence in your final data set.

  • Knowledge Checks: These help ensure each respondent is familiar with and educated on a topic. Content knowledge checks can include asking a respondent to pick correct definitions, define various acronyms, or provide other information. These should be included in a screening  section.
  • Speeder Flag: A respondent who completes a survey too quickly can mean that the person is not reading and thoughtfully answering the questions. Speeder checks can help by flagging any users who finish the survey within a chosen time frame. The industry standard for speeder checks would flag anything less than one-third the median length of a survey.
  • Conflicting-Answer Flag: You may want to check important data for accuracy in your survey. Checking for conflicting answers can mean asking the same question twice or asking similar questions looking for conflicting answers. However, sometimes respondents don’t remember the exact answers they’ve already provided, so it may be helpful to reference their previous answer in a later question.
  • Red Herrings: These are like content knowledge checks but are used to confirm a respondent’s familiarity with a subject or industry. You can test for this by inserting a red herring option in your question (e.g., adding a fake vendor to a list of real companies). Be careful not to make your fake options sound close to an actual one.
  • Straight-Lining Flag: This flag is triggered when a respondent selects a majority of options in a multiselect question or continually selects an individual answer on a grid.  Straight-lining can be a result of fatigue; therefore, it is heavily encouraged to design the survey to minimize the opportunity for this to occur.
  • Open-End Validation: This check looks for duplicate answers, frequent misspellings, irrelevant answers, and gibberish entered in open-ended fields. While these checks don’t evaluate the quality of the responses, they can alert you to respondents who are not mindfully answering open-ended  questions.

Building flags like these into a survey helps ensure that the time and money you spend conducting a survey produces useful, actionable results. Combined with well-written screener questions and an expert panel that matches your needs, a quantitative survey will produce powerful results.


GLG Surveys builds quality checks into every survey and our Quality Review Team assesses respondents and responses to identify potentially unqualified answers that could skew your data. What would you like to learn from your panel? Get in touch to start surveying for insight now.

Check out the other articles in our Survey Series: