In a standard report, do you know why and how to fix if the number of rankings exceeds the actual number of responses to that question?


We can tell by looking at individual responses that 8 people responded to a particular question.  However, the standard report indicates there were 9 rankings of the answers to that question.  If only 8 people responded, how did the report show that 9 people selected a particular answer?  This has happened more than once.  Is it something we are fundamentally misunderstanding about how Survey Gizmo extrapolates the information for the report?  It’s skewing our results.  How do I fix this?

jobs answered

    It is hard to tell without being able to see your survey, so I have to ask some questions.

    Is it possible that the extra response is a “test” response?  The default setting for standard reports is to include TEST responses, so you might need to change that.

    Are you using a checkbox type question for this particular question?  This would allow a respondent to select more than one response.  A radio-button type question would be the proper choice for this type of question. 

    Have you considered exporting your response data to an Excel spreadsheet?  This will provide you with a better way to find problems in your response data.

    Jim W (Moderator) edited answer
      jobs 13 Rep.

      All the questions were in a “drag and drop” format.  The respondents were supposed to rank their answers from most (1) to least (10) important.  Most of the respondents ranked all the answers from 1-10, but some people didn’t.  They may have only picked the top five most important.  That’s actually okay, but that’s where you see the discrepancy in how many ranked the answers and how many the survey results show.  We know from the individual responses that eight people answered one particular question, but the survey results show there are nine rankings for a particular answer.  

      I checked the survey results.  Under “Filtered” on the Individual Responses tab, test responses is unchecked.  Also, I had deleted the test responses before running the report.  We had 12 respondents, and that checks out with most of the report.  

      We have put the information in a spreadsheet.  That’s how we noticed the discrepancy.  We’re just now sure why the discrepancy happened, and if that means that at least some of our results are skewed.  

      Thanks for your help.

      Jim W (Moderator) commented
        • My guess is that the discrepancy was caused by a respondent using the Back button of their browser to return to a page and answer the question again. Because they have bypassed the survey’s Back button,their second response is posted in addition to their first. This is easy to catch in radio-button type questions – when you look at the export data you can see two responses separated by a comma within a single cell. In other question types it is just recorded as two different responses.
          Unfortunately there is no way to stop the respondent from doing this, but in my experience it is rare.



        Question stats

        • Active
        • Views3810 times
        • Answers2 answers
        • Followers1 follower