Posted by: bbannan | March 7, 2012

Strategies for Ensuring that You’re Asking the Right Survey Questions – Sarah

This semester, my ISD Team (Group 5) knew immediately that we wanted to conduct a survey as part of our user research plan to refine our prototype. We had wanted to administer a survey last semester, but didn’t have enough time. With EDIT 752 focusing on user research, we were excited to finally have the opportunity. We started by brainstorming some research questions that we wanted our survey to answer and then drafted some potential survey questions. Our survey questions all seemed to be appropriate for our target audience and would provide valuable information that we could use to revise our prototype. However, they weren’t clearly (beyond a shadow of doubt) in line with our initially proposed research questions. This made me wonder whether a survey was really the best method for answering our research questions, or whether we were simply planning to do a survey because we just really wanted to do a survey.

As an ISD, I want all elements of my designs to fit in pretty little, clearly labeled packages. Obviously, I want my evaluation methods to align with my learning objectives, no matter how informal those objectives may be. At my day job, my customers become somewhat defensive when I ask them to explain why the learner needs to know particular content that has “always” been included in a course that I am redesigning. By asking the customer to explain this to me, I am not suggesting that the learner doesn’t really need to know the information in question. On the contrary, I need to understand the why so that I can determine the most effective how—that is, how to help the learner… well, learn.

After reading Chapter 11 of Kuniavsky’s (2003) Observing the User Experience: A Practitioner’s Guide to User Research, I realize that designing an effective survey is a lot like designing effective instruction. Surveys can be an extremely efficient and effective way to collect information from a large group of users in order to refine your user profile. That being said, insufficient planning can be disastrous. As Kuniavsky points out, poorly designed surveys “can ask the wrong people the wrong questions, producing results that are inaccurate, inconclusive, or at worst, deceptive” (p. 304). As ISDs, we witness all too often how insufficient planning and analysis usually produces ineffective training.

In terms of planning our user survey, Group 5 started off on the right track by first identifying our research goals and associated research questions. But how can we ensure that the survey questions we initially came up with are aligned with these goals and objectives?

Hypothetical Tables

After you’ve identified the survey questions you would like to ask, Kuniavsky suggests creating a grid to map each survey question to the instructions you intend to provide to the respondent, possible answers, and the reason for asking the question (p. 311). The latter is of utmost importance because researchers need to be able to justify the inclusion of every single question in their survey. In other words, researchers need to be able to express exactly why they are asking each question and what they intend to do with the resulting data. This includes identifying how the researchers intend to analyze the data.

Brace (2008) warns researchers to include only those “questions that are relevant to the objectives and not [to] be tempted to ask questions of areas that might be of interest but not relevant to the objectives.  To do so is to waste resources in terms of the time of everyone involved, including the respondents, and to spend money unnecessarily.” Hypothetical tables, force the researcher to identify each potential survey question as either “need to know” or “nice to know.” Salent and Dillman (1994) assert, “I ‘Need to know’ is a critical criterion for every item on a questionnaire.”

Mock-Up Reports

Kuniavsky also recommends writing a mock-up survey report prior to administering the survey. The report should include the research goals, methodology, design description, sampling information, response rate, and data analysis with appropriate tables (p. 323). Clearly, it is impossible to include the actual data at this point, but including hypothesized results can help researchers ensure that they are asking respondents the appropriate survey questions to answer their research questions (p. 323). 

Decision Trees

Salant and Dillman (1994) advise researchers to think about surveys as a means to solving a problem. Once you identify the problem that needs to be solved, the next step is to determine what new information you need to solve it. Salant and Dillman encourage researchers to consider whether they really need to obtain new data in order to solve their problem. Re-evaluating the problem may help them see that a “survey won’t help them solve the problem they are trying to address.” Perhaps a different research strategy would better address the problem, or perhaps researchers really need to look at existing data in new ways.

Halteman (2011) reminds survey writers, “you owe it to your respondents to only ask questions from which the resulting data will be used to take action or make a decision.” Halteman explains, “The two most common types of unnecessary questions are asking about something that has already been decided and asking about things over which you have no control.”  In other words, if you don’t plan to do something with the data, or if you don’t have the authority to act on it, then it is a waste of resources (both yours and the respondents’) to ask the question.

Like Salent and Dillman, Vanek (2007) advises researchers to resist the urge to include “nice to know” questions in their surveys. Although “Learning for the sake of learning is admirable,” Vanek stresses that “learning something with the intent to act on it is far more practical.” To ensure that your survey questions are actionable, Vanek suggests developing a decision tree to “outline what actions will be most effective based on your data.” Having a well-thought-out action plan prior to administering the survey will also enable you to act more quickly and confidently once you’ve obtained your data.

Some Final Thoughts

After reading Kuniavksy Chapter 11, I’m pretty sure that Group 5 isn’t designing a survey simply for the sake of designing a survey. We have spent several months analyzing our target audience, our customer’s missions and goals, communicating with stakeholders, and conducting comparative analyses, and our instincts tell us that a survey is the way to go. Even so, we have decided to take a step back to re-evaluate our survey goals and perhaps attempt some of the strategies described here. We know that instincts alone aren’t enough—that we need to be able to articulate precisely what problem our survey data will help us solve, how each survey question ties back to our research questions, and what we plan to do with the data.

Those customers who become defensive when asked to explain why their learners need to know specific content usually have pretty good instincts, too. There’s usually a very good reason why that content has “always” been included in their course. The ISD’s job is to help the customer express the why so that the how produces the desired learning outcomes.

References

Brace, I. (2008). Questionnaire design: How to plan, structure and write survey material for effective

market research. (2nd ed.) [Books24x7 version] Retrieved from http://common.books24x7.com/
toc.aspx?bookid=28480.

Halteman, E. (2011, November 10). 10 common mistakes made when writing surveys—Part 2.

SurveyGizmo. Retrieved on March 5, 2012, from http://www.surveygizmo.com/survey-blog/10-common-survey-mistakes-part-2/

Kuniavsky, M. (2003). Observing the user experience: A practitioner’s guide to user research.

San Francisco: Morgan Kaufmann.

Salant, P. & Dillman, D. A. (1994). How to conduct your own survey. [Books24x7 version] Retrieved from

http://common.books24x7.com/toc.aspx?bookid=4863

Vanek, C. (2007, May 11) What is a successful survey project? (Hint: It’s not just the data).

SurveyGizmo. Retrieved March 5, 2012, from http://www.surveygizmo.com/survey-blog/what-is-a-successful-survey-project-hint-it%e2%80%99s-not-just-the-data/

 

Advertisements

Responses

  1. Great post, Sarah. I found the comment from Vanek about creating a decision tree espcially useful. Not only will knowing what you will do with the data help save time when you get it, it may help you realize that you don’t need it in the first place. We’ve all been bombarded with surveys througout our lives so, naturally, surveys are probably one of the first types of data gathering techniques that come to mind when developing a researcah plan. Furthermore, free online software has made it incredibly easy to administer surveys. However, they should be approached like any other part of the research plan, answering the question, “will it answer relevant questions?” Unfortunately, most surveys are probably administered quickly, poorly, and unnecessarily.

  2. Oops, the above post is from Ryan Gibbens.

  3. Sarah – As our group is really digging into round 2, we’re putting together our survey questions and your post was a great reminder of how valuable surveys can be. It is incredibly easy to throw a survey together, but when time is spent crafting questions that really get to the heart of the matter, the survey results uncover data that can really improve the user experience. Thank you for your post.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Categories

%d bloggers like this: