GUEST POST by LAUREL NORRIS: Robust Responses to Open-Ended Questions: Good Surveys Prime Respondents to Think Critically

This is a guest post by Laurel Norris (https://twitter.com/neutrinosky).

Laurel is a Training Specialist at Widen Enterprises, where she is involved in developing and delivering training, focusing on data, reporting, and strategy.

--------------------------------------------------

Robust Responses to Open-Ended Questions: Good Surveys Prime Respondents to Think Critically

By Laurel Norris

--------

I’ve always been a fan of evaluation. It’s a way to better understand the effectiveness of programs, determine if learning objectives are being met, and reveal ways to improve web workshops and live trainings.

Or so I thought.

It turns out that most evaluations don’t do those things. Performance-Focused Smile Sheets (the book is available at http://SmileSheets.com) taught me that and when I implemented the recommendations from the book, I discovered something interesting. Using Dr. Thalheimer’s method improved the quality and usefulness of survey data – and provided me with much more robust responses to open-ended questions.

By more robust, I mean they revealed what was helpful and why, talked about what they thought their challenges would be in trying it themselves, discussed what areas they thought could use more emphasis, and shared where they would have appreciated more examples. In short, they provided a huge amount of useful information.

Bigstock--187668292

Before using Dr. Thalheimer’s method, only a few open-ended responses were helpful. Most were along the lines of “Thanks!”, “Good webinar”, or “Well presented”. While those kinds of answers make me feel good, they don’t help me improve trainings.

I’m convinced that the improved survey primed people to be more engaged with the evaluation process and enabled them to easily provide useful information to me.

So what did I do differently? I’ll use real examples from web workshops I conducted. Both workshops ran around 45 minutes and had 30 responses to the end of workshop survey. They did differ in style, something that I will discuss towards the end of this article.

 

The Old Method

Let’s talk about the old method, what Dr. Thalheimer might call a traditional smile sheet. It was (luckily) short, with three multiple choice questions and two open-ended. Multiple choice questions included:

  • How satisfied are you with the content of this web workshop?
  • How satisfied are you with the presenter's style?
  • How closely did this web workshop align with your expectations?

Participants answered the questions with options on Likert-like scales ranging from “Very Unsatisfied” to “Very Satisfied” or “Not at all Closely” to “Very Closely”. Of course, in true smile-sheet style, the multiple choice yielded no useful information. People were 4.1 level satisfied with the content of the webinar, “data” which did not enable me to make any useful changes to the information I provided.

Open-ended questions invited people to “Share your ideas for web workshop topics” and offer “Additional Comments”. Of the thirteen open-ended responses I got, five of them provided useful information. The other seven were either a thank you or some form of praise.

 

The New Method

Respondents were asked four multiple choice questions that judged effectiveness of the web workshop, how much the concepts would help them improve work outcomes, how well they understand the concepts taught, and whether or not they would use skills they learned in the workshop at their job.

The web workshop was about user engagement, in particular, administrators increasing engagement with the systems they manage. Questions were:

  • In regard to the user engagement, how able are you to put what you’ve learned into practice on the job?
  • From your perspective, how valuable are the concepts taught in the workshop? How much will they help improve engagement with your site?
  • How well do you feel you understand user engagement?
  • How motivated will you be to utilize these user engagement skills at your work?

Responses were specific and adapted from Dr. Thalheimer’s book. For example, here were the optional responses to the question “In regard to the user engagement, how able are you to put what you’ve learned into practice on the job?”

  • I'm not at all able to put the concepts into practice.
  • I have general awareness of the concepts taught, but I will need more training or practice to complete user engagement projects.
  • I am able to work on user engagement projects, but I'll need more hands-on experience to be fully competent in using the concepts taught.
  • I am able to complete user engagement projects at a fully competent level in using the concepts taught.
  • I am able to complete user engagement projects at a expert level in using the concepts taught.

All four multiple choice questions had similarly complete options to choose from. From those responses, I was able to more appropriately determine the effectiveness of the workshop and whether my training content was performing as expected.

The open-ended question was relatively bland. I asked “What else would you like to share about your experience during the webinar today?” and received twelve specific, illuminating responses, such as:

“Loved the examples shown from other sites. Highly useful!”

“It validated some of the meetings I have had with my manager about user engagement and communication about our new site structure. It will be valuable for upcoming projects about asset distribution throughout the company.”

“I think the emphasis on planning the plan is helpful. I think I lack confidence in designing desk drops for Design teams. Also - I'm heavily engaged with my users now as it is - I am reached out to multiple times per day...but I think some of these suggestions will be valuable for more precision in those engagements.”

Even questions that didn’t give me direct feedback on the workshop, like “Still implementing our site, so a lot of today's content isn't yet relevant”, gave me information about my audience.

 

Conclusion

Clearly, I’m thrilled with the kind of information I am getting from using Dr. Thalheimer’s methods. I get useful, rich data from respondents that helps me better evaluate my content and understand my audience.

There is one positive aspect of using the new method that might have skewed the data. I designed the second web workshop after I read the book, and Dr. Thalheimer’s Training Effectiveness Taxonomy influenced the design. I thought more about the goals for the workshop, provided cognitive supports, repeated key messages, and did some situation-action triggering.

Based on those changes, the second web workshop was probably better than the first and it’s possible that the high-quality, engaging workshop contributed to the robust responses to open-ended questions I saw.

Either way, my evaluations (and learner experiences) are revolutionized. Has anyone seen a similar improvement in open-ended response rates since implementing performance-focused smile sheets?

 

Another Reason to Learn About Performance-Focused Smile Sheets

This has been a great year for the Performance-Focused Smile Sheet approach. Not only did the book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form, win a prestigious Award of Excellence from the International Society of Performance Improvement, but people are flocking to workshops, conference sessions, and webinars to learn about this revolutionary new method of gathering learner feedback.

Now there's even more reason to learn about this method. In the July 2017 issue of TD (Talent Development), it was reported that the Human Capital Institute (HCI) issued a report that said that measurement/evaluation is the top skill needed by learning and development professionals!

Go to SmileSheets.com to get the book.

Using Performance-Focused Smile Sheet Questions — Even One!

Mark Jenkins, a long-time technical trainer and forward-thinking learning professional, now providing concierge service on learning technologies, audio-video, and OneNote at inforivers, recently used a performance-focused smile-sheet question at the end of one of his public training sessions. He used just one question! And one follow-up open-ended question.

Mark loves the results he's getting. Now he gets very clear feedback on whether his workshop helps people actually do what he wants them to be able to do. And, he is able to judge their interest in another session on the same subject matter, all on a one-page smile sheet.

"I was surprised the liberation I felt not being shackled down by Likert scales, while still getting good and easily understandable analytics. The results are less ambiguous than a Likert scale. It also helps me to figure out how to follow up with each person that took the smile sheet."

His feedback form, shared with his permission:

Jenkins Smile Sheet

Free Online Diagnostic for Your Organization’s Smile Sheet (Learner-Feedback Questions)

Are Your Smile Sheets Giving You Good Data Larger

In honor of April as "Smile-Sheet Awareness Month," I am releasing a brand new smile-sheet diagnostic.

Available by clicking here:  
http://smilesheets.com/smile-sheet-diagnostic-survey/

This diagnostic is based on wisdom from my award-winning book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form, plus the experience I've gained helping top companies implement new measurement practices.

The diagnostic is free and asks you 20 questions about your organization's current practices. It then provides instant feedback.

 

 

Want to Diagnose Your Organization’s Smile Sheets for FREE?

We are coming up to the one-year anniversary of my book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form. To celebrate, I've created a diagnostic to help you and your organization give yourselves a robust Smile-Sheet Checkup.

To access the diagnostic instrument, go to the diagnostic page on the book's website.

SmileSheetDiagnostic

Providing Feedback on Quiz Questions — Yes or No?

I was asked today the following question from a learning professional in a large company:

It will come as no surprise that we create a great deal of mandatory/regulatory required eLearning here. All of these eLearning interventions have a final assessment that the learner must pass at 80% to be marked as completed; in addition to viewing all the course content as well. The question is around feedback for those assessment questions. 

  • One faction says no feedback at all, just a score at the end and the opportunity to revisit any section of the course before retaking the assessment.

  • Another faction says to tell them correct or incorrect after they submit their answer for each question.

  • And a third faction argues that we should give them detailed feedback beyond just correct/incorrect for each question. 

Which approach do you recommend? 

Bigstock-Results-Information-Homepage-E-123859796


Here is what I wrote in response:

It all depends on what you’re trying to accomplish…

If this is a high-stakes assessment you may want to protect the integrity of your questions. In such a case, you’d have a large pool of questions and you’d protect the answer choices by not divulging them. You may even have proctored assessments, for example, having the respondent turn on their web camera and submit their video image along with the test results. Also, you wouldn’t give feedback because you’d be concerned that students would share the questions and answers.

If this is largely a test to give feedback to the learners—and to support them in remembering and performance—you’d not only give them detailed feedback, but you’d retest them after a few days or more to reinforce their learning. You might even follow-up to see how well they’ve been able to apply what they’ve learned on the job.

We can imagine a continuum between these two points where you might seek a balance between a focus on learning and a focus on assessment.

This may be a question for the lawyers, not just for us as learning professionals. If these courses are being provided to meet certain legal requirements, it may be most important to consider what might happen in the legal domain. Personally, I think the law may be behind learning science. Based on talking with clients over many years, it seems that lawyers and regulators often recommend learning designs and assessments that do NOT make sense from a learning standpoint. For example, lawyers tell companies that teaching a compliance topic once a year will be sufficient -- when we know that people forget and may need to be reminded.

In the learning-assessment domain, lawyers and regulators may say that it is acceptable to provide a quiz with no feedback. They are focused on having a defensible assessment. This may be the advice you should follow given current laws and regulations. However, this seems ultimately indefensible from a learning standpoint. Couldn't a litigant argue that the organization did NOT do everything they could to support the employee in learning -- if the organization didn't provide feedback on quiz questions? This seems a pretty straightforward argument -- and one that I would testify to in a court of law (if I was asked).

By the way, how do you know 80% is the right cutoff point? Most people use an arbitrary cutoff point, but then you don’t really know what it means.

Also, are your questions good questions? Do they ask people to make decisions set in realistic scenarios? Do they provide plausible answer choices (even for incorrect choices)? Are they focused on high-priority information?

Do the questions and the cutoff point truly differentiate between competence and lack of competence?

Are the questions asked after a substantial delay -- so that you know you are measuring the learners' ability to remember?

Bottom line: Decision-making around learning assessments is more complicated than it looks.

Note: I am available to help organizations sort this out... yet, as one may ascertain from my answer here, there are no clear recipes. It comes down to judgment and goals.

If your goal is learning, you probably should provide feedback and provide a delayed follow-up test. You should also use realistic scenario-based questions, not low-level knowledge questions.

If your goal is assessment, you probably should create a large pool of questions, proctor the testing, and withhold feedback.

 

Is My Book Award Worthy?

Is my book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form, award worthy?

I think so, buy I'm hugely biased! SMILE.

Boxshot-rendering redrawn-no-shadow2

Here's what I wrote today on an award-submission application:

Performance-Focused Smile Sheets: A Radical Rethinking of Dangerous Art Form is a book, published in February 2016, written by Will Thalheimer, PhD, President of Work-Learning Research, Inc.

The book reviews research on smile sheets (learner feedback forms), demonstrates the limitations of traditional smile sheets, and provides a completely new formulation on how to design and deploy smile sheets.

The ideas in the book -- and the example questions provided -- help learning professionals focus on "learning effectiveness" in supporting post-learning performance. Where traditional smile sheets focus on learner satisfaction and the credibility of training, Performance-Focused Smile Sheets can also focus on science-of-learning factors that matter. Smile sheets can be transformed by focusing on learner comprehension, factors that influence long-term remembering, learner motivation to apply what they've learned, and after-learning supports for learning transfer and application of learning to real-world job tasks.

Smile sheets can also be transformed by looking beyond Likert-like responses and numerical averages that dumb-down our metrics and lead to bias and paralysis. We can go beyond meaningless averages ("My course is a 4.1!") and provide substantive information to ourselves and our stakeholders.

The book reviews research that shows that so-called "learner-centric" formulations are filled with dangers -- as research shows that learners don't always know how they learn best. Smile-sheet questions must support learners in making smile-sheet decisions, not introduce biases that warp the data.

For decades our industry has been mired in the dishonest and disempowering practice of traditional smile sheets. Thankfully, a new approach is available to us.

Sure! I'd love to see my work honored. More importantly, I'd love to see the ideas from my book applied wisely, improved, and adopted for training evaluation, student evaluations, conference evaluations, etc. 

You can help by sharing, by piloting, by persuading, by critiquing and improving! That will be my greatest award!

Smile Sheet Questions — New Examples July 2016

The response to the book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form, has been tremendous! Since February, when it was published, I've received hundreds of thank you's from folks the world over who are thrilled to have a new tool -- and to finally have a way to get meaningful data from learner surveys. At the ATD conference where I spoke recently, the book sold out it was so popular! If you want to buy the book, the best place is still SmileSheets.com, the book's website.

Since publication, I've begun a research effort to learn how companies are utilizing the new smile-sheet approach -- and to learn what's working, what the roadblocks are, and what new questions they've developed. As I said in the book in the chapter that offers 26 candidate questions, I hope that folks tailor questions, improve them, and develop new questions. This is happening, and I couldn't be more thrilled. If your company is interested in being part of my research efforts, please contact me by clicking here. Likewise, if you've got new questions to offer, let me know as well.

Avoiding Issues of Traditional Smile Sheets

Traditional smile sheets tend to focus on learners' satisfaction and learners' assessments of the value of the learning experience. Scientific research shows us that such learner surveys are not likely to be correlated with learning results. Performance-focused smile sheets offer several process improvements:

  1. Avoid Likert-like scales and numerical scales which create a garbage-in garbage-out problem, which don't offer clear delineations between answer choices, which don't support respondent decision-making, and which open responding to bias.
  2. Instead, utilize concrete answer choices, giving respondents more granularity, and enabling much more meaningful results.
  3. In addition to, or instead of, focusing on factors related to satisfaction and perceived value; target factors that are related to learning effectiveness.

 

New Example Questions

As new questions come to my attention, I'll share them here on my blog and elsewhere. You can sign up for my email newsletter if you want to increase the likelihood that you'll see new smile-sheet questions (and for other learning-research related information as well).

Please keep in mind that there are no perfect assessment items, no perfect learning metrics, and no perfect smile-sheet questions. I've been making improvements to my own workshop smile sheets for years, and every time I update them, I find improvements to make. If you see something you don't like in the questions below, that's wonderful! When evaluating an assessment item, it's useful to ask whether the item (1) is targeting something important and (2) is it better than other items that we could use or that we've used in the past.

 

Question Example -- A Question for Learners' Managers

My first example comes from a world renowned data and media company. They decided to take one of the book's candidate questions, which was designed for learners to answer, and modify the question to ask learners' managers to answer. Their reasoning: The training is strategically important to their business and they wanted to go beyond self-report data. Also, they wanted to send "stealth messages" to learners' managers that they as managers had a role to play in ensuring application of the training to the job.

Here's the question (aimed at learners' managers):

In regard to the course topics taught, HOW EFFECTIVELY WAS YOUR DIRECT REPORT ABLE to put what he/she learned into practice in order to PERFORM MORE EFFECTIVELY ON THE JOB?

 

A. He/she has NOT AT ALL ABLE to put the concepts into practice.

B. He/she has GENERAL AWARENESS of the concepts taught, but WILL NEED MORE TRAINING / GUIDANCE to put the concepts into practice.

C. He/she WILL NEED MORE HANDS-ON EXPERIENCE to be fully competent in using the concepts taught.

D. He/she is at a FULLY COMPETENT LEVEL in using the concepts taught.

E. He/she is at an EXPERT LEVEL in using the concepts taught.

 

Question Example -- Tailoring a Question to the Topic

In writing smile-sheet questions, there's a tradeoff between generalization and precision. Sometimes we need a question to be relevant to multiple courses. We want to compare courses to one another. Personally, I think we overvalue this type of comparison, even when we might be comparing apples to oranges. For example, do we really want to compare scores on courses that teach such disparate topics as sexual harassment, word processing, leadership, and advanced statistical techniques? Still, there are times when such comparisons make sense.

The downside of generalizability is that we lose precision. Learners are less able to calibrate their answers. Analyzing the results becomes less meaningful. Also, learners see the learner-survey process as less valuable when questions are generic, so they give less energy and thought to answering the questions, and our data become less valuable and more biased.

Here is a question I developed for my own workshop (on how to create better smile sheets, by the way SMILE):

How READY are you TO WRITE QUESTIONS for a Performance-Focused Smile Sheet?

CIRCLE ONE OR MORE ANSWERS

AND/OR WRITE YOUR REASONING BELOW

 

A. I’m STILL NOT SURE WHERE TO BEGIN.

B. I KNOW ENOUGH TO GET STARTED.

C. I CAN TELL A GOOD QUESTION FROM A BAD ONE.

D. I CAN WRITE MY OWN QUESTIONS, but I’d LIKE to get SOME FEEDBACK before using them.

E. I CAN WRITE MY OWN QUESTIONS, and I’m CONFIDENT they will be reasonably WELL DESIGNED.

More
Thoughts?

 

 

 

Note several things about this question. First to restate. It is infinitely more tailored than a generic question could be. It encourages more thoughtful responding and creates more meaningful feedback.

Second, you might wonder why all the CAPS! I advocate CAPS because (1) CAPS have been shown in research to slow reading speed. Too often, our learners burn through our smile-sheet questions. Anything we can do to make them attend more fully is worth trying. Also, (2) respondents often read the full question and then skim back over it when determining how to respond. I want them to have an easy way to parse the options. Full disclosure. To my knowledge, all CAPS has not been studied yet for smile sheets. At this point, my advocacy for all CAPS is based on my intuition about how people process smile-sheet questions. If you'd like to work with me to test this in a scientifically rigorous fashion, please contact me.

Third, notice the opportunity for learners to write clarifying comments. Open-ended questions, though not easily quantifiable, can be the most important questions on smile sheets. They can provide intimate granularity -- a real sense of the respondents' perceptions. In these questions, we're using a hybrid format, a forced choice question followed by an open-ended opportunity for clarification. This not only enables the benefits of open-ended responding, but it also enables us to get clarifying meaning. In addition, in some way it provides a reality-check on our question design. If we notice folks responding in ways that aren't afforded in the answer choices given, we can improve our question for later versions.

 

Question Example -- Simplifying The Wording

In writing smile-sheet questions, there's another tradeoff to consider. More words add more precision, but fewer words add readability and motivation to engage the question fully. In the book, I talk about what I once called, "The World's Best Smile Sheet Question." I liked it partly because the answer choices were more precise than a Likert-like scale. It did have one drawback; it used a lot of words. For some audiences this might be fine, but for others it might be entirely inappropriate.

Recently, in working with a company to improve their smile sheet, a first draft included the so-called World's Best Smile Sheet Question. But they were thinking of piloting the new smile sheet for a course to teach basic electronics to facilities professionals. Given the topic and audience, I recommended a simpler version:

How able will you be to put what you’ve learned into practice on the job?  Choose one.

A. I am NOT AT ALL ready to use the skills taught.
B. I need MORE GUIDANCE to be GOOD at using these skills
C. I need MORE EXPERIENCE to be GOOD at using these skills.
D. I am FULLY COMPETENT in using these skills.
E. I am CAPABLE at an EXPERT LEVEL in using these skills.

This version nicely balances precision with word count.

 

Question Example -- Dealing with the Sticky Problem of "Motivation"

In the book, I advocate a fairly straightforward question asking learners about their motivation to apply what they've learned. In many organizations -- in many organizational cultures -- this will work fine. However in others, our trainers may be put off by this. They'll say, "Hey, I can't control people's motivations." They're right, of course. They can't control learners' motivations, but they can influence them. Still, it's critical to realize that motivation is a multidimensional concept. When we speak of motivation, we could be talking simply about a tendency to take action. We could be talking about how inspired learners are, or how much they believe in the value of the concepts, or how much self-efficacy they might have. It's okay to ask about motivation in general, but you might generate clearer data if you ask about one of the sub-factors that comprise motivation.

Here is a question I developed recently for my Smile-Sheet Workshop:

How motivated are you to IMPLEMENT PERFORMANCE-FOCUSED SMILE SHEETS in your organization?

CIRCLE ONLY ONE ANSWER. ONLY ONE!

 

A. I’m NOT INTERESTED IN WORKING TOWARD IMPLEMENTING this.

B. I will confer with my colleagues to SEE IF THERE IS INTEREST.

C. I WILL ADVOCATE FOR performance-focused smile sheet questions.

D. I WILL VIGOROUSLY CHAMPION performance-focused smile sheet questions.

E. Because I HAVE AUTHORITY, I WILL MAKE THIS HAPPEN.

More
Thoughts?

 

In this question, I'm focusing on people's predilection to act. Here I've circumnavigated any issues in asking learners to divulge their internal motivational state, and instead I've focused the question on the likelihood that they will utilize their newly-learned knowledge in developing, deploying, and championing performance-focused smile sheets.

 

Final Word

It's been humbling to work on smile sheet improvements over many years. My earlier mistakes are still visible in the digital files on my hard drive. I take solace in making incremental improvements -- and in knowing that the old way of creating smile-sheet questions is simply no good at all, as it provides us with perversely-irrelevant information.

As an industry -- and the learning industry is critically important to the world -- we really need to work on our learning evaluations. Smile sheets are just one tool in this. Unfortunately, poorly constructed smile sheets have become our go-to tool, and they have led us astray for decades.

I hope you find value in my book (SmileSheets.com). More importantly, I hope you'll participate along with some of the world's best-run companies and organizations in developing improved smile sheet questions. Again, please email me with your questions, your question improvements, and alternatively, with examples of poorly-crafted questions as well.

Smile-Sheet Workshop in Suffolk, VA — June 10th, 2016

OMG! The best deal ever for a full-day workshop on how to radically improve your smile-sheet designs! Sponsored by the Hampton Roads Chapter of ISPI. Free book and subscription-learning thread too!

 

Friday, June 10, 2016

Reed Integration

7007 Harbour View Blvd #117

Suffolk, VA

 

Click here to register now...

 

Performance Objectives:

By completing this workshop and the after-course subscription-learning thread, you will know how to:

  1. Avoid the three most troublesome biases in measuring learning.

  2. Persuade your stakeholders to improve your organization’s smile sheets.

  3. Create more effective smile sheet questions.

  4. Create evaluation standards for each question to avoid bias.

  5. Envision learning measurement as a bulwark for improved learning design.

 

Recommended Audience:

The content of this workshop will be suitable to those who have at least some background and experience in the training field. It will be especially valuable to those who are responsible for learning evaluation or who manage the learning function.

 

Format:

This is a full-day workshop. Participants are encouraged to bring laptops if they prefer to use a computer to write their questions.  

 

Bonus Take-Away:

Each Participant will receive a copy of Dr. Thalheimer’s Book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form.

Podcast with Connie Malamed

Connie Malamed, one of the learning industry's most insightful thinkers when it comes to the visual aspects of instructional design, has produced a series of podcasts interviewing some of the most thoughtful folks in the field. In her most recent podcast, she interviews me about my book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form.

Click here to listen to (or download) the podcast...

https://pbs.twimg.com/profile_images/1604402649/cmalamed-twitter_400x400.jpg

Connie is a real pro, as you can hear in her podcasts. She not only finds great people to interview, but she crafts the interview questions after having done her homework--and she edits the podcasts herself! A labor of love it seems. Indeed, for our interview, Connie had clearly read my book and came ready with insights of her own.

Let me recommend Connie's book, Visual Design Solutions: Principles and Creative Inspiration for Learning Professionals, which I'm right now in the middle of reading myself. As designers of learning, we are not maximizing our effectiveness without bringing design principles and visual aesthetics into our work.

Click here to learn more about Connie's book...