Adobe Captivate 2017: Powerful!

I am very excited to be using Adobe Captivate 2017.  What an outstanding tool!  I tired the 30 day trial a few months ago and while I was impressed, I thought that I needed more time to really consider what I could do in the future with such a powerful tool.  After purchasing Adobe Captivate 2017, reviewing the short tutorials and taking an Adobe Captivate 2017 course on LYNDA.com, I feel like Adobe Captivate 2017 will provide me with all I need to do a great job on my next eLearning project without any problems.  The design, power, and ease of use of Adobe Captivate 2017 has prompted me to consider several new eLearning projects to complete in the near future.  Finally, a great plus when using Adobe Captivate 2017 is that you know that help along with access to users with cutting-edge ideas are a click away in the Adobe Captivate community.

Converting eLearning Content: 3 Benefits And 6 Top Tips For Switching From Flash To HTML5

Isn't it about time to say goodbye to Flash players in your eLearning course design? In this article, I'll explore the benefits, as well as 6 top tips for converting eLearning content from Flash to HTML5.

This post was first published on eLearning Industry.

8 Critical Questions To Consider When Building A Business Case For Learning Technology Systems

Much like every difficult decision requires a well-structured process to be dealt with, so does building a business case for learning technology systems involve making a list of questions that will determine what type of LMS you will choose. In this article, I’ll highlight 8 critical questions you need to consider when building a business case for learning technology systems.

This post was first published on eLearning Industry.

GUEST POST by LAUREL NORRIS: Robust Responses to Open-Ended Questions: Good Surveys Prime Respondents to Think Critically

This is a guest post by Laurel Norris (https://twitter.com/neutrinosky).

Laurel is a Training Specialist at Widen Enterprises, where she is involved in developing and delivering training, focusing on data, reporting, and strategy.

--------------------------------------------------

Robust Responses to Open-Ended Questions: Good Surveys Prime Respondents to Think Critically

By Laurel Norris

--------

I’ve always been a fan of evaluation. It’s a way to better understand the effectiveness of programs, determine if learning objectives are being met, and reveal ways to improve web workshops and live trainings.

Or so I thought.

It turns out that most evaluations don’t do those things. Performance-Focused Smile Sheets (the book is available at http://SmileSheets.com) taught me that and when I implemented the recommendations from the book, I discovered something interesting. Using Dr. Thalheimer’s method improved the quality and usefulness of survey data – and provided me with much more robust responses to open-ended questions.

By more robust, I mean they revealed what was helpful and why, talked about what they thought their challenges would be in trying it themselves, discussed what areas they thought could use more emphasis, and shared where they would have appreciated more examples. In short, they provided a huge amount of useful information.

Bigstock--187668292

Before using Dr. Thalheimer’s method, only a few open-ended responses were helpful. Most were along the lines of “Thanks!”, “Good webinar”, or “Well presented”. While those kinds of answers make me feel good, they don’t help me improve trainings.

I’m convinced that the improved survey primed people to be more engaged with the evaluation process and enabled them to easily provide useful information to me.

So what did I do differently? I’ll use real examples from web workshops I conducted. Both workshops ran around 45 minutes and had 30 responses to the end of workshop survey. They did differ in style, something that I will discuss towards the end of this article.

 

The Old Method

Let’s talk about the old method, what Dr. Thalheimer might call a traditional smile sheet. It was (luckily) short, with three multiple choice questions and two open-ended. Multiple choice questions included:

  • How satisfied are you with the content of this web workshop?
  • How satisfied are you with the presenter's style?
  • How closely did this web workshop align with your expectations?

Participants answered the questions with options on Likert-like scales ranging from “Very Unsatisfied” to “Very Satisfied” or “Not at all Closely” to “Very Closely”. Of course, in true smile-sheet style, the multiple choice yielded no useful information. People were 4.1 level satisfied with the content of the webinar, “data” which did not enable me to make any useful changes to the information I provided.

Open-ended questions invited people to “Share your ideas for web workshop topics” and offer “Additional Comments”. Of the thirteen open-ended responses I got, five of them provided useful information. The other seven were either a thank you or some form of praise.

 

The New Method

Respondents were asked four multiple choice questions that judged effectiveness of the web workshop, how much the concepts would help them improve work outcomes, how well they understand the concepts taught, and whether or not they would use skills they learned in the workshop at their job.

The web workshop was about user engagement, in particular, administrators increasing engagement with the systems they manage. Questions were:

  • In regard to the user engagement, how able are you to put what you’ve learned into practice on the job?
  • From your perspective, how valuable are the concepts taught in the workshop? How much will they help improve engagement with your site?
  • How well do you feel you understand user engagement?
  • How motivated will you be to utilize these user engagement skills at your work?

Responses were specific and adapted from Dr. Thalheimer’s book. For example, here were the optional responses to the question “In regard to the user engagement, how able are you to put what you’ve learned into practice on the job?”

  • I'm not at all able to put the concepts into practice.
  • I have general awareness of the concepts taught, but I will need more training or practice to complete user engagement projects.
  • I am able to work on user engagement projects, but I'll need more hands-on experience to be fully competent in using the concepts taught.
  • I am able to complete user engagement projects at a fully competent level in using the concepts taught.
  • I am able to complete user engagement projects at a expert level in using the concepts taught.

All four multiple choice questions had similarly complete options to choose from. From those responses, I was able to more appropriately determine the effectiveness of the workshop and whether my training content was performing as expected.

The open-ended question was relatively bland. I asked “What else would you like to share about your experience during the webinar today?” and received twelve specific, illuminating responses, such as:

“Loved the examples shown from other sites. Highly useful!”

“It validated some of the meetings I have had with my manager about user engagement and communication about our new site structure. It will be valuable for upcoming projects about asset distribution throughout the company.”

“I think the emphasis on planning the plan is helpful. I think I lack confidence in designing desk drops for Design teams. Also - I'm heavily engaged with my users now as it is - I am reached out to multiple times per day...but I think some of these suggestions will be valuable for more precision in those engagements.”

Even questions that didn’t give me direct feedback on the workshop, like “Still implementing our site, so a lot of today's content isn't yet relevant”, gave me information about my audience.

 

Conclusion

Clearly, I’m thrilled with the kind of information I am getting from using Dr. Thalheimer’s methods. I get useful, rich data from respondents that helps me better evaluate my content and understand my audience.

There is one positive aspect of using the new method that might have skewed the data. I designed the second web workshop after I read the book, and Dr. Thalheimer’s Training Effectiveness Taxonomy influenced the design. I thought more about the goals for the workshop, provided cognitive supports, repeated key messages, and did some situation-action triggering.

Based on those changes, the second web workshop was probably better than the first and it’s possible that the high-quality, engaging workshop contributed to the robust responses to open-ended questions I saw.

Either way, my evaluations (and learner experiences) are revolutionized. Has anyone seen a similar improvement in open-ended response rates since implementing performance-focused smile sheets?

 

4 charts on how people around the world see education

Eine interessante kleine Frage des amerikanischen Pew Research Center: „… which is more important to emphasize in school: creative thinking or basic academic skills and discipline“? Auf Kreativität und unabhängiges Denken setzen mehr Menschen in den entwickelten Staaten und dort mehr Linke als Rechte, wobei die Unterschiede in den USA und Großbritannien am größten, in Deutschland und Spanien am kleinsten sind. Und schließlich: „Younger people in most advanced economies are the most supportive of education that emphasizes creative and independent thinking.“ Fürs Protokoll.
Laura Silver, Pew Research Center, 28. August 2017

Bildquelle: Ricardo Viana (Unsplash)