Save your clients from themselves

Two National Guard team members dangle from a cable below a helicopter“Our job is to give the client what they want.” Sound familiar?

It’s what I was told when I started. But decades later, I’d say this instead:

“Our job is to make the client look good.”

Often this means, “Our job is to save our clients from themselves.”

Make them look good…

Which manager looks good? The one who helps staff do their work well and feel proud of it, or the one who makes everyone sit through a zombie presentation followed by a quiz that a garden gnome could pass?

If we want to make our clients look good, we can’t just give them what they think they want.

…by saving them from themselves.

“Give me a heart transplant, and use this knife to do it.” No surgeon would agree to this. It goes against their ethics.

“Make me a course, and use this tool to do it.” Like a surgeon, we should have ethics. We should at least vow to do no harm.

Creating a course and making everyone sit through it does harm when it doesn’t solve the problem. We not only waste money and time, we disrespect the client, employees, organization, and our own profession.

Our goal should be to leave our clients in better shape than when they came to us. The only way to do this is to diagnose their problem and help them solve it. They’ll look good, and we’ll be heroes.

Stalk your client

To know what will make our client look good, we need to know who they are and what challenges they’re facing. For example, we could:

  • Look at their LinkedIn profile and posts. How long have they been in their current position? What’s their background?
  • Look for internal announcements or news releases about their department.
  • Look up their company in Wikipedia or business databases. How is it performing? Any recent controversies or changes?
  • Check Glassdoor for staff reviews about their division or company.

For example, in the Jedi Mind Tricks toolkit (now available), we have a practice client called Carla. She wants an online course for managers. The course is supposed to teach them how to use the (fictional, but only barely) Soto-Baldwin personality inventory to “become more empathetic.”

You could obey and crank out the course. But if you spend a few minutes learning more about Carla, you discover that going ahead with this idea would damage an important relationship. You’d also waste everyone’s time with a dubious personality test.

Your challenge is to help Carla see this for herself.

Then, do the analysis!!!

Once we understand where the client is coming from, we can help them analyze the problem.

Even if you can only invest two hours, you’ll have a chance to steer your client in the right direction.

Toolkit now available

These first steps with the client are the focus of the Jedi Mind Tricks toolkit. It gives you the direction I wish I had when I first started in this field.

You’ll change how you talk to stakeholders so you can help them solve problems and improve lives. You’ll stop being an order taker and move toward performance consulting.

You’ll have tons of realistic practice and a unique system of real-world tasks. The 30 tasks have you improve the forms you use, write what you’ll say in meetings, practice with colleagues, and establish procedures to permanently change how you work.

You can also use the toolkit as a mega-job aid as you start a new project. In each section, you can write notes in the toolkit, recording how you’ll apply the techniques to your current client. You can download these notes as a custom PDF. Later, you can reuse the notes fields and download new PDFs as many times as you want for new projects.

Check out the toolkit here!

Discuss this post on LinkedIn.

Photo credit: New Jersey National Guard Flickr via Compfight cc

Burn your training request form

If your organization is typical, you have a training request form. Look at it now. It probably commits 10,000 sins.

Training request form on fireFor example, it might ask the client to:

  • Identify the format and length of “the training”
  • List the content that should be included
  • Specify the date and location for “the training”
  • Identify the number of people to be “trained”

With this form, you’re saying, “My job is to produce whatever training you want, whether or not it will actually work.” It turns you into a worker in a course factory.

If you want to have a real impact and win the respect of your organization, you need to set your clients’ expectations from the start.

1. Not “training” but “development”

If you must have a form, call it something like “development request.” Make clear your job is to improve performance, not create training on demand.

2. Don’t use the forbidden words

Throughout the form, avoid terms that refer to a specific solution. There is no solution yet. You won’t decide whether training is part of the solution until you’ve analyzed the problem.

For example, don’t use these terms in the form:

  • course
  • training
  • workshop
  • content
  • blended
  • assessment

3. Ask about the problem instead

Ask about the issue that the client is seeing. You might use questions like these:

  • What change would you like to achieve with this project?
  • How will you know that the change has been achieved?
  • Who will change what they’re doing as a result of this project?
  • How will the organization benefit?
  • What has been tried in the past to bring about this change?

Your goal is to get an idea of the possible business goal and how the client currently views the problem. Both could change during your discussions with the client.

Template available soon

A development request form that you can adapt will soon be available as part of a toolkit.

Jedi mind tricks for training designers has been a popular presentation, and now it’s grown up into the first toolkit of several that I’m developing.

The toolkit is a menu-driven series of challenges, guidance, downloads, and real-world tasks that will help you start projects right and avoid creating information dumps. Learn more and sign up to be notified when it’s available.

Discuss this post on LinkedIn.

Burn your training request form

If your organization is typical, you have a training request form. Look at it now. It probably commits 10,000 sins.

Training request form on fireFor example, it might ask the client to:

  • Identify the format and length of “the training”
  • List the content that should be included
  • Specify the date and location for “the training”
  • Identify the number of people to be “trained”

With this form, you’re saying, “My job is to produce whatever training you want, whether or not it will actually work.” It turns you into a worker in a course factory.

If you want to have a real impact and win the respect of your organization, you need to set your clients’ expectations from the start.

1. Not “training” but “development”

If you must have a form, call it something like “development request.” Make clear your job is to improve performance, not create training on demand.

2. Don’t use the forbidden words

Throughout the form, avoid terms that refer to a specific solution. There is no solution yet. You won’t decide whether training is part of the solution until you’ve analyzed the problem.

For example, don’t use these terms in the form:

  • course
  • training
  • workshop
  • content
  • blended
  • assessment

3. Ask about the problem instead

Ask about the issue that the client is seeing. You might use questions like these:

  • What change would you like to achieve with this project?
  • How will you know that the change has been achieved?
  • Who will change what they’re doing as a result of this project?
  • How will the organization benefit?
  • What has been tried in the past to bring about this change?

Your goal is to get an idea of the possible business goal and how the client currently views the problem. Both could change during your discussions with the client.

Template available soon

A development request form that you can adapt will soon be available as part of a toolkit.

Jedi mind tricks for training designers has been a popular presentation, and now it’s grown up into the first toolkit of several that I’m developing.

The toolkit is a menu-driven series of challenges, guidance, downloads, and real-world tasks that will help you start projects right and avoid creating information dumps. Learn more and sign up to be notified when it’s available.

Learning objectives: Our frenemy

Frenemies“Never design anything without first writing the learning objectives.”

We all know this. It’s a useful rule, but only when the objectives are useful.

And there’s the problem — conventional learning objectives can work against us. They’re our friends, but not always.

What do I mean by “conventional learning objectives?” This sort of thing:

  • List the three steps of widget certification.
  • Explain the difference between widget certification and widget approval.
  • Describe the widget supply chain.

Here are three questions that will help you set boundaries with our frenemy.

1. Do people actually need to “learn” something?

Conventional learning objectives might be your friends if both of the following are true.

  1. A knowledge or skill gap clearly exists.
  2. Closing that gap will help solve the problem.

Is there really a knowledge or skill gap? Maybe the problem is mostly caused by bad tools, an inefficient process, lack of incentives, or some other environmental issue. With your client and SME, first identify what people need to do on the job, and then walk through this flowchart before agreeing to develop training.

Will closing the gap solve the problem? Maybe it’s true that people don’t know the intricacies of the supply chain, but installing that information in their brains won’t make them better widget polishers. Don’t deliver content just because someone told you to.

(In the Asia-Pacific region? This workshop on Jan. 23 will help you stand up to content-obsessed clients.)

2. Are we writing useful objectives for the formal training bits?

If our analysis shows that we really do need to design a learning experience, then, yes, we need objectives. Are the actions we wrote earlier good enough, or should we let learning objectives elbow their way into our project?

Here’s an example from my book.

Let’s say that we want firefighters to educate the public about preventing forest fires and quickly put these fires out when they occur. Our official goal is, “Fire-related losses in our region will decrease 10% by next year as firefighters prevent and extinguish brush and forest fires.”

Which of the following do you think I’d accept as actions to reach this goal?

a) Identify the techniques used to extinguish a brush fire
b) List the principal sources of combustion in a deciduous forest
c) Describe common public misconceptions about campfires
d) Quickly extinguish a brush fire in a dry mixed forest
e) Define “incendiary device”

If you said that only d, “Quickly extinguish a brush fire,” was an action by my standards, you’ve gotten my point.

An action is something you see someone do as a normal part of their job. It doesn’t take place in their head or in that abstract world I call Testland. The action should be the focus of our analysis and design, and it should be the statement we use to “sell” the material to the stakeholders and learners.

“But the other statements are good objectives!”

In the world of conventional instructional design, the other statements are also observable objectives.

For example, we can watch a firefighter write a list of the techniques used to extinguish a brush fire, and we can point at that list and say, “See? They know it.” And that’s the problem — we’re just measuring whether they know it. There’s no guarantee that the firefighter will actually apply this knowledge, which is what we really want and what we should be helping them do.

“Identify the techniques” is an enabling objective. It describes information necessary to perform the action. It goes in the information part of the map — I’d list “techniques to extinguish a brush fire” as required knowledge that’s subordinate to the action about putting out fires.

Our goal is to create realistic, contextual practice activities. We can do that only if we focus on what people need to do. If instead we let knowledge-based objectives distract us, we’ll create the usual information dump followed by a quiz, which is the approach that helps make us irrelevant.

3. Who needs to see the objectives?

The client

If you’re using action mapping, your client helped create the list of actions, so they’re already familiar with them. If you need to submit a formal document, I recommend an outline rather than a big design-everything-at-once document. (See this big interactive graphic of the action mapping workflow.)

In that outline, you can include your action map, which shows the actions and the information required by each. The actions are your main objectives, and the bits of information represent the knowledge that supports those objectives.

If your client wants to see conventional learning objectives, consider listing your actions as “performance objectives.” Then, indented and subordinate to each performance objective, list its enabling objectives.

I resist writing the enabling objectives using test language (“describe, explain, define…”) because that sets the expectation that there will be a knowledge test. Maybe some of the knowledge doesn’t need to be memorized and could instead be included in a job reference. It won’t be tested, so there’s no reason to write a test-style objective about it.

Or maybe people do need to memorize some stuff, but a separate knowledge test would be artificial. Instead, you could assess with the same type of activities you provided for practice, which would test not only whether people know the information but whether they can apply it.

The learners

Briefly tell people what they’ll be able to do as a result of the activity, and focus on what they care about. Put those over-eager learning objectives on mute because they don’t know how to sound appealing.

  • Not this: “When interacting with a dissatisfied customer, appropriately apply the five steps of the Abrams-Martinson Dissatisfied-to-Satisfied Customer Transformation Model” that no one has heard of but a consultant convinced us to use
  • This instead: “Turn an unhappy customer into a happy one in 5 quick steps”

Again, I’m not talking just about courses. This applies to activities, which could (and maybe should) be provided individually, on demand. Each activity that stands alone should quickly make clear what people will practice doing and how they’ll benefit.

For more on the distinction between an action and an enabling objective, see Why you want to focus on actions, not learning objectives.


Online workshop Jan. 23

Jedi Mind Tricks for learning designersIn the Asia-Pacific region? Learn to control your clients’ minds (sort of!) on January 23. In this 90-minute interactive workshop, you’ll change how you talk to stakeholders and manage the initial project meetings. You’ll stop being an order taker and instead steer clients toward solutions that solve problems and improve lives.

Learning Technologies UK

I’ll present a shorter version of the Jedi Mind Tricks workshop in my Feb. 23 session at Learning Technologies UK.

 

Learning objectives: Our frenemy

Frenemies“Never design anything without first writing the learning objectives.”

We all know this. It’s a useful rule, but only when the objectives are useful.

And there’s the problem — conventional learning objectives can work against us. They’re our friends, but not always.

What do I mean by “conventional learning objectives?” This sort of thing:

  • List the three steps of widget certification.
  • Explain the difference between widget certification and widget approval.
  • Describe the widget supply chain.

Here are three questions that will help you set boundaries with our frenemy.

1. Do people actually need to “learn” something?

Conventional learning objectives might be your friends if both of the following are true.

  1. A knowledge or skill gap clearly exists.
  2. Closing that gap will help solve the problem.

Is there really a knowledge or skill gap? Maybe the problem is mostly caused by bad tools, an inefficient process, lack of incentives, or some other environmental issue. With your client and SME, first identify what people need to do on the job, and then walk through this flowchart before agreeing to develop training.

Will closing the gap solve the problem? Maybe it’s true that people don’t know the intricacies of the supply chain, but installing that information in their brains won’t make them better widget polishers. Don’t deliver content just because someone told you to.

(In the Asia-Pacific region? This workshop on Jan. 23 will help you stand up to content-obsessed clients.)

2. Are we writing useful objectives for the formal training bits?

If our analysis shows that we really do need to design a learning experience, then, yes, we need objectives. Are the actions we wrote earlier good enough, or should we let learning objectives elbow their way into our project?

Here’s an example from my book.

Let’s say that we want firefighters to educate the public about preventing forest fires and quickly put these fires out when they occur. Our official goal is, “Fire-related losses in our region will decrease 10% by next year as firefighters prevent and extinguish brush and forest fires.”

Which of the following do you think I’d accept as actions to reach this goal?

a) Identify the techniques used to extinguish a brush fire
b) List the principal sources of combustion in a deciduous forest
c) Describe common public misconceptions about campfires
d) Quickly extinguish a brush fire in a dry mixed forest
e) Define “incendiary device”

If you said that only d, “Quickly extinguish a brush fire,” was an action by my standards, you’ve gotten my point.

An action is something you see someone do as a normal part of their job. It doesn’t take place in their head or in that abstract world I call Testland. The action should be the focus of our analysis and design, and it should be the statement we use to “sell” the material to the stakeholders and learners.

“But the other statements are good objectives!”

In the world of conventional instructional design, the other statements are also observable objectives.

For example, we can watch a firefighter write a list of the techniques used to extinguish a brush fire, and we can point at that list and say, “See? They know it.” And that’s the problem — we’re just measuring whether they know it. There’s no guarantee that the firefighter will actually apply this knowledge, which is what we really want and what we should be helping them do.

“Identify the techniques” is an enabling objective. It describes information necessary to perform the action. It goes in the information part of the map — I’d list “techniques to extinguish a brush fire” as required knowledge that’s subordinate to the action about putting out fires.

Our goal is to create realistic, contextual practice activities. We can do that only if we focus on what people need to do. If instead we let knowledge-based objectives distract us, we’ll create the usual information dump followed by a quiz, which is the approach that helps make us irrelevant.

3. Who needs to see the objectives?

The client

If you’re using action mapping, your client helped create the list of actions, so they’re already familiar with them. If you need to submit a formal document, I recommend an outline rather than a big design-everything-at-once document. (See this big interactive graphic of the action mapping workflow.)

In that outline, you can include your action map, which shows the actions and the information required by each. The actions are your main objectives, and the bits of information represent the knowledge that supports those objectives.

If your client wants to see conventional learning objectives, consider listing your actions as “performance objectives.” Then, indented and subordinate to each performance objective, list its enabling objectives.

I resist writing the enabling objectives using test language (“describe, explain, define…”) because that sets the expectation that there will be a knowledge test. Maybe some of the knowledge doesn’t need to be memorized and could instead be included in a job reference. It won’t be tested, so there’s no reason to write a test-style objective about it.

Or maybe people do need to memorize some stuff, but a separate knowledge test would be artificial. Instead, you could assess with the same type of activities you provided for practice, which would test not only whether people know the information but whether they can apply it.

The learners

Briefly tell people what they’ll be able to do as a result of the activity, and focus on what they care about. Put those over-eager learning objectives on mute because they don’t know how to sound appealing.

  • Not this: “When interacting with a dissatisfied customer, appropriately apply the five steps of the Abrams-Martinson Dissatisfied-to-Satisfied Customer Transformation Model” that no one has heard of but a consultant convinced us to use
  • This instead: “Turn an unhappy customer into a happy one in 5 quick steps”

Again, I’m not talking just about courses. This applies to activities, which could (and maybe should) be provided individually, on demand. Each activity that stands alone should quickly make clear what people will practice doing and how they’ll benefit.

For more on the distinction between an action and an enabling objective, see Why you want to focus on actions, not learning objectives.


Online workshop Jan. 23

Jedi Mind Tricks for learning designersIn the Asia-Pacific region? Learn to control your clients’ minds (sort of!) on January 23. In this 90-minute interactive workshop, you’ll change how you talk to stakeholders and manage the initial project meetings. You’ll stop being an order taker and instead steer clients toward solutions that solve problems and improve lives.

Learning Technologies UK

I’ll present a shorter version of the Jedi Mind Tricks workshop in my Feb. 23 session at Learning Technologies UK.

 

“It’s new, so everyone needs training on it.” Nope.

“We’re introducing something new,” your client says. “So of course everyone needs to be trained on it.”

New thing requires 13 hours of trainingHmmm. Really?

Maybe your client is thinking this: “This new thing is so bizarrely new that no adult Earthling could possibly figure it out without formal training.”

Or maybe they’re really thinking this: “This new thing is a pain in my neck and I don’t know how to introduce it. I’ll have L&D train everyone and call it a day.”

Either way, the client is expecting you to unleash an avalanche of “training” on innocent people who would rather just do their jobs.

Stop that avalanche right now with your Jedi mind tricks. (Learn those tricks online on Jan. 23 or in my session at Learning Technologies UK.)

Example

“Please train everyone on the new TPS software by June 1,” your client says.

The client expects to hear, “Sure. I’m on it!” Instead, offer an innocent “why?”

“Why are you installing new TPS software?” you ask.

“Because people were messing up their reports in the old software,” your client says.

“Why were people messing up their reports in the old software?”

“It was confusing to use,” your client says. “The new software walks people through the process a lot more clearly.”

“So the new software is easier to use?”

“Yeah, a lot easier.”

“And everyone who will be using it is already familiar with the old software?”

“Yep. They’ve all been entering TPS reports for years.”

At this point, do you agree with the client that everyone needs “training” on the new software? I hope not.

You might propose this: Give the new software to a few typical TPS report creators and watch them figure it out. Their struggles (or lack of struggle) will show what support they really need. A help screen or short reference is likely to be enough “training” in this case.

Use a goal to focus on performance, not training

If you’re using action mapping, you’ll want your client to give you a measurable business goal that justifies the expense of the project.

In our example, the client’s first goal was, “TPS software training is 100% complete by June 1.” This goal is measurable, but it doesn’t show how the organization will benefit. It also gets way ahead of itself by assuming that training is the solution.

Your innocent questions help the client see their real goal. This might be, “TPS error rates decrease 35% by June 1 as all TPS staff correctly use the new software.”

This goal doesn’t assume that training is the answer, and it justifies the expense of the project in terms the organization cares about. It also leaves room for many solutions, including job aids.

What about new products?

“We’re releasing a new product,” your client says. “Please train all employees on it.”

What are the two biggest problems with this request? I’d say:

1. The client assumes training is necessary.
2. They think “everyone” needs training. They’re planning a sheep dip.

Your (polite! helpful!) questions should steer the client to this:

  • The reason the product was created in the first place. What organizational improvement is the product supposed to achieve? For example, are we trying to snag some market share from a competitor? Help your client and SMEs focus on meeting that organizational goal, which may or may not require training.
  • A breakdown of the “trainees” by job role (sales, support, repair…)
  • The very specific tasks a typical person in each role needs to perform with the product (sell it using X technique, explain it when a customer asks Y, repair it when it does Z…)
  • The real-world barriers that might make each major task difficult, including problems with tools, time, social pressures, communications, management support…
  • The (many!) solutions that could be put in place to remove those barriers

Then, if some training does seem to be necessary, it will be far more targeted and useful.

You could use a similar approach for customer training for a new product:

  • Why was the product released? What problem does it solve for customers? What does it do for us as an organization?
  • Do we have different types of customers? How can we categorize them?
  • What does each category of customer DO with the product? What are the major tasks they perform?
  • What could make each of those tasks difficult? How could we make it easier?

Related articles

What to do if they just want “awareness”

How to design software training, part 1: Do everything except “train”

Is training really the answer? Ask the flowchart.

“It’s new, so everyone needs training on it.” Nope.

“We’re introducing something new,” your client says. “So of course everyone needs to be trained on it.”

New thing requires 13 hours of trainingHmmm. Really?

Maybe your client is thinking this: “This new thing is so bizarrely new that no adult Earthling could possibly figure it out without formal training.”

Or maybe they’re really thinking this: “This new thing is a pain in my neck and I don’t know how to introduce it. I’ll have L&D train everyone and call it a day.”

Either way, the client is expecting you to unleash an avalanche of “training” on innocent people who would rather just do their jobs.

Stop that avalanche right now with your Jedi mind tricks. (Learn those tricks online on Jan. 23 or in my session at Learning Technologies UK.)

Example

“Please train everyone on the new TPS software by June 1,” your client says.

The client expects to hear, “Sure. I’m on it!” Instead, offer an innocent “why?”

“Why are you installing new TPS software?” you ask.

“Because people were messing up their reports in the old software,” your client says.

“Why were people messing up their reports in the old software?”

“It was confusing to use,” your client says. “The new software walks people through the process a lot more clearly.”

“So the new software is easier to use?”

“Yeah, a lot easier.”

“And everyone who will be using it is already familiar with the old software?”

“Yep. They’ve all been entering TPS reports for years.”

At this point, do you agree with the client that everyone needs “training” on the new software? I hope not.

You might propose this: Give the new software to a few typical TPS report creators and watch them figure it out. Their struggles (or lack of struggle) will show what support they really need. A help screen or short reference is likely to be enough “training” in this case.

Use a goal to focus on performance, not training

If you’re using action mapping, you’ll want your client to give you a measurable business goal that justifies the expense of the project.

In our example, the client’s first goal was, “TPS software training is 100% complete by June 1.” This goal is measurable, but it doesn’t show how the organization will benefit. It also gets way ahead of itself by assuming that training is the solution.

Your innocent questions help the client see their real goal. This might be, “TPS error rates decrease 35% by June 1 as all TPS staff correctly use the new software.”

This goal doesn’t assume that training is the answer, and it justifies the expense of the project in terms the organization cares about. It also leaves room for many solutions, including job aids.

What about new products?

“We’re releasing a new product,” your client says. “Please train all employees on it.”

What are the two biggest problems with this request? I’d say:

1. The client assumes training is necessary.
2. They think “everyone” needs training. They’re planning a sheep dip.

Your (polite! helpful!) questions should steer the client to this:

  • The reason the product was created in the first place. What organizational improvement is the product supposed to achieve? For example, are we trying to snag some market share from a competitor? Help your client and SMEs focus on meeting that organizational goal, which may or may not require training.
  • A breakdown of the “trainees” by job role (sales, support, repair…)
  • The very specific tasks a typical person in each role needs to perform with the product (sell it using X technique, explain it when a customer asks Y, repair it when it does Z…)
  • The real-world barriers that might make each major task difficult, including problems with tools, time, social pressures, communications, management support…
  • The (many!) solutions that could be put in place to remove those barriers

Then, if some training does seem to be necessary, it will be far more targeted and useful.

You could use a similar approach for customer training for a new product:

  • Why was the product released? What problem does it solve for customers? What does it do for us as an organization?
  • Do we have different types of customers? How can we categorize them?
  • What does each category of customer DO with the product? What are the major tasks they perform?
  • What could make each of those tasks difficult? How could we make it easier?

Related articles

What to do if they just want “awareness”

How to design software training, part 1: Do everything except “train”

Is training really the answer? Ask the flowchart.

How to make mandatory training relevant

Compliance training sheep“How can we make mandatory training more than a tick box exercise?”

That’s the top topic voted by blog readers, so here’s my take.

For “mandatory training,” I’m picturing any material that says some version of “Follow these rules.”

It’s sheep-dip training. Everyone must be “exposed” to it, and a checkmark records that they have been exposed.

How can we make it more relevant?

1. Disobey

A client who says “Everyone must be trained on X” needs our resistance, not our obedience.

Help the client by asking questions, such as:

  • What problems are you seeing? Has something happened? Has someone sued?
  • Was this problem caused by one rogue employee, or is it a bigger issue? Is it limited to a group of employees, or is it really a problem that all employees are causing equally?
  • What are we currently measuring that will improve when everyone is “trained?”

If there’s really no problem, we shouldn’t create a solution. We need to focus on improving performance, not guarding against problems that experience has shown aren’t likely to occur.

2. Set a goal

If it’s clear there really is a need for “training,” or some force far outside your control insists on “training,” then put on your action mapping hat and push for a measurable goal. Here’s one model to follow.

action mapping goal template

For details, see How to create a training goal in 2 quick steps.

3. Narrow your focus

Make sure your audience is specific. “All employees” is not specific.

If you’re required by forces beyond your control to create something for all employees, you can at least break down the audience by major job roles as described next.

4. Do the analysis. Really. DON’T SKIP THIS.

Focus on one job role in your audience. Ask your client and SME what these people need to do, in specific, observable terms, to meet the goal.

“Follow the data security policy” isn’t specific. This is specific:

  • When you must physically transfer data to another location, put the data on a BrandZ thumb drive using HakrPruf encryption and chain it to your left ankle.

Prioritize the actions. Choose a high-priority one, and ask, “What makes this one thing hard to do?” Use the flowchart.

Again, you’re doing this for a specific group of people in a specific job, and you’re focusing on specific, observable behaviors. You’re not asking this once for the entire “course,” and you’re not talking about all employees in every job everywhere.

If those forces far beyond your control insist on applying the same solution to everyone, do this analysis for the major job roles. You probably won’t have a ton of time to do this, but even two hours can save you and everyone else from a much bigger waste of time in the form of irrelevant and ignored materials.

Then, if training is part of the solution, you can have people use only the activities that apply to their job.

Don’t skip this.

If you skip this analysis, what do you have to work with? Generic rules that are guaranteed to become an information dump.

Instead, if you look closely at what people need to do and why they aren’t doing it, you get:

  • Ways to fix the problem that don’t require “training”
  • Ideas for ways to help people practice the tricky parts
  • Respect for the intelligence and experience of the people currently doing the job (notably lacking from most compliance training)

5. Base your design on job tasks, not information

Yes, people need to know stuff. But they need to know stuff in order to do stuff. Design first for what they need to do.

Provide the need-to-know information in the format it’s used on the job. Let people pull the information just like they will on the job.

Here’s a fictional example. Extraterrestrials have landed and are being incorporated into earthling families. As a result, employers have created alien leave policies. Here’s a mini-scenario for managers.

Mini-scenario for alien leave

To answer this question, what information does the manager need? The alien leave policy. How should we provide it?

The traditional approach would be to first present a bunch of slides about the policy. Then we’d give people a chance to “apply” what they’ve “learned” by having them use their short-term memory to answer the question.

Lots of slides followed by activity

But why design slides to present information that’s already in a policy on the intranet?

Instead, we can plunge people into the activity and let them use the policy just like they will on the job.

Instead of presentation, just an activity that links to info

Same activity with link to policy

And now that we aren’t developing lots of information slides, we can create more activities. Since they aren’t trapped inside an information presentation, they can travel alone. For example, we can provide them individually over time (spaced practice) as described in this post.

6. Sell it with a prototype

Create a prototype of one typical activity and show it to the stakeholders. Make clear that people will see only the activities that apply to their job. They’ll pull information rather than recognizing what they saw three slides ago, and they’ll learn from the consequences of their choices.

You’re letting the stakeholders see for themselves how you plan to provide the “training,” because then you’ll be in a good position to respond to the following common concerns.

“But everyone must be exposed to all the information!”

Give each option unique feedback. In that feedback, first show the consequence of the choice — continue the story.

Then show the snippet of information they should have looked at, as described in How to really involve learners. Do this for all consequences, not just the poor ones.

See more ideas and examples in Scenario mistakes to avoid: Eager-beaver feedback.

If you have a stakeholder who’s determined to expose everyone, you can point out that they are now exposed. They’re just exposed after making a relevant decision, rather than in a forgettable presentation.

By not presenting information first, you’re helping people see their own knowledge gaps. They’re not pulling stuff out of short-term memory, because you haven’t put anything there. They have to rummage around in their existing knowledge, look at the policy just like they would in real life, make a choice, and learn from the consequences. They get deeper learning, plus they’re dutifully “exposed” to the correct information.

“But they have to prove that they know it!”

Which approach is more likely to avoid lawsuits about misuse of the alien leave policy?

A. Present the policy over several slides. Then require a knowledge test to see if people can recognize a bit of information that they saw 5 minutes ago. If they can, they “pass.” If they can’t, they must put those same slides back in their short-term memory and try again.

B. Present challenges in which people need to make the same decisions they make on the job. Provide the information in the same format that people will have it on the job. Start with easy-ish decisions and increase the challenge. If people make good decisions in enough activities, they’re free to go. If they make not-good decisions, they get more activities and optional help until they make good decisions.

My point

Don’t design for “They should know the rules.” Design for “They should correctly apply the rules on the job.”

For lots more, see my book and just about everything in this blog, especially the following posts.

Credits

Photo of Jorge: David Boyle in DC via Compfight cc

All other images: Cathy Moore

How to make mandatory training relevant

Compliance training sheep“How can we make mandatory training more than a tick box exercise?”

That’s the top topic voted by blog readers, so here’s my take.

For “mandatory training,” I’m picturing any material that says some version of “Follow these rules.”

It’s sheep-dip training. Everyone must be “exposed” to it, and a checkmark records that they have been exposed.

How can we make it more relevant?

1. Disobey

A client who says “Everyone must be trained on X” needs our resistance, not our obedience.

Help the client by asking questions, such as:

  • What problems are you seeing? Has something happened? Has someone sued?
  • Was this problem caused by one rogue employee, or is it a bigger issue? Is it limited to a group of employees, or is it really a problem that all employees are causing equally?
  • What are we currently measuring that will improve when everyone is “trained?”

If there’s really no problem, we shouldn’t create a solution. We need to focus on improving performance, not guarding against problems that experience has shown aren’t likely to occur.

2. Set a goal

If it’s clear there really is a need for “training,” or some force far outside your control insists on “training,” then put on your action mapping hat and push for a measurable goal. Here’s one model to follow.

action mapping goal template

For details, see How to create a training goal in 2 quick steps.

3. Narrow your focus

Make sure your audience is specific. “All employees” is not specific.

If you’re required by forces beyond your control to create something for all employees, you can at least break down the audience by major job roles as described next.

4. Do the analysis. Really. DON’T SKIP THIS.

Focus on one job role in your audience. Ask your client and SME what these people need to do, in specific, observable terms, to meet the goal.

“Follow the data security policy” isn’t specific. This is specific:

  • When you must physically transfer data to another location, put the data on a BrandZ thumb drive using HakrPruf encryption and chain it to your left ankle.

Prioritize the actions. Choose a high-priority one, and ask, “What makes this one thing hard to do?” Use the flowchart.

Again, you’re doing this for a specific group of people in a specific job, and you’re focusing on specific, observable behaviors. You’re not asking this once for the entire “course,” and you’re not talking about all employees in every job everywhere.

If those forces far beyond your control insist on applying the same solution to everyone, do this analysis for the major job roles. You probably won’t have a ton of time to do this, but even two hours can save you and everyone else from a much bigger waste of time in the form of irrelevant and ignored materials.

Then, if training is part of the solution, you can have people use only the activities that apply to their job.

Don’t skip this.

If you skip this analysis, what do you have to work with? Generic rules that are guaranteed to become an information dump.

Instead, if you look closely at what people need to do and why they aren’t doing it, you get:

  • Ways to fix the problem that don’t require “training”
  • Ideas for ways to help people practice the tricky parts
  • Respect for the intelligence and experience of the people currently doing the job (notably lacking from most compliance training)

5. Base your design on job tasks, not information

Yes, people need to know stuff. But they need to know stuff in order to do stuff. Design first for what they need to do.

Provide the need-to-know information in the format it’s used on the job. Let people pull the information just like they will on the job.

Here’s a fictional example. Extraterrestrials have landed and are being incorporated into earthling families. As a result, employers have created alien leave policies. Here’s a mini-scenario for managers.

Mini-scenario for alien leave

To answer this question, what information does the manager need? The alien leave policy. How should we provide it?

The traditional approach would be to first present a bunch of slides about the policy. Then we’d give people a chance to “apply” what they’ve “learned” by having them use their short-term memory to answer the question.

Lots of slides followed by activity

But why design slides to present information that’s already in a policy on the intranet?

Instead, we can plunge people into the activity and let them use the policy just like they will on the job.

Instead of presentation, just an activity that links to info

Same activity with link to policy

And now that we aren’t developing lots of information slides, we can create more activities. Since they aren’t trapped inside an information presentation, they can travel alone. For example, we can provide them individually over time (spaced practice) as described in this post.

6. Sell it with a prototype

Create a prototype of one typical activity and show it to the stakeholders. Make clear that people will see only the activities that apply to their job. They’ll pull information rather than recognizing what they saw three slides ago, and they’ll learn from the consequences of their choices.

You’re letting the stakeholders see for themselves how you plan to provide the “training,” because then you’ll be in a good position to respond to the following common concerns.

“But everyone must be exposed to all the information!”

Give each option unique feedback. In that feedback, first show the consequence of the choice — continue the story.

Then show the snippet of information they should have looked at, as described in How to really involve learners. Do this for all consequences, not just the poor ones.

See more ideas and examples in Scenario mistakes to avoid: Eager-beaver feedback.

If you have a stakeholder who’s determined to expose everyone, you can point out that they are now exposed. They’re just exposed after making a relevant decision, rather than in a forgettable presentation.

By not presenting information first, you’re helping people see their own knowledge gaps. They’re not pulling stuff out of short-term memory, because you haven’t put anything there. They have to rummage around in their existing knowledge, look at the policy just like they would in real life, make a choice, and learn from the consequences. They get deeper learning, plus they’re dutifully “exposed” to the correct information.

“But they have to prove that they know it!”

Which approach is more likely to avoid lawsuits about misuse of the alien leave policy?

A. Present the policy over several slides. Then require a knowledge test to see if people can recognize a bit of information that they saw 5 minutes ago. If they can, they “pass.” If they can’t, they must put those same slides back in their short-term memory and try again.

B. Present challenges in which people need to make the same decisions they make on the job. Provide the information in the same format that people will have it on the job. Start with easy-ish decisions and increase the challenge. If people make good decisions in enough activities, they’re free to go. If they make not-good decisions, they get more activities and optional help until they make good decisions.

My point

Don’t design for “They should know the rules.” Design for “They should correctly apply the rules on the job.”

For lots more, see my book and just about everything in this blog, especially the following posts.

Credits

Photo of Jorge: David Boyle in DC via Compfight cc

All other images: Cathy Moore

Two examples of interactive job aids

I talk a lot about using Twine for branching scenarios, but it’s also useful for creating interactive job aids. Here are two examples.

Diagnostic tool: Is this a gnome or what?

Want to help people diagnose a problem or identify the best person to contact? Be inspired by this fun example created by Krishan Coupland in Twine: A Primer on the Capture and Identification of the Little Folk of Myth and Legend.

This is basically a text-based flowchart, sending you down paths depending on the characteristics of the creature you’re trying to identify.

This type of interaction has a lot of potential uses in business, where the little folk might be replaced by types of tools, people to contact, troubleshooting steps to follow, or any other type of flowchart-y decision.

I’d prefer a quicker, visual flowchart when possible, but this text-based approach lets you include more detail.

Custom advice made scalable

Will action mapping work for my project?You can make Twine handle more complex decisions if you use variables.

One of the most common questions I get is, “Will action mapping work for my project?”

To relieve the pressure on my in box, I created a Twine interaction that answers that question. I used variables to keep track of answers.

For example, you might say that your client is a non-profit organization ($org is “nonprofit”) and their goal is to make people feel confident or engaged ($goal is “feelings”). As a result, you’ll see advice tailored for non-profit organizations and feelings goals.

You might want to try the interaction before reading more, or the rest of the post won’t make much sense.

The variables are set when you answer questions at each decision point. Here’s how one decision point looks.

Question about type of organization

Earlier, the user identified whether their client was external, internal, or themselves. Now they’re being asked what type of organization the client works for.

If they answer “Business,” a variable that tracks the action mapping score ($am) gets 2 more points. This score will help decide whether action mapping is appropriate.

If they answer “Non-profit or government,” the action mapping score increases by only 1, due to the goal-setting issues that often plague those types of organizations.

If they answer “University or school,” the action mapping score doesn’t increase, because it’s likely that action mapping won’t be appropriate. That will get decided in the next question, which asks who the audience is.

The questions continue assigning variables and changing the action-mapping score. Users who want to prepare people for a knowledge test will be told early on that action mapping won’t be appropriate. Others will continue until they see the final advice screen.

The final screen uses the variables that have been accumulating to display text unique to each variable. Here’s a snippet.

Results screen showing variable use

In the above excerpt, if you said that your goal was to have people do something differently, you get a confirmation that action mapping will help. However, if you said that your goal was for people to be aware of something, you get some advice on how to change that so you can use the model more successfully.

Several additional paragraphs of text appear on the advice screen, all based on the answers you gave earlier and the variables you were assigned as a result.

This took some time to develop, but it has also saved a lot of time by reducing the number of questions I get. This kind of tool can reduce the need for training and relieve the pressure on help desks by providing instant answers tailored to each situation.


 

Scenario design: There are still some seats available

The June session of the live, online scenario design course still has some seats available. Learn to design scenarios by designing scenarios, with my personal feedback. There are sessions for people in the Australia-Pacific region as well as the Americas and Europe. Check it out.