Save your clients from themselves

Two National Guard team members dangle from a cable below a helicopter“Our job is to give the client what they want.” Sound familiar?

It’s what I was told when I started. But decades later, I’d say this instead:

“Our job is to make the client look good.”

Often this means, “Our job is to save our clients from themselves.”

Make them look good…

Which manager looks good? The one who helps staff do their work well and feel proud of it, or the one who makes everyone sit through a zombie presentation followed by a quiz that a garden gnome could pass?

If we want to make our clients look good, we can’t just give them what they think they want.

…by saving them from themselves.

“Give me a heart transplant, and use this knife to do it.” No surgeon would agree to this. It goes against their ethics.

“Make me a course, and use this tool to do it.” Like a surgeon, we should have ethics. We should at least vow to do no harm.

Creating a course and making everyone sit through it does harm when it doesn’t solve the problem. We not only waste money and time, we disrespect the client, employees, organization, and our own profession.

Our goal should be to leave our clients in better shape than when they came to us. The only way to do this is to diagnose their problem and help them solve it. They’ll look good, and we’ll be heroes.

Stalk your client

To know what will make our client look good, we need to know who they are and what challenges they’re facing. For example, we could:

  • Look at their LinkedIn profile and posts. How long have they been in their current position? What’s their background?
  • Look for internal announcements or news releases about their department.
  • Look up their company in Wikipedia or business databases. How is it performing? Any recent controversies or changes?
  • Check Glassdoor for staff reviews about their division or company.

For example, in the Jedi Mind Tricks toolkit (now available), we have a practice client called Carla. She wants an online course for managers. The course is supposed to teach them how to use the (fictional, but only barely) Soto-Baldwin personality inventory to “become more empathetic.”

You could obey and crank out the course. But if you spend a few minutes learning more about Carla, you discover that going ahead with this idea would damage an important relationship. You’d also waste everyone’s time with a dubious personality test.

Your challenge is to help Carla see this for herself.

Then, do the analysis!!!

Once we understand where the client is coming from, we can help them analyze the problem.

Even if you can only invest two hours, you’ll have a chance to steer your client in the right direction.

Toolkit now available

These first steps with the client are the focus of the Jedi Mind Tricks toolkit. It gives you the direction I wish I had when I first started in this field.

You’ll change how you talk to stakeholders so you can help them solve problems and improve lives. You’ll stop being an order taker and move toward performance consulting.

You’ll have tons of realistic practice and a unique system of real-world tasks. The 30 tasks have you improve the forms you use, write what you’ll say in meetings, practice with colleagues, and establish procedures to permanently change how you work.

You can also use the toolkit as a mega-job aid as you start a new project. In each section, you can write notes in the toolkit, recording how you’ll apply the techniques to your current client. You can download these notes as a custom PDF. Later, you can reuse the notes fields and download new PDFs as many times as you want for new projects.

Check out the toolkit here!

Discuss this post on LinkedIn.

Photo credit: New Jersey National Guard Flickr via Compfight cc

Burn your training request form

If your organization is typical, you have a training request form. Look at it now. It probably commits 10,000 sins.

Training request form on fireFor example, it might ask the client to:

  • Identify the format and length of “the training”
  • List the content that should be included
  • Specify the date and location for “the training”
  • Identify the number of people to be “trained”

With this form, you’re saying, “My job is to produce whatever training you want, whether or not it will actually work.” It turns you into a worker in a course factory.

If you want to have a real impact and win the respect of your organization, you need to set your clients’ expectations from the start.

1. Not “training” but “development”

If you must have a form, call it something like “development request.” Make clear your job is to improve performance, not create training on demand.

2. Don’t use the forbidden words

Throughout the form, avoid terms that refer to a specific solution. There is no solution yet. You won’t decide whether training is part of the solution until you’ve analyzed the problem.

For example, don’t use these terms in the form:

  • course
  • training
  • workshop
  • content
  • blended
  • assessment

3. Ask about the problem instead

Ask about the issue that the client is seeing. You might use questions like these:

  • What change would you like to achieve with this project?
  • How will you know that the change has been achieved?
  • Who will change what they’re doing as a result of this project?
  • How will the organization benefit?
  • What has been tried in the past to bring about this change?

Your goal is to get an idea of the possible business goal and how the client currently views the problem. Both could change during your discussions with the client.

Template available soon

A development request form that you can adapt will soon be available as part of a toolkit.

Jedi mind tricks for training designers has been a popular presentation, and now it’s grown up into the first toolkit of several that I’m developing.

The toolkit is a menu-driven series of challenges, guidance, downloads, and real-world tasks that will help you start projects right and avoid creating information dumps. Learn more and sign up to be notified when it’s available.

Burn your training request form

If your organization is typical, you have a training request form. Look at it now. It probably commits 10,000 sins.

Training request form on fireFor example, it might ask the client to:

  • Identify the format and length of “the training”
  • List the content that should be included
  • Specify the date and location for “the training”
  • Identify the number of people to be “trained”

With this form, you’re saying, “My job is to produce whatever training you want, whether or not it will actually work.” It turns you into a worker in a course factory.

If you want to have a real impact and win the respect of your organization, you need to set your clients’ expectations from the start.

1. Not “training” but “development”

If you must have a form, call it something like “development request.” Make clear your job is to improve performance, not create training on demand.

2. Don’t use the forbidden words

Throughout the form, avoid terms that refer to a specific solution. There is no solution yet. You won’t decide whether training is part of the solution until you’ve analyzed the problem.

For example, don’t use these terms in the form:

  • course
  • training
  • workshop
  • content
  • blended
  • assessment

3. Ask about the problem instead

Ask about the issue that the client is seeing. You might use questions like these:

  • What change would you like to achieve with this project?
  • How will you know that the change has been achieved?
  • Who will change what they’re doing as a result of this project?
  • How will the organization benefit?
  • What has been tried in the past to bring about this change?

Your goal is to get an idea of the possible business goal and how the client currently views the problem. Both could change during your discussions with the client.

Template available soon

A development request form that you can adapt will soon be available as part of a toolkit.

Jedi mind tricks for training designers has been a popular presentation, and now it’s grown up into the first toolkit of several that I’m developing.

The toolkit is a menu-driven series of challenges, guidance, downloads, and real-world tasks that will help you start projects right and avoid creating information dumps. Learn more and sign up to be notified when it’s available.

Discuss this post on LinkedIn.

Learning objectives: Our frenemy

Frenemies“Never design anything without first writing the learning objectives.”

We all know this. It’s a useful rule, but only when the objectives are useful.

And there’s the problem — conventional learning objectives can work against us. They’re our friends, but not always.

What do I mean by “conventional learning objectives?” This sort of thing:

  • List the three steps of widget certification.
  • Explain the difference between widget certification and widget approval.
  • Describe the widget supply chain.

Here are three questions that will help you set boundaries with our frenemy.

1. Do people actually need to “learn” something?

Conventional learning objectives might be your friends if both of the following are true.

  1. A knowledge or skill gap clearly exists.
  2. Closing that gap will help solve the problem.

Is there really a knowledge or skill gap? Maybe the problem is mostly caused by bad tools, an inefficient process, lack of incentives, or some other environmental issue. With your client and SME, first identify what people need to do on the job, and then walk through this flowchart before agreeing to develop training.

Will closing the gap solve the problem? Maybe it’s true that people don’t know the intricacies of the supply chain, but installing that information in their brains won’t make them better widget polishers. Don’t deliver content just because someone told you to.

(In the Asia-Pacific region? This workshop on Jan. 23 will help you stand up to content-obsessed clients.)

2. Are we writing useful objectives for the formal training bits?

If our analysis shows that we really do need to design a learning experience, then, yes, we need objectives. Are the actions we wrote earlier good enough, or should we let learning objectives elbow their way into our project?

Here’s an example from my book.

Let’s say that we want firefighters to educate the public about preventing forest fires and quickly put these fires out when they occur. Our official goal is, “Fire-related losses in our region will decrease 10% by next year as firefighters prevent and extinguish brush and forest fires.”

Which of the following do you think I’d accept as actions to reach this goal?

a) Identify the techniques used to extinguish a brush fire
b) List the principal sources of combustion in a deciduous forest
c) Describe common public misconceptions about campfires
d) Quickly extinguish a brush fire in a dry mixed forest
e) Define “incendiary device”

If you said that only d, “Quickly extinguish a brush fire,” was an action by my standards, you’ve gotten my point.

An action is something you see someone do as a normal part of their job. It doesn’t take place in their head or in that abstract world I call Testland. The action should be the focus of our analysis and design, and it should be the statement we use to “sell” the material to the stakeholders and learners.

“But the other statements are good objectives!”

In the world of conventional instructional design, the other statements are also observable objectives.

For example, we can watch a firefighter write a list of the techniques used to extinguish a brush fire, and we can point at that list and say, “See? They know it.” And that’s the problem — we’re just measuring whether they know it. There’s no guarantee that the firefighter will actually apply this knowledge, which is what we really want and what we should be helping them do.

“Identify the techniques” is an enabling objective. It describes information necessary to perform the action. It goes in the information part of the map — I’d list “techniques to extinguish a brush fire” as required knowledge that’s subordinate to the action about putting out fires.

Our goal is to create realistic, contextual practice activities. We can do that only if we focus on what people need to do. If instead we let knowledge-based objectives distract us, we’ll create the usual information dump followed by a quiz, which is the approach that helps make us irrelevant.

3. Who needs to see the objectives?

The client

If you’re using action mapping, your client helped create the list of actions, so they’re already familiar with them. If you need to submit a formal document, I recommend an outline rather than a big design-everything-at-once document. (See this big interactive graphic of the action mapping workflow.)

In that outline, you can include your action map, which shows the actions and the information required by each. The actions are your main objectives, and the bits of information represent the knowledge that supports those objectives.

If your client wants to see conventional learning objectives, consider listing your actions as “performance objectives.” Then, indented and subordinate to each performance objective, list its enabling objectives.

I resist writing the enabling objectives using test language (“describe, explain, define…”) because that sets the expectation that there will be a knowledge test. Maybe some of the knowledge doesn’t need to be memorized and could instead be included in a job reference. It won’t be tested, so there’s no reason to write a test-style objective about it.

Or maybe people do need to memorize some stuff, but a separate knowledge test would be artificial. Instead, you could assess with the same type of activities you provided for practice, which would test not only whether people know the information but whether they can apply it.

The learners

Briefly tell people what they’ll be able to do as a result of the activity, and focus on what they care about. Put those over-eager learning objectives on mute because they don’t know how to sound appealing.

  • Not this: “When interacting with a dissatisfied customer, appropriately apply the five steps of the Abrams-Martinson Dissatisfied-to-Satisfied Customer Transformation Model” that no one has heard of but a consultant convinced us to use
  • This instead: “Turn an unhappy customer into a happy one in 5 quick steps”

Again, I’m not talking just about courses. This applies to activities, which could (and maybe should) be provided individually, on demand. Each activity that stands alone should quickly make clear what people will practice doing and how they’ll benefit.

For more on the distinction between an action and an enabling objective, see Why you want to focus on actions, not learning objectives.


Online workshop Jan. 23

Jedi Mind Tricks for learning designersIn the Asia-Pacific region? Learn to control your clients’ minds (sort of!) on January 23. In this 90-minute interactive workshop, you’ll change how you talk to stakeholders and manage the initial project meetings. You’ll stop being an order taker and instead steer clients toward solutions that solve problems and improve lives.

Learning Technologies UK

I’ll present a shorter version of the Jedi Mind Tricks workshop in my Feb. 23 session at Learning Technologies UK.

 

Learning objectives: Our frenemy

Frenemies“Never design anything without first writing the learning objectives.”

We all know this. It’s a useful rule, but only when the objectives are useful.

And there’s the problem — conventional learning objectives can work against us. They’re our friends, but not always.

What do I mean by “conventional learning objectives?” This sort of thing:

  • List the three steps of widget certification.
  • Explain the difference between widget certification and widget approval.
  • Describe the widget supply chain.

Here are three questions that will help you set boundaries with our frenemy.

1. Do people actually need to “learn” something?

Conventional learning objectives might be your friends if both of the following are true.

  1. A knowledge or skill gap clearly exists.
  2. Closing that gap will help solve the problem.

Is there really a knowledge or skill gap? Maybe the problem is mostly caused by bad tools, an inefficient process, lack of incentives, or some other environmental issue. With your client and SME, first identify what people need to do on the job, and then walk through this flowchart before agreeing to develop training.

Will closing the gap solve the problem? Maybe it’s true that people don’t know the intricacies of the supply chain, but installing that information in their brains won’t make them better widget polishers. Don’t deliver content just because someone told you to.

(In the Asia-Pacific region? This workshop on Jan. 23 will help you stand up to content-obsessed clients.)

2. Are we writing useful objectives for the formal training bits?

If our analysis shows that we really do need to design a learning experience, then, yes, we need objectives. Are the actions we wrote earlier good enough, or should we let learning objectives elbow their way into our project?

Here’s an example from my book.

Let’s say that we want firefighters to educate the public about preventing forest fires and quickly put these fires out when they occur. Our official goal is, “Fire-related losses in our region will decrease 10% by next year as firefighters prevent and extinguish brush and forest fires.”

Which of the following do you think I’d accept as actions to reach this goal?

a) Identify the techniques used to extinguish a brush fire
b) List the principal sources of combustion in a deciduous forest
c) Describe common public misconceptions about campfires
d) Quickly extinguish a brush fire in a dry mixed forest
e) Define “incendiary device”

If you said that only d, “Quickly extinguish a brush fire,” was an action by my standards, you’ve gotten my point.

An action is something you see someone do as a normal part of their job. It doesn’t take place in their head or in that abstract world I call Testland. The action should be the focus of our analysis and design, and it should be the statement we use to “sell” the material to the stakeholders and learners.

“But the other statements are good objectives!”

In the world of conventional instructional design, the other statements are also observable objectives.

For example, we can watch a firefighter write a list of the techniques used to extinguish a brush fire, and we can point at that list and say, “See? They know it.” And that’s the problem — we’re just measuring whether they know it. There’s no guarantee that the firefighter will actually apply this knowledge, which is what we really want and what we should be helping them do.

“Identify the techniques” is an enabling objective. It describes information necessary to perform the action. It goes in the information part of the map — I’d list “techniques to extinguish a brush fire” as required knowledge that’s subordinate to the action about putting out fires.

Our goal is to create realistic, contextual practice activities. We can do that only if we focus on what people need to do. If instead we let knowledge-based objectives distract us, we’ll create the usual information dump followed by a quiz, which is the approach that helps make us irrelevant.

3. Who needs to see the objectives?

The client

If you’re using action mapping, your client helped create the list of actions, so they’re already familiar with them. If you need to submit a formal document, I recommend an outline rather than a big design-everything-at-once document. (See this big interactive graphic of the action mapping workflow.)

In that outline, you can include your action map, which shows the actions and the information required by each. The actions are your main objectives, and the bits of information represent the knowledge that supports those objectives.

If your client wants to see conventional learning objectives, consider listing your actions as “performance objectives.” Then, indented and subordinate to each performance objective, list its enabling objectives.

I resist writing the enabling objectives using test language (“describe, explain, define…”) because that sets the expectation that there will be a knowledge test. Maybe some of the knowledge doesn’t need to be memorized and could instead be included in a job reference. It won’t be tested, so there’s no reason to write a test-style objective about it.

Or maybe people do need to memorize some stuff, but a separate knowledge test would be artificial. Instead, you could assess with the same type of activities you provided for practice, which would test not only whether people know the information but whether they can apply it.

The learners

Briefly tell people what they’ll be able to do as a result of the activity, and focus on what they care about. Put those over-eager learning objectives on mute because they don’t know how to sound appealing.

  • Not this: “When interacting with a dissatisfied customer, appropriately apply the five steps of the Abrams-Martinson Dissatisfied-to-Satisfied Customer Transformation Model” that no one has heard of but a consultant convinced us to use
  • This instead: “Turn an unhappy customer into a happy one in 5 quick steps”

Again, I’m not talking just about courses. This applies to activities, which could (and maybe should) be provided individually, on demand. Each activity that stands alone should quickly make clear what people will practice doing and how they’ll benefit.

For more on the distinction between an action and an enabling objective, see Why you want to focus on actions, not learning objectives.


Online workshop Jan. 23

Jedi Mind Tricks for learning designersIn the Asia-Pacific region? Learn to control your clients’ minds (sort of!) on January 23. In this 90-minute interactive workshop, you’ll change how you talk to stakeholders and manage the initial project meetings. You’ll stop being an order taker and instead steer clients toward solutions that solve problems and improve lives.

Learning Technologies UK

I’ll present a shorter version of the Jedi Mind Tricks workshop in my Feb. 23 session at Learning Technologies UK.

 

“It’s new, so everyone needs training on it.” Nope.

“We’re introducing something new,” your client says. “So of course everyone needs to be trained on it.”

New thing requires 13 hours of trainingHmmm. Really?

Maybe your client is thinking this: “This new thing is so bizarrely new that no adult Earthling could possibly figure it out without formal training.”

Or maybe they’re really thinking this: “This new thing is a pain in my neck and I don’t know how to introduce it. I’ll have L&D train everyone and call it a day.”

Either way, the client is expecting you to unleash an avalanche of “training” on innocent people who would rather just do their jobs.

Stop that avalanche right now with your Jedi mind tricks. (Learn those tricks online on Jan. 23 or in my session at Learning Technologies UK.)

Example

“Please train everyone on the new TPS software by June 1,” your client says.

The client expects to hear, “Sure. I’m on it!” Instead, offer an innocent “why?”

“Why are you installing new TPS software?” you ask.

“Because people were messing up their reports in the old software,” your client says.

“Why were people messing up their reports in the old software?”

“It was confusing to use,” your client says. “The new software walks people through the process a lot more clearly.”

“So the new software is easier to use?”

“Yeah, a lot easier.”

“And everyone who will be using it is already familiar with the old software?”

“Yep. They’ve all been entering TPS reports for years.”

At this point, do you agree with the client that everyone needs “training” on the new software? I hope not.

You might propose this: Give the new software to a few typical TPS report creators and watch them figure it out. Their struggles (or lack of struggle) will show what support they really need. A help screen or short reference is likely to be enough “training” in this case.

Use a goal to focus on performance, not training

If you’re using action mapping, you’ll want your client to give you a measurable business goal that justifies the expense of the project.

In our example, the client’s first goal was, “TPS software training is 100% complete by June 1.” This goal is measurable, but it doesn’t show how the organization will benefit. It also gets way ahead of itself by assuming that training is the solution.

Your innocent questions help the client see their real goal. This might be, “TPS error rates decrease 35% by June 1 as all TPS staff correctly use the new software.”

This goal doesn’t assume that training is the answer, and it justifies the expense of the project in terms the organization cares about. It also leaves room for many solutions, including job aids.

What about new products?

“We’re releasing a new product,” your client says. “Please train all employees on it.”

What are the two biggest problems with this request? I’d say:

1. The client assumes training is necessary.
2. They think “everyone” needs training. They’re planning a sheep dip.

Your (polite! helpful!) questions should steer the client to this:

  • The reason the product was created in the first place. What organizational improvement is the product supposed to achieve? For example, are we trying to snag some market share from a competitor? Help your client and SMEs focus on meeting that organizational goal, which may or may not require training.
  • A breakdown of the “trainees” by job role (sales, support, repair…)
  • The very specific tasks a typical person in each role needs to perform with the product (sell it using X technique, explain it when a customer asks Y, repair it when it does Z…)
  • The real-world barriers that might make each major task difficult, including problems with tools, time, social pressures, communications, management support…
  • The (many!) solutions that could be put in place to remove those barriers

Then, if some training does seem to be necessary, it will be far more targeted and useful.

You could use a similar approach for customer training for a new product:

  • Why was the product released? What problem does it solve for customers? What does it do for us as an organization?
  • Do we have different types of customers? How can we categorize them?
  • What does each category of customer DO with the product? What are the major tasks they perform?
  • What could make each of those tasks difficult? How could we make it easier?

Related articles

What to do if they just want “awareness”

How to design software training, part 1: Do everything except “train”

Is training really the answer? Ask the flowchart.

“It’s new, so everyone needs training on it.” Nope.

“We’re introducing something new,” your client says. “So of course everyone needs to be trained on it.”

New thing requires 13 hours of trainingHmmm. Really?

Maybe your client is thinking this: “This new thing is so bizarrely new that no adult Earthling could possibly figure it out without formal training.”

Or maybe they’re really thinking this: “This new thing is a pain in my neck and I don’t know how to introduce it. I’ll have L&D train everyone and call it a day.”

Either way, the client is expecting you to unleash an avalanche of “training” on innocent people who would rather just do their jobs.

Stop that avalanche right now with your Jedi mind tricks. (Learn those tricks online on Jan. 23 or in my session at Learning Technologies UK.)

Example

“Please train everyone on the new TPS software by June 1,” your client says.

The client expects to hear, “Sure. I’m on it!” Instead, offer an innocent “why?”

“Why are you installing new TPS software?” you ask.

“Because people were messing up their reports in the old software,” your client says.

“Why were people messing up their reports in the old software?”

“It was confusing to use,” your client says. “The new software walks people through the process a lot more clearly.”

“So the new software is easier to use?”

“Yeah, a lot easier.”

“And everyone who will be using it is already familiar with the old software?”

“Yep. They’ve all been entering TPS reports for years.”

At this point, do you agree with the client that everyone needs “training” on the new software? I hope not.

You might propose this: Give the new software to a few typical TPS report creators and watch them figure it out. Their struggles (or lack of struggle) will show what support they really need. A help screen or short reference is likely to be enough “training” in this case.

Use a goal to focus on performance, not training

If you’re using action mapping, you’ll want your client to give you a measurable business goal that justifies the expense of the project.

In our example, the client’s first goal was, “TPS software training is 100% complete by June 1.” This goal is measurable, but it doesn’t show how the organization will benefit. It also gets way ahead of itself by assuming that training is the solution.

Your innocent questions help the client see their real goal. This might be, “TPS error rates decrease 35% by June 1 as all TPS staff correctly use the new software.”

This goal doesn’t assume that training is the answer, and it justifies the expense of the project in terms the organization cares about. It also leaves room for many solutions, including job aids.

What about new products?

“We’re releasing a new product,” your client says. “Please train all employees on it.”

What are the two biggest problems with this request? I’d say:

1. The client assumes training is necessary.
2. They think “everyone” needs training. They’re planning a sheep dip.

Your (polite! helpful!) questions should steer the client to this:

  • The reason the product was created in the first place. What organizational improvement is the product supposed to achieve? For example, are we trying to snag some market share from a competitor? Help your client and SMEs focus on meeting that organizational goal, which may or may not require training.
  • A breakdown of the “trainees” by job role (sales, support, repair…)
  • The very specific tasks a typical person in each role needs to perform with the product (sell it using X technique, explain it when a customer asks Y, repair it when it does Z…)
  • The real-world barriers that might make each major task difficult, including problems with tools, time, social pressures, communications, management support…
  • The (many!) solutions that could be put in place to remove those barriers

Then, if some training does seem to be necessary, it will be far more targeted and useful.

You could use a similar approach for customer training for a new product:

  • Why was the product released? What problem does it solve for customers? What does it do for us as an organization?
  • Do we have different types of customers? How can we categorize them?
  • What does each category of customer DO with the product? What are the major tasks they perform?
  • What could make each of those tasks difficult? How could we make it easier?

Related articles

What to do if they just want “awareness”

How to design software training, part 1: Do everything except “train”

Is training really the answer? Ask the flowchart.

How to make mandatory training relevant

Compliance training sheep“How can we make mandatory training more than a tick box exercise?”

That’s the top topic voted by blog readers, so here’s my take.

For “mandatory training,” I’m picturing any material that says some version of “Follow these rules.”

It’s sheep-dip training. Everyone must be “exposed” to it, and a checkmark records that they have been exposed.

How can we make it more relevant?

1. Disobey

A client who says “Everyone must be trained on X” needs our resistance, not our obedience.

Help the client by asking questions, such as:

  • What problems are you seeing? Has something happened? Has someone sued?
  • Was this problem caused by one rogue employee, or is it a bigger issue? Is it limited to a group of employees, or is it really a problem that all employees are causing equally?
  • What are we currently measuring that will improve when everyone is “trained?”

If there’s really no problem, we shouldn’t create a solution. We need to focus on improving performance, not guarding against problems that experience has shown aren’t likely to occur.

2. Set a goal

If it’s clear there really is a need for “training,” or some force far outside your control insists on “training,” then put on your action mapping hat and push for a measurable goal. Here’s one model to follow.

action mapping goal template

For details, see How to create a training goal in 2 quick steps.

3. Narrow your focus

Make sure your audience is specific. “All employees” is not specific.

If you’re required by forces beyond your control to create something for all employees, you can at least break down the audience by major job roles as described next.

4. Do the analysis. Really. DON’T SKIP THIS.

Focus on one job role in your audience. Ask your client and SME what these people need to do, in specific, observable terms, to meet the goal.

“Follow the data security policy” isn’t specific. This is specific:

  • When you must physically transfer data to another location, put the data on a BrandZ thumb drive using HakrPruf encryption and chain it to your left ankle.

Prioritize the actions. Choose a high-priority one, and ask, “What makes this one thing hard to do?” Use the flowchart.

Again, you’re doing this for a specific group of people in a specific job, and you’re focusing on specific, observable behaviors. You’re not asking this once for the entire “course,” and you’re not talking about all employees in every job everywhere.

If those forces far beyond your control insist on applying the same solution to everyone, do this analysis for the major job roles. You probably won’t have a ton of time to do this, but even two hours can save you and everyone else from a much bigger waste of time in the form of irrelevant and ignored materials.

Then, if training is part of the solution, you can have people use only the activities that apply to their job.

Don’t skip this.

If you skip this analysis, what do you have to work with? Generic rules that are guaranteed to become an information dump.

Instead, if you look closely at what people need to do and why they aren’t doing it, you get:

  • Ways to fix the problem that don’t require “training”
  • Ideas for ways to help people practice the tricky parts
  • Respect for the intelligence and experience of the people currently doing the job (notably lacking from most compliance training)

5. Base your design on job tasks, not information

Yes, people need to know stuff. But they need to know stuff in order to do stuff. Design first for what they need to do.

Provide the need-to-know information in the format it’s used on the job. Let people pull the information just like they will on the job.

Here’s a fictional example. Extraterrestrials have landed and are being incorporated into earthling families. As a result, employers have created alien leave policies. Here’s a mini-scenario for managers.

Mini-scenario for alien leave

To answer this question, what information does the manager need? The alien leave policy. How should we provide it?

The traditional approach would be to first present a bunch of slides about the policy. Then we’d give people a chance to “apply” what they’ve “learned” by having them use their short-term memory to answer the question.

Lots of slides followed by activity

But why design slides to present information that’s already in a policy on the intranet?

Instead, we can plunge people into the activity and let them use the policy just like they will on the job.

Instead of presentation, just an activity that links to info

Same activity with link to policy

And now that we aren’t developing lots of information slides, we can create more activities. Since they aren’t trapped inside an information presentation, they can travel alone. For example, we can provide them individually over time (spaced practice) as described in this post.

6. Sell it with a prototype

Create a prototype of one typical activity and show it to the stakeholders. Make clear that people will see only the activities that apply to their job. They’ll pull information rather than recognizing what they saw three slides ago, and they’ll learn from the consequences of their choices.

You’re letting the stakeholders see for themselves how you plan to provide the “training,” because then you’ll be in a good position to respond to the following common concerns.

“But everyone must be exposed to all the information!”

Give each option unique feedback. In that feedback, first show the consequence of the choice — continue the story.

Then show the snippet of information they should have looked at, as described in How to really involve learners. Do this for all consequences, not just the poor ones.

See more ideas and examples in Scenario mistakes to avoid: Eager-beaver feedback.

If you have a stakeholder who’s determined to expose everyone, you can point out that they are now exposed. They’re just exposed after making a relevant decision, rather than in a forgettable presentation.

By not presenting information first, you’re helping people see their own knowledge gaps. They’re not pulling stuff out of short-term memory, because you haven’t put anything there. They have to rummage around in their existing knowledge, look at the policy just like they would in real life, make a choice, and learn from the consequences. They get deeper learning, plus they’re dutifully “exposed” to the correct information.

“But they have to prove that they know it!”

Which approach is more likely to avoid lawsuits about misuse of the alien leave policy?

A. Present the policy over several slides. Then require a knowledge test to see if people can recognize a bit of information that they saw 5 minutes ago. If they can, they “pass.” If they can’t, they must put those same slides back in their short-term memory and try again.

B. Present challenges in which people need to make the same decisions they make on the job. Provide the information in the same format that people will have it on the job. Start with easy-ish decisions and increase the challenge. If people make good decisions in enough activities, they’re free to go. If they make not-good decisions, they get more activities and optional help until they make good decisions.

My point

Don’t design for “They should know the rules.” Design for “They should correctly apply the rules on the job.”

For lots more, see my book and just about everything in this blog, especially the following posts.

Credits

Photo of Jorge: David Boyle in DC via Compfight cc

All other images: Cathy Moore

How to make mandatory training relevant

Compliance training sheep“How can we make mandatory training more than a tick box exercise?”

That’s the top topic voted by blog readers, so here’s my take.

For “mandatory training,” I’m picturing any material that says some version of “Follow these rules.”

It’s sheep-dip training. Everyone must be “exposed” to it, and a checkmark records that they have been exposed.

How can we make it more relevant?

1. Disobey

A client who says “Everyone must be trained on X” needs our resistance, not our obedience.

Help the client by asking questions, such as:

  • What problems are you seeing? Has something happened? Has someone sued?
  • Was this problem caused by one rogue employee, or is it a bigger issue? Is it limited to a group of employees, or is it really a problem that all employees are causing equally?
  • What are we currently measuring that will improve when everyone is “trained?”

If there’s really no problem, we shouldn’t create a solution. We need to focus on improving performance, not guarding against problems that experience has shown aren’t likely to occur.

2. Set a goal

If it’s clear there really is a need for “training,” or some force far outside your control insists on “training,” then put on your action mapping hat and push for a measurable goal. Here’s one model to follow.

action mapping goal template

For details, see How to create a training goal in 2 quick steps.

3. Narrow your focus

Make sure your audience is specific. “All employees” is not specific.

If you’re required by forces beyond your control to create something for all employees, you can at least break down the audience by major job roles as described next.

4. Do the analysis. Really. DON’T SKIP THIS.

Focus on one job role in your audience. Ask your client and SME what these people need to do, in specific, observable terms, to meet the goal.

“Follow the data security policy” isn’t specific. This is specific:

  • When you must physically transfer data to another location, put the data on a BrandZ thumb drive using HakrPruf encryption and chain it to your left ankle.

Prioritize the actions. Choose a high-priority one, and ask, “What makes this one thing hard to do?” Use the flowchart.

Again, you’re doing this for a specific group of people in a specific job, and you’re focusing on specific, observable behaviors. You’re not asking this once for the entire “course,” and you’re not talking about all employees in every job everywhere.

If those forces far beyond your control insist on applying the same solution to everyone, do this analysis for the major job roles. You probably won’t have a ton of time to do this, but even two hours can save you and everyone else from a much bigger waste of time in the form of irrelevant and ignored materials.

Then, if training is part of the solution, you can have people use only the activities that apply to their job.

Don’t skip this.

If you skip this analysis, what do you have to work with? Generic rules that are guaranteed to become an information dump.

Instead, if you look closely at what people need to do and why they aren’t doing it, you get:

  • Ways to fix the problem that don’t require “training”
  • Ideas for ways to help people practice the tricky parts
  • Respect for the intelligence and experience of the people currently doing the job (notably lacking from most compliance training)

5. Base your design on job tasks, not information

Yes, people need to know stuff. But they need to know stuff in order to do stuff. Design first for what they need to do.

Provide the need-to-know information in the format it’s used on the job. Let people pull the information just like they will on the job.

Here’s a fictional example. Extraterrestrials have landed and are being incorporated into earthling families. As a result, employers have created alien leave policies. Here’s a mini-scenario for managers.

Mini-scenario for alien leave

To answer this question, what information does the manager need? The alien leave policy. How should we provide it?

The traditional approach would be to first present a bunch of slides about the policy. Then we’d give people a chance to “apply” what they’ve “learned” by having them use their short-term memory to answer the question.

Lots of slides followed by activity

But why design slides to present information that’s already in a policy on the intranet?

Instead, we can plunge people into the activity and let them use the policy just like they will on the job.

Instead of presentation, just an activity that links to info

Same activity with link to policy

And now that we aren’t developing lots of information slides, we can create more activities. Since they aren’t trapped inside an information presentation, they can travel alone. For example, we can provide them individually over time (spaced practice) as described in this post.

6. Sell it with a prototype

Create a prototype of one typical activity and show it to the stakeholders. Make clear that people will see only the activities that apply to their job. They’ll pull information rather than recognizing what they saw three slides ago, and they’ll learn from the consequences of their choices.

You’re letting the stakeholders see for themselves how you plan to provide the “training,” because then you’ll be in a good position to respond to the following common concerns.

“But everyone must be exposed to all the information!”

Give each option unique feedback. In that feedback, first show the consequence of the choice — continue the story.

Then show the snippet of information they should have looked at, as described in How to really involve learners. Do this for all consequences, not just the poor ones.

See more ideas and examples in Scenario mistakes to avoid: Eager-beaver feedback.

If you have a stakeholder who’s determined to expose everyone, you can point out that they are now exposed. They’re just exposed after making a relevant decision, rather than in a forgettable presentation.

By not presenting information first, you’re helping people see their own knowledge gaps. They’re not pulling stuff out of short-term memory, because you haven’t put anything there. They have to rummage around in their existing knowledge, look at the policy just like they would in real life, make a choice, and learn from the consequences. They get deeper learning, plus they’re dutifully “exposed” to the correct information.

“But they have to prove that they know it!”

Which approach is more likely to avoid lawsuits about misuse of the alien leave policy?

A. Present the policy over several slides. Then require a knowledge test to see if people can recognize a bit of information that they saw 5 minutes ago. If they can, they “pass.” If they can’t, they must put those same slides back in their short-term memory and try again.

B. Present challenges in which people need to make the same decisions they make on the job. Provide the information in the same format that people will have it on the job. Start with easy-ish decisions and increase the challenge. If people make good decisions in enough activities, they’re free to go. If they make not-good decisions, they get more activities and optional help until they make good decisions.

My point

Don’t design for “They should know the rules.” Design for “They should correctly apply the rules on the job.”

For lots more, see my book and just about everything in this blog, especially the following posts.

Credits

Photo of Jorge: David Boyle in DC via Compfight cc

All other images: Cathy Moore

How to design software training, part 1: Do everything except “train”

Software training really required?“How can I design good training for new software? What’s the right balance between help screens, job aids, and training?”

That’s the top question in our idea collector, so let’s go at it.

Here’s part 1, in which I say, “Try everything but training.” We’ll cover needs analysis, job aids, help screens, and the radical idea of making the software easier to use.

Later, I’ll publish part 2. In that part, we’ll look at how to design practice activities for times when the everything-but-training approach isn’t enough.

 

1. Justify your work with a measurable goal

Action mapping begins with identifying how the organization will benefit from the project. The goal justifies the existence of the “training” (and of your job).

Here’s a template:
action mapping goal template

To identify the measure for new software, you might ask, “Why did we buy this software? What problem will it solve? How will we know it worked?”

For example, if the organization is installing a new customer relationship manager, why? Have potential new customers been slipping through the cracks? If so, maybe your goal is this:

Sales to new customers will increase 5% by [date] as all sales reps correctly use NewCRM to identify new prospects and build relationships with them.

If your stakeholders refuse to commit to a business-performance goal, you might try a weaker but still useful type. For example, you could measure the strain that new software can put on productivity. For example, if the new software is already in place and causing confusion, you can try to reduce the number of calls to the help desk.

If you doubt the usefulness of this kind of goal, imagine the alternative. A too-common “goal” is “All sales reps will be trained on NewCRM by [date].” This says, “We don’t care if the training actually works. We’ll just take attendance. Now give us money.”

For more on setting goals, see:

 

2. Ask, “What do they use the software to DO?”

List the most common tasks that people will use the software to complete. Also consider what might make each task difficult.

For example, is it obvious how to complete the task in the software? Are people working under time constraints?

This flowchart helps you consider all the angles.

For more on this, see the discussion in “Technical training: What do they need to DO?

 

3. View training with suspicion

Many people assume that every new system must be introduced with formal training. But is that always necessary?

“New” doesn’t mean “requires training”

Just because the software is new doesn’t mean that people need to be trained on it. The likelihood that training will be necessary depends on a lot of things, such as:

  • How different is the software from what they’re using now?
  • How tech savvy are the users?
  • How complex are the tasks that the software is used to complete?
  • How horrible is the outcome if someone screws up when using the software?
  • How clumsy is the software interface?
  • How much help is built into the software?

“Hard to use” doesn’t mean “requires training”

If the software is clumsy and its help system is unhelpful, that doesn’t mean that you have to develop training. It means the software should be made easier to use.

L&D staff are often surprised to discover that they can request changes to software — but they have to ask. Don’t assume that it’s too late to change anything.

If the software is from a third party, making it easier to use would help their sales. If the software was developed internally, there’s no excuse for refusing to make it easier. Clumsy software hurts performance.

Make a list of changes that will reduce the need for training. Take screenshots and scribble on them. Write the help blurbs that are missing. Point out where there are too many steps.

“They won’t change it” doesn’t mean “requires training”

If the software is hard to use and the developers have rejected your requested changes, that still doesn’t mean that formal training is your only hope. How about some job aids?

 

4. Try some job aids

A job aid is a reference that gives you just enough information. It can be a piece of paper, a page on the intranet, a short how-to video, a help screen, or anything else.

We can use Moore’s Machete of Awkward Oversimplification to divide software job aids into two groups.

Type 1: Task-based job aids

These are handy guides that quickly tell you how to complete job tasks using the software. Some examples:

  • A short article in a knowledgebase shows you how to record a partial refund in the accounting software.
  • A quick video shows you how to create a template for your marketing emails.
  • The built-in help system highlights the commands you need to use to escalate a customer complaint in the CRM.
  • An extensive, structured document helps you install WordPress, as deconstructed by Dave Ferguson in his handy site about job aids.

These are all aids that help you complete tasks. They aren’t the painfully ubiquitous tour of the menus or alphabetical list of all commands.

Type 2: Code and command references

If users need to type in codes or non-intuitive commands, group the most common ones in a quick reference. As above, try to group the commands by what they achieve. Don’t just list them alphabetically. Some examples:

Let people choose how much information they need

Don’t force-feed everyone with information in your job aids. Let experts skip ahead while novices read deeply. Here’s an example from Dave Ferguson of a job aid designed to satisfy both groups.

Help screen or job aid?

A common practice is to use a help screen to give a quick overview of how to complete a small task. For longer tasks or more detail, the help screen could contain a link to a video, knowledgebase article, or other reference.

People should be able to use the software and view the reference at the same time. For example, a reference that opens to the side is way more useful than one that opens in the same window as the software and blocks it.

For a lot more about job aids from real job aid experts, see Job Aids and Performance Support by Allison Rossett and Lisa Schafer.

 

5. Test your job aids before designing training

Test your job aids on a sample of future users and tweak them as necessary.

If it looks like the job aids alone will get people up to speed, release them into the wild. Tell everyone where they are, make them super-easy to find for the people who missed the memo, and provide a quick feedback mechanism so users can tell you how to improve them.

Let the job aids do their thing for awhile, and then check the measurement in your goal. Has it improved? Also, have managers reported better use of the software? Has the help desk seen a decrease in the number of calls? If so, you might be done. You can make sure you’re done by using Robert Brinkerhoff’s Success Case Method.

In part 2, we’ll look at what you might consider if you decide formal training is necessary.


 

Listen to me rant

Upcoming talks and interviews

Podcasts about instructional designMarch 7, online: Join the discussion about action mapping, scenarios, and who knows what else at TLDCast. 11 AM US Eastern time / 4 PM GMT.

May 16-18, Auckland, New Zealand: I’ll be presenting at the NZATD Conference along with many interesting colleagues. In an informal post-conference session, we’ll discuss ninja mind tricks that will help us understand and change our clients’ mindsets and move us out of the order-taking role.

Listen to the recording

Interview: Jo Cook recently asked me good questions in this five-part interview. I ranted a bit about instructional design degree programs and pointed out that spaced learning gives advantages to trainers as well as learners, in addition to covering some action mapping concepts.

Interview: Connie Malamed also asked good questions about action mapping and the state of the instructional design world for her podcast. We considered whether instructional designers should be responsible for “solving all workplace problems,” the advantages of branching scenarios, the many ways to deliver stand-alone activities, and when action mapping isn’t appropriate.

Webinar with lively chat: Many blog readers and I recently talked about three ways to motivate learners in a chat-filled webinar. View the recording and try some sample activities.