Day 30: Showcase your #EdTech journey

The last day of the #JuneEdTechChallenge and a chance for us all to be a bit proud and put ourselves on a pedestal and shout about our achievements.

Many colleagues and friends in these roles are not naturally good at shouting about our own acheivements, so seeing how people will complete this part of the challenge will be interesting. As will seeing what people remember of their own achievements and LT journey, and what others remember of it too. Indeed, what I remember will no doubt be different to what you remember, so please tell me your highlgihts of our time together, dear reader?

For me, there are a few stand-out moments from my +15 years in learning technology, learning design, and a few more senior roles too, which include:

Each day has been different and a challenge for many, many different reasons. One constant, however, has been both my need to grow and learn, and the network on Twitter and LinkedIn that has helped this journey – directly and indirectly. The list of those who have been on this journey with me continues to grow to this very day, and I thank you all!

Photo by davide ragusa on Unsplash

Change takes time

This week the UK has relaxed the lockdown restrictions, meaning we can hug family members and go to the pub. Actually in the pub. Should we want to.

Students can return to campus and now the talk is all about the future of the University experience and the question of what have we learned about working, studying, or learning in a fully remote environment. We’ve had an unexpected insight into what our fully remote students have been dealing with for years.

Conversations about what we can learn from this to make campus better for students, both the physical campus and the virtual, and spaces where they mix, abound – see my notes from last weeks’ AulaCon for some ideas. Changing the infrastructure and systems/platforms to make the most of this learning and experience is not going to be easy to get right, and certainly not get everything lined up as well as it is at the moment.

Here’s the crunch though … this kind of fundamental change takes time. Universities are not known for the speedy adoption of change, and certainly not at this level. Individuals, yes. Maybe even some dynamic or progressive departments, but not whole institutions. How an institution adapts to and adopts the change needed to survive in the changing market will determine how the current students view them, and how well they are positioned for new students (and not just the undergraduate market either, it’s all about lifelong learning now!).

I am looking forward to seeing how this will play out across the HE and FE sectors. Bring on the changes.

Photo by Ross Findon on Unsplash

52 things I learned in 2020

Inspired by Tom Whitwell’s annual collection of things learned, here are my ’52 things I learned in 2020′.

The list is usually presented under the comment that ‘no explanation or context of what it is about the article I learned, just a title and link of something that was important to me personally or professionally in [year]’, but this year is different. So very different from previous years.

Clearly, it’s important, for those reading this year(s) in the future, that 2020 saw the Covid-19 pandemic, the culmination of the US election and the UK/Brexit. Many of the articles deliberately focus on my work and personal development, they concentrate on how we work and learn during a global health crisis, in and out of lockdown, etc. If nothing else, this is a great way to see how we thought, and what we thought about, at the different stages of how pandemic developed and progressed (deliberately avoiding most things Brexit or elections, for the sake of my sanity and yours):

  1. ‘Global apathy toward the fires in Australia is a scary portent for the future‘ [David Wallace-Wells]
  2. ‘Kindness doesn’t just make you healthier, It can actually slow aging’ [Jessica Stillman]
  3. ‘How digital activists around the world are trying to change the tone of social media’ [Douglas Quan]
  4. ‘Today I learned that not everyone has an internal monologue and it has ruined my day.’ [Ryan Langdon]
  5. ‘I’ve worked in a ‘virtual office’ for 3 years – here’s what I’ve learned’ [Mitch Robinson]
  6. ‘Could micro-credentials compete with traditional degrees?’ [Anisa Purbasari Horton]
  7. ‘Why Amazon knows so much about you’ [Leo Kelion]
  8. ‘What should universities do to prepare for COVID-19 coronavirus?’ [Doug Clow]
  9. ‘Is our relationship with digital technology true love or an unhealthy obsession?’ [Rachel Drinkwater]
  10. ‘Surviving to thriving’ [Doug Belshaw]
  11. ‘Covid-19 could cause a permanent shift towards home working’ [Alex Hern]
  12. ‘Coaching is even more important in a time of crisis’ [Ian Day]
  13. ‘The Critical Points: The upsides of quarantine’ [Richard Kerr]
  14. ”I can’t get motivated’: the students struggling with online learning’ [Rachel Hall and David Batty]
  15. ‘But somehow the vital connection is made’ [WonkHE]
  16. ‘A Whole New (Remote) World’ [Diane Gaa]
  17. ‘Coronavirus could revolutionize work opportunities for people with disabilities’ [Lisa Shur and Douglas Kruse]
  18. ‘The COVID-19 pandemic has changed education forever. This is how’ [Cathy Li and Farah Lalani]
  19. ‘Why ‘Let Me Know How I Can Help’ Doesn’t Work for Introverts’ [Bret Serbin]
  20. ‘This is how music helps us get through difficult times’ [Emily Ansari]
  21. ‘Students should be partners not passengers in the Covid community recovery’ [Ben Vulliamy]
  22. ‘Is Work-From-Home Productivity A Mirage?’ [Peter Bendor-Samuel]
  23. ‘Sorry Not Sorry: Online Teaching Is Here to Stay’ [Flower Darby]
  24. ‘The Disease of More’ [Mark Mason]
  25. ‘These are the 10 most discussed tech topics during COVID-19’ [Stephen Robnett and Trey Sexton]
  26. ‘Not desking is the horrendous new hot-desking hell that awaits us all’ [Bruce Daisley]
  27. ‘Fixing education during the pandemic means fixing an uneasy relationship with technology’ [Robert Pianta and Bart Epstein]
  28. ’15 historical predictions on what life would be like in 2020′ [Claudia Lyman]
  29. ‘Screw finding your passion’ [Mark Manson]
  30. ‘More than 100 scientific journals have disappeared from the Internet’ [Diana Kwon]
  31. ‘#100DaysToOffload’ [Kev Quirk]
  32. ‘How to (Actually) Save Time When You’re Working Remotely’ [Lauren Howe, Ashley Whillans, and Jochen Menges]
  33. ‘Connectivity and Dis-junction in the Post-Pandemic University: Preliminary Thoughts’ [Fadia Dakka]
  34. ‘Students don’t know what’s best for their own learning’ [Arthur Poropat]
  35. ‘Content and Design Are Inseparable Work Partners’ [Jared Spool]
  36. ‘Why our ocean could hold the best solutions to climate change’ [Emily Kelly and Elena Perez]
  37. ‘Learning Is a Learned Behavior. Here’s How to Get Better at It.’ [Ulrich Boser]
  38. ‘The race to find and stop viruses that could cause the next pandemic’ [David Adam]
  39. ‘Why Memorizing Stuff Can Be Good For You’ [Natalie Wexler]
  40. ‘Wicked problems: are universities really prepared to grow in the next decade?’ [Mark Corver and Debbie McVitty and Tim Blackman]
  41. ‘Is time on the side of learning?’ [Neil Mosley]
  42. ‘Working from home was the dream but is it turning into a nightmare?’ [John Naughton]
  43. ‘Netflix’s Unlimited Vacation Policy Took Years to Get Right. It’s a Lesson in Emotional Intelligence’ [Justin Bariso]
  44. ‘A hybrid education format is sticking around. Here’s how we can improve the model’ [Anant Agarwal]
  45. ‘These 5 Rules Will Help You Work More Productively at Home’ [Nicole Avery]
  46. ‘University leaders come together for new digital strategy framework’ [JISC]
  47. ‘How People With Disabilities Help The Economy Grow And Thrive’ [Robyn Shulman]
  48. ‘Gaming might actually be good for your wellbeing, study suggests’ [Amy Barrett]
  49. ‘2020 has tested our humanity. Where do we go from here?’ [Phillip Morris]
  50. ‘How animals choose their leaders, from brute force to democracy’ [Brian Handwerk]
  51. ’25 moments in tech that defined the past 25 years’ [Fast Company Staff]
  52. ‘(Over)working from home: Why we need to get better at switching off when in remote mode’ [Rhodri Marsden]
  53. BONUS – ‘NASA’s Hubble Telescope Captures a Rare Metal Asteroid Worth 70,000 Times the Global Economy’ [Rachel Cormack]

Lucija Ros

What is wrong with (higher) education?

Ich glaube, Clark Quinn habe ich hier schon eine Weile nicht mehr verlinkt. In diesem Beitrag spricht er drei Punkte an, in denen die meisten Hochschulen aus seiner Sicht besser werden müssen:

„… three pillars I think create a valid learning offer:
– a killer learning experience,
– being a partner in your success,
– and developing you as an individual.“

Hier noch ein kurzer Teaser, der andeutet, an was Clark Quinn bei der „killer learning experience“ denkt: „My short (and admittedly cheeky) statement about education is that they’re wrong on two things, the curriculum and pedagogy, other than that they’re fine. Most universities aren’t doing a good job of curriculum, focusing on knowledge instead of skills.“
Clark Quinn, Learnlets, 20. Oktober 2020

Bildquelle: Nathan Dumlao (Unsplash)

Recommended Reading Summary: A Chapter from “How People Learn: Brain, Mind, Experience, and School”

I recently posted some recommended reading that relates to a virtual class I recently taught on gamification.  (Here is the recording.)

This is my own summary of the first chapter on the list.  I highly recommend the entire book, which is available for free from the National Academies Press.  It was written in 2000 but it contains some great foundational information.

Chapter 1: “Learning: From Speculation to Science,” from How People Learn: Brain, Mind, Experience, and School, by Bransford, Brown, and Cocking.

The current methods we use to deliver learning have been shaped by research within the field of education, as well as related fields.  In recent decades, teachers and researchers have discovered approaches that assist the learner in understanding and retaining new information.  Learning professionals now design curricula from a perspective that is more focused on the learner’s needs.  Research related to child development, cognitive psychology, and neuroscience has molded the current approach to early education, and has influenced how emerging technology is incorporated into the learning experience.

In the past, there was less focus on the teaching of critical thinking skills, as well as the abilities to express concepts persuasively, and solve problems requiring complex thought.  Learning experiences were focused on developing basic literacy in fields such as reading and mathematics.  Today, humanity’s knowledge is increasing at a faster rate due to globalization and rapid development of technology.  It is still important that learners develop fundamental understanding of certain subjects, but that is not enough.  Learners must be taught to self-sustain, meaning they must learn on their own by asking meaningful questions.  Using new teaching methods will help instructors connect with those who were once considered “difficult” students.  New teaching methods will also provide a deeper knowledge of complex subjects to the majority of learners.

There has been extensive research regarding how to teach traditional subjects, such as writing skills, with a non-traditional approach.  These research efforts date back to the nineteenth century and have influenced a new school of behaviorism, which in turn led to changes in how psychological research is performed.

Learning is now thought of as a process to form connections between stimuli and responses.  For instance, hunger may drive an animal or person to learn the tasks or skills necessary to relieve hunger.  Even if complex trial and error is required to learn a skill, we will perform whatever process is necessary, as long as the reward we seek is desirable enough to warrant the effort.

Cognitive science approaches the study of learning in a multi-disciplinary fashion, incorporating research from many fields and using many tools and methodologies to further research.  Qualitative research methods complement and expand earlier experimental research efforts.  An important objective within this research is to better understand what it means to understand a topic.  Traditionally, the learner’s ability to memorize is assessed in order to determine competency.  While knowledge is necessary in order to solve problems, facts must be connected to each other in order for the learner to draw conclusions.  An organized framework of concepts and ideas will give the learner the context necessary to solve problems and establish long-term retention.

Our prior knowledge, skills, beliefs, and concepts influence how we organize and interpret new information.  We exist in an environment that consists of competing stimuli, and we must choose which stimuli to focus on based on what has been important or meaningful to us in the past.  Therefore, it’s important that our foundational knowledge be accurate.  Incomplete and inaccurate thinking needs to be challenged and corrected early so that the learner doesn’t build upon which is essentially a weak foundation of knowledge.  For example, it’s common to believe our personal experience of physical or biological phenomena represents a complete and correct knowledge of that phenomena, when in fact we need more information in order to understand what we’ve experienced.

It’s important that learners have some control over their learning process so they have the opportunity to gauge their own understanding of the topics being taught.  The ability to self-assess and reflect on areas of improvement leads to metacognition, which is the ability of a person to predict their own performance on various tasks and monitor current levels of mastery and understanding.  Learning can be reinforced through internal dialog, meaning a learner may choose to compare new information with old information, explain information to themselves, and look for areas where they fail to comprehend what has been taught.  Teaching a learner how to monitor their own learning is therefore a worthwhile investment in the building of deep knowledge.  An active learner is more able to transfer skills to new problems and challenges.

The difference between a novice and an expert within a subject matter is the depth of knowledge commanded by the expert.  Depth of knowledge allows a person to recognize patterns, relationships, and discrepancies that a less experienced or knowledgeable person might miss.  An expert has a better conceptual framework, and is able to better analyze what information they need to draw forward in their memory to solve a problem.  Understanding what information is relevant to a problem is key, because it allows a person to focus only on the information they need at that moment.  This makes the problem less complex.

In order to build understanding within a subject, a teacher may provide in-depth understanding of a few specific topics, rather than giving a superficial overview of many topics.  This allows learners to better digest defining concepts.  Assessments must reinforce this model by providing instructors with an understanding of the learner’s thought processes and testing in-depth, rather than superficial, knowledge.

Learners should be encouraged to reflect on what has been learned before going on to additional topics in order to support metacognition.  Teachers should be encouraged to consider the many tools and methodologies available to present new information, and select what is best for the learner and topic.  Building a community of learners who work together and accept failure will allow individuals to take risks and challenge themselves in the classroom.  There is no one “right” way to design a classroom environment – but there are ways that are more effective than others depending on the learner’s culture and expectations, and how competence is defined.


The post Recommended Reading Summary: A Chapter from “How People Learn: Brain, Mind, Experience, and School” appeared first on eLearning.

Learning Technologists as Project Managers too

As I work my way through job boards and role profiles in my effort to avoid my recent redundancy and the impending doom of an empty bank account (yes, really) I have found a lot of roles being advertised with headline grabbing titles and/or impressive requirements. What I’ve also found is there is sometimes a narrowness in thinking, from both employer or agency, in that people can and should be pigeon-holed into a role because of the title. If your title is one thing (LT?) then that means you can’t be considered for a role as an ID. Yes, there are differences, but there are also similarities which can be greatly enhanced by crossing disciplines, and this cross over can benefit both individual and employer with fresh ideas, fresh perspective and fresh enthusiasm.

What I’ve also seen, and this is the reason for this post, is that Learning Technologists* (LT) are also very effective project managers. Here’s why. The quotes are taken from jobs being advertised today for project managers in engineering and finance companies:

“As a project manager it is your responsibility to deliver projects on time and in budget, by planning and organising resources and people.”

Obviously, yes. An LT is required to work with multiple teams from academic, administrative and IT perspectives. Often the estates teams can be involved if it means new kits needs installation, as well as legal and HR if contracts need signing. Not to mention what happens when you need to dig into the data the system collects, where it’s stored and the data protection (and GDPR) issues that follow. Sometimes the LT is at the heart of this making sure the work is done and everyone involved has the necessary information to hand in a timely manner.

The thing is, we LTs often don’t know about the budgets or wider timelines involved, other than start of term or assessment dates. But this doesn’t stop us working to deadlines and strategies that have defined and immovable timelines. Damn, we’re good!

“Select, lead and motivate your project team from both internal and external stakeholder organisations.”

Sometimes the ‘team’ may just be you and the academic colleague who wants to do something they’ve never done before. Sometimes you may be experienced at this task, or it’s new to you too. The stakeholders here may be other staff who need mentoring or training on something new, they could also be students who need guidance on new assessment criteria or group working parameters. Again, it’s up to you to manage, “lead and motivate”.

Unleash your inner project manager
Click To Tweet

“Planning and setting goals, defining roles and producing schedules of tasks.”

The timeline could include a new cohort of students, the NSS survey, release of module/unit materials for online learning, scheduled meeting, fixed reports, annual budget review, etc. It doesn’t matter the actual purpose of the goal, role, or schedule of tasks, the LT is at the centre and working with others to ensure nothing slips and everything works.

“Report regularly to management and the client.”

However the report is structured it doesn’t matter if this report is verbal over a coffee, written via email or other social channel used, or a formal document presented to a board or committee, the ‘client’ will have contact from the LT on the status of the work and progress. A good/great LT and project manager will also make sure delays and timeline slippage is reported well in advance and any impacts taken into account.

“… first point of contact for any issue or discrepancy arising from within the project before the problem escalates to higher authorities.”

As above, the LT is this point of contact on any work he/she undertakes. Whether the work is consider small or ‘incidental’ or a full-on VLE review with institutional impact, the LT is fully aware of the impact to themselves and those involved.

Project management is defined as “the application of processes, methods, knowledge, skills and experience to achieve the project objectives” (APM) and a project manager is “typically to offer a product, change a process or to solve a problem in order to benefit the organization” (Project Insight).

Working on implementing a new VLE or LMS for your department or institution? Chances are you’ll be working with a dedicated project manager or someone who’s acting in that role. Initiating some training on new tools or design or assessment criteria or rules around lecture capture … chances are you’ll again need to plan ahead for delivery of the training, resources to support it, room bookings or webinar time/space. See … you’ll need to employ project management techniques to make sure it happens when you want it to, how you want it, and where you want it.

Sounds familiar? It sounds like work I’ve engaged in for years now. I just didn’t know I could add ‘project manager’ to my list of skills too!

* Note: When I say Learning Technologists, I also mean Educational / Instructional Designers too.

If you’re interested, I’ve found this series of 15 journals (free download) from Product Focus, really useful introduction to project and product management. You’ll have to read your own skill and projects into the words, but it’s all there if you want it.

Image source: Judith Doyle (CC BY-ND-2.0)

Change the title, change the work?

Have I had it wrong all these years … is not been about me being a Learning Technologist (LT), I’ve actually been an Instructional Designer (ID) instead? Bear with me here …

I’ve been looking at opportunities on job boards (more on this another time) and have been looking at the requirements and roles for Instructional Designers. There are more of these around that LT or senior LT roles. Based on the role profile and job description, it got me thinking; “Well, that’s what I’ve been doing isn’t it?” Here are some of the descriptors and requirements that are asked for on an ID position, and how this mirrors the work I’ve been doing as an LT

“This role will be creating high quality new learning programmes for [name here], being the designer of the blended, engaging and interactive learning programmes to address specific business needs.”

“Creative, direct and concise. Good with technology. Great communicator, especially with clients.”

“Analyse base content and current study materials to identify the best way to present the content online.”

“Consider the range of instructional media available: video (face to face, voice-over PowerPoint), interactions and questions to recommend the most suitable for each instructional need.”

All the above have come from current ID roles being advertised. All this is precisely what most LTs I know are doing, and what I’ve done many times before too, yet you can be compartmentalised into a role by title, not by merit?

Let’s contrast this with similar descriptors from LT roles currently being advertised …

“Design and Development of e-learning content.”

“Undertake a range of activities to advocate for digital learning and its associated technologies.”

“The LT is expected to work proactively to identify potential resources for the [name here] and to plan and manage the development of varied e-learning material, including video, webinars, self-paced interactive resources, and [VLE here] activities.”

“Provide leadership and support for the development of innovative and effective teaching and learning practices using information technology.”

Learning Technologist or Instructional Designer … or both? #edtech
Click To Tweet

Do you see the similarities here? The only difference is that the ID role requirements are for commercial/corporate employers, and the LT ones for universities. Same role, often similar responsibilities and management duties (team and self), but different ‘sectors’. Of course, there are many differences in the roles that mean there are clear distinctions that warrant the different titles, and that’s fine – LTs may be more limited in scope in what and how they deal with, LTs may look after a tool (VLE, lecture capture, etc.) rather than a department or programme or academic group, etc..

But, for myself and those LTs I know and have worked with, we are much much more than this. We engage, advise, collaborate, curate, anticipate, lead, mentor, showcase, develop, design, implement, consult, etc. All these things are appropriate terms for both LT and ID roles. Yes? Perhaps it’s more to do with context … in my more recent roles and work I am so much more than an LT … I am now manager of an entire organisation’s learning platform, how it works, why it works and who it works for (internal and external). I ‘manage’ all aspects of the relationships between organisational parties with interest in the training as well as all external stakeholders, whether they are course participants or suppliers or accrediting bodies or potential clients.

According to the definitions in the ID role profiles above I have a more ID background and approach than LT, and have been since my 1st day in an LT-titled role, since I learned about my craft and stopped blindly following convention of the (enforced) VLE module structure and thought about making the learning more engaging and inclusive. It’s not about using the tools provided, it’s not even about finding new tools, it’s about using appropriate tools at an appropriate time for an appropriate motive to further the learning opportunities.

ID or LT? It's about using appropriate tools at an appropriate time for an appropriate motive to further the learning opportunities #edtech
Click To Tweet

So, are you an Instructional Designer or a Learning Technologist. Does the title/name given to your role even matter? Perhaps the difference here is time … what was once two distinct roles have now merged in outlook and intention and can be seen as the same, depending on which title the organisation prefers?

Image source: Olle & Agen (CC BY-SA 2.0)

The University of tomorrow is …?

I’ve just read this article and wanted to share a couple of thoughts I had while I was reading it: “It’s the end of the university as we know it”

The title is clearly clickbait, testing your resolve to read beyond the tweeted headline, knowing full well ‘the end of the university’ will get people interested (or enraged that this kind of talk is still going on … MOOCs anyone?). That the URL is not the same as the title implies they might change the title at a later stage … “/the-future-of-the-university-is-in-the-air-and-in-the-cloud/”?

Here are some soundbites from the article:

“Shocking as it might seem, there is one catch-all answer that could be the remedy to many of these concerns: Cut the campus loose. Axe the physical constraints. The library? Classrooms? Professors? Take it all away. The future of the university is up in the air.”

Another, when looking at the history of how and why universities are set up like they are:

“It is untenable for universities to continue existing as sanctums for a small group of elite students, taught by top scholars.Technology isn’t only refashioning the ways in which we live and work, but also changing what we need to learn for these new schemes of existence: It’s returning us to a need for specialized learning, for individualized education that is custom-tailored to one’s needs. A world in which most of our learning is more self-directed and practical is, in many ways, a return to an apprenticeship model that existed before industrialization.”

Predictions on the future of learning, at universities at any rate:

Online “cloud” teaching is cheaper; universities can offer such online-based (or majority-online) degrees at the lowest rate—making for a cheap(ish) degree, available to everyone with access to the internet, and taking place completely digitally. Meanwhile, other students will pay a premium to interact with professors and have more of a traditional campus experience. At the highest end, the richest or most elite students may get the full Oxford tutorial experience, brushing elbows with the best of scholars; they’ll just have to pay through the nose for it”

Read the article, let me know what you think – agree or disagree with the tenet of the article, that this is the end of the university?

Image source: Dave Herholz (CC BY-SA 2.0)

Caveat Auditor: The Role Of Critical Thinking In Modern Business Training

Critical Thinking ImageCritical thinking is a skill that, when absent, contributes to the rise of recent phenomena like runaway fake news stories or hacking of government and corporate computers, and costs companies dearly in law suits, fines, penalties, and failed projects. The skill of critical thinking in modern business is often described as desirable when asking employers what it is that they are looking for in new hires (Hart Research Associates, 2013). Employers lament the inability of new employees to think critically and solve problems creatively – placing the blame firmly on the high school or college training of the incoming employees. In their article “Eight Habits of Effective Critical Thinkers”, Guinn & Williamson identify several qualities that clearly identify why critical thinking skills are so desirable (Guinn & Williamson, 2017). Among the behaviors identified are that critical thinkers are ‘more concerned about getting it right, than being right’ and ‘avoid the rush to judgment’. The trick to cultivating these admirable traits in employees is actually in training and encouraging a solid foundation in critical thinking. Learning how to think more effectively leads to better decision making and job performance.

In this article, I’ll explore why critical thinking skills are so important to modern business, and in a series of activities that follow I’ll explore some methods that might prove useful for expanding the critical thinking skills of incoming employees. For the purpose of this discussion, I’m defining critical thinking as the skill(s) required to validate information and ideas based on verifiable evidence and sound logic. Critical thinking involves a well-organized thought process that is focused on solving problems, analyzing and researching relevant research, willing to challenge assumptions, open to new possibilities and approaches, aware of the limitations and scope of analysis, reflective and transparent.

The Difference Between Critical Thinking And Non-Critical Thinking

One interesting example of a critical thinking failure in modern business is when employees fall victim to scams based in logical fallacies, or fail to identify deceptive practices in business. There are many examples of this kind of problem, for example consider common problems with information security. One of the most common is falling victim to phishing scams in email. When employees are unable to discern fact from fiction – or learn to ‘trust their gut’ rather than validate facts, companies generally pay the price. Often the difference between critical thinking and non-critical thinking comes down to the difference between making decisions based on facts and logic, and making decisions based on intuition and emotion.

Logic vs Emotion

Sometimes the problem is not so obviously attributed to emotion. It simply feels natural – ergo logical to the person making the decision. Just as we can make a habit of behaviors like brushing our teeth or pouring a glass of tea, we can and do habitually perform many tasks based solely on past experiences. You might habitually open a door for others to enter a room for example. You also might habitually sit with your legs sprawled open even to the discomfort of another passenger on a plane – not even considering that you are unfairly consuming the space. You might habitually move out of the path of an oncoming pedestrian because they are male and you are female. Note that there is very little logic in the aforementioned examples – but they are derived of past patterns.

We often assume that when two events are related in one way, they must also be related by cause. We confuse coincidence with cause. Just because two things connect or coincide, doesn’t mean that one caused the other.

Gambler's Fallacy in Captivate

At other times you might find that emotion plays a subtle role. Most of us use phrases like “my luck ran out” and “I’m due for a win” when usually there is no ultimate relationship between your odds of success in the current venture than in the prior. We call this the gambler’s fallacy. One example of the influence of the gambler’s fallacy interfering with decision making was documented in a recent publication from the National Bureau of Economic Research (Chen, Moskowitz, & Shue, 2016). The article identifies a pattern of behavior among professionals that demonstrates application of the gambler’s fallacy to successive decision making patterns. In essence, when a professional makes a series of topically or conceptually similar decisions (on independent cases), they are more likely to invert positive or negative recommendations in the wake of a series of prior recommendations. If they made a series of positive loan recommendations, the likelihood of a negative recommendation increases, even when a negative recommendation is not warranted by the data. The team found the same pattern in baseball umpires calling strikes, and immigration workers recommending asylum. Just as people falsely believe that if they’ve flipped a coin twice, and got heads both times, they are more likely to get tails on the next flip. Of course that is the whole point of the gambler’s fallacy. The odds remain 50/50 no matter how many times you flip the coin. But the fallacy is deeply ingrained in the ‘beliefs’ and emotional reality of most people. The research team also found that the more experienced employees were less likely to suffer the effects of the gambler’s fallacy – suggesting that less experienced employees were more likely to include emotion and instinct in their decision making than more experienced employees (Chen, Moskowitz, & Shue, 2016).

Learning, Behavior, And Precaution

I recently attended a conference on behavior and learning. A major theme of the conference was the role of emotion of visceral reactions in the decision making process of learners. Michelle Segar (Segar, 2016) argued persuasively that recent research from the Journal of Consumer Research (Chang & Pham, 2013), the British Journal of Health Psychology (Sirriyeh, Lawton, & Ward, 2010) & the Annual Review of Psychology (Lerner, Li, Valdesolo, & Kassam, 2015) all provide evidence that emotion is more influential than logic in decision making.

It is an argument that rings true for me. I remember an experience in a classroom as a young boy that echoed this life-lesson. Our fifth grade teacher used a simulation game to teach a critical thinking concept. Essentially we were to assume the role of town leaders, making decisions about whether or not to allow a new factory to be built within our imagined city.

Some students argued that a factory would provide jobs, and some that it would encourage the economy. Some argued that it simply felt like the right thing to do, struggling to give any reasons for their inclination. Others said that it would promote more revenue in taxes, and it would encourage growth. I was concerned that there must be a catch. Life had already taught me that people would often present limited information or simply lie to make an ‘offer’ seem more beneficial than it actually was. The proposal sounded ‘too good to be true’ to my cynical ears. Eventually we learned that the project was in fact rife with problems. Pollution, competing tax break incentives, competition for local resources, and a host of other issues made the project a huge net loss for the imagined community. But the group –predictably– voted to support the project. In fact no amount of logic could persuade them that the proposal was in fact ‘too good to be true’.


This simple aphorism ‘too good to be true’ dates back at least to the 1800s and is similar to other warnings that essentially suggest the same idea. People throughout history have always used deception and misdirection to manipulate other people.

A similar phrase, caveat emptor, is well known in legal circles. Essentially this means buyer beware. (Caveat translates to “may he beware” and emptor represents the consumer in the expression.) The same form has been applied to the consumption of any message – caveat auditor. You’ll recognize the phrase from the title of this article. “Beware the receiver of any message” could easily be a modern battle cry for those inundated with advertisements, mass marketing, internet memes, deliberately deceptive fake news reports, the growing practice of social manipulation through push polling, and similar forms of agenda promotion via online discussion and sharing communities.

It’s worth noting that to a marketer, many of these techniques are tools – often employed to inspire emotional attachment to goods and services. Learning to spot these techniques, and in some cases to utilize them, plays a tremendous role in modern business.

While these warnings serve as a reminder of the universality of attempts to persuade individuals or systems, they also exist as aphorisms because the art of deception and manipulation with the intent to win a goal through manipulation has continued to grow at an alarming rate. Our improved methods of communication have democratized mass communication making it cheaper and easier than ever for anyone to reach the masses with well-targeted messages. While it once required enormous sums of money and elaborate infrastructure to push messages (both accurate and inaccurate) to the masses, the promotion of an agenda, false idea, con, or other manipulation of truth now requires little more than a clever meme and an internet connected computer. It is also worth noting that the reasons for the promotion of incorrect information don’t have to be deliberately devious to be destructive, or problematic.

So Why Does This Matter To My Training Department?

A growing number of businesses are relying more and more on publically available resources to inform, educate and update their personnel. You will see this trend reflected in articles touting the ‘end of organizational training’ or ‘end of the Learning Management System’. Why not? Using a search engine at the point of need has become the most common first inquiry in learning virtually anything (Wang-Audia & Tauber, 2014). As research, education, and training move rapidly to a self-serve delivery model, the onus is ever-increasingly on the individual to use a solid background in critical thinking in order to discern facts, uncover undisclosed specifics, verify sources of information, and clarify the specifics of a given proposal or narrative.

If we further analyze the problem, it is possible to see that the underlying logic – specifically the ability to recognize logical fallacies, is very often one source of problems in critical thinking. Spotting logical fallacies like emotional appeal and reacting appropriately can prevent a significant proportion of security-related problems like the email phishing scam discussed earlier. They can also improve performance among managers and decision makers. Ensuring a solid foundation in logical thinking, especially the identification of logical fallacy, can dramatically improve employees’ abilities to make good decisions and contribute more effectively to organizational and project goals.

In practice this means that people in a business are able to discern good ideas from ones that are likely to fail. They are much more likely to spot fraud, to quickly discover innovative approaches that have a likelihood of success. Critical thinking skills are essential to the success of both individuals and organizations. Without adequate critical thinking skills it will be easy for your employees to fall into traps. They will be easy targets for anyone who wants to game the system and cheat them. And if they can be cheated, your organization can be cheated.

Of course fundamental training in how to think critically remains essential, but given an audience that has some foundation in critical thought, augmenting such training with an overview of logical fallacies could play an important role in reducing problems. These skills could only be strengthened by a solid understanding of logic and logical fallacies. By training personnel in the aspects of critical thinking and the fundamentals of logical reasoning, employees will be much more prepared to handle the rapid fire influx of new ideas and concepts migrating into your organization from a myriad of outside sources.

During his closing keynote for the 2016 Adobe Learning Summit in Las Vegas, Tridib Roy Chowdhury described these sources of learning. Whether it’s YouTube, Twitter, Facebook, or most commonly Google, people are increasingly likely to consult public sources for answers about the tasks that they perform in your organization. As Chowdhury explained, this ‘Google first’ modality of learning is already the norm, and our most common support architectures simply do not compete with the search ability and completeness of Google’s latest snapshot of the sum of human knowledge.

Unfortunately, that snapshot is increasingly likely to produce results that have virtually no information for the consumer about the critical qualifications of the content on the other end of a search. At this point, the individual learner is generally the only line of defense between your organization and false information, an attempted fraud, or any other source of faulty information which can potentially cost your organization time, money, and resources.

How Can Organizational Training Address This Need?

When I speak with employers, one of the most common concerns expressed about new hires is the general lack the ability to think critically and solve problems with creativity and innovation. Our education systems generally do not include a lot of training in these areas. One of the few strong instructional approaches tied to critical thinking skills historically has been training in information literacy. This skill is most often relegated to librarians, and as we have witnessed the digitization of modern libraries – the opportunities for training in information literacy has rapidly declined.

For the most part, educators use lecture and other didactic methods of instruction. This approach can be effective in some instances, but in general approaches to learning that encourage use of the information, and questioning of ideas and concepts are more lasting, and offer far more potential for meaningful use of the content. We see such methods employed in the sciences, and in the scientific method, which inherently aligns better to critical thinking. Perhaps the oldest method of teaching critical thinking is the use of Socratic Questioning rather than lecture and drill based training. In this approach, students are challenged with questions, rather than presented with information. Questions encourage the learners to consider a problem in much greater detail, to compare and contextualize new information to things they already know, and to consider the veracity of new ideas. These are all skills that employers highly value – and all methods that have been employed by effective classroom teachers for centuries.

Using a Socratic approach to deliver content, learners do not passively consume information. They are challenged to answer questions about a topic, and guided by those questions to examine all of the information, ideas, and concepts that relate. This approach encourages learners to question the sources, debate the merit and logic of responses and form their own critical conclusions about the content. The trainer is able to evaluate the results based on the accuracy, specificity, complexity and relevance. But a Socratic approach is inherently expensive. It requires substantial investment in 1:1 training, or very effective group training. If training as a remedy to poor critical thinking skills is going to make an impact, it must scale well and be relatively inexpensive.

Educational institutions should recognize the deficiencies in educational methods and begin significantly improving those methods in order to encourage substantially better critical thinking skills in young adults. But there is absolutely no reason to expect that educational institutions will accept or adopt that challenge. Therefore it is fundamentally necessary for organizations to provide supplemental education in critical thinking and logic in order to prevent the losses owed to the accidental introduction of false, misrepresented, fraudulent, or deceptive information that is propagated within their organizations.

Final Thoughts

The world around us is rapidly changing. The availability of information and immediacy of communication enables us to adapt and learn at unprecedented speeds. But those who will excel in this environment of constant innovation will be the ones who are capable of discerning quickly the difference between sound information and ideas and those which are fraudulent, deceptive, or simply not founded in solid evidence. Likewise companies that thrive will be those that recognize, celebrate and cultivate critical thinkers – capable of questioning convention, exploring new ideas critically, and ready to uncover facts to guide their projects and decisions.

In order to explore some of these ideas more fully, I have created the first of several learning activities focused on rapidly improving the identification and rejection of logical fallacies. The first one focuses on the fallacy commonly known as the gambler’s fallacy. This is an activity designed to explain the fallacy, and encourage the trainee to learn to identify examples of the fallacy in work related settings.

I’ll demonstrated how the aforementioned sample was created in Adobe Captivate 9, in an eSeminar on Gamification and Adobe Captivate.  (NOTE: The seminar contains the download files for source of the Captivate project shown in this article.) Or if gamification seems a bit advanced, you are also invited to join me for an eSeminar for Captivate beginners, during which I’ll introduce the basic principles of course development.

One of the most common problems that HR and eLearning professionals are facing is implementing modern, engaging Learning solutions. This is exactly the sort of thing that critical thinking can help solve. Whether you are trying to better understand your training needs or evaluating a Learning Management System, like Adobe’s amazing new LMS, the problem will be much easier to solve with critical thinking process in place.

Still hungry for more? Consider reading ‘Engage the Fox’ by Jennifer Lawrence and Larry Chester.’ It is an absolutely enjoyable and engaging read on a subject that you might expect to be terribly dry.


  • Chang, H. H., & Pham, M. T. (2013). Affect as a Decision-Making System of the Present. Journal of Consumer Research, 40: 42-46.
  • Chen, D., Moskowitz, T. J., & Shue, K. (2016). Decision-Making Under The Gambler’s Fallacy. Cambridge, MA: NBER Working Paper Series.
  • Guinn, S. L., & Williamson, G. A. (2017, January 6). Eight Habits of Effective Critical Thinkers. Retrieved from American Management Association.
  • Hart Research Associates. (2013). It takes more than a major: Employer Priorities for College Learning and Student Success. Washington, DC: American Association of Colleges & Universities & Hart Research Associates.
  • Lerner, J. S., Li, Y., Valdesolo, P., & Kassam, K. S. (2015). Emotion and Decision Making. Annual Review of Psychology, 799-823.
  • Segar, M. (2016). The Surprising Science Behind Creating Sustainable Behavior. Science of Behavior Change Summit (p. Opening Keynote). Online: eLearning Guild.
  • Sirriyeh, R., Lawton, R., & Ward, J. (2010). Physical activity and adolescents: An exploratory randomized controlled trial investigating the influence of affective and instrumental text messages. British Journal of Health Psychology, 825-840.
  • Wang-Audia, W., & Tauber, T. (2014). Meet the Modern Learner: Engaging the Overwhelmed, Distracted, and Impatient Employee. Bersin by Deloitte.

Get Responsive with #KidCaptivate

Last week I took a look at Five Common Mistakes in mLearning. This hour long eSeminar is available on demand here: The session focused on mobile learning and helping you get started with creating your first courses for mobile devices. For those who like the slide deck either before or in lieu of the […]