This post inspired (and populated) based on a tweet from Cathy Moore (perhaps the Empress of Instructional Design; Founder, Leader and Driver of the Action Mapping Forces).

How to Plug in a Plug

The How to Series: Lessons in How to Do Things from SlapClap on Vimeo.


Week 1, and I have already found #ocTEL quite challenging!

This is a good thing, as this is the reason for my taking the course. It begins with the Week 1 Webinar (on Strategies for Learning Technology), which I would recommend for anyone in learning technology or instructional design. Kyriaki Anagnostopoulou (University of Bath) and James Little (University of Leeds) gave an elegant account of developing and implementing strategy, while also signposting the way to more discovery for learners. It was a bit of a masterclass in using a webinar format to inform and spark further exploration. However, I digress, when I should be getting back to the start.

To start: my current role is very much in developing eLearning content – multimedia, quizzes and assessments – for didactic courses (we are “teaching”, as opposed to guiding/encouraging exploration, which seemed to be the approach focus for the webinar). This is important to point out, as this is where I am coming from when I watched the Week 1 webinar.

My Contribution to Current eLearning Strategy

Currently, I work on two projects, developing interactive eLearning content (developed as courses composed of individual modules; each module is based on a presentation, notes and some in-house research). The audience is composed of junior doctors/surgeons who are seeking to improve their understanding of the theories, concepts and development of surgical interventions.  While we can explain surgical steps, they really learn to perform those steps in the hospitals or clinics where they are working and learning. However, what we provide helps them to understand why certain steps might be taken, what could go wrong (and what to do if something does go wrong), so that better knowledge will (we hope) lead to greater confidence when they begin undertaking these tasks in real life.

I developed the current strategy we are using (this also answers the question on impact of my practice – as it is what I do). I did not do it alone, and could not have done it alone. There was input from me (as an instructional designer, with experience in multimedia and developing assessment) in terms of the type of resources we could provide. This was guided by more senior members of the company I work for, who had great insight into the needs of the audience and what was realistic in terms of managing a project. Once we had put together some ideas (described in more detail below), we took it to some senior surgeons, who helped to further refine it.

Is the Main Focus Learning Technology?

This is where it gets a little controversial. Most people I work with would answer “Yes” to this question (as the learning technology demarcates us from the huge number of videos, articles and other online resources available). For me, the answer is “No”. We are using multimedia development tools to better engage learners – who are, by default, working hard and tired. The purpose is not to “provide multimedia content” but to enhance their professional development by helping to improve not only their knowledge but to deepen their understanding. We work with very experienced Subject Matter Experts (SMEs), who are rich in wisdom, but poor in time. We are trying to connect the audience to the SME. For me, the main focus is the learner, and making the most of the time they can afford (at the time they can afford to spend it). Time is at a premium: for learners and SMEs, today’s schedule is tomorrow’s adjustment. Therefore, preparing content in advance for SMEs to review and edit means they can still reach this audience. For the audience, they will get opinions from surgeons who are at the top of their game, and the head of their field.

Multimedia content, as a strategy, was initially sparked by the person who employed me. The intention then – as now – was to provide something more engaging, that was assessed and accredited (so adding value to taking these courses). They were looking for something quite involved. This is where I come in.

My role was to define the type of content we could use to better engage these doctors – using not just video (by seeing a video, and hearing an explanation of what is going on), and not just reading about it. The idea was to let them better conceptualise what happens during an operation – perhaps (not always) even beyond what can be seen (using animations, diagrams, images) and get them to really think about it (using questions, quizzes and assessment).  A deeper understanding prepares them not only for the tasks they perform, and what to do if anything goes wrong, but also (I hope) to help them think further about the possibilities of the concepts, steps etc they are learning about.

I am quite proud of what we have achieved so far. But the world keeps turning: standing still means you will surely be passed. This is something that crossed my mind as I watched the Week 1 Webinar. While I couldn’t see changes to the specifics of what we are currently doing, I did start thinking about how we might expand the offering. I’m starting to answer the next question. So time for some Bold font and a carriage return.

How often is it reviewed and is it flexible enough to adapt as things change?

The content we develop is updated every 2 years, and we change the content according to any major changes in industry, academia or clinical practice. However, to date, the overall strategy remains the same. This is primarily because we are still working through it (developing courses). The intention is to go on from developing multimedia courses to providing spaces for communities of practice to develop, helping learners connect to each other (but this is a place we have not arrived at yet).

The Week 1 Webinar got me thinking more deeply about this expansion.

  • How can we start providing activities and opportunities for those beyond this career level?
  • Those who may have some experience and want to explore different issues in more detail?
  • Perhaps helping people from different places connect to build knowledge with each other?

Again, I’m getting ahead of myself.

Finally, if you were to provide input to a new version, what, if any, changes would you make to it?

This for me has been the key learning point and key action I can take from the webinar. As I mentioned already, we develop didactic multimedia content; but one of its aims is to get people thinking further.

The strategy now should look to how we take these small sparks (of thinking further) and create something of a fire with them (sorry for the mangled metaphor). Hopefully, those managers, senior surgeons and I can work together again to define ways in which we can:

  • Empower learners to follow their interests but also find new ways of learning and managing their learning
  • Encourage greater conversation between learners so they can find like-minded travellers
  • Enhance collaborative opportunities to allow them space to act upon these

What I learned from the Week 1 Webinar is that this is not a case of: Provide bookmarking services, forums and wikis. The next step is not to provide technology, but to determine (along with the other main stakeholders) where it is our learners should be going and how they want to get there, so that we can provide the right learning technology to help them get there.

I signed up for ocTEL last year, but time ran away from me. After it was over, I thought “Damn, I wish I’d made more time for that”. 

In the meantime, I have continued in my job (developing eLearning for medical specilaists), and every so often think of my learners “Damn I wish they’d make more time for this”. Now that ocTEL is back, so am I. Hopefully with a resolve which will sustain. It is only 7 weeks after all. 

Why am I doing ocTEL?

Working in eLearning (or Technology Enhanced Learning), I find a real strain of the “Shoemakers children”. While we strive to make professional development accessible, useful and applicable for our learners, we rarely take the time to reflect on our own professional development. 

One issue I am painfully aware of myself is my own career trajectory in this regard. Like many I know in the eLearning space, it has been quite a circuitous route.  I’m going to talk about this, but between dashed lines, so that you can skip it if you want. 


I started out working for a large eLearning “shop” (as I called it). I was an instructional designer writing scripts for multimedia content, which was developed into Flash Learning objects and deployed to telecoms engineers to help them learn about and use telecoms hardware and software. In these early days, I was quite content writing up my scripts, suggesting graphical illustrations and interactivities (all of which had been pre-defined) and developing quizzes based on manuals we received from the Company.  I was about a year into the job (and enjoying it, and doing quite well with it) before I actually saw the finished product – the fruits of my labours as it were. How I managed so long, I do not know; but it is an indication of how foreign the development/design side of things was to me then. I saw my job primarily as a writer of scripts. 

During my time there, we went through the dot com crash and some ripples as a result of it. I actually decided to return to college, to take an MA in American Literature. To make money while doing this, I worked as a technical writer on a part time basis. I did OK with this, but my content did lean toward the didactic, rather than the purely informational. I was even then trying to teach through what should have been a manual. So, I took another part time job, but this time as an instructional designer (the first time I had actually heard the term used). This was much more satisfactory. There were 2 instructional designers, working directly with Subject Matter Experts and two graphic/programmers (who were also working part time, on work experience from their degree courses).  We were creating multimedia-based learning intended to explain the mechanisms of various industrial automation bits and pieces. This was a good time, and my colleague and a line manager were very supportive but also quite excited about learning technology. It was this experience that really sparked me into thinking: This should be my career.

Never one to make things easy for myself, I next ended up for a search marketing company, writing copy for online ads and reviewing the copy of ads submitted for online display. The project I had been working for came to a natural end, and at the time, I found it difficult to get work as an instructional designer. But then, the job titles used at that time ran the gamut from “Technical writer” to “Multimedia scripting editor for educational content ” to “instructional designer”, so I probably missed a few opportunities by not understanding what these titles meant. 

Feeling I had done my stint in search marketing, I sought a role as a web editor. The job description demanded 3 references. When I reached out to someone who I had worked with in my first job for a reference, he offered me an interview: he and a friend of his from that first company I worked for had set up their own company. It was a sort of boutique content development company, specialising in eLearning (but also taking care of marketing and other messaging). This role really formed me. I learned about web standards (which at the time seemed to shift every couple of months), learning content, real multimedia development (when I wrote design notes in scripts, people came back! They asked: Is this what you meant? I couldn’t believe it). While in this role, I had experience with Rapid eLearning tools (specifically Articulate, but also others), as well as podcasting, video development, online resource development, and collaborative learning. At the time all these things were in their infancy as learning technologies, and it was great to be working in a company with the imagination to really try to use them and use them well. The key lesson I learned from the guys I worked for here was: everything you do should have a purpose. I carry this with me to this day.

Unfortunately, 2008 rolled around, and it was like someone slammed on the brakes. With the inertia built up, I was flung through the windscreen of my job and landed on the side of the road, where I did some contract work to keep the bills paid. 

Then, about 4 years ago, I saw an ad for the job I am currently in. A medical society were seeking someone to manage the development of eLearning. I thought “I can manage the development of eLearning. Can’t I?” Really, it turned out they needed someone to just create the eLearning, and they wanted multimedia content. Well, with my experience using Rapid eLearning tools and Moodle, I said “Yes, I *can* manage the development of eLearning”.


For the past 4 years, I have been developing learning content for medical specialists. We develop using Articulate tools, and presentations given by medical specialists. Some research is also required, but the idea is to create highly interactive, multimedia content that is laden down with quizzes (medical graduates like being tested. A lot).  Our content is designed specifically to prepare medical specialists to take more hands-on learning (i.e. to give them the background, theory, explanations of what they are doing and why).  I am interested in the “Serious eLearning Manifgesto”, which was launched recently, but note that our content would not meet its exacting standards (specifically, making the learning more practical – we can’t do this, as most people are not willing to be operated upon by someone who has learned a procedure from eLearning). However, our feedback indicates that the content we are developing is helping people to better understand the various interventions they are learning to do in practice, why they must be done and how various pitfalls arise. My understanding is that our learners use the content we build as a complement to text books (which don’t test them) and lectures they attend (which may be recorded, but don’t ask too much of them in terms of engagement). They enjoy the engagement, the challenges of quizzes, the “fun” of interactivities and/or animations that explain various mechanisms. 

As I have concentrated on this sphere of multimedia-based eLearning development, I note from reading blogs and articles that the eLearning world is moving in different directions:

In terms of what we do with technology, some talk about instructional design moving into a more “curation” based area (gathering, assessing and providing digital assets such as podcasts, videos, PDFs, etc.); others see eLearning as becoming communities of practice, sharing information and experiences and gathering the same when required. Many still see multimedia as having a role.  Mostly, you hear of a mix of all of these things.

In terms of how we do what we do – this (it seems to me) has become much more fractured. Should we be teaching didactically? (I know I have to right now), should we encourage exploration? Should we mix these things. Should we just provide tools and allow learners to find their own path (as ocTEL does)? What is best? 

In terms of how we measure our outcomes – the debate here ranges wildly, because people forget that (ideally) content will be tailored to specific audiences, so there are calls that all eLearning should be measured in workplace improvement. Sounds great, unless you develop eLearning for 6 year olds. Or “All eLearning should be measured by standardised tests”, also good, unless you are trying to get people to think critically – then you need a human to determine how well that critical thinking has occurred. “No eLearning should be tested based on facts: only on application of information”, again, good, unless the specific sphere calls for an understanding of specific facts. 

For me, what is best is answered quite simply as “What is best for the learners”. However, this is not so simple, when learners may not be aware of the opportunities available to them.  One thing I like to reference here are stupid Facebook memes. I saw two beside each other the other day. One was basically “What should I do?; Let them tell you!” and the other was Henry Ford saying if he asked his customers what they wanted, they would have said “A faster horse”. I think we may be at this crossroads in Technology Enhanced Learning: determining the best way to provide learning experiences (or to direct learners to it), while the boundaries of possibility are broken around us every day.

So, why AM I doing ocTEL?

I want to catch up with the developments in other areas.  I need to better understand other approaches other people are taking to engage their learners, and better understand their experiences (as learners and those providing the learning experience).

The Experience (Tin Can) API has really ignited my imagination about what learning can be into the future, and I want to be fully prepared to exploit the potential of learning anywhere/everywhere. I should add, my fundamental starting position is that we are all learning, all the time. For me pedagogy is about designing learning experiences that can capture and enhance this natural instinct. 

By taking part in ocTEL this year, I want to engage myself further in the benefits, challenges and drawbacks of different methods of Technology Enhanced Learning. 

The one question I am asked the most as an instructional designer: “How do you do that? How can you know about that?”

The question is asked in varying tones, from an undeserved derision (who are you to teach these people?) to undeserved praise (who are you to teach these people?).

The simple fact is, instructional design is not about knowing the details of the content being developed (or, to the layperson – the subject you are teaching), but knowing how to develop the content (or how to communicate the information to the people who need to know it). Dealing with content is the skill. What does that mean? It means many things to many people. I like to go back to first principles. In short: you have a problem. 

Someone (or some population) need to know X. They do not know it now. Your job is not just to tell them “X”. Your job is to ensure that when they need to use this information, X (they may be on the phone, they may be in a hospital, they may be flying a plane), X is there, in their head and ready to be used.

So, how do you get X inside their head? I can’t tell you here, because you’ve already formed an idea. An idea about who this learner is and what X is. If I don’t know who they are or what X is, I cannot possibly suggest how you get it inside their head. I can’t even pack its bags for the journey. All I know is you have to communicate it to them (not their boss, not the experts I talk about later, but to that person).

In most cases, I am dealing with technology. This is a function of the environment. People call on an instructional designer when they want something delivered via technology. If they wanted something delivered in a classroom, they would have called a trainer. This does not mean they are right, it is merely a function of how the roles of “trainer” (traditional, instructor-led training and coaching) and instructional design (new-media, technology assisted training) are viewed. I believe the 2 will converge at some point.  This will offer better results, as trainers and instructional designers will be in a position to better exploit both the instructor-led and technology assisted methods of communicating that are currently somewhat “shut off” from them (because they are currently called upon to provide either instructor led sessions or technology assisted communication).

Then, dealing with people. I work (as all instructional designers do) with Subject Matter Experts, who often have more important things to do with their time. I am blessed in my current job (I know from experience!), in that the SMEs I work with now are willing to help, but are busy (I have worked with those who are not busy, and not willing to help). Dealing with getting the most from the small slice of their schedule is the skill. 

You also need to be able to explain to the Subject Matter Expert, and to whoever manages the person you are communicating with, that this communication is not for them. It is for the audience. This may seem obvious, but in my experience can be the most difficult communication to get across. Managers want to see content in one shape, SMEs want to see it in another. However, to do your job properly, the only people who matter are the audience. The content needs to be in a shape that allows them to consume it in the most efficient way that ensures it remains durable.

Once you’ve cut through who you are talking to, how best to communicate with them and explained to their boss and the subject matter expert what it is you intend to do, it is time to deal with the information. This is the bit I like.

You consider what people need to know, why they need to know it and how they will use this information. This allows you (in discussion with their manager and the SME) to determine learning objectives. These determine what it is someone should be able to after you have finished talking with them.

With the objectives, you have an end point. Progress! 

So, next you need to know your starting point. Again, the SME should give you an idea of where people are starting from (what they should already know/what they can do). So, armed with this information, you start to create a “story” that begins where the audience currently is, and leads them to where they need to be (i.e. achieving their learning objectives).

Ah, but sometimes, management and coordination is the skill. Many instructional designers will work with programmers and designers to make this story compelling and engaging. This is usually achieved using interaction, quizzes and assessment. If you aren’t using a Rapid eLearning Tool (this is a subject for another day, but in short means you’re doing it all yourself), you have to request this content and keep an eye on its delivery to make sure everything comes together at the right time. Yes, you probably have a project manager who organises all this, but you still need a mechanism to determine what you need, how you get it and when you’ll get it back.

At the end of the road, it is always good to look over the journey. So you create a final assessment (the intermediate quizzes and assessment help people to gauge their learning as they go, so they can go back over anything that the quiz has shown they didn’t understand).  

This final quiz will serve 2 purposes. The first is testing – to prove to the person you are talking to (and whoever else needs to know) that they now know X, and can hopefully use it. A second purpose is to “activate” the information within their mind. In short: they may well know the information you have communicated to them. But it is latent – sitting in their brain, perhaps having a cocktail and enjoying the view. By quizzing them on this information, you force it to get up and head upstairs to the consciousness, so the brain knows where to find it. This helps to develop and protect their understanding of the information. Which is good, because if they need to know X while they are flying a plane, you want X arriving promptly, not delayed at the gate or snoozing on a lilo, while the brain frantically calls its name over some kind of synapse intercom.

The next skill is one you learned as a child. It is the “spot the difference” skill. What was planned, what was built and how do they compare? Are there differences? Do these differences need fixing? If not, perhaps you need to alter your plan to reflect any changes that were required in the course of the project. Because when you come back in a year or so, you may not be able to remember that last Tuesday Jo Murphy came in and said the TX3720 was going to launch with a specialist module. While the specialist module isn’t the “main event”, it still needs to be mentioned.

After you’re happy, the content needs to go to the SME to make sure it is correct. If it is not correct, terrible things may happen. And many instructional designers feel an SME review is a terrible thing, happening. But this final review ensures that even though you are not an expert, that at least what you are communicating is correct.

After all that, it’s probably time for a pint. 

What this post is about

A simple reflection on the importance of “book learning” and more traditional teaching modes, within a world where highly interactive and engaging learning content is becoming more widespread.
What this post is not about

This is not a detailed discussion on technical capabilities or creative development.
Here is a summary

This post is part 1 of a 2-part posting about Apple’s iBooks. I had intended to write solely about iBooks (under the title “Textbooks and eBooks and iBooks, Oh My!). However, I got quite far down the page without even mentioning iBooks. So I have left this first part as a reflection on the importance of some more traditional modes of study.

For most – perhaps all – instructional designers, Apple’s iBooks appears to be a welcome step forward in the development of interactive educational content.  However, for me, a key question arises: are we all getting too familiar with – or dependent on – highly interactive content? Is there not still a place for the discipline of traditional book learning? And if you agree that there is, how do we maintain this discipline in an age of Google (search for anything), iBooks (easy reference and education) and the ever-greater need to create on-demand content?

Here Begins the Post

A long time ago –  a good few years ago – a colleague and I declared that we had entered the age where “Education was no longer about memory. It is purely about application.” Please accept my apologies – I did not have a blog at the time, and so could not send out the memo.

I think at the time, we had got ourselves some killer 2GB USB sticks.  I felt you needed to remember nothing. Just store everything, then look it up. Google’s search technology meant that your digital content could be chucked anywhere and retrieved with nothing more than a keyword.

I can’t remember exactly the chronology, but around this time Gmail arrived, as did iPods, (then) podcasting, Wikipedia, a huge range of online resource sites – like OED,, and various others (including the idea that companies would host their own knowledge-retention systems).

Memory was handled by technology – it was no more than storage. I had decided at the time that no one really needed to remember much – they had to learn the skills required to find information. In my innocence, I also believed that learning skills were all about knowing the course of action to take and the information required to undertake that action.  Book learning – I thought – was on its last legs.

How wrong I was.

The landscape of eLearning has changed dramatically in this time. Many  previous certainties have fallen by the wayside. Flash content wanes, as on-demand content (e.g. reference material, podcasts, vodcasts, etc.) have all increased in popularity. Learning is considered to be everywhere.

With the rise of rapid eLearning (still quite dependent on Flash – because of the form-based authoring that made it so popular – but not as much as it used to be), we have developed ever more inventive ways to get people to engage with content – focus their attention, interact with information – in order to better “internalise” (I hate this word, but accept it) and use it.

The instructional designer is often faced with quite exciting choices for any new project they begin. What kind of media is available? How can this be deployed? How can we make it so that learners can explore this information – so that rather than “going through the motions”, they are creating their own path – and therefore engaging more and more deeply with the content? Furthermore, how can we organise this information so that learners can access what is required – but what is most important for them?

Why was I wrong?

Because everyday, I go into work and what concerns me most about any and all aspects of development – whether we are planning, scripting, testing or evaluating – are learning objectives.

Don’t get me wrong. I love the tech stuff. I hope to explore it a little more in the next post.

But my fundamental concern in developing learning is: What should learners be able to do when they have done whatever we create? Furthermore, what is it that we should create? What is it that will enable learners to do these things that we (or someone) has decided they should be able to do?

While engaging learning is certainly the way to go for higher-level concepts (working with information –  taking actions, making decisions, considering options, etc.), sometimes the basics are best learned by rote. An unpopular view, but I still remember my multiplication tables, rules of grammar (even if I don’t always adhere to them) and much of the poetry I learned in school. Ironically, a lot of the good poetry that I loved in university is missing words, phrases, often lines. Sometimes, I can remember a snippet and Google can help me to find the rest. Other poems that I worked with in university, I learned by rote. By reading them over and over until they were welded to my brain (a curious phrase I have adopted recently).  I should mention here that learning by rote still had some more engaging aspects – my father used to help us with multiplication and alphabet by using a distinct rhythm; this rhythm certainly helped the drilling of the information.

I am not saying this kind of information could not be learned through more exploratory means – it most certainly can. But would it be learned as quickly? Furthermore – by knowing multiplication tables, I believe my understanding of multiplication (when properly explained to me in terms of sets, how multiplication can be used, squaring and cubing numbers, etc.), as I had to hand all the examples I needed – whether I wanted to use 2×2 or 12×12 to apply to these concepts I was learning.

These examples of ‘basics’ are somewhat extreme – as far as ‘basics’ go. Basics might also include content that lawyers, doctors, accountants and IT professionals might need to have – solid knowledge, not at their fingertips, but within their mind.

As very simple (perhaps over simple) examples, consider:

If you went to a lawyer, explained a complaint you might have against someone and the lawyer – suitably outraged – explained to you that this was wrong and they would set it right; only to later tell you that there is nothing that can be done through the legal system – how would you feel about that lawyer?

While we are all very aware of the marvellous alchemy of accountants in the past decade, what if you were to go to an accountant who didn’t  know whether the purchase of a new piece of kit was an asset to your business or a liability?

What if you had a fall and hurt your arm, so went to the doctor. If the doctor consulted a book (or website) about pains in the arm and suspected a heart attack might be in the offing, how much confidence would you have in that doctor?

In all these cases, there is missing a fundamental level of knowledge that hard study is probably best suited to remedying (I’m using these as I know what these people do. I would venture that for almost any professional – builders, plumbers, web designers, web developers, etc. there is a foundation of knowledge that is required to work competently – and upon which they can build to develop their own knowledge, experience and career).

While interactive content can easily deal with broad stroke concepts and decisions that need to be made, there are often a wide range of situations that have occurred before. Knowing how these have been approached, and the outcome from the approach(es) taken can help someone make better decisions, more quickly.

Also required is a teacher/trainer/mentor who can test the knowledge gained from these books – both the type of knowledge gained and its application – by asking random, real-life and highly complex questions (perhaps that lead to further questions based on the specific answer provided by the learner).

While we would love to cater for this in eLearning, it might never be possible to develop something that worked as efficiently as one-on-one Socratic questioning.

I am an instructional designer. I make my money and have dedicated over ten years of my life in the development of learning content for a range of professionals and purposes. I am not arguing that eLearning is useless, or only partially useful. I am arguing that there are certain situations in which older-style “book learning” and “teaching” is best suited to developing and testing knowledge.

This will be further developed and applied in the next post – on the iBook and its applications (in a few days or perhaps next week); but for now, I shall say thank you for reading.

Happy new year! And yes, along with a long line of others, one of my resolutions is to pay more attention to this blog. It has been difficult to keep a regular presence; sadly, I seem to spend more time on my hobby writing blog. Anyway, this time, I’m going to talk about the whys and wherefores of teaching concepts using eLearning. Often considered the most boring content to develop and consume, I think some simple tips can help to make your content less page-turning and more page-turner.

An addendum. I need to acknowledge Kieron Strong, trainer extroadinaire, as inspiration and information used in this post.  Much of what I have written here has been drawn from conversations we have had about training and development. However, Kieron did not contribute to the planning or writing (or indeed had no knowledge of my writing) of this post. The opinions are my own alone. He has his own opinions, which I am sure he would be happy to share with you.  Anything you disagree with in this post – you disagree with me – not with Kieron.

What this post is about

I’m going to write about some cases in which it is important that concepts be taught in eLearning. Also, how concepts can be taught in a more engaging fashion. I have often found that the immediate impetus is to provide conceptual information (guidelines, facts, “information” etc.) in a page-turning fashion, or simply as resources to be consulted in order to complete task-based content with a more practical focus.

What this post is not about

I’m certainly not arguing that teaching concepts has to change everywhere and for all projects. I am not belittling the work, effect and impact of more practically-focused online training.

Here is a summary

Task-based learning has perhaps saved online learning from itself (especially during the 90’s, when it was “Page-page-question; Page-page-rollover; Page-drilldown (essentially more pages)-rollover”; however, it sometimes feels that the balance has gone the other way.  That the concepts that inform tasks – the reasons certain steps are taken and why they are taken in a specific order – are suddenly less important. This balance needs to be redressed. Concept-focused learning needn’t be boring or gimmicky; it can also provide essential understanding to help learners apply other knowledge in a more effective fashion.

Here Begins the Post

When an instructional designer (ID) begins a new project, there is an air of magic about it. Especially where the project is entirely new (i.e. the content to be developed is a suite on its own, or a specific one-off; in both cases freeing the ID from any standardised approaches that may have been defined for the wider project).

The opportunities are endless (sometimes, this can be daunting).

Of particular interest to most IDs is the opportunity to create simulation- or scenario-based content, that will guide the learner through a series of steps to complete some complex task. All “excess” or “residual” information can be provided as resources, that the learner consults in order to complete the task. The whole thing is immersive, exploratory and altogether kicks ass – for ID as the designer/developer and for the perceived learner, who will have a much more engaging experience as a result.

However, sometimes, the concept needs to be given greater focus. The concept is not always “excessive” or “residual”. Knowing why one takes a specific step can be as important as knowing the step that needs to be taken.

For example – a technician may need to run a series of diagnostic steps to determine the cause of a fault. Each step they take will lead them to some conclusion or another. By understanding what these conclusions mean, the technician might determine the cause of the fault more efficiently – or indeed identify a wider problem (something causing several  specific faults).

For example, consider me – even though I’m no technician – at home. I may notice that the lamp has gone out. I could check the bulb, but to no avail. I might then check the plug fuse; still to no avail. At this point, I might realise that perhaps there has been a power cut, or I have blown a fuse at the fuse box. So, I might check the clock on the oven, or just check the fusebox. These steps make sense – going from the most simple options to something bigger or more complex. If it’s night-time, and the lamp, my laptop, TV, stereo, &c all cut out, I’ll probably head straight for the fusebox, using the illumination of my phone display as a torch. I don’t go through each simple step in turn, as I know immediately that there is probably a bigger problem at hand. Furthermore, I have never learned how to use a mobile as a torch; but I do know that pressing the screen randomly will keep it illuminated, and that I can use that illumination to light my way.

Another example. Consider the dreaded position of the Customer Service Rep, manning the phones. In just about every corporate environment, a prepared script is provided, through which the CSR must work. By following their training (and indeed job specification) to the letter, they could make a bad situation worse.  Imagine a customer phones, who has had something go wrong.

They are frustrated with their situation, and perhaps with the company. They aren’t necessarily angry at the CSR – but they will be taking their frustration out on the CSR.  If the CSR misreads this situation, they may either:

  • Respond personally – over-defensively, taking umbridge at the caller, who may be rude to them
  • Try to ignore the emotion of the caller, and proceed through the prescribed script until they can reach a resolution – which, even if it deals with the customer’s primary problem, may well leave the caller frustrated that their anger/emotional state has not been recognised/acknowledged

Both of these outcomes creates an unsatisfactory (or at the least, not satisfactory enough – given most call centres focus on “excellent customer experiences”) outcome for the customer, which may have knock-on effects.

A final example. In medical cases, trainee doctors may be faced with several options for dealing with a case. The best option to take will depend on previous medical history, patient allergies to medications/treatments, and a range of other factors that are (frankly) beyond my understanding. Traditionally, understanding these options and the factors that should contribute to decision making are learned by late nights of reading books and following a senior doctor on the rounds.  However, I believe eLearning could be adequately used to inform trainee doctors of these various and diverse contexts and situations in which decisions must be made, and the type of information they require to make the best decision.

So how does teaching concepts directly help in these cases?

  • By learning to recognise and deal with multiple causes for power outages – and perhaps the patterns that indicate them, I might be able to find the quickest set of steps required to deal with my faulty lamp/TV/stereo, depending on the circumstances I face.
  • By learning to recognise customer frustration, the CSR may be able to deploy the right kind of tone/interaction that allows them to calm the customer and reassure them that they are trying to resolve the problem at hand.
  • By learning to recognise the most common allergies, issues with specific treatments and drug interactions (although it is fair to say, not every single one could be learned – knowing the most common is probably useful), the trainee doctor can be better informed, which will help them to select the best course of action for their patient.

So how can concepts be taught in a more engaging way online?

Here are some tips that might be worth considering:

Set the scene/purpose convincingly. This might mean using a scenario (I know this goes against what I mentioned previously – but what I mean is to set up a scene, in which the concept(s) you want to teach can be elicited). However, if you do use a scenario – do not make it too specific. The problem with teaching general concepts with very specific scenarios is that the learner may interpret the content as saying “In this – very specific situation – you will need to know this information”; in fact, you are trying to get them to understand that they need to know this information in almost all situations. This might otherwise mean writing more engaging content to draw the learner in – make them scared (i.e. the negative consequences of not knowing the information provided), or more motivated (i.e. the positive consequences of knowing the information)

Include lots of questions – not just as quizzes (but do include quizzes – I would rather have a quiz and no copy than copy and no quiz), but in the body of your content. I have found it useful to ask questions directly preceding content to focus the learner’s attention, or use their previous knowledge to introduce a new concept. Questions wake learners up a bit, and therefore “tune” their mind to the information you are asking them to take in.

Include good graphs/animations/visual content – this is fairly obvious and covered in every ID training programme. If you have to teach the relationship between things, or the innerworkings of something, use graphical content as much as possible. Whatever about a picture and a thousand words, graphs can also be reproduced as downloads, which make excellent job aids.

For very complex concepts, take steps and ask lots of questions – this has proven particularly useful in some projects I have worked on. From an extremely complicated concept, we provided one sentence and one question per screen. The sentence would describe a simple aspect (which was illustrated with an animated graphic), then – based on this – the learner would be asked a question about the consequence of this simple aspect. In most cases (but not all), the SME agreed that given the single sentence and graphical aid, the learner should be able to make an informed guess (I hate blind guesses, I believe informed guesses are eLearning gold dust), about the next sentence, given the question. Obviously, in these cases the question is prompting the next sentence/screen, but it is useful nonetheless to take a learner through a concept – step by step – and ask questions to help build those cognitive bridges that lead to better understanding. It also meant we weren’t dumping a pile of text and load of images on one page/screen and asking the learner to read through it before moving onto something more interesting (the learner reads: “This is something you don’t want to do – but hold on, because there’s something you do want to do next!”)

Answer the Question the Learner is Asking – namely “Why can’t I get on with the good stuff – why should I care about all this?” Provide context for your information, explain where it might come in handy. As with the scenarios mentioned above, do not make this too specific – for fear that your learner believes the examples you give are the only situations in which the information is used.

Here Ends The Post

Thanks for reading. As always, comments and conversation are welcome. What do you think?

Hi there. Sorry I’m so late. I got caught in traffic.  From now on, I’m going to  try and get the train – this means posting once every month. Let’s see how that goes.

What this post is about: This post discusses some pitfalls I have come across when trying to create effective assessment or interactivity using quizzes/Q&As (and some that I have noticed in other people’s learning content).

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it talks about what you might try to do (but really should avoid) when developing quizzes to test for knowledge retention and understanding.

Here is a summary: The last post talked about 5 things to do with quizzes, which I hope you found useful.  There are many temptations to over-reach when asking questions – leading to questions which you might think are more engaging and exercising, but in fact are irritating for learners, or do not add to the educational value of your content.

Here begins the post

Quizzes can be the hardest part of developing online learning. What’s a good question? The content seems so thin, how can I ever get X questions from it? How can I grab the learner’s attention with this? How can I avoid boring them?

You’re looking at a blank sheet of paper, wondering what you should be asking…

…then you get “creative.” You try to ask the same questions in different ways. This is not always a bad thing – especially where the same information might be applied in different contexts.  However, problem questions come from over-reaching. Here are 5 ways I’ve seen people over-reach, which (for me) just don’t hit the mark.

5 Things to Avoid

1. Avoid dealing in “semantics” and using “trick” questions

Both of these points relate to the language you use when asking questions.  Therefore the following is not necessarily true where your content is addressing comprehension, definitions, grammar or parts of language.

Using “semantics” means doing things like:

  • asking a question using a double negative
  • playing on words/puns or subtler meanings to ‘throw’ a learner
  • using options where a similarly-spelled word is used
  • using difficult sentence structures, especially where cause/effect or timelines are being asked about

This sort of tactic is just not fair to learners.  You aren’t testing them on their language skills; you’re testing them on the knowledge they are meant to be gaining as a result of your content.

Many argue that asking questions like this is a good way to “keep learners on their toes”, or “make sure they’re paying attention”. I’m not so sure.

  • First, keeping learners on their toes can be achieved by asking pop-questions within your content, or asking about subtle differences in application of knowledge – not by seeing whether they notice a minor aberration in your grammar, punctuation or spelling.
  • Furthermore, the quiz itself is a good way to make sure learners are paying attention; asking that they notice very subtle linguistic issues when they are concentrating on the concepts, theories or techniques you have been describing is off-putting.

Trick questions I have more difficulty with (not just in terms of their use, but also in describing them). These are questions where you actively try to lead a learner astray (i.e. away from the right answer).

I think quiz questions – whether for assessment or self reflection – should be fairly unbiased. For example, with multiple choice questions, they should present to the learner a series of equally likely (or unlikely) options.  If it’s skewed one way, it might be too easy (which doesn’t really exercise them) skewed another, it will feel to the learner like you are  trying to trick them (which you are!) The problem with tricking a learner is that you annoy them or (worse) they question the value of your content.

However, there is a subtlety in the “trickiness” here. Sometimes, you may want to ask about the differences between certain concepts or objects. Or, you may want to ask when a specific piece of information might be applied to a situation. In these cases, subtle differences can be a very useful way to tease out such differences (in fact, I’m a particular fan of “True/False” questions, where you ask whether a piece of information can be applied in the wrong context. The answer should be False, and the feedback can be used to explain the importance of context and information).

Be fair to your learners: They are there to learn from your content – not have their comprehension/concentration tested.

2. Avoid writing ‘to content’

You have objectives that you want your learner to achieve.  Often, these objectives will be well-used for proper assessment quizzes; but will be forgotten when trying to ask about the content a learner has just encountered. What often happens is that a sentence is lifted from the content to be reworked as a question.  Quite often, this will work for you (after all, your content is trying to drive your objectives).

However, this approach can lead to problems. If the sentence lifted from the content is ancillary (in that it explains something not directly addressing an objective, but supporting other information that directly addresses the objective), the learner may think this piece of information is more important than it actually is.

Another problem is where the sentence or piece of content you are basing your question on is within a context in the learning content. If the information is taken out of its context, it may lose meaning or importance. It may also make the information confusing, and therefore any question based on that information meaningless.

Finally, asking questions that speak to context can lead to asking unimportant questions. This is particularly the case where you have used case studies or examples in your content. Asking about a scenario you outlined, or the actions taken by some ‘pretend’ actor leads the learner completely astray from the point of your learning content. Of course, asking about the theory, concepts or steps that lie behind a scenario is fine. But all too often, I find people being asked “How much did Mary spend in the shop” rather than “In a shop, an apple costs 50c, a coke €1, a sandwich €5. If a customer buys 2 sandwiches 3 cokes and 4 apples, how much do they spend?”

Remember: your IDD should map the course goal into objectives, and those objectives to content. Both robust assessment and lighter self-reflective quizzes need to step back to the objectives in order to be really useful for learners.

Use your objectives to guide your content, rather than simply reviewing a topic for easy content to turn into questions.

3. Developing tests as a long series of simple multiple choice questions

This is the hardest “avoid” measure for me to defend. House style, SME opinions, certification demands and client requirements could make this point redundant. However, I shall argue on.

Being asked the same type of question over and over is pretty boring for learners. Click here, click there. One option, two options. Etc. Et cetera.  If you can at all, avoid doing this.

Most LMS and authoring tools now offer a range of question types that you can use. Try them out – don’t tie yourself to multiple choice simply because it seems to be the most robust type of question.  All types of questions can be used to ask about the truth of a statement or a definition. You could also consider:

  • Fill in the blanks for cause/effect, timeline, steps, definitions, conditions
  • Matching for the same as above, but also relationships
  • True/false for subtle differences arising from context, use of information, etc
  • Graphical questions excellent for relationships, screen-based information, etc

All these question types can also be used to provide scenarios or contexts within which you can really test your learners understanding.  There are of course many more contexts in which these questions are used – perhaps the subject for a series of posts; but I guess my advice is:

Avoid doing the same thing over and over again, try to be creative and consider new ways to ask about the information

A corrollorry of this is the way you frame your questions, where you are stuck with multiple choice alone. If the stem of your question is exactly the same over the course of 20 consecutive questions, your learners’ minds will start to stray. The ways to frame your question is bound only by your imagination. I also know that this advice could be seen to contradict my first point – but it shouldn’t. Reframing a question does not mean making it more obscure. Instead consider:

  • If you are asking about information, asking learners to identify a definition for something
  • Also with information, asking when information might be used
  • Also, providing a scenario and asking what information might be used
  • If you are asking about steps, providing one of the steps and asking the learner what happens previously or next
  • Also with steps, listing the steps in a process, then asking what was missing
  • Also, asking why a step is important

4. Getting ‘flashy’ for the sake of it

The exact opposite of the preceding point.  Trying to show off your content creation talents with wildly differing question types, formats and approaches could leave your learners with heads spinning.

It is tempting to try to “spice things up” or even “stir things up” using the vast selection of questions and content development tools out there. However, you should keep your learner in mind – they should be concentrating on getting to the right answers, rather than the question itself. Drawing too much attention to the format or bells and whistles attached to your question is something akin to writing a bad postmodern novel – those that appreciate the aesthetic of your great efforts may well miss the point of your making them.

Furthermore, great swings in the delivery of content or the way in which learners answer questions could lead to confusion on their part, and lead them astray from what is important in your learning content.

I’m not saying you should avoid being ambitious – rather, the manner in which you ask your question should suit the information you want the learner to work with. Sometimes the information you are testing on may require complex questions – but then you should make sure all the questions in your quiz work together in some way.

You want learners to be exercised by quizzes, but you don’t want them to be worn out by them.

5. Overcomplicating  your questions

Again, related to all of these points: keep it simple. Of course, some content is necessarily complex. But complexity is the addition of several simple pieces.

With the proliferation of available question types, you may be tempted to flex your muscles and shoe-horn a question into a question type. Alternatively, you may be tempted to create a large, complex scenario based quiz  that learners may not be able to follow. In short – these approaches add up to the same thing: Making the question difficult to comprehend. The longer a learner has to consider what a question means, the less effort they will put into actually thinking about the right answer.

This brings us back to the beginning – you’re not testing their language ability – you are testing their comprehension of the information you provide in your content.

Ask direct, pointed questions – so the learner concentrates on calling up the information, rather than working on understanding what the question is asking of them.

Here ends the post.

Talk to you next month! As always, please do leave a comment below.

Also, I’ve been toying with the idea of porting this whole project to a Tumblr blog, which might make it easier to start and engage in conversations. Please do let me know your thoughts on this.