What this post is about

A simple reflection on the importance of “book learning” and more traditional teaching modes, within a world where highly interactive and engaging learning content is becoming more widespread.
What this post is not about

This is not a detailed discussion on technical capabilities or creative development.
Here is a summary

This post is part 1 of a 2-part posting about Apple’s iBooks. I had intended to write solely about iBooks (under the title “Textbooks and eBooks and iBooks, Oh My!). However, I got quite far down the page without even mentioning iBooks. So I have left this first part as a reflection on the importance of some more traditional modes of study.

For most – perhaps all – instructional designers, Apple’s iBooks appears to be a welcome step forward in the development of interactive educational content.  However, for me, a key question arises: are we all getting too familiar with – or dependent on – highly interactive content? Is there not still a place for the discipline of traditional book learning? And if you agree that there is, how do we maintain this discipline in an age of Google (search for anything), iBooks (easy reference and education) and the ever-greater need to create on-demand content?

Here Begins the Post

A long time ago –  a good few years ago – a colleague and I declared that we had entered the age where “Education was no longer about memory. It is purely about application.” Please accept my apologies – I did not have a blog at the time, and so could not send out the memo.

I think at the time, we had got ourselves some killer 2GB USB sticks.  I felt you needed to remember nothing. Just store everything, then look it up. Google’s search technology meant that your digital content could be chucked anywhere and retrieved with nothing more than a keyword.

I can’t remember exactly the chronology, but around this time Gmail arrived, as did iPods, (then) podcasting, Wikipedia, a huge range of online resource sites – like OED, dictionary.com, reference.com and various others (including the idea that companies would host their own knowledge-retention systems).

Memory was handled by technology – it was no more than storage. I had decided at the time that no one really needed to remember much – they had to learn the skills required to find information. In my innocence, I also believed that learning skills were all about knowing the course of action to take and the information required to undertake that action.  Book learning – I thought – was on its last legs.

How wrong I was.

The landscape of eLearning has changed dramatically in this time. Many  previous certainties have fallen by the wayside. Flash content wanes, as on-demand content (e.g. reference material, podcasts, vodcasts, etc.) have all increased in popularity. Learning is considered to be everywhere.

With the rise of rapid eLearning (still quite dependent on Flash – because of the form-based authoring that made it so popular – but not as much as it used to be), we have developed ever more inventive ways to get people to engage with content – focus their attention, interact with information – in order to better “internalise” (I hate this word, but accept it) and use it.

The instructional designer is often faced with quite exciting choices for any new project they begin. What kind of media is available? How can this be deployed? How can we make it so that learners can explore this information – so that rather than “going through the motions”, they are creating their own path – and therefore engaging more and more deeply with the content? Furthermore, how can we organise this information so that learners can access what is required – but what is most important for them?

Why was I wrong?

Because everyday, I go into work and what concerns me most about any and all aspects of development – whether we are planning, scripting, testing or evaluating – are learning objectives.

Don’t get me wrong. I love the tech stuff. I hope to explore it a little more in the next post.

But my fundamental concern in developing learning is: What should learners be able to do when they have done whatever we create? Furthermore, what is it that we should create? What is it that will enable learners to do these things that we (or someone) has decided they should be able to do?

While engaging learning is certainly the way to go for higher-level concepts (working with information –  taking actions, making decisions, considering options, etc.), sometimes the basics are best learned by rote. An unpopular view, but I still remember my multiplication tables, rules of grammar (even if I don’t always adhere to them) and much of the poetry I learned in school. Ironically, a lot of the good poetry that I loved in university is missing words, phrases, often lines. Sometimes, I can remember a snippet and Google can help me to find the rest. Other poems that I worked with in university, I learned by rote. By reading them over and over until they were welded to my brain (a curious phrase I have adopted recently).  I should mention here that learning by rote still had some more engaging aspects – my father used to help us with multiplication and alphabet by using a distinct rhythm; this rhythm certainly helped the drilling of the information.

I am not saying this kind of information could not be learned through more exploratory means – it most certainly can. But would it be learned as quickly? Furthermore – by knowing multiplication tables, I believe my understanding of multiplication (when properly explained to me in terms of sets, how multiplication can be used, squaring and cubing numbers, etc.), as I had to hand all the examples I needed – whether I wanted to use 2×2 or 12×12 to apply to these concepts I was learning.

These examples of ‘basics’ are somewhat extreme – as far as ‘basics’ go. Basics might also include content that lawyers, doctors, accountants and IT professionals might need to have – solid knowledge, not at their fingertips, but within their mind.

As very simple (perhaps over simple) examples, consider:

If you went to a lawyer, explained a complaint you might have against someone and the lawyer – suitably outraged – explained to you that this was wrong and they would set it right; only to later tell you that there is nothing that can be done through the legal system – how would you feel about that lawyer?

While we are all very aware of the marvellous alchemy of accountants in the past decade, what if you were to go to an accountant who didn’t  know whether the purchase of a new piece of kit was an asset to your business or a liability?

What if you had a fall and hurt your arm, so went to the doctor. If the doctor consulted a book (or website) about pains in the arm and suspected a heart attack might be in the offing, how much confidence would you have in that doctor?

In all these cases, there is missing a fundamental level of knowledge that hard study is probably best suited to remedying (I’m using these as I know what these people do. I would venture that for almost any professional – builders, plumbers, web designers, web developers, etc. there is a foundation of knowledge that is required to work competently – and upon which they can build to develop their own knowledge, experience and career).

While interactive content can easily deal with broad stroke concepts and decisions that need to be made, there are often a wide range of situations that have occurred before. Knowing how these have been approached, and the outcome from the approach(es) taken can help someone make better decisions, more quickly.

Also required is a teacher/trainer/mentor who can test the knowledge gained from these books – both the type of knowledge gained and its application – by asking random, real-life and highly complex questions (perhaps that lead to further questions based on the specific answer provided by the learner).

While we would love to cater for this in eLearning, it might never be possible to develop something that worked as efficiently as one-on-one Socratic questioning.

I am an instructional designer. I make my money and have dedicated over ten years of my life in the development of learning content for a range of professionals and purposes. I am not arguing that eLearning is useless, or only partially useful. I am arguing that there are certain situations in which older-style “book learning” and “teaching” is best suited to developing and testing knowledge.

This will be further developed and applied in the next post – on the iBook and its applications (in a few days or perhaps next week); but for now, I shall say thank you for reading.

Advertisements

Happy new year! And yes, along with a long line of others, one of my resolutions is to pay more attention to this blog. It has been difficult to keep a regular presence; sadly, I seem to spend more time on my hobby writing blog. Anyway, this time, I’m going to talk about the whys and wherefores of teaching concepts using eLearning. Often considered the most boring content to develop and consume, I think some simple tips can help to make your content less page-turning and more page-turner.

An addendum. I need to acknowledge Kieron Strong, trainer extroadinaire, as inspiration and information used in this post.  Much of what I have written here has been drawn from conversations we have had about training and development. However, Kieron did not contribute to the planning or writing (or indeed had no knowledge of my writing) of this post. The opinions are my own alone. He has his own opinions, which I am sure he would be happy to share with you.  Anything you disagree with in this post – you disagree with me – not with Kieron.

What this post is about

I’m going to write about some cases in which it is important that concepts be taught in eLearning. Also, how concepts can be taught in a more engaging fashion. I have often found that the immediate impetus is to provide conceptual information (guidelines, facts, “information” etc.) in a page-turning fashion, or simply as resources to be consulted in order to complete task-based content with a more practical focus.

What this post is not about

I’m certainly not arguing that teaching concepts has to change everywhere and for all projects. I am not belittling the work, effect and impact of more practically-focused online training.

Here is a summary

Task-based learning has perhaps saved online learning from itself (especially during the 90’s, when it was “Page-page-question; Page-page-rollover; Page-drilldown (essentially more pages)-rollover”; however, it sometimes feels that the balance has gone the other way.  That the concepts that inform tasks – the reasons certain steps are taken and why they are taken in a specific order – are suddenly less important. This balance needs to be redressed. Concept-focused learning needn’t be boring or gimmicky; it can also provide essential understanding to help learners apply other knowledge in a more effective fashion.

Here Begins the Post

When an instructional designer (ID) begins a new project, there is an air of magic about it. Especially where the project is entirely new (i.e. the content to be developed is a suite on its own, or a specific one-off; in both cases freeing the ID from any standardised approaches that may have been defined for the wider project).

The opportunities are endless (sometimes, this can be daunting).

Of particular interest to most IDs is the opportunity to create simulation- or scenario-based content, that will guide the learner through a series of steps to complete some complex task. All “excess” or “residual” information can be provided as resources, that the learner consults in order to complete the task. The whole thing is immersive, exploratory and altogether kicks ass – for ID as the designer/developer and for the perceived learner, who will have a much more engaging experience as a result.

However, sometimes, the concept needs to be given greater focus. The concept is not always “excessive” or “residual”. Knowing why one takes a specific step can be as important as knowing the step that needs to be taken.

For example – a technician may need to run a series of diagnostic steps to determine the cause of a fault. Each step they take will lead them to some conclusion or another. By understanding what these conclusions mean, the technician might determine the cause of the fault more efficiently – or indeed identify a wider problem (something causing several  specific faults).

For example, consider me – even though I’m no technician – at home. I may notice that the lamp has gone out. I could check the bulb, but to no avail. I might then check the plug fuse; still to no avail. At this point, I might realise that perhaps there has been a power cut, or I have blown a fuse at the fuse box. So, I might check the clock on the oven, or just check the fusebox. These steps make sense – going from the most simple options to something bigger or more complex. If it’s night-time, and the lamp, my laptop, TV, stereo, &c all cut out, I’ll probably head straight for the fusebox, using the illumination of my phone display as a torch. I don’t go through each simple step in turn, as I know immediately that there is probably a bigger problem at hand. Furthermore, I have never learned how to use a mobile as a torch; but I do know that pressing the screen randomly will keep it illuminated, and that I can use that illumination to light my way.

Another example. Consider the dreaded position of the Customer Service Rep, manning the phones. In just about every corporate environment, a prepared script is provided, through which the CSR must work. By following their training (and indeed job specification) to the letter, they could make a bad situation worse.  Imagine a customer phones, who has had something go wrong.

They are frustrated with their situation, and perhaps with the company. They aren’t necessarily angry at the CSR – but they will be taking their frustration out on the CSR.  If the CSR misreads this situation, they may either:

  • Respond personally – over-defensively, taking umbridge at the caller, who may be rude to them
  • Try to ignore the emotion of the caller, and proceed through the prescribed script until they can reach a resolution – which, even if it deals with the customer’s primary problem, may well leave the caller frustrated that their anger/emotional state has not been recognised/acknowledged

Both of these outcomes creates an unsatisfactory (or at the least, not satisfactory enough – given most call centres focus on “excellent customer experiences”) outcome for the customer, which may have knock-on effects.

A final example. In medical cases, trainee doctors may be faced with several options for dealing with a case. The best option to take will depend on previous medical history, patient allergies to medications/treatments, and a range of other factors that are (frankly) beyond my understanding. Traditionally, understanding these options and the factors that should contribute to decision making are learned by late nights of reading books and following a senior doctor on the rounds.  However, I believe eLearning could be adequately used to inform trainee doctors of these various and diverse contexts and situations in which decisions must be made, and the type of information they require to make the best decision.

So how does teaching concepts directly help in these cases?

  • By learning to recognise and deal with multiple causes for power outages – and perhaps the patterns that indicate them, I might be able to find the quickest set of steps required to deal with my faulty lamp/TV/stereo, depending on the circumstances I face.
  • By learning to recognise customer frustration, the CSR may be able to deploy the right kind of tone/interaction that allows them to calm the customer and reassure them that they are trying to resolve the problem at hand.
  • By learning to recognise the most common allergies, issues with specific treatments and drug interactions (although it is fair to say, not every single one could be learned – knowing the most common is probably useful), the trainee doctor can be better informed, which will help them to select the best course of action for their patient.

So how can concepts be taught in a more engaging way online?

Here are some tips that might be worth considering:

Set the scene/purpose convincingly. This might mean using a scenario (I know this goes against what I mentioned previously – but what I mean is to set up a scene, in which the concept(s) you want to teach can be elicited). However, if you do use a scenario – do not make it too specific. The problem with teaching general concepts with very specific scenarios is that the learner may interpret the content as saying “In this – very specific situation – you will need to know this information”; in fact, you are trying to get them to understand that they need to know this information in almost all situations. This might otherwise mean writing more engaging content to draw the learner in – make them scared (i.e. the negative consequences of not knowing the information provided), or more motivated (i.e. the positive consequences of knowing the information)

Include lots of questions – not just as quizzes (but do include quizzes – I would rather have a quiz and no copy than copy and no quiz), but in the body of your content. I have found it useful to ask questions directly preceding content to focus the learner’s attention, or use their previous knowledge to introduce a new concept. Questions wake learners up a bit, and therefore “tune” their mind to the information you are asking them to take in.

Include good graphs/animations/visual content – this is fairly obvious and covered in every ID training programme. If you have to teach the relationship between things, or the innerworkings of something, use graphical content as much as possible. Whatever about a picture and a thousand words, graphs can also be reproduced as downloads, which make excellent job aids.

For very complex concepts, take steps and ask lots of questions – this has proven particularly useful in some projects I have worked on. From an extremely complicated concept, we provided one sentence and one question per screen. The sentence would describe a simple aspect (which was illustrated with an animated graphic), then – based on this – the learner would be asked a question about the consequence of this simple aspect. In most cases (but not all), the SME agreed that given the single sentence and graphical aid, the learner should be able to make an informed guess (I hate blind guesses, I believe informed guesses are eLearning gold dust), about the next sentence, given the question. Obviously, in these cases the question is prompting the next sentence/screen, but it is useful nonetheless to take a learner through a concept – step by step – and ask questions to help build those cognitive bridges that lead to better understanding. It also meant we weren’t dumping a pile of text and load of images on one page/screen and asking the learner to read through it before moving onto something more interesting (the learner reads: “This is something you don’t want to do – but hold on, because there’s something you do want to do next!”)

Answer the Question the Learner is Asking – namely “Why can’t I get on with the good stuff – why should I care about all this?” Provide context for your information, explain where it might come in handy. As with the scenarios mentioned above, do not make this too specific – for fear that your learner believes the examples you give are the only situations in which the information is used.

Here Ends The Post

Thanks for reading. As always, comments and conversation are welcome. What do you think?

Can We Fix Jobbridge?

September 7, 2011

IRISH READERS: Please skip to the next paragraph. For non-Irish readers, Jobbridge is a National Internship Scheme, which has been developed to help those who have lost their jobs to gain experience and/or upskill. The programme has been subject to some criticism as there is a belief employers are exploiting it – and that furthermore, there is nothing to stop unscrupulous employers from exploiting those who are out of work.

The past couple of weeks have thrown up more bizarre positions on the Jobbridge website.  Many of the “internships” being advertised appear to certainly be entry level (or higher) roles – jobs that people should be getting paid to do. This is causing deep concern, and rightly so. Not just concern that employers are getting free labour, subsidised by the government. Some have pointed out that they have lost income as contractors – because work they would have pitched for is now being done by “interns” for free.   Some interns have pointed out that they are the only person performing a specific function within the companies where they are working.

In fact, a Tumblr blog is now listing some of the worst offenders.

What this post is about:  In light of all this, I would like to consider some ways in which the scheme might be made to work as a  proper internship programme, offering real value for those who are enrolled.

What this post is not about: I’m avoiding critiquing specific roles (I did that in my last post) or the range of positions advertised on Jobbridge, as I believe they are adequately dealt with elsewhere. (The Tumblr blog I’ve linked to collects them, alternatively, search #jobbridge on Twitter, and you’ll see plenty of complaints or positions that appear to be improperly advertised)

Here is a summary: Jobbridge lacks three vital ingredients which must be addressed to improve its reputation and usefulness:

  • Employer effort (not just the effort of taking on staff – but the effort of contributing to a workplace learning and development programme)
  • Defined training paths, which prove that the internship is – in fact – an internship and not a free-labour scheme
  • Accreditaiton and certification, which benefit both candidates and organisations by defining a skills range and level and benchmarking it so that employers and employees have a common reference point when discussing ability

In short, Jobbridge might be fixed by applying a structured training and development framework to it. In this post, I’ll map out my thinking in reaching this conclusion and list some ways in which these vital ingredients can be included in the mix.

Here begins the post

Jobbridge is meant to be about internships.

An internship should be a position specifically created to train young (usually professional) person in applying learned knowledge to a role, or set of roles (this is my own broad description).

The point of an internship is that the candidate can learn how to bring together various and diverse knowledge, skills or approaches in order to perform some role. For example, as an instructional designer, one pulls together knowledge of learning styles and learning design, computer skills, some insight, and certain approaches to develop learning content. All of these specific skills/knowledge may have been learned elsewhere or in different contexts (e.g. studying educational psychology, computer science,  browsing the Internet and taking an interest, etc.). An internship can take this knowledge and the candidate’s “raw talent” and shape it (by training them how to apply these skills, or providing some new skills) into a form that is useful for business/productivity. This benefits the intern, as their new skills are now marketable. It benefits the business (and business in general), as it means that candidates are better skilled to enter the workforce and become more productive in a shorter timeframe.

The US Dept of Labor asserts that an internship must meet 6 criteria (these 6 criteria separate an “unpaid internship” from “unpaid work”):


  1. The internship, even though it includes actual operation of the facilities of the employer, is similar to which would be given in an educational environment;
  2. The internship experience is for the benefit of the intern;
  3. The intern does not displace regular employees, but works under close supervision of existing staff;
  4. The employer that provides the training derives no immediate advantage from the activities of the intern; and on occasion its operations may actually be impeded;
  5. The intern is not necessarily entitled to a job at the conclusion of the internship; and
  6. The employer and the intern understand that the intern is not entitled to wages for the time spent in the internship

US Dept of Labor Fact Sheet #71 http://www.dol.gov/whd/regs/compliance/whdfs71.pdf

How Does This Apply to Jobbridge?

What Jobbridge offers – despite calling them internships – is “experience”.  This is intended to ” the cycle where jobseekers are unable to get a job without experience, either as new entrants to the labour market after education or training or as unemployed workers wishing to learn new skills.” As long as this remains the case, Jobbridge is wide open to exploitation by unscrupulous business practices. It is easy to offer anyone “experience”, as all you need to do is tell them to do something. And that something may well profit you.

It is also easy to make exploiters out of those who may have good intentions. In a range of comments and tweets, I’ve seen people defend Jobbridge listings, claiming that “it’s a legitimate opportunity”. This may well be a firmly held belief – not everyone is a scheister. I can understand that some without an understanding of professional development or training may well feel that they are offering people a real chance to apply their knowledge/practice skills. This is not to say that all employers are struck by such a feeling of benevolence. It is simply to point out that many may not see the problem that others see. Which is what I want to talk about next.

What is the Problem?

The immediate and obvious problem is everything written thus far in the post – the possibility of exploitation (and the assertions that this exploitation is not only ocurring, but is rampant). A less obvious, but perhaps more damaging problem is that there is no proof of learning here. Without such proof of learning, the whole exercise is empty and pointless.

Empty and pointless because:

  • Businesses/organisations cannot describe what it is that they will teach interns (which leads to suspicion of their motives)
  • Interns have no objective evidence that they have done anything at all (which doesn’t help them when searching for actual, paid work)
  • Not necessarily essential – but Ireland is also missing the opportunity to draw up and develop a real skills register – i.e. a list of the skills used in businesses and organisations across the country
It is probably useful at this point to restate Jobbridge’s “aim”.
The aim of the National Internship Scheme is to assist in breaking the cycle where jobseekers are unable to get a job without experience, either as new entrants to the labour market after education or training or as unemployed workers wishing to learn new skills. The scheme will also give people a real opportunity to gain valuable experience to bridge the gap between study and the beginning of their working lives.
There is just about nothing here that describes education or training. (Another irony here is that Jobbridge comes under the aegis of FAS (soon to be SOLAS), the national work skills training and development agency.  FAS is Irish for “grow”, but there is little proof here that any growth will occur, offering much less by way of solace for those who are out of work and seeking opportunities).
So, How Can We Fix It?
Simply put (and it is much easier said that done), a proper training and development framework needs to be applied to the Jobbridge programme. This would:
  • Define the prerequisite level of education and skills required to undertake the internship
  • Identify the training that will take place during the apprenticeship, and describe the manner in which this training is provided
  • (The hardest part) Plot the skills to be learned/developed/applied according to their complexity and level of expertise. While this is difficult, we do have a National Framework of Qualifications, which plots – in the abstract – just this kind of information. The NFQ is used to decide whether a course leads to a certificate, diploma, degree, masters degree, etc.
  • Describe how the intern’s learned skills can be tested and applied in future
  • Provide a form of accreditation and certification, which gives the intern objective proof of their efforts, and identifies both their skillset and mastery for potential employers
So, How in Hell Do We Do All This?
This is a blog post, not a white paper. But forms of such internships exist (Accountancy/Law have used some form of learning/workplace development for generations, with new graduates undertaking “office work” while studying for examinations – often sponsored by their employers). However, I can think of 3 methods that could at least be investigated:
The medical internship model, in which educated people (i.e. those with a degree in medicine) are trained to perform certain tasks and encouraged to develop their skill. The traditional method is Learn One, Do One, Train One – in which the trainee will see a task being done, then – under supervision – perform that task, then teach the task to another, or explain it back to their supervisor (it is often easiest to prove to yourself and others that you know how to do something when you explain it to someone else in a clear manner). Of course, there are other learning methods employed (and indeed, different medical disciplines have their own training methods).  However, I suppose my point here is that if we don’t want to reinvent the wheel, we could always look at a tried and tested method for learning and developing professional skills, which can eventually be accredited and certified.
Another model might be to take a breakdown of the skills that an intern will gain while working for the ‘host’ organisation. These skills could be plotted against existing courses and training efforts already certified by FAS. Final testing for accreditation/certification could be developed.  The benefit of this approach is that many of the courses exist, and therefore will have standard curricula, syllabi and testing methods. These would need to be  tweaked, to be applied to workplace learning.  This is something for some kind of national skills training and development agency.  Where ‘holes’ exist (i.e. no courses currently exist), courses could be developed to bridge the gap – which then means Ireland (and Ireland’s training and development organisations) can better respond to the needs of organisations in the “real” economy.
Finally, and perhaps most difficult, is to develop skills accreditation on the fly. This would be absolutely meaningless without reference to something like the NFQ – it would require the development of a framework within which working skills are described in terms of effort, mental dexterity, expertise and complexity. These separate aspects would then need to be configured in such a way as to provide an abstract, objective description of the types of skills one has learned – but within definable categories (these categories being along the lines of “IT”, “Hospitality”, “Design”, etc. and then “Novice”, “Intermediate”, “Expert”, etc.). This might be considered impossible by some, but many corporate employers have something like this in place – because it allows them to gauge their own talent pool. Such a system would have to work on a national level. Skills descriptions would have to be defined by organisations working with objective experts in the training and development field. The skills descriptions would then need to be plotted against something like the NFQ.
In all cases, there are 3 important aspects which are missing from most Jobbridge listings:
  • Employer effort (not just the effort of taking on staff – but the effort of contributing to a workplace learning and development programme)
  • Defined training paths, which prove that the internship is – in fact – an internship and not a free-labour scheme
  • Accreditaiton and certification, which benefit both candidates and organisations by defining a skills range and level and benchmarking it so that employers and employees have a common reference point when discussing ability

In a country where we consistently discuss the “Knowledge Economy”, some form of knowledge development scheme should not be beyond us. However, whether the appetite for the challenge and the effort is there is another question.

Hello again, and thanks for dropping by. I appear to be blogging on the theme of questions in eLearning. This month, I’ll be talking about True or False questions – and what they can – and cannot – do. If you’d prefer to see more about developing content, downloads, supporting materials, integration for blended models, etc. please do let me know. I’m on questions at the moment because they appear to be poorly represented in eLearning blogs in general. Anyway, let’s get on.

What this post is about

The use, proper and poor, of that old cherry – the True or False? question. In this post, I’ll provide some advice and strategies for using True or False questions in your eLearning project. I hope to show there is more to True or False questions than many people assume (essentially – stating bald facts or bald lies to identify whether your learners can distinguish between them); and hopefully inspire some new thinking in the use of this question type.

What it is not about

This is not a post intended to give specific true or false questions.

Here is a Summary

Here begins the post

Recently, I was discussing True or False questions- from IRL (in-real-life) tests, actually – where the person I was talking to mentioned a third person who claims:
“True or false questions are useless! All one has to do is answer true – and you will pass. Because the statements used ALWAYS tend toward truth.” 

There may well be truth in this statement (or falsity!), but it is too much to deal with in one post – there are too many issues. Here are just some:

  • What is your passing grade (that allows someone to answer “True” for every question and still pass)?
  • What is your content (that the only manner in which to phrase the questions is such that they tend toward truth)?
  • What is the context (that you are using “True or False” questions as a form of assessment)?
  • What is your goal for your learners (that “True or False” questions are being used – is there a specific reason?)?
  • What is your learner’s goal (how does the use of “True or False” questions help them to achieve their aims, by applying the knowledge they are hopefully learning from your content)?
  • How is it that all your questions actually do tend toward truth (I must admit, in developing these questions I instinctively tend toward the opposite – there is no good reason for this, perhaps I am a man of falsehoods, I know not)?
So, let’s start by taking a look at the anatomy of a True or False Question.
At its heart, a True or False question is a simple multiple choice format. There is a statement, from which the learner selects an option.
Going a little deeper, a True or False question is a statement that either reflects a true state of events, or is a bald faced lie.
That’s it. In practice, it will look like this:
Statement: Some state of existence in the world.
  • True?
  • False?
The learner reads the statement, then selects True or False as a response, to indicate whether they believe that the statement does actually reflect some state of existence in the world, or does not.
So what’s wrong with it?
In many cases, it’s seen as far too simplistic (and because it is seen as simplistic, it is used simplistically). This is often true. Here’s an example. Imagine you read the Peter and Jane story to your learner (this is a story for children, along the lines of:
This is Peter. This is Jane. Peter and Jane are at the beach. Peter and Jane will build a sandcastle. Peter has a bucket. Jane has a shovel.)
Quite often – too often – the True or false question will be used to test the learner’s ability to remember a bald fact. For example:
True or false? Peter has a shovel.
Of course, we know Peter does not have a shovel.  The learner, irritated, clicks False and if they receive feedback, it annoys them that they are being told they were right (God help us all if they were wrong), which adds a delay (read: frustration) to their learning. I know not all statements of bald fact are this easy.
However, another issue creeps in: recognition (as opposed to understanding). When you make a statement that repeats (whether exactly to elicit “True” or incorrectly to elicit “False”), you run the risk of the learner recognising the statement. This is a memory of the language – not of the actual content, or meaning. The learner will not remember the importance of Peter’s bucket – they will simply recognise that the sentence they read did not say that Peter had a shovel. This is an important distinction, because in learning, we aim for retention of knowledge and understanding – not recognition of phrasing.
So, what’s right with it?
The True or False question can be a great way of asking a learner to demonstrate their understanding of a complex situation. This may appear to be a far cry from Peter and Jane, but it applies quite well. (I use the term “complex” here to mean a series or combination of simple statements, that combine to become complex). The True or False question can be used to take disparate concepts and join them together. Some applications include:
  • Between topics within a learning unit
  • As a pre-test to determine whether a learner can connect two objectives (i.e. apply knowledge they should have in a relevant context)
  • After a complex topic (i.e. where several concepts have been discussed) to determine whether a learner can put the concepts together, and understand their outcome
(This really is, just a few. There are many more applications, which I would tease out if there were a book as opposed to a blog to be written).
So what do these options mean in real life? 
Between topics within a unit – as part of a quiz, the True or False question can be used to link some concept in the previous topic to a concept in the next topic (e.g. In the last topic, we learned that Peter had a bucket and Jane had a shovel. We might also have learned that they intend to build a sandcastle – the mechanics of which will be dealt with in the next topic. So we might ask: True or False? A typical sandcastle cannot be built using a bucket and shovel.). If you’re using a True or False question in this way, it cannot be an assessment item. You can’t assess someone based on such a question without  presenting the information first, it’s simply not fair. However, if you aren’t assessing learners, this is an excellent way to focus their attention on what is coming next and to identify the key information from the previous topic that will be employed in the next topic.  Furthermore, if you are using feedback immediately after the question, it can be used as a learning event in itself (if you are comfortable with this, and your course is organised in such a way as to allow the learner to learn from the feedback). In short, you can ask about two disparate concepts, and use the feedback to demonstrate to the learner how they are connected. A note of caution – be careful with this approach. The learner should – based on the preceding content  – have some idea of the connection between the concepts. If something new is introduced out of the blue, the learner will lose faith in you. However, if the concepts have been dealt with thoroughly, a True or False question that challenges learners to connect them can be really useful in focusing their attention or making them think about the concepts in a new way.
As a pre-test, True or False questions can be used in a similar fashion -by asking about 2 seemingly disparate aspects to a complex procedure or body of knowledge to see if learners can connect them. For example: True or False? If Peter uses a shovel and Jane uses a bucket, they can build a sandcastle. In this example, it is true. We know from the story that this isn’t the way things worked out – but the learner should be able to extrapolate from the 2 actors and the 2 tools, the outcome should be achievable. This has been simplified – it’s not always the case that the statement will be true (see my final advice), but in this specific context, we are on safe ground. However, if one were training in health and safety, it might be the case that Peter and Jane need certificates to use the equipment they have. In such a situation, the very same question could be false – because only Peter has the certificate for use of a bucket and only Jane has the certificate for use of the shovel. Suddenly, something very simple has become quite complex in itself.
This leads me to the third point – the use of True or False questions after a complex subject, the use of True or False questions can be really effective. To build them, you need to phrase the statement very carefully, then ask whether it is true or false. Taking our previous example:
True or False? If Peter uses the bucket and Jane uses another bucket, they can build a sandcastle without a shovel at all.
Well, is this true or false? If we have only stated the facts as listed above, we cannot mark a learner down for saying it is false. All they know is that Peter and Jane have a bucket and a shovel and with these tools, they will build a sandcastle.
However, we may have explained the purpose of the bucket and shovel- to collect (shovel) and hold (bucket) sand. We may have gone further and explained the characteristics of a bucket and shovel, which means the learner should be able to identify that one bucket can be used to pick up sand – and the other can be used to hold it. If the learner is expected to be at a level whereby they can understand all this – then they should get this question right. We are testing their ability to recognise that one of the buckets can be used as a shovel, because it has characteristics that allow this. To get even more subtle – if the only way to build a sandcastle was to use a bucket and upend it into the sand – then the vice versa case is false – we cannot use 2 shovels to build a sandcastle, because a shovel (by its characteristics) cannot collect the amount of sand we require to be upended to create our sandcastle.
I hope these examples have provided some food for thought, and will extend the use of the True or False question type.
Some Final Advice/Thoughts. When using True or False questions:
The statement you write should contain all the information they need to answer true or false – based on their defined level of expertise and experience. As you may – or may not – have noticed, I hate “trick” questions. The truth (or falsity) of the statement you make should rely on itself alone.  You must remove all possibility where a learner thinks “Well, in this context, it might be true, but in general it is false”. The learner is attending to the question, and so must be able to answer the question based on the merits of the question provided (this may not be the case with all question types – but with True or False, I believe it is the case)
Your statement must be quite finely honed.  As an extension of the above – you must be careful to remove all uncertainty to the statement you make. Ensuring this will usually require a second pair of eyes to make sure there is nothing in your statement that might confuse them. This is really an issue for the editorial process, but it is important to point out.
Mix it up. Some statements are true, some are false. So, make some of your own statements false. My experience would echo the statement at the start of this post – that most True or False questions tend toward truth. However, if you do decide to tend toward false:
Don’t play with complex constructions that make the statement false (i.e. double, triple, negative falsehoods)
Don’t go with simple, bald false statements (see the earlier point about recognition vs understanding)
Make sure the false statement appears to be likely, but don’t try to trick the learner in this regard. Questions can be aloof from the learner and the general content;  a True or False question needn’t attempt to ‘catch a learner out’ by being especially inviting in any way. Remember – you have taught the content – the question can now be used to connect, combine or otherwise construct knowledge based on the individual parts.
If you use feedback – immediate feedback – True or False questions can be an opportune way to tease out differences. For example, if you have covered two disparate concepts that can be brought together in one True or False question, the feedback to the question can be used to provide further insight (the method or other technologies that Peter and Jane could use to build a sandcastle; how sandcastles can be seen as a metaphor for something else; etc.)
Here ends the post.
Thanks for reading. Please do leave a comment or email me if you’d like to talk about this more.

Hi there. Sorry I’m so late. I got caught in traffic.  From now on, I’m going to  try and get the train – this means posting once every month. Let’s see how that goes.

What this post is about: This post discusses some pitfalls I have come across when trying to create effective assessment or interactivity using quizzes/Q&As (and some that I have noticed in other people’s learning content).

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it talks about what you might try to do (but really should avoid) when developing quizzes to test for knowledge retention and understanding.

Here is a summary: The last post talked about 5 things to do with quizzes, which I hope you found useful.  There are many temptations to over-reach when asking questions – leading to questions which you might think are more engaging and exercising, but in fact are irritating for learners, or do not add to the educational value of your content.

Here begins the post

Quizzes can be the hardest part of developing online learning. What’s a good question? The content seems so thin, how can I ever get X questions from it? How can I grab the learner’s attention with this? How can I avoid boring them?

You’re looking at a blank sheet of paper, wondering what you should be asking…

…then you get “creative.” You try to ask the same questions in different ways. This is not always a bad thing – especially where the same information might be applied in different contexts.  However, problem questions come from over-reaching. Here are 5 ways I’ve seen people over-reach, which (for me) just don’t hit the mark.

5 Things to Avoid

1. Avoid dealing in “semantics” and using “trick” questions

Both of these points relate to the language you use when asking questions.  Therefore the following is not necessarily true where your content is addressing comprehension, definitions, grammar or parts of language.

Using “semantics” means doing things like:

  • asking a question using a double negative
  • playing on words/puns or subtler meanings to ‘throw’ a learner
  • using options where a similarly-spelled word is used
  • using difficult sentence structures, especially where cause/effect or timelines are being asked about

This sort of tactic is just not fair to learners.  You aren’t testing them on their language skills; you’re testing them on the knowledge they are meant to be gaining as a result of your content.

Many argue that asking questions like this is a good way to “keep learners on their toes”, or “make sure they’re paying attention”. I’m not so sure.

  • First, keeping learners on their toes can be achieved by asking pop-questions within your content, or asking about subtle differences in application of knowledge – not by seeing whether they notice a minor aberration in your grammar, punctuation or spelling.
  • Furthermore, the quiz itself is a good way to make sure learners are paying attention; asking that they notice very subtle linguistic issues when they are concentrating on the concepts, theories or techniques you have been describing is off-putting.

Trick questions I have more difficulty with (not just in terms of their use, but also in describing them). These are questions where you actively try to lead a learner astray (i.e. away from the right answer).

I think quiz questions – whether for assessment or self reflection – should be fairly unbiased. For example, with multiple choice questions, they should present to the learner a series of equally likely (or unlikely) options.  If it’s skewed one way, it might be too easy (which doesn’t really exercise them) skewed another, it will feel to the learner like you are  trying to trick them (which you are!) The problem with tricking a learner is that you annoy them or (worse) they question the value of your content.

However, there is a subtlety in the “trickiness” here. Sometimes, you may want to ask about the differences between certain concepts or objects. Or, you may want to ask when a specific piece of information might be applied to a situation. In these cases, subtle differences can be a very useful way to tease out such differences (in fact, I’m a particular fan of “True/False” questions, where you ask whether a piece of information can be applied in the wrong context. The answer should be False, and the feedback can be used to explain the importance of context and information).

Be fair to your learners: They are there to learn from your content – not have their comprehension/concentration tested.

2. Avoid writing ‘to content’

You have objectives that you want your learner to achieve.  Often, these objectives will be well-used for proper assessment quizzes; but will be forgotten when trying to ask about the content a learner has just encountered. What often happens is that a sentence is lifted from the content to be reworked as a question.  Quite often, this will work for you (after all, your content is trying to drive your objectives).

However, this approach can lead to problems. If the sentence lifted from the content is ancillary (in that it explains something not directly addressing an objective, but supporting other information that directly addresses the objective), the learner may think this piece of information is more important than it actually is.

Another problem is where the sentence or piece of content you are basing your question on is within a context in the learning content. If the information is taken out of its context, it may lose meaning or importance. It may also make the information confusing, and therefore any question based on that information meaningless.

Finally, asking questions that speak to context can lead to asking unimportant questions. This is particularly the case where you have used case studies or examples in your content. Asking about a scenario you outlined, or the actions taken by some ‘pretend’ actor leads the learner completely astray from the point of your learning content. Of course, asking about the theory, concepts or steps that lie behind a scenario is fine. But all too often, I find people being asked “How much did Mary spend in the shop” rather than “In a shop, an apple costs 50c, a coke €1, a sandwich €5. If a customer buys 2 sandwiches 3 cokes and 4 apples, how much do they spend?”

Remember: your IDD should map the course goal into objectives, and those objectives to content. Both robust assessment and lighter self-reflective quizzes need to step back to the objectives in order to be really useful for learners.

Use your objectives to guide your content, rather than simply reviewing a topic for easy content to turn into questions.

3. Developing tests as a long series of simple multiple choice questions

This is the hardest “avoid” measure for me to defend. House style, SME opinions, certification demands and client requirements could make this point redundant. However, I shall argue on.

Being asked the same type of question over and over is pretty boring for learners. Click here, click there. One option, two options. Etc. Et cetera.  If you can at all, avoid doing this.

Most LMS and authoring tools now offer a range of question types that you can use. Try them out – don’t tie yourself to multiple choice simply because it seems to be the most robust type of question.  All types of questions can be used to ask about the truth of a statement or a definition. You could also consider:

  • Fill in the blanks for cause/effect, timeline, steps, definitions, conditions
  • Matching for the same as above, but also relationships
  • True/false for subtle differences arising from context, use of information, etc
  • Graphical questions excellent for relationships, screen-based information, etc

All these question types can also be used to provide scenarios or contexts within which you can really test your learners understanding.  There are of course many more contexts in which these questions are used – perhaps the subject for a series of posts; but I guess my advice is:

Avoid doing the same thing over and over again, try to be creative and consider new ways to ask about the information

A corrollorry of this is the way you frame your questions, where you are stuck with multiple choice alone. If the stem of your question is exactly the same over the course of 20 consecutive questions, your learners’ minds will start to stray. The ways to frame your question is bound only by your imagination. I also know that this advice could be seen to contradict my first point – but it shouldn’t. Reframing a question does not mean making it more obscure. Instead consider:

  • If you are asking about information, asking learners to identify a definition for something
  • Also with information, asking when information might be used
  • Also, providing a scenario and asking what information might be used
  • If you are asking about steps, providing one of the steps and asking the learner what happens previously or next
  • Also with steps, listing the steps in a process, then asking what was missing
  • Also, asking why a step is important

4. Getting ‘flashy’ for the sake of it

The exact opposite of the preceding point.  Trying to show off your content creation talents with wildly differing question types, formats and approaches could leave your learners with heads spinning.

It is tempting to try to “spice things up” or even “stir things up” using the vast selection of questions and content development tools out there. However, you should keep your learner in mind – they should be concentrating on getting to the right answers, rather than the question itself. Drawing too much attention to the format or bells and whistles attached to your question is something akin to writing a bad postmodern novel – those that appreciate the aesthetic of your great efforts may well miss the point of your making them.

Furthermore, great swings in the delivery of content or the way in which learners answer questions could lead to confusion on their part, and lead them astray from what is important in your learning content.

I’m not saying you should avoid being ambitious – rather, the manner in which you ask your question should suit the information you want the learner to work with. Sometimes the information you are testing on may require complex questions – but then you should make sure all the questions in your quiz work together in some way.

You want learners to be exercised by quizzes, but you don’t want them to be worn out by them.

5. Overcomplicating  your questions

Again, related to all of these points: keep it simple. Of course, some content is necessarily complex. But complexity is the addition of several simple pieces.

With the proliferation of available question types, you may be tempted to flex your muscles and shoe-horn a question into a question type. Alternatively, you may be tempted to create a large, complex scenario based quiz  that learners may not be able to follow. In short – these approaches add up to the same thing: Making the question difficult to comprehend. The longer a learner has to consider what a question means, the less effort they will put into actually thinking about the right answer.

This brings us back to the beginning – you’re not testing their language ability – you are testing their comprehension of the information you provide in your content.

Ask direct, pointed questions – so the learner concentrates on calling up the information, rather than working on understanding what the question is asking of them.

Here ends the post.

Talk to you next month! As always, please do leave a comment below.

Also, I’ve been toying with the idea of porting this whole project to a Tumblr blog, which might make it easier to start and engage in conversations. Please do let me know your thoughts on this.

What this post is about: This post provides tips on developing quiz assessments for instructional media.

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it provides advice on what you need to consider when developing quizzes to test for knowledge retention and understanding.

Here is a summary: Quizzes are not the only method of assessment available. However, they are the most popular, as they are easy to develop and manage. Unfortunately, quizzes are often seen as an add-on to learning content, rather than being central to its purpose. In this post, I discuss this in more detail, and provide 5 useful tips that can help you develop better quizzes that really assess learner knowledge retention and understanding.

Here begins the post

Quizzes are probably the most popular form of assessment for online/digital learning materials. They are easy enough to write, develop and deploy. From a learner administration point of view, they are also easy to track and measure. They seldom require intervention, as they run from an LMS, assessment engine or other automated system.  Once learners have taken a quiz, the (reliable) results are provided, letting administrators know which learners have done well, and which have not.

However, many quizzes I have seen are poorly implemented. They are an afterthought, created as a piece that slots in toward the end of the project: an administrative effort, required by the Subject Matter Expert, Curriculum Developer, Client or the Project Manager.

This is the wrong approach.  Quizzes – or any assessment, for that matter – should be much more central to your learning product. It is the assessment that indicates how much knowledge learners have retained and how well they have understood it. The questions you pose also offer the opportunity to test how well learners will be able to apply the knowledge they have learned.

5 Things to do

Test to your objectives (verbs, purpose, etc.)

It is crucial that your quiz questions relate back to the course objectives you have set. As obvious as this seems, it is often not implemented. Instructional Designers will much more often work off their scripts or content samples, tying quiz questions to specific content in the learning product. The logic is that “This is what they have learned, so it is safe to test on it”. While this point makes some sense, it loses sight of the purpose of both the quiz and the learning content that has been developed.

Learning content is fundamentally about improving people’s personal or professional ability. The specific improvement to be achieved is broken down into constituent objectives. For most subjects and learning content, it would be impossible to test whether the learner has improved their skill by asking them to display it. Therefore, it is the objectives that are tested, with the fair assumption that if the learner can achieve those actions described by the objectives, they should have improved their skill.

This means considering:

  1. The context of each objective (are there specific conditions under which the action described in an objective should be performed?)
  2. The verb of the objective – this is really important. I’ll elaborate more on verb objectives in another post, but for now, take it as read that to “describe”, “identify”  or “demonstrate” are completely different things. While this is obvious when written like this, you might be surprised to learn that they often become interchangeable when being developed in learning content
  3. The subjects and objects of the objectives. What should the learner be able to do, will they do this with an object, a tool, or a piece of software?

Answering these questions about your objectives will go some distance to helping you develop really good quiz assessments.

Decide on quiz items when developing an IDD

An important way to decide on the design for your quizzes is to outline them as part of your initial Instructional Design Document. I know this is standard in most companies. I also know it’s standard to give assessment little more than an afterthought or consider it something of an onerous task, a push to get the IDD out and signed off.  However, carefully considering your quiz assessments, based on the course objectives at this point will actually provide a couple of benefits:

  1. They will give you a better idea of how best to assess learner’s retention of knowledge and/or understanding. This in turn can help to suggest ways in which the content itself should be developed to improve retention and understanding
  2. If you consider your quiz questions ‘blind’ (i.e. without looking at the content itself), then look back on how you will develop your content, you can check that the content does address all the objectives. If you have questions that ask something not covered by the content – you need to include it

Ask questions as closely relevant to the job/task/skill you are training on

Again, this may seem obvious, but it is advice that is often overlooked, rather than ignored. In an attempt to question objectives, instructional designers will often overlook the real purpose of the objective. Should someone know the definition of a term, or what that definition actually implies when using a tool, software or piece of information in their job? Usually, the latter is the case.

Here is an example: Pressing the Shift key will display letters in uppercase. So, what is the Shift key for? Well, two possible answers are immediately apparent:

  • displaying letters in upper case
  • capitalising words (both at the start of a sentence, or mid sentence)

The first option here is very literal, but the second provides a useful context in which the information is used. From my viewpoint, this makes the second option more ‘correct’ than the first.

Of course, the treatment you provide will depend on the objectives, which is (yet) another reason why your objectives should be carefully drawn out. However, in most cases, the application of information is more important to test than its abstract truth. This is especially true when developing and assessing content for specific work tasks.

Consider the assessment as an item in itself – not just a string of questions!

Quizzes in particular are often seen as a string of questions. Instructional Designers bash them out and send them off for review. However, a much more interesting way to develop your quiz assessments is to try and consider the whole piece. Consider the following:

  • In what you are testing, is there a beginning, middle and end that relates to your objectives? Could you use this to provide a ‘development’ for your quiz?
  • Does the information you provide in the learning content tie together in an interesting way? Can you ask one question to assess fundamentals, then go on to ask other questions that assess the ability to apply the information?
  • Is it possible to ask questions in a structured fashion that relates to how learners will use information, tools or software in their jobs?
  • Is there a typical scenario you could use to develop questions from? Ask the SME!

Add some colour!

If I had a penny for every time I’ve seen a quiz assessment made up entirely of multiple choice questions… Most VLEs and assessment engines now offer an excellent range of question types for quizzes, and you should try to exploit this. Don’t allow learners to passively click their way from one end of a quiz to the other- even assessments are learning opportunities. When considering question types, it is also important to ensure the type of question you are asking relates to the type of thing you are assessing. Here are some examples:

  • Fill in the blanks, or a series of them can be a useful way to test steps in a process
  • Matching questions are a helpful way of checking relationships between concepts, or the where one might find a menu command, information or perform an action
  • Graphic based click answer questions are great for testing people’s understanding of situations, scenarios or the actions they need to take in software applications to perform tasks
  • True or false – often considered the sledgehammer of quizzes – can be used to test understanding of conditions, situations or differences in functions or features. As long as your question is fair, you can ask learners to identify subtle implications of information outlined in the learning content.
  • Just about every question type can be used to ask learners about definitions

If you’d like to discuss this, please feel free to leave a comment below. Alternatively, you can email me about this and other opinions at brendan dot strong at gmail dot com.

Tune in next time for 5 things you should avoid when developing quiz assessments.

What this post is about: Considerations you need to make when creating a strategy to write, develop and/or edit instructional copy. It looks broadly at structure and approach.

What this post is not about: Grammar, punctuation or sentence structure. These will be covered elsewhere

Here is a summary: Learning communication is full of paradigms, short on silly metaphors. My Horse and Cart approach aims to rectify this. Much instructional copy is poorly implemented, as writers or designers don’t consider their objectives, voice, audience or the impact of their copy. This post aims to provide suggestions to help instructional designers overcome this.

Here begins the post

This post uses a silly metaphor (horses and carts) to provide some guidance in how you should develop your instructional copy. It deals primarily with strategies to developing learning content.

The first thing to consider is the road to be travelled. Your horse and cart have to be prepared so that they can deal with the obstacles that may arise over the way. This translates to ensuring your learning content will apply to ‘real world’ scenarios. Consider:

  • What it is people should be able to do
  • Why they should be able to do it (knowing this will help you identify where knowledge can be applied elsewhere, or where learners may need to consider how best to apply the knowledge they learn)
  • What the outcomes for the learner, their organisation and/or industry should be (in this order)

Next, make sure your horse is right for the cart. This translates to ensuring your learning content is being driven by the right objectives. You should consider:

  • That the objectives all contribute to the desired outcomes of your learning project (as outlined by the Road you need to travel)
  • The objective verbs are correct. This may seem pedantic, but the difference between “understand”, “identify” and “describe” is a series of lacunas something like the great lakes.  You need to ensure the objectives you map out for the project will take the learning in the right direction and ensure learners get to where they need to be (as defined, again, by the road)
  • That your learning content directly addresses your well-defined objectives. Anything off the point should be provided as an add on (e.g. drilldown, download, “See also…” types of information. Keep the copy as sharp and focused as it can be to derive optimal effects and get learners from start to finish quickly and easily

Now, the cart itself. This is the media you are using to carry the goods. Think about carts here carefully – you don’t try to shape the goods to fit the cart (although you can do); instead, you should ensure the cart will carry the goods. You don’t carry milk in a wooden flatbed. Neither do you use a discarded oil tanker.  You also need to consider how you prevent the goods from falling out or spilling. This translates to the media you use to deliver the knowledge from you (or the trainer, facilitator, etc.) to the learner. You should consider:

  • The best format to assist the learner in retaining the knowledge. This can be a tough decision, but generally boils down to: Print, Online, Multimedia: consider permanence, access and ability or need to update, as well as how engaging the media needs to be (will learners have sufficient motivation to stick with something that is not very engaging?)
  • The best format to express the information you want to deliver.  For example, complex relationships should always be represented graphically. Steps are often better implemented in text, as you can then provide download checklists. However, some steps might benefit from a screencast.
  • The best way to structure and use that format. Never expect anyone to sit down for 60 minutes to ‘learn’. Even if you’re using Video or Flash, learners just will not concentrate for that length of time in one sitting. You need to structure your learning so that it contributes to achieving your objectives, but also is delivered in managable, engaging sections that learners can take one at a time
  • Assessment and other retention techniques and checks for understanding. Ensure that however you deliver your training, you check that learners can remember and use the information being provided.  It is next to tragic to consider a well structured and delivered learning content that people forget immediately after leaving the classroom. More and more often, I’m finding assessment and checks for understanding are being considered more important than the content itself. This may be a foolish turn of events, but the point is assessment/retention and checks for understanding are the only way to be sure that the knowledge people take on board can actually be applied to the workplace or situations they are training for. You can look at your content and say “Yes, this applies to that objective”, but without assessment you can never be sure how effective it is in doing so.

Finally, consider the wheels on your cart. This relates closely to ensuring the goods don’t fall out the back. What it entails is looking at the road, the goods being carried, the weight of it all together and ensuring the wheels will be able to support the cart, and move it along the road. Consider:

  • How robust your learning will stand up to the demands of a modern workplace. Do you ask people to take time out from work, or can they take the training in hour long bites?
  • How often does the information you are providing change? Is it about software, which may update regularly? How do you intend to deal with such changes so that 1 – the learning content stays relevant and 2 – learners themselves can be updated
  • Can the learner use what they have learned in the content and easily/directly apply it to their workplace, or tasks they are learning to perform? What is the gap between the Learning environment and the Real World environment, and how do you intend to bridge it?

Finally, keep en eye on your cart as the horse draws it. I wanted to include this last point with the Wheels, but it doesn’t really belong there, no matter how silly the metaphor can be made. Evaluation is often overlooked, but always a key driver in improving all future learning content (or indeed existing content that can be updated). Monitor every step of your instructional copy, from design and development to implementation. This will allow you to evaluate what worked best (improving your own systems), and what was most effective for learners (improving future learner engagement and success).

I hope this has been useful to you, or at the very least of interest. If you’d like to hear more about my opinions, please do leave a comment, email me at brendan-dot-strong at gmail-dot-com. If you’d like to share a link to this post with someone else, please do so!

SeanMoncrieff

Hello

June 3, 2009

Hello, I'm Brendan

Hello, I'm Brendan

My name is Brendan. I am what they call a wordsmith; what I call a writer. From a background in Instructional Design (scripting and developing eLearning and blended learning solutions), I have worked on a range of projects that include:

  • Video & multimedia scripting
  • User education & direction for products and support sites
  • Help files & other support resources
  • Job aids and performance support using web based tools
  • Technical writing and documentation
  • Marketing & copywriting for print and web
  • Mobile content
  • Writing ads for sponsored search

It’s a varied list, but I’m quite good you see. I’m currently seeking work on a contract basis (long or short) currently working as an eLearning coordinator, instructional designer, general know-it-all. What I would like to do is help you communicate more clearly and concisely, and without the type of ‘industry’ cliché such as those found in this sentence (unless, of course, you like them). What does this mean for you?

For training and education:

  • Developing training solutions that address business objectives
  • Writing engaging content that keeps people interested
  • Creating assessments that test their understanding

For multimedia, software or other products:

  • Creating help files and user guides so people can get the answers they are looking for
  • Developing task oriented tutorials so people know how to use your product
  • Explaining how everything fits together so people can make more of it and use it in their own way

For sales and marketing:

  • Explaining your product or service to potential customers
  • Outlining the benefits you offer
  • Showing them how to use your product or service

For other business communications:

  • Writing text that is clear and easy to understand- both in house and for clients or customers
  • Simplifying complex information so that it’s easier to digest
  • Making it easy for people to act on the information you provide

I’ll be using this blog to share some of my thoughts and experiences in the written and multimedia communications trade. I have a CV and will be putting it up here soon.

In the meantime, please add a comment below or email me (brendan dot strong at gmail dot com) if you have any comments, queries or would like to offer me some work.