This tweet  gave me a good laugh, for I feel the same:

I don’t hate video tutorials, and have made many. The point is, if it’s not the right solution, we end up with tweets like this. Spraypainting a cow is also an expensive proposition that raises many logistical problems (especially if on-the-job content is being provided).

It’s useful to have someone with the knowledge and expertise to help you determine, plan and execute the best solution to your problem.


Week 2 of #ocTEL has found me flagging somewhat, due to work commitments. However, it has been a great week of reading.

The key question of learning approaches really interests me, as it is something I am very interested in. Different to the ever-controversial learning “styles”, learning approaches are defined in three categories, as listed on the #ocTEL Week 2 page:

Page detailiing 3 approaches to learning

Approaches to Learning


My initial reaction to the question of which approach is best was “Well, deep learning of course, because it is the most “complete””. This is unsurprising as a Philosophy and English graduate:

A little learning is a dangerous thing;
Drink deep, or taste not the Pierian spring

(Alexander Pope)

However, thinking a little further, I considered – could these approaches actually be a continuum? One starts learning facts, and as they accumulate those facts, they get a bit smarter about what they are learning, they get strategic: targeting the knowledge they need. At some point, if the subject is interesting enough, they will seek to learn and think more deeply about it.  This theory made sense to me, and fit with many experiences (me, people I know, people I have created learning content for, but never met, etc.).

What I find interesting about “Surface” learning is that it corresponds somewhat to learning by rote, which many now consider an awful travesty. However, in my own experience, it was worth learning some things by rote. For example (and this is just one), my multiplication tables(!). I use these just about every day. While there are many ways times tables could be made more interesting, I think they survived better without application, as their abstract nature made it easy to apply to many things: hours, time, money, etc. Although, I must confess, my father taught me times tables using a very rhythmic metre, such that they felt more like poems in recital – so this probably helped immensely.

Then, I read #ocTEL activity week 2 : approaches to learning (  and my thinking got pushed a little further. The well constructed argument is that strategic learning is perhaps what we should all be aiming for in community situations, as deep learning could mean a whole group learning to follow the interests of deep learners (or indeed switching off when the deeper levels do not interest them). You should read this post, as rather than re-write all the arguments here, I’m just going to refer to them.

The “strategic” learner is pretty much my audience, and has been for my whole career. Working in eLearning development, our learners are often those seeking “Just in time” skills-gap content. They want to get in and out quite quickly, and they want a sense that they have learned enough to do what is required (as a result, one develops towards stated learning objectives, talks about or refers to specific tasks as much as possible, and loads in assessments that will run the gamut from True or False questions to detailed scenario based simulations).  For my own learning, I would notice many situations where I am “strategic”, a simple example might be looking up code for html – not necessarily understanding the complete ins and outs of it, but knowing it will do what I need to get done. An interesting development in the industry is the announcement of the Serious eLearning Manifesto, something I am very interested in, but don’t necessarily agree with fully (I feel it concentrates too much on the strategic – while good for professionals, may not meet the needs of others who need factual or deeper learning, which I feel technology can be developed to provide or at least assist with).

Many would argue eLearning has no place for those seeking “deep” learning, but I would disagree.  I see eLearning (or technology enhanced learning) as a set of tools that can be deployed according to pedagogical principles – the question is not whether, but how. That is not to say it is easy to do. The danger (as outlined in Week 1 of #ocTEL) is that you provide the technology and expect learning to occur. This is why I am doing #ocTEL – to better understand how others, from different disciplines or backgrounds are using technology to provide that deeper learning. All this said, I still agree with the post I linked to a paragraph back: strategic may indeed be the real goal, with “deep” being a bonus.

In my current role, my audience is somewhere between surface, strategic and deep – depending on the specific context and content. Some will just want to know facts (that can be applied elsewhere, or used to inform understanding of something else). The same person can also be strategic in a different context – learning what needs to be known to perform a specific task. In another context, that same person may feel the content we have developed, while informative, hasn’t taken them far enough – so they will go and read articles referenced in the content, and seek out forums and/or wikis to further discuss ideas.

In conclusion, I think we are all working within some kind of dynamic strategy. When I say “we”, I mean “learning designers/facilitators” and “learners”.

As learners (and we are all learners really), we sometimes seek facts to support other information – or just to know. We can also be strategic, outside of the classroom/course setting, learning specific things we need (taking facts, and applying them to tasks). But as we learn facts and assemble them strategically, we start to see connections, similarities or the rationale, which leads to a deeper learning.

As learning designers, we’re trying to assist the deepest required learning for people. This could be provision of facts (if that’s what they want to come and get, we can certainly hang it out, ready to be taken), or something more strategic (which as I think I’ve said is perhaps what we should aim for: an understanding that allows for action), as well as providing a gateway for deeper learning.

What this post is about: A continuation from yesterday’s “On Putting it Online…“, but with a bit more structure in terms of when you might do what.

What this post is not about: As with yesterday’s post, this is not a project map or detailed instructions on developing online content. This provides general principles.

Here is a summary: After some flattering and some not so flattering emails with regard to yesterday’s post, I want to provide a more coherent structure by outlining the steps you take to put learning content online. While I agree yesterday’s post was certainly not my finest moment in written content, I still believe the thrust of my argument. This post merely reflects that I had not truly considered my audience (those seeking advice on creating online content). Which is something I believe strongly that one should do. Won’t somebody think of the shoemaker’s children?

Here begins the post

So, as I was saying, you need to develop content that is assessable, applicable and appropriate. But what steps do you take to achieve this?

First Steps…

Identify the boss. Someone will need to manage the whole project, lest it spin out of control and you end up with hours and hours of unnecessary or incoherent content. You will also need to mind your budget (this may be preset, or you may have some time to determine what it will be in discussion with your instructional designer and SMEs)

Assemble your Subject Matter Experts (SMEs). These may be paid or volunteer (depending on your set up. Perhaps if they are employed or contract trainers, SME activities can come under an existing contract). When considering people for this role, consider also your subject matter. You want to ensure you have expertise to cover all areas of your course. This may be as few as 1, or as many as the number of subjects you want to cover. I often advocate more is better (as it reduces the load on each individual SME), however, one must be careful not to employ too many chefs, lest your broth become a lumpy stew of expert opinion.

Get an instructional designer. I really do believe the outlay for a learning professional will provide best results. Seek out references, portfolio of work or at the very least ask them to describe projects they have worked on, problems they faced and how they overcame these. Prize specific information from them. I hate to say this, but there are some charlatans out there. However, sites like LinkedIn can be useful for identifying people they have worked with in the past that you may know. If you are starting from scratch, I also advocate going for experience over energy. If systems are in place, energy is your friend – but where you are trying to get something off the ground, experience will really come into its own. You need someone who can identify solutions and deal with the problems that may arise, either by avoiding them or dealing with them efficiently.

Next steps…

Get your instructional designer and your SMEs together. If you are managing this project, you may need to be there too, because this is the first point when things may get a bit hairy.  Your instructional designer will know nothing or little of the content to be covered. Your SMEs know everything. However, your instructional designer should be able to interview your SMEs to determine an overall goal for the course and perhaps a structure (although this is not always the first thing to happen). They should also be walking away with contact details, an overall goal, perhaps objectives and – preferably – source material.

(Next- this rarely happens, but I like to add this: next, work on a learner survey to determine their learning need, how they feel they might use an online service for learning and what it is they might like to see covered. If you have a mature audience, you could even seek out learner advocates to help you design the survey and better understand their needs and demands. This could be a group project between the instructional designer and the SMEs, but again needs to be managed, lest you inadvertently offer the sun, moon and stars (raising expectations unreasonably), or, the answers to your questions come back contradicting themselves. Take the results of this survey, maybe break out some learner profiles (in case broad differences arise from your learners), and figure out how much time they can spend online and how they might effectively work with the content.)

Depending on your situation, you may also need to consider your assessment principles. While your SMEs and instructional designer will have their own ideas in this regard, you need to also consider whether certification or accreditation of any kind will be sought for your course – and what these mean for your assessment. Assessment may be Multiple Choice questions, more varied questions, portfolio work (if your SMEs have time to grade this), you may use peer-assessment (if your learners can be trusted with this), the list goes on…

More steps…

Your instructional designer will then go away for a week or so (depending on how much content is there).

They should then return with an overall structure and approach for your content. Going back to yesterday’s post, this may be:

  • Highly engaging activities in Flash or HTML5, with assessments
  • Forum/social media-based discussion groups, with portfolios (for assessment)
  • Online web pages, with case studies or activities
  • Or a mix of all these
  • Or none of these (again, your specific content, audience and intended outcomes will dictate the solution)

Now, you need something to put all this on. Website, Learning Management System (LMS), Wiki, what? Well, your instructional designer (again) should have ideas based on their proposed solution. You may also need to consider whether you need:

  • Tracking to see that learners have covered specific content
  • Forums?
  • Social media integration?
  • Private messaging/mailing system?
  • Portfolio building as proof-of-activity?
  • Certification/completion monitoring?
  • Calendar?
  • Calendar with “locking” of activities?
  • Storage for grades and learner outcomes
  • A “dashboard” for learners (and SMEs, if acting as “teachers”) so everyone can monitor what is going on
  • Support for iOS devices? (I know this is very specific – but the iPad is so popular now, it is important to consider whether people can use it to take part in your online course. “Bring the learning to the learner” is my motto)
  • etc (also please note, this list is disparate on purpose – there really is a multitude of things to consider, and they will all depend on your content, approach and learner profile)

Even more steps…

Then, it’s time to start building. In a perfect world, you will hear nothing except for any regular reporting updates you have asked for. And all the rain drops will be gum drops and lemon drops. Your SMEs and instructional designer will enter a strange relationship, whereby they love and hate each other. In short, there is a tug-o-war: The SMEs want to put everything in, often in the same way they lecture (or learned or developed) the content for learners. Your instructional designer will want to highlight key information, dealing with ancillary information in other ways. They will both be right and wrong at some point. They will probably come to you also. Generally, the instructional designer will work alongside the SME to develop an outline for activities/quizzes/content. They will then design and build it (in cases where you work with a company, this will be an instructional designer, graphic designer and developer – however, they will usually be managed by a project manager who will be your point of contact). This will be sent back to the SME (as a script, as preliminary content, or as very close to complete content), who will provide corrections. This is not a “maybe”, this is a “definite”. There will always be edits to improve the accuracy of content, or to limit misleading information or to make information more “complete” so that it is coherent for learners. This is an iterative process, but hopefully your instructional designer will have strategies to minimise the number of iterations required.

See if you can put anadin into your budget.

It is not all bad news. They will also both respect each other (in my experience). If you have a good instructional designer, your SMEs should be happy with what comes out, despite how they felt as it was being developed.

Nearly there? Nope…

Now, you’re not finished. Not by a long way. Once you have content developed, it will need to be tested. Does it work the way it is meant to? Is it easy to use? Is it accessible (not just in terms of devices, but to accessibility software, you may have learners who live with disabilities). Does the course do what it said it would do in the script? Does everything “hang together” (keep in mind, the usual process is to develop in modules, or separate pieces, then put everything together. When it gets put together – is it coherent? Are there gaps?)

Just like end as you mean to start again…

You are still not finished. It is time to let the learners bask in the fruits of your labour. And compliment you (oh, really, you are too kind!) and complain (oh, you really are too cruel!). Get as much feedback as possible about your course. Gather it also from SMEs (a tip in this respect: a Project Manager who I worked with one kept SME feedback all the way through the project, rather than just those specifically requested at the end. This gave them a “live snapshot” of SME opinion and feeling throughout course development – not just at the end when they are generally happy with the outcome).

Then, you need to feed all this back into your course design:

  • What worked for learners and SMEs?
  • What did not work for learners and SMEs?
  • What worked in terms of project development?
  • Where did things go wrong? How did they go wrong? Can this be avoided in future?
  • How did the subject matter go down? Do you need to speed it up, or slow it down?
  • Did learners pass their assessments? How about those with less or more interaction in the course – was there a difference in how they performed?

How does this work? It depends on the instructional designer, but people will usually take this information and re-work the original design to improve it for the next cohort of learners (actually, it is becoming quite popular to update courses for current learners also, so this may happen after a defined period, or as the course runs).


I hope this post has helped to better explain the sort of process you may need to go through in order to develop some online learning for your organisation. While it was inspired based on feedback from yesterday’s post, I believe both posts are complementary: Yesterday’s looked at the issues you need to consider to ensure your learning suits your learners. Today’s looks at the kind of steps you need to take to put it all together.

What this post is about: Some thoughts on making the decision to “go online” with training interventions, and how you might go about doing that.

What this post is not about: This is not a specific, tailor-made guide that can help you determine whether and how to go online. This is informational. Always speak to your instructional design specialist, and only follow the instructions on the label if you are sure it will provide the outcome you are seeking.

Here is a summary: I have been following the ocTEL course provided by ALT. While I have been lax in activities, I have been thinking a lot about the discussions and resources provided. In one of (hopefully) many posts inspired by the ocTEL course, I consider how one should make the decision to provide online learning content.

Here begins the post.

So, you’re going online. Genius move. There are training and marketing dollars in that. You are the future, ever stretching and all-knowledge providing. What next?

First, I’m assuming you know you need training (and that you really do). If not, I suggest reading this blog by Cathy Moore on whether you even need to provide a training intervention.

So, let’s assume you do. What next?

A few days ago from my (PLUG WARNING) new @BrenLearning Twitter account (which is reserved for learning and development thoughts and tweets), I tweeted the following:

“Having studied theories and styles, plus reflection, I’m going with “People are #learning, everywhere and all the time, in different ways”

“(2/2) My job in #learning is to provide opportunities for learning that are appropriate, applicable and assessable, for learner and subject.


This encapsulates my current thought on learning, and what you need to do. Current is the key word here, as this may change, because I am 1. quite ambivalent about everything, in general and 2. constantly learning, so may well refine, rework or reverse my opinion.

So, now you’re done.

Except, how do we turn a pithy tweet into something workable? Let’s break it down.

(MC Hammer Dance)

Making it Assessable for Learner and Subject

What? This was the last point in my pithy “rule of three”-structured comment. Why is it first?!


You need to consider the outcome first. How will you know your online learning has worked? And how is that translated to the workplace/skills applied/etc (in most cases, “assessment” is really a proxy result that will confirm a learner knows something – how this affects behaviour after may require further follow up after the training intervention).

What is it you are trying to achieve for your learners? And will that work for them? What you’re trying to achieve can range from depth and breadth of knowledge to proving an ability to apply that knowledge to – in some cases – actually testing the application of knowledge (although this can be limited in the online space – see the first point).

In terms of what will work for learners, do they need to learn this, do they want to learn this, is “online” an appropriate manner for teaching?

Once you know where you want to get to, then you can start mapping a path to it.

Making it Appropriate for Learner and Subject

In terms of going online – think about your intended learners and ask yourself: Why?

  • What is the value to learners? (this is key. Many, many people think of the value to the institution (in terms of improved service or esteem), which can leave learners cold. While your institution may well gain in terms of esteem (but that is up to you and your marketing department to know or figure out), if you’re going online, think of the learners.
  • Will they see that value themselves? (if the answer is “No”, some marketing may be required. Just because someone doesn’t know whether something will work for them does not mean it won’t. My 5 year old who will not touch carrots but loves broccoli confirms this for me)
  • What is the most effective way to get the information into their heads (This will include what learners will be able to access, will have time to access and will not be too difficult to get through)?
  • How are they used to learning, and is there anyway to leverage that (e.g. Could you record “live” events (lectures, presentations, round tables, etc.) and put these online, along with assessment? Or would this bore them? Or, do you need to provide more applicable content – see below in Applicable bit)?

In terms of the subject matter or content you want to teach, consider:

  • What do you teach now?
  • What is taught in a “live” event?
  • What can you put online (e.g. again, can you record lectures? Or would it make more sense to provide case studies and activities online? Or would it make sense to provide both?)?
  • What is worth putting online (Just because it can go online, doesn’t always mean it should. I give you LOLCATS and many, many Tumblrs as examples. Don’t assume that if you build it, they will come – it must be worth coming for)?
  • Is there a split between what can be taught effectively using live events, and what can be taught effectively using online (self-directed) methds? (Commonly known as blended learning, but more often now known simply as learning, as more institutions and providers actually design their learning output to work in this way)

Another aspect to developing learning content that is appropriate to learners and the subject is what it actually looks like. It might be:

High in interactivity, using Flash or HTML5 (here is a really interesting issue right now. Flash is not supported on iOS devices. Yet (and despite the death notices), it is still the main technology used for highly interactive and video-based content for learning.  This probably will not last, as so many new products are released that publish to HTML5, offering similar types of interactive content, and indeed products that publish to apps for iOS or Android phones and tablets. A full discussion on this is beyond the point of this post, so to go back to the initial consideration: High interactivity using Flash or HTML5 is expensive, and labour intensive, but can offer really excellent results. An experience can be developed, rather than a resource.

High in social interactivity, using forums or social media. The appropriateness of this approach again depends on your content. If you are trying to be quite didactic (i.e. this is the information you need to learn), then careful monitoring, mentoring and moderation will be required to ensure incorrect information is not circulated through your learning community. However, if the purpose is to be more exploratory (i.e. through discussion, the community will build up their own knowledge base), the effort required is less so.

Low in interactivity, but rich in information: This is a common approach because it is fairly easy to create. However, it can be difficult for learners to engage with – so is a trade off. In short, you provide an indexed, searchable body of information, with supporting assessments and some kind of curriculum outline (perhaps with supporting activities as well). Then launch it. Learners can access information as they need, and go back to it as required. The difficulty with such an approach is that it can get boring for learners, who may use it in 20 minute bursts (or something similar), rather than sitting down to a course. (By the way, 20 minute bursts is the recommended timeframe for any specific “chunk” of information you want people to learn).

And so forth – one could go on forever, but really the approach you take should depend on your objectives, and what will work for learners to help them better learn and understand the information. Again, I’ll reiterate: You should consult a learning specialist to help you determine what will meet these requirements – and within your budget.

There are some ways to help with this: Surveys of potential learners; Looking at case studies from others in industry, or if you are part of an association, etc.; Looking at other case studies, which may be applicable to the type of information you are learning. In these cases, the more information the better – build up profiles of your learners – and through those profiles – your online service.

Making it Applicable for Learner and Subject

So you know what you want them to do after the training, and you’ve figured out a way to provide online content that will work for them. Now, what will that content be? Some more questions to ask yourself:

What should they be able to do (see assessment), and how can the content align to that?

What the hell does this mean? Here are some examples!

If you’re teaching software, consider developing activities where they use specific functions in a targeted way. Don’t just say “Use the file menu to open a new document and save it with a title…” Instead, go with: “You have received a support call from a client, who has asked for a breakdown of their current account information to be posted to them. Prepare this document and include a cover note to explain what it contains and how the customer can read it. Remember to give it an appropriate file name, because it will be going to them!”. Or something similar. The point is you should be considering real life application at all times.

In a project I worked on, we used animated diagrams and click events (click here to see what happens) to explain the inner workings of machines. This helped learners to “visualise” the mechanisms involved and (hopefully) to consider how they might apply such mechanisms in other situations (i.e. when on the job, to consider what mechanical applications could be applied to specific problems they needed to solve).

But wait! That’s too advanced! I need to roll back a  bit…

OK. Well, here you need to consider:

  • What it is they have to learn
  • How they will use this information

Going back to our software, you could:

  1. Explain a typical workflow, which includes how the software “fits into it”
  2. Break out the specific information you will be talking about, and what it means
  3. Run through the specific information and its place/function within the software
  4. Provide one or two examples
  5. etc.

For more theoretical information, consider the use of diagrams that relate how different information works together. For example:

  • Specific information to be input to a process (data, a situation, etc.)
  • The process itself (steps taken, tasks)
  • The output (result of using the information through the steps)

Such a diagram could also be used as a download or job aid, which the learner could print out and keep, so they have a ready-reference when they need it.

You also need to consider your expertise, with regard to making your content applicable. Subject Matter Experts (SMEs) are fundamental to making your online learning work. Without them, you have nothing. These are the people who can tell you what needs to be in the course. What learners need to know, be able to do and the relationships between what they know and what they do.

  • How will you engage SMEs? Who will they be?
  • Will they be paid? Can they work voluntarily?
  • Could they (with some initial training) develop the online content themselves? Or, if they are time poor, how can they be involved efficiently – so that being involved in your project does not take up too much of their time.

Another tough question is how your SME(s) might be managed. They are often senior figures, who can think of better things to do with their time. You are better off finding people who are interested in learning (have a “vocation” for want of a better word). What happens if they miss deadlines? Can you exert any pressure on them?

Of course, all of this will be done for free, so that’s the end of this post.


Oh, yes. Budget.

This is why I suggest talking to an instructional designer or eLearning professional. They may have some idea of costs for you (time for instructional design, technical costs and what you may need to pay out initially and on an ongoing basis).

However, they may also (drawing on their experience) have some creative ideas on maximising your budget. For example – as mentioned earlier, you could engage an instructional designer to teach your SMEs to develop content over a set period. During that time, they could develop templates, themes, manage installation of technical infrastructure, etc. Then perhaps come back once a year or so to look at refreshing the look and feel of your content and provide pointers on improving what has been developed.

If you have loads of money, you could look at a “big house” company, where teams of instructional designers, graphic designers and programmers will work to create content.

In many cases, you’ll probably fall in between these stools, and perhaps require someone on an ongoing basis (but part time) or full time as an employee.

In Summary

I hope this has been a helpful post for those thinking of providing online learning content, and at least give you an idea of how you should be thinking. As I say, none of this could be taken as specific advice, as there is no guarantee any of it is appropriate, applicable and assessable to your specific needs (as I hope is made clear by the post). You should engage an instructional designer or eLearning professional, at least to provide advice in the early days, and perhaps for a longer and more involved engagement. Always read the label, but only ever follow the instructions when they will specifically deal with the problem you are facing.

What This Post Is About

Some strategies for developing effective quizzes to help engage learners and assess understanding/knowledge at the end of your lesson/content.

What This Post is Not About

This is not a detailed discussion on using frameworks (e.g. Bloom’s Taxonomy) to develop questions, although this is mentioned in passing.

Here is a Summary

Developing quizzes seems to be a perennial pain for instructional designers. Recently, I have seen some comments and posts bemoaning a sort of “writer’s block” to developing quizzes. Quite often, this occurs because IDs get very excited about developing the content, and forget that really, the quiz is the only thing that can tell us whether a learner has achieved their objectives.

Here Begins the Post

Nearly every instructional designer I know has at some time bemoaned the writing of quiz questions for learning content. Often, quizzes are seen as a “come-down” after the dizzying heights of scripting according to learning objectives and some overall content strategy (which determines the level and mode of interactivity to be used).  The race is often to get content that covers the objectives. Quiz questions (at best) are left to questions that are intended to elicit a response which is some sentence from the middle of the content (indeed, this sentence works hard for its existence, often being the feedback as well).

So, here are some strategies I employ for writing quiz questions…

Goals, Objectives, Content. Base your questions on these, in this order:

Goals – the overall goal should be a clear (although it may be complex) statement of some task the learner should be able to perform, or piece of information they should know. This is really the most important thing to test. But how do you test it, using Multiple Choice Questions, True or False, or Matching questions?

Refer to Blooms (or other) taxonomy for help to determine the type of question you should be asking.  Of course, you should have done this to define your objective – but let’s pretend you have done that.

Consider the verb:

  • What action is it that the learner must be able to perform?
  • Do you need them to recognise a statement of truth?
  • Do you need them to interpret data and identify the best statement to describe that?
  • Do you need them to interpret information (data/facts/statement) and identify a statement describing the consequence of that?
  • Do you need them to identify how certain concepts relate together?

Consider the conditions and information/tools to be used:

  • Do they need to know these, specifically?
  • Do they need to identify how conditions and information/tools change or should be considered with a different verb?
  • Do they need to identify how the conditions themselves will change the information/tools should be used?
  • Do they need to know the mechanism by which the information/tools should work?

Objectives are the simplified statements that make up the overall (complex) goal for a course.  Often (and it is certainly how I learned), it is the objectives that people tell you to quiz on. If you have well-defined objectives, your questions are simple: rewrite your objective so that a question mark can sit at the end. However, you should think about:

How different objectives combine to make up the overall goal – can you ask questions that relate to 2 or 3 objectives, and also attend to the overall goal? Isn’t this a better indication of whether the learner understands your content?

In what context are your objectives useful (again, you should be looking at the verb, conditions and information/tools required)? Your quizzes should test that a learner understands how to apply information (or use tools) in the right context. Quite often, an instructional designer will chase the ghost of a detailed, complex description of the operation of a tool, or the content of an idea. However, this could be quite as useless as knowing the innards of a cars engine, when actually what you need to know is when to change gear.

How can you “string” or “combine” questions, such that a series of questions can be used to take a learner through steps or the separate parts of a complex idea. This can always be a useful route to developing scenarios, which bring us closer to developing content, which makes the whole exercise more enjoyable for the instructional designer. Think about:

Whether there are a series of steps to be performed

What it might mean if a step is omitted – perhaps this can be a good way to highlight the result of an omission?

Whether it is important for the learner to understand why each step is important (and remember – sometimes it is not – for the purpose of completing a task, sometimes you only want a learner to know the requisite steps)

What about complex ideas? These might include interpreting an interface, alerts or other information that a learner may encounter. In these cases it is worth considering whether they need to know how different pieces of information relate to each other (e.g. if you get an alert here – do you need to check something else? How will inputting information at this point impact on some other system? What do you need to tell a customer about a process before you initiate it?).

Content – this is the heartbreaking work of staggering genius that you have devised to communicate important information to learners. Usually, I rail against asking questions based on content alone (from an ID point of view – always tell the learner you are asking questions based on the content – but from an ID point of view, to cohere to your syllabus and curriculum, you need to return to the objectives to base your questions).

However. (Always a however). The content can be used to guide how you put together questions and how the questions you write might relate to each other within the context of a quiz.

Using your content as a framework can help you to build the bridge between the objectives (as very specific pieces of information you need to ask about to confirm learning/understanding) and the goals (as the overall task or complex operation that the learner should be able to perform). This is where you can find the context to ask questions about how information or tools should be applied. Also, how you might string some questions together to cover more complex tasks, steps or information.

Also consider writing your questions backwards. Seriously, this works. Sometimes, having written your learning content, you go back to the beginning and can stare at the content and the objectives and wonder “What is it I was going to say?” On the other hand, if you start at the end, and move backwards toward the start, you can see new ways of considering the information and its importance. The information will also be fresher in your mind.

Well, sin é (Irish for “that’s it”). I hope some of these tips help. Please do leave a comment, question or query and I’ll answer as soon as possible.

What this post is about

Some (very late) initial thoughts on the uses and applications of iBooks and the idea of iBook development for instructional designers.

What this post is not about

This is not a detailed discussion on technical capabilities or creative development using iBooks.

Here is a summary

This post is part 2 of a 2-part posting about Apple’s iBooks. In this post, I want to record my initial reaction (as a learning design professional) to Apple’s iBook technology.

Here begins the post

Well, here they are. The much rumoured iBook Textbook. Apple’s latest addition to education using technology. Together with podcasting, iTunes university and other initiatives, these will change the shape of education, and the use of technology in education.

What are iBooks? iBooks are a software implementation for the iPad that allow developers to create interactive, rich media text books. Apple have released the code for developing iBooks for free, but of course development and use of the final product requires an Apple product.

Why do I make such grand claims about their dominance? From a technical point of view, the key to successful instructional design is to get people to work with/explore/use and process information. This post (Pulse Learning) says it very efficiently, so I link to it for economy’s sake.  Quite often, there are constraints to using technology to teach anything. You want to make the experience as rich and effective and efficient as possible. Quite often, information may need to be truncated – to keep it to the point. Perhaps all the information is provided within the scope of a course.  But within a specific topic, it may be isolated, modular, away from its natural environment (that being related concepts or important things to consider).

The iBook offers various methods for allowing the highly interactive content to be deployed, while also providing a full account of the information. Learners don’t just read text.

They can be encouraged to interact with the concepts being communicated:

  • Watch engaging/entertaining video (that might take 2 minutes to consume), rather than read dense and complex text about the relationships between things
  • Play with interactive graphs to see how adjusting X will affect Y
  • Take and keep notes directly from the text, which are then available as note cards, which help with things like revision (for students) or quick “just in time” support/reminders (for those learning work skills).
  • 3D images will also help to make text books more interesting to read.

All of this contributes to a better learning experience and the possibility of maintaining learner attention for longer than the normal 20-30 minutes of self study.

Is it all good? Well, I do have some issues…
One key market they are targeting are schoolchildren. This makes sense when one considers the size/weight of the average school bag. However, as a parent I can tell you there are several issues arising from the prospect:

  • It is an unequal world. Will children whose parents cannot afford an iPad be left behind?
  • It is an imperfect world. I wouldn’t trust my child to look after a €5 (or $5 or £5) radio and not break it. What about a €400+ device? If they (and my kids will) break their “school iPad”, what happens? Am I spending €400+ everytime they break one? Get insurance? Sure – but then they break their iPad so often,I’m probaby paying for another iPad a year anyway.

I have heard the arguments that text books are as expensive, but I don’t know whether they stretch to this cost. Furthermore, printed books can be handed down to younger children, bought/sold second hand, etc. In short, there are various factors that will mitigate the cost of text books, but these are not so easy to find for the iPad.
It is an impractical world.  (Update – please see comment below)Not every school will have a PC/Mac to load these iPads with content. Will parents need a computer? What about those who don’t quite understand the requirements of such technology (I know of someone who got an iPod, but didn’t realise they needed a computer to load content onto it). Will parents need computers and WiFi?

Why Am I So Down On This All of a sudden?
I’m not down on it at all. The first thing that does strike me is that it is a shame such a technology could not be provided in an open source model (or perhaps even the One Laptop Per Child model). Using cheap but effective technology and open source software could bring down the cost of developing and purchasing the technology. Furthermore a OLPC model could aslo have tablet computers with iBooks sent to developing countries. In short: Apple cannot be blamed for doing a good job. It is a shame that it cannot be more open and available, but Apple cannot be blamed for not being a charity.

On balance, iBooks are a definite step forward in education and the use of technology in education.

For instructional designers, I offer a tentative SWOT analysis for instructional designers in the use of iBook technology

There are many strengths in the iBook model.

  • The deployment of rich-media, engaging learning content makes everyone happy
  • One device (rather than many books) is very compelling. I worked on projects in the past where people took eLearning courses to learn about a technology, but then brought manuals to work sites where they might need them. They would need a specific workbook for a specific worksite, depending on the technology installed. An iBook textbook means all of this can be kept in one portable device.
  • Moreso, they can become a one-stop-shop for learning and reference. (Imagine reading Ullyses with a guide/dictionary/notes all built in so you aren’t grappling for the back of the book or another book – you simply tap to bring up the information you may want immediately). As an example of an adult-learner, imagine a technician having the full manual, as well as troubleshooting guides and interactive guides explaining the concepts behind a technology altogether? They can find and use the specific information they require in seconds. This could help speed up processes, especially for rarer problems people face. Similar arguments could be made in medicine (reference guides, diagnostic practices, photos of symptoms could all be provided in one place, on one sleek device), sales (product references, user guides, application guides, price points, etc.).
  • (Possibly) Automatic revision? I am unsure of this, but if iBook Textbooks are built on the app model, a publisher could keep the content in their texts up-to-date in a much quicker, much more effective method, pushing new updates so their learners/users will always be confident they have the most up-to-date information.  This could completely disrupt the textbook model, with purchasers taking a “Subscription” to a text book for core content and updates.

From an instructional design point of view, some educational technology is missing from iBook textbooks.

  • Ability to network/use forums or social media
  • Quizzes to help learners monitor their progress through a subject
  • the development of interactive scenarios

These are all regulalry used to enrich the learning experience at all levels (from school children to young and even more advance adult learners.

Perhaps this is on the horizon? Is it conceivable that someone else has already thought of this and could be developing an Android equivalent?
For instructional designers, this may mean we cannot extend/develop our learning content to the full extent that we might want. (From my own personal point of view, scenario/quiz based learning is very important). On the other hand, there could be great challenges in using the core functionality to mock things such as quizzes and scenarios (for example, if there is a function to jump to a specific page/point). However, without a dedicated quiz engine, any workaround would still lack key functionality, or make that functionality too clumsy to mock up (consider a question with several options, individual feedback for each, as part of a quiz of several questions, with feedback at the end of the quiz… that’s going to be complex)


There are obvious opportunities immediately available. Apple hook up with some publishing companies to provide a massive library of content. Their success with iTunes and the music industry would indicate that a development roadmap will be full for quite some time, and libraries of content will be released in time.
For instructional designers, this could mean opportunities within more traditional publishing houses to help them develop/redevelop a huge amount of texts into more engaging and interactive content. Whether publishing houses would go for this is anyone’s guess. I would imagine there will be a critical mass – once X number of publishers are on board, the rest may have to follow to stay relevant.

I think the biggest threat could be the model itself. Again, price is going to be a problem for many people. This might mean someone else comes along to develop an Android equivalent using cheaper hardware and OS. But then will instructional designers have to deal with development from 2 differnet operating systems (and hardware setups)?

As it is, one will need to buy a Mac to develop iBook Textbooks – which is a costly prospect to say the least.

With any technology for eLearning development, there is also the threat of instructional designers becoming lazy. You could get away with a lot of content that looks very good, but is instructionally poor if you don’t keep in mind the fundamentals of your profession. There is no inherent design/development process – this is the value you add as an instructional designer. You will still need to deal with SMEs, designers, project managers and clients. You will still be responsible for ensuring that learners using the finished contnet will learn – will achieve the objectives set out for that content.
Another threat could be that many people decide they *only* want iBook textbooks content, disregarding a lot of other content that could be more engaging/useful for learners. Without quiz/scenario based learning, this would degrade the quality of your learning product even further. Unless you could somehow hook the textbook up to an LMS, where learners could go for testing/scenarios. While this could work, it seems quite clumsy given that you’re using such a sleek model to deploy your content in the first place.


I have no doubt that the iBook Textbook is going to make serious waves – not just in school/college education, but in further education, CPD and ongoing requirements for those who work in industries where information is constantly being updated, or where typical responsibilities will often require a small library of content for reference. I myself am looking to save for a Mac for the express purpose of being ready, should (and when) the revolution hits full throttle.

Hello again, and thanks for dropping by. I appear to be blogging on the theme of questions in eLearning. This month, I’ll be talking about True or False questions – and what they can – and cannot – do. If you’d prefer to see more about developing content, downloads, supporting materials, integration for blended models, etc. please do let me know. I’m on questions at the moment because they appear to be poorly represented in eLearning blogs in general. Anyway, let’s get on.

What this post is about

The use, proper and poor, of that old cherry – the True or False? question. In this post, I’ll provide some advice and strategies for using True or False questions in your eLearning project. I hope to show there is more to True or False questions than many people assume (essentially – stating bald facts or bald lies to identify whether your learners can distinguish between them); and hopefully inspire some new thinking in the use of this question type.

What it is not about

This is not a post intended to give specific true or false questions.

Here is a Summary

Here begins the post

Recently, I was discussing True or False questions- from IRL (in-real-life) tests, actually – where the person I was talking to mentioned a third person who claims:
“True or false questions are useless! All one has to do is answer true – and you will pass. Because the statements used ALWAYS tend toward truth.” 

There may well be truth in this statement (or falsity!), but it is too much to deal with in one post – there are too many issues. Here are just some:

  • What is your passing grade (that allows someone to answer “True” for every question and still pass)?
  • What is your content (that the only manner in which to phrase the questions is such that they tend toward truth)?
  • What is the context (that you are using “True or False” questions as a form of assessment)?
  • What is your goal for your learners (that “True or False” questions are being used – is there a specific reason?)?
  • What is your learner’s goal (how does the use of “True or False” questions help them to achieve their aims, by applying the knowledge they are hopefully learning from your content)?
  • How is it that all your questions actually do tend toward truth (I must admit, in developing these questions I instinctively tend toward the opposite – there is no good reason for this, perhaps I am a man of falsehoods, I know not)?
So, let’s start by taking a look at the anatomy of a True or False Question.
At its heart, a True or False question is a simple multiple choice format. There is a statement, from which the learner selects an option.
Going a little deeper, a True or False question is a statement that either reflects a true state of events, or is a bald faced lie.
That’s it. In practice, it will look like this:
Statement: Some state of existence in the world.
  • True?
  • False?
The learner reads the statement, then selects True or False as a response, to indicate whether they believe that the statement does actually reflect some state of existence in the world, or does not.
So what’s wrong with it?
In many cases, it’s seen as far too simplistic (and because it is seen as simplistic, it is used simplistically). This is often true. Here’s an example. Imagine you read the Peter and Jane story to your learner (this is a story for children, along the lines of:
This is Peter. This is Jane. Peter and Jane are at the beach. Peter and Jane will build a sandcastle. Peter has a bucket. Jane has a shovel.)
Quite often – too often – the True or false question will be used to test the learner’s ability to remember a bald fact. For example:
True or false? Peter has a shovel.
Of course, we know Peter does not have a shovel.  The learner, irritated, clicks False and if they receive feedback, it annoys them that they are being told they were right (God help us all if they were wrong), which adds a delay (read: frustration) to their learning. I know not all statements of bald fact are this easy.
However, another issue creeps in: recognition (as opposed to understanding). When you make a statement that repeats (whether exactly to elicit “True” or incorrectly to elicit “False”), you run the risk of the learner recognising the statement. This is a memory of the language – not of the actual content, or meaning. The learner will not remember the importance of Peter’s bucket – they will simply recognise that the sentence they read did not say that Peter had a shovel. This is an important distinction, because in learning, we aim for retention of knowledge and understanding – not recognition of phrasing.
So, what’s right with it?
The True or False question can be a great way of asking a learner to demonstrate their understanding of a complex situation. This may appear to be a far cry from Peter and Jane, but it applies quite well. (I use the term “complex” here to mean a series or combination of simple statements, that combine to become complex). The True or False question can be used to take disparate concepts and join them together. Some applications include:
  • Between topics within a learning unit
  • As a pre-test to determine whether a learner can connect two objectives (i.e. apply knowledge they should have in a relevant context)
  • After a complex topic (i.e. where several concepts have been discussed) to determine whether a learner can put the concepts together, and understand their outcome
(This really is, just a few. There are many more applications, which I would tease out if there were a book as opposed to a blog to be written).
So what do these options mean in real life? 
Between topics within a unit – as part of a quiz, the True or False question can be used to link some concept in the previous topic to a concept in the next topic (e.g. In the last topic, we learned that Peter had a bucket and Jane had a shovel. We might also have learned that they intend to build a sandcastle – the mechanics of which will be dealt with in the next topic. So we might ask: True or False? A typical sandcastle cannot be built using a bucket and shovel.). If you’re using a True or False question in this way, it cannot be an assessment item. You can’t assess someone based on such a question without  presenting the information first, it’s simply not fair. However, if you aren’t assessing learners, this is an excellent way to focus their attention on what is coming next and to identify the key information from the previous topic that will be employed in the next topic.  Furthermore, if you are using feedback immediately after the question, it can be used as a learning event in itself (if you are comfortable with this, and your course is organised in such a way as to allow the learner to learn from the feedback). In short, you can ask about two disparate concepts, and use the feedback to demonstrate to the learner how they are connected. A note of caution – be careful with this approach. The learner should – based on the preceding content  – have some idea of the connection between the concepts. If something new is introduced out of the blue, the learner will lose faith in you. However, if the concepts have been dealt with thoroughly, a True or False question that challenges learners to connect them can be really useful in focusing their attention or making them think about the concepts in a new way.
As a pre-test, True or False questions can be used in a similar fashion -by asking about 2 seemingly disparate aspects to a complex procedure or body of knowledge to see if learners can connect them. For example: True or False? If Peter uses a shovel and Jane uses a bucket, they can build a sandcastle. In this example, it is true. We know from the story that this isn’t the way things worked out – but the learner should be able to extrapolate from the 2 actors and the 2 tools, the outcome should be achievable. This has been simplified – it’s not always the case that the statement will be true (see my final advice), but in this specific context, we are on safe ground. However, if one were training in health and safety, it might be the case that Peter and Jane need certificates to use the equipment they have. In such a situation, the very same question could be false – because only Peter has the certificate for use of a bucket and only Jane has the certificate for use of the shovel. Suddenly, something very simple has become quite complex in itself.
This leads me to the third point – the use of True or False questions after a complex subject, the use of True or False questions can be really effective. To build them, you need to phrase the statement very carefully, then ask whether it is true or false. Taking our previous example:
True or False? If Peter uses the bucket and Jane uses another bucket, they can build a sandcastle without a shovel at all.
Well, is this true or false? If we have only stated the facts as listed above, we cannot mark a learner down for saying it is false. All they know is that Peter and Jane have a bucket and a shovel and with these tools, they will build a sandcastle.
However, we may have explained the purpose of the bucket and shovel- to collect (shovel) and hold (bucket) sand. We may have gone further and explained the characteristics of a bucket and shovel, which means the learner should be able to identify that one bucket can be used to pick up sand – and the other can be used to hold it. If the learner is expected to be at a level whereby they can understand all this – then they should get this question right. We are testing their ability to recognise that one of the buckets can be used as a shovel, because it has characteristics that allow this. To get even more subtle – if the only way to build a sandcastle was to use a bucket and upend it into the sand – then the vice versa case is false – we cannot use 2 shovels to build a sandcastle, because a shovel (by its characteristics) cannot collect the amount of sand we require to be upended to create our sandcastle.
I hope these examples have provided some food for thought, and will extend the use of the True or False question type.
Some Final Advice/Thoughts. When using True or False questions:
The statement you write should contain all the information they need to answer true or false – based on their defined level of expertise and experience. As you may – or may not – have noticed, I hate “trick” questions. The truth (or falsity) of the statement you make should rely on itself alone.  You must remove all possibility where a learner thinks “Well, in this context, it might be true, but in general it is false”. The learner is attending to the question, and so must be able to answer the question based on the merits of the question provided (this may not be the case with all question types – but with True or False, I believe it is the case)
Your statement must be quite finely honed.  As an extension of the above – you must be careful to remove all uncertainty to the statement you make. Ensuring this will usually require a second pair of eyes to make sure there is nothing in your statement that might confuse them. This is really an issue for the editorial process, but it is important to point out.
Mix it up. Some statements are true, some are false. So, make some of your own statements false. My experience would echo the statement at the start of this post – that most True or False questions tend toward truth. However, if you do decide to tend toward false:
Don’t play with complex constructions that make the statement false (i.e. double, triple, negative falsehoods)
Don’t go with simple, bald false statements (see the earlier point about recognition vs understanding)
Make sure the false statement appears to be likely, but don’t try to trick the learner in this regard. Questions can be aloof from the learner and the general content;  a True or False question needn’t attempt to ‘catch a learner out’ by being especially inviting in any way. Remember – you have taught the content – the question can now be used to connect, combine or otherwise construct knowledge based on the individual parts.
If you use feedback – immediate feedback – True or False questions can be an opportune way to tease out differences. For example, if you have covered two disparate concepts that can be brought together in one True or False question, the feedback to the question can be used to provide further insight (the method or other technologies that Peter and Jane could use to build a sandcastle; how sandcastles can be seen as a metaphor for something else; etc.)
Here ends the post.
Thanks for reading. Please do leave a comment or email me if you’d like to talk about this more.

Hi there. Sorry I’m so late. I got caught in traffic.  From now on, I’m going to  try and get the train – this means posting once every month. Let’s see how that goes.

What this post is about: This post discusses some pitfalls I have come across when trying to create effective assessment or interactivity using quizzes/Q&As (and some that I have noticed in other people’s learning content).

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it talks about what you might try to do (but really should avoid) when developing quizzes to test for knowledge retention and understanding.

Here is a summary: The last post talked about 5 things to do with quizzes, which I hope you found useful.  There are many temptations to over-reach when asking questions – leading to questions which you might think are more engaging and exercising, but in fact are irritating for learners, or do not add to the educational value of your content.

Here begins the post

Quizzes can be the hardest part of developing online learning. What’s a good question? The content seems so thin, how can I ever get X questions from it? How can I grab the learner’s attention with this? How can I avoid boring them?

You’re looking at a blank sheet of paper, wondering what you should be asking…

…then you get “creative.” You try to ask the same questions in different ways. This is not always a bad thing – especially where the same information might be applied in different contexts.  However, problem questions come from over-reaching. Here are 5 ways I’ve seen people over-reach, which (for me) just don’t hit the mark.

5 Things to Avoid

1. Avoid dealing in “semantics” and using “trick” questions

Both of these points relate to the language you use when asking questions.  Therefore the following is not necessarily true where your content is addressing comprehension, definitions, grammar or parts of language.

Using “semantics” means doing things like:

  • asking a question using a double negative
  • playing on words/puns or subtler meanings to ‘throw’ a learner
  • using options where a similarly-spelled word is used
  • using difficult sentence structures, especially where cause/effect or timelines are being asked about

This sort of tactic is just not fair to learners.  You aren’t testing them on their language skills; you’re testing them on the knowledge they are meant to be gaining as a result of your content.

Many argue that asking questions like this is a good way to “keep learners on their toes”, or “make sure they’re paying attention”. I’m not so sure.

  • First, keeping learners on their toes can be achieved by asking pop-questions within your content, or asking about subtle differences in application of knowledge – not by seeing whether they notice a minor aberration in your grammar, punctuation or spelling.
  • Furthermore, the quiz itself is a good way to make sure learners are paying attention; asking that they notice very subtle linguistic issues when they are concentrating on the concepts, theories or techniques you have been describing is off-putting.

Trick questions I have more difficulty with (not just in terms of their use, but also in describing them). These are questions where you actively try to lead a learner astray (i.e. away from the right answer).

I think quiz questions – whether for assessment or self reflection – should be fairly unbiased. For example, with multiple choice questions, they should present to the learner a series of equally likely (or unlikely) options.  If it’s skewed one way, it might be too easy (which doesn’t really exercise them) skewed another, it will feel to the learner like you are  trying to trick them (which you are!) The problem with tricking a learner is that you annoy them or (worse) they question the value of your content.

However, there is a subtlety in the “trickiness” here. Sometimes, you may want to ask about the differences between certain concepts or objects. Or, you may want to ask when a specific piece of information might be applied to a situation. In these cases, subtle differences can be a very useful way to tease out such differences (in fact, I’m a particular fan of “True/False” questions, where you ask whether a piece of information can be applied in the wrong context. The answer should be False, and the feedback can be used to explain the importance of context and information).

Be fair to your learners: They are there to learn from your content – not have their comprehension/concentration tested.

2. Avoid writing ‘to content’

You have objectives that you want your learner to achieve.  Often, these objectives will be well-used for proper assessment quizzes; but will be forgotten when trying to ask about the content a learner has just encountered. What often happens is that a sentence is lifted from the content to be reworked as a question.  Quite often, this will work for you (after all, your content is trying to drive your objectives).

However, this approach can lead to problems. If the sentence lifted from the content is ancillary (in that it explains something not directly addressing an objective, but supporting other information that directly addresses the objective), the learner may think this piece of information is more important than it actually is.

Another problem is where the sentence or piece of content you are basing your question on is within a context in the learning content. If the information is taken out of its context, it may lose meaning or importance. It may also make the information confusing, and therefore any question based on that information meaningless.

Finally, asking questions that speak to context can lead to asking unimportant questions. This is particularly the case where you have used case studies or examples in your content. Asking about a scenario you outlined, or the actions taken by some ‘pretend’ actor leads the learner completely astray from the point of your learning content. Of course, asking about the theory, concepts or steps that lie behind a scenario is fine. But all too often, I find people being asked “How much did Mary spend in the shop” rather than “In a shop, an apple costs 50c, a coke €1, a sandwich €5. If a customer buys 2 sandwiches 3 cokes and 4 apples, how much do they spend?”

Remember: your IDD should map the course goal into objectives, and those objectives to content. Both robust assessment and lighter self-reflective quizzes need to step back to the objectives in order to be really useful for learners.

Use your objectives to guide your content, rather than simply reviewing a topic for easy content to turn into questions.

3. Developing tests as a long series of simple multiple choice questions

This is the hardest “avoid” measure for me to defend. House style, SME opinions, certification demands and client requirements could make this point redundant. However, I shall argue on.

Being asked the same type of question over and over is pretty boring for learners. Click here, click there. One option, two options. Etc. Et cetera.  If you can at all, avoid doing this.

Most LMS and authoring tools now offer a range of question types that you can use. Try them out – don’t tie yourself to multiple choice simply because it seems to be the most robust type of question.  All types of questions can be used to ask about the truth of a statement or a definition. You could also consider:

  • Fill in the blanks for cause/effect, timeline, steps, definitions, conditions
  • Matching for the same as above, but also relationships
  • True/false for subtle differences arising from context, use of information, etc
  • Graphical questions excellent for relationships, screen-based information, etc

All these question types can also be used to provide scenarios or contexts within which you can really test your learners understanding.  There are of course many more contexts in which these questions are used – perhaps the subject for a series of posts; but I guess my advice is:

Avoid doing the same thing over and over again, try to be creative and consider new ways to ask about the information

A corrollorry of this is the way you frame your questions, where you are stuck with multiple choice alone. If the stem of your question is exactly the same over the course of 20 consecutive questions, your learners’ minds will start to stray. The ways to frame your question is bound only by your imagination. I also know that this advice could be seen to contradict my first point – but it shouldn’t. Reframing a question does not mean making it more obscure. Instead consider:

  • If you are asking about information, asking learners to identify a definition for something
  • Also with information, asking when information might be used
  • Also, providing a scenario and asking what information might be used
  • If you are asking about steps, providing one of the steps and asking the learner what happens previously or next
  • Also with steps, listing the steps in a process, then asking what was missing
  • Also, asking why a step is important

4. Getting ‘flashy’ for the sake of it

The exact opposite of the preceding point.  Trying to show off your content creation talents with wildly differing question types, formats and approaches could leave your learners with heads spinning.

It is tempting to try to “spice things up” or even “stir things up” using the vast selection of questions and content development tools out there. However, you should keep your learner in mind – they should be concentrating on getting to the right answers, rather than the question itself. Drawing too much attention to the format or bells and whistles attached to your question is something akin to writing a bad postmodern novel – those that appreciate the aesthetic of your great efforts may well miss the point of your making them.

Furthermore, great swings in the delivery of content or the way in which learners answer questions could lead to confusion on their part, and lead them astray from what is important in your learning content.

I’m not saying you should avoid being ambitious – rather, the manner in which you ask your question should suit the information you want the learner to work with. Sometimes the information you are testing on may require complex questions – but then you should make sure all the questions in your quiz work together in some way.

You want learners to be exercised by quizzes, but you don’t want them to be worn out by them.

5. Overcomplicating  your questions

Again, related to all of these points: keep it simple. Of course, some content is necessarily complex. But complexity is the addition of several simple pieces.

With the proliferation of available question types, you may be tempted to flex your muscles and shoe-horn a question into a question type. Alternatively, you may be tempted to create a large, complex scenario based quiz  that learners may not be able to follow. In short – these approaches add up to the same thing: Making the question difficult to comprehend. The longer a learner has to consider what a question means, the less effort they will put into actually thinking about the right answer.

This brings us back to the beginning – you’re not testing their language ability – you are testing their comprehension of the information you provide in your content.

Ask direct, pointed questions – so the learner concentrates on calling up the information, rather than working on understanding what the question is asking of them.

Here ends the post.

Talk to you next month! As always, please do leave a comment below.

Also, I’ve been toying with the idea of porting this whole project to a Tumblr blog, which might make it easier to start and engage in conversations. Please do let me know your thoughts on this.

What this post is about: This post provides tips on developing quiz assessments for instructional media.

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it provides advice on what you need to consider when developing quizzes to test for knowledge retention and understanding.

Here is a summary: Quizzes are not the only method of assessment available. However, they are the most popular, as they are easy to develop and manage. Unfortunately, quizzes are often seen as an add-on to learning content, rather than being central to its purpose. In this post, I discuss this in more detail, and provide 5 useful tips that can help you develop better quizzes that really assess learner knowledge retention and understanding.

Here begins the post

Quizzes are probably the most popular form of assessment for online/digital learning materials. They are easy enough to write, develop and deploy. From a learner administration point of view, they are also easy to track and measure. They seldom require intervention, as they run from an LMS, assessment engine or other automated system.  Once learners have taken a quiz, the (reliable) results are provided, letting administrators know which learners have done well, and which have not.

However, many quizzes I have seen are poorly implemented. They are an afterthought, created as a piece that slots in toward the end of the project: an administrative effort, required by the Subject Matter Expert, Curriculum Developer, Client or the Project Manager.

This is the wrong approach.  Quizzes – or any assessment, for that matter – should be much more central to your learning product. It is the assessment that indicates how much knowledge learners have retained and how well they have understood it. The questions you pose also offer the opportunity to test how well learners will be able to apply the knowledge they have learned.

5 Things to do

Test to your objectives (verbs, purpose, etc.)

It is crucial that your quiz questions relate back to the course objectives you have set. As obvious as this seems, it is often not implemented. Instructional Designers will much more often work off their scripts or content samples, tying quiz questions to specific content in the learning product. The logic is that “This is what they have learned, so it is safe to test on it”. While this point makes some sense, it loses sight of the purpose of both the quiz and the learning content that has been developed.

Learning content is fundamentally about improving people’s personal or professional ability. The specific improvement to be achieved is broken down into constituent objectives. For most subjects and learning content, it would be impossible to test whether the learner has improved their skill by asking them to display it. Therefore, it is the objectives that are tested, with the fair assumption that if the learner can achieve those actions described by the objectives, they should have improved their skill.

This means considering:

  1. The context of each objective (are there specific conditions under which the action described in an objective should be performed?)
  2. The verb of the objective – this is really important. I’ll elaborate more on verb objectives in another post, but for now, take it as read that to “describe”, “identify”  or “demonstrate” are completely different things. While this is obvious when written like this, you might be surprised to learn that they often become interchangeable when being developed in learning content
  3. The subjects and objects of the objectives. What should the learner be able to do, will they do this with an object, a tool, or a piece of software?

Answering these questions about your objectives will go some distance to helping you develop really good quiz assessments.

Decide on quiz items when developing an IDD

An important way to decide on the design for your quizzes is to outline them as part of your initial Instructional Design Document. I know this is standard in most companies. I also know it’s standard to give assessment little more than an afterthought or consider it something of an onerous task, a push to get the IDD out and signed off.  However, carefully considering your quiz assessments, based on the course objectives at this point will actually provide a couple of benefits:

  1. They will give you a better idea of how best to assess learner’s retention of knowledge and/or understanding. This in turn can help to suggest ways in which the content itself should be developed to improve retention and understanding
  2. If you consider your quiz questions ‘blind’ (i.e. without looking at the content itself), then look back on how you will develop your content, you can check that the content does address all the objectives. If you have questions that ask something not covered by the content – you need to include it

Ask questions as closely relevant to the job/task/skill you are training on

Again, this may seem obvious, but it is advice that is often overlooked, rather than ignored. In an attempt to question objectives, instructional designers will often overlook the real purpose of the objective. Should someone know the definition of a term, or what that definition actually implies when using a tool, software or piece of information in their job? Usually, the latter is the case.

Here is an example: Pressing the Shift key will display letters in uppercase. So, what is the Shift key for? Well, two possible answers are immediately apparent:

  • displaying letters in upper case
  • capitalising words (both at the start of a sentence, or mid sentence)

The first option here is very literal, but the second provides a useful context in which the information is used. From my viewpoint, this makes the second option more ‘correct’ than the first.

Of course, the treatment you provide will depend on the objectives, which is (yet) another reason why your objectives should be carefully drawn out. However, in most cases, the application of information is more important to test than its abstract truth. This is especially true when developing and assessing content for specific work tasks.

Consider the assessment as an item in itself – not just a string of questions!

Quizzes in particular are often seen as a string of questions. Instructional Designers bash them out and send them off for review. However, a much more interesting way to develop your quiz assessments is to try and consider the whole piece. Consider the following:

  • In what you are testing, is there a beginning, middle and end that relates to your objectives? Could you use this to provide a ‘development’ for your quiz?
  • Does the information you provide in the learning content tie together in an interesting way? Can you ask one question to assess fundamentals, then go on to ask other questions that assess the ability to apply the information?
  • Is it possible to ask questions in a structured fashion that relates to how learners will use information, tools or software in their jobs?
  • Is there a typical scenario you could use to develop questions from? Ask the SME!

Add some colour!

If I had a penny for every time I’ve seen a quiz assessment made up entirely of multiple choice questions… Most VLEs and assessment engines now offer an excellent range of question types for quizzes, and you should try to exploit this. Don’t allow learners to passively click their way from one end of a quiz to the other- even assessments are learning opportunities. When considering question types, it is also important to ensure the type of question you are asking relates to the type of thing you are assessing. Here are some examples:

  • Fill in the blanks, or a series of them can be a useful way to test steps in a process
  • Matching questions are a helpful way of checking relationships between concepts, or the where one might find a menu command, information or perform an action
  • Graphic based click answer questions are great for testing people’s understanding of situations, scenarios or the actions they need to take in software applications to perform tasks
  • True or false – often considered the sledgehammer of quizzes – can be used to test understanding of conditions, situations or differences in functions or features. As long as your question is fair, you can ask learners to identify subtle implications of information outlined in the learning content.
  • Just about every question type can be used to ask learners about definitions

If you’d like to discuss this, please feel free to leave a comment below. Alternatively, you can email me about this and other opinions at brendan dot strong at gmail dot com.

Tune in next time for 5 things you should avoid when developing quiz assessments.

What this post is about: Considerations you need to make when creating a strategy to write, develop and/or edit instructional copy. It looks broadly at structure and approach.

What this post is not about: Grammar, punctuation or sentence structure. These will be covered elsewhere

Here is a summary: Learning communication is full of paradigms, short on silly metaphors. My Horse and Cart approach aims to rectify this. Much instructional copy is poorly implemented, as writers or designers don’t consider their objectives, voice, audience or the impact of their copy. This post aims to provide suggestions to help instructional designers overcome this.

Here begins the post

This post uses a silly metaphor (horses and carts) to provide some guidance in how you should develop your instructional copy. It deals primarily with strategies to developing learning content.

The first thing to consider is the road to be travelled. Your horse and cart have to be prepared so that they can deal with the obstacles that may arise over the way. This translates to ensuring your learning content will apply to ‘real world’ scenarios. Consider:

  • What it is people should be able to do
  • Why they should be able to do it (knowing this will help you identify where knowledge can be applied elsewhere, or where learners may need to consider how best to apply the knowledge they learn)
  • What the outcomes for the learner, their organisation and/or industry should be (in this order)

Next, make sure your horse is right for the cart. This translates to ensuring your learning content is being driven by the right objectives. You should consider:

  • That the objectives all contribute to the desired outcomes of your learning project (as outlined by the Road you need to travel)
  • The objective verbs are correct. This may seem pedantic, but the difference between “understand”, “identify” and “describe” is a series of lacunas something like the great lakes.  You need to ensure the objectives you map out for the project will take the learning in the right direction and ensure learners get to where they need to be (as defined, again, by the road)
  • That your learning content directly addresses your well-defined objectives. Anything off the point should be provided as an add on (e.g. drilldown, download, “See also…” types of information. Keep the copy as sharp and focused as it can be to derive optimal effects and get learners from start to finish quickly and easily

Now, the cart itself. This is the media you are using to carry the goods. Think about carts here carefully – you don’t try to shape the goods to fit the cart (although you can do); instead, you should ensure the cart will carry the goods. You don’t carry milk in a wooden flatbed. Neither do you use a discarded oil tanker.  You also need to consider how you prevent the goods from falling out or spilling. This translates to the media you use to deliver the knowledge from you (or the trainer, facilitator, etc.) to the learner. You should consider:

  • The best format to assist the learner in retaining the knowledge. This can be a tough decision, but generally boils down to: Print, Online, Multimedia: consider permanence, access and ability or need to update, as well as how engaging the media needs to be (will learners have sufficient motivation to stick with something that is not very engaging?)
  • The best format to express the information you want to deliver.  For example, complex relationships should always be represented graphically. Steps are often better implemented in text, as you can then provide download checklists. However, some steps might benefit from a screencast.
  • The best way to structure and use that format. Never expect anyone to sit down for 60 minutes to ‘learn’. Even if you’re using Video or Flash, learners just will not concentrate for that length of time in one sitting. You need to structure your learning so that it contributes to achieving your objectives, but also is delivered in managable, engaging sections that learners can take one at a time
  • Assessment and other retention techniques and checks for understanding. Ensure that however you deliver your training, you check that learners can remember and use the information being provided.  It is next to tragic to consider a well structured and delivered learning content that people forget immediately after leaving the classroom. More and more often, I’m finding assessment and checks for understanding are being considered more important than the content itself. This may be a foolish turn of events, but the point is assessment/retention and checks for understanding are the only way to be sure that the knowledge people take on board can actually be applied to the workplace or situations they are training for. You can look at your content and say “Yes, this applies to that objective”, but without assessment you can never be sure how effective it is in doing so.

Finally, consider the wheels on your cart. This relates closely to ensuring the goods don’t fall out the back. What it entails is looking at the road, the goods being carried, the weight of it all together and ensuring the wheels will be able to support the cart, and move it along the road. Consider:

  • How robust your learning will stand up to the demands of a modern workplace. Do you ask people to take time out from work, or can they take the training in hour long bites?
  • How often does the information you are providing change? Is it about software, which may update regularly? How do you intend to deal with such changes so that 1 – the learning content stays relevant and 2 – learners themselves can be updated
  • Can the learner use what they have learned in the content and easily/directly apply it to their workplace, or tasks they are learning to perform? What is the gap between the Learning environment and the Real World environment, and how do you intend to bridge it?

Finally, keep en eye on your cart as the horse draws it. I wanted to include this last point with the Wheels, but it doesn’t really belong there, no matter how silly the metaphor can be made. Evaluation is often overlooked, but always a key driver in improving all future learning content (or indeed existing content that can be updated). Monitor every step of your instructional copy, from design and development to implementation. This will allow you to evaluate what worked best (improving your own systems), and what was most effective for learners (improving future learner engagement and success).

I hope this has been useful to you, or at the very least of interest. If you’d like to hear more about my opinions, please do leave a comment, email me at brendan-dot-strong at gmail-dot-com. If you’d like to share a link to this post with someone else, please do so!