This tweet  gave me a good laugh, for I feel the same:

I don’t hate video tutorials, and have made many. The point is, if it’s not the right solution, we end up with tweets like this. Spraypainting a cow is also an expensive proposition that raises many logistical problems (especially if on-the-job content is being provided).

It’s useful to have someone with the knowledge and expertise to help you determine, plan and execute the best solution to your problem.

What this post is about: A continuation from yesterday’s “On Putting it Online…“, but with a bit more structure in terms of when you might do what.

What this post is not about: As with yesterday’s post, this is not a project map or detailed instructions on developing online content. This provides general principles.

Here is a summary: After some flattering and some not so flattering emails with regard to yesterday’s post, I want to provide a more coherent structure by outlining the steps you take to put learning content online. While I agree yesterday’s post was certainly not my finest moment in written content, I still believe the thrust of my argument. This post merely reflects that I had not truly considered my audience (those seeking advice on creating online content). Which is something I believe strongly that one should do. Won’t somebody think of the shoemaker’s children?

Here begins the post

So, as I was saying, you need to develop content that is assessable, applicable and appropriate. But what steps do you take to achieve this?

First Steps…

Identify the boss. Someone will need to manage the whole project, lest it spin out of control and you end up with hours and hours of unnecessary or incoherent content. You will also need to mind your budget (this may be preset, or you may have some time to determine what it will be in discussion with your instructional designer and SMEs)

Assemble your Subject Matter Experts (SMEs). These may be paid or volunteer (depending on your set up. Perhaps if they are employed or contract trainers, SME activities can come under an existing contract). When considering people for this role, consider also your subject matter. You want to ensure you have expertise to cover all areas of your course. This may be as few as 1, or as many as the number of subjects you want to cover. I often advocate more is better (as it reduces the load on each individual SME), however, one must be careful not to employ too many chefs, lest your broth become a lumpy stew of expert opinion.

Get an instructional designer. I really do believe the outlay for a learning professional will provide best results. Seek out references, portfolio of work or at the very least ask them to describe projects they have worked on, problems they faced and how they overcame these. Prize specific information from them. I hate to say this, but there are some charlatans out there. However, sites like LinkedIn can be useful for identifying people they have worked with in the past that you may know. If you are starting from scratch, I also advocate going for experience over energy. If systems are in place, energy is your friend – but where you are trying to get something off the ground, experience will really come into its own. You need someone who can identify solutions and deal with the problems that may arise, either by avoiding them or dealing with them efficiently.

Next steps…

Get your instructional designer and your SMEs together. If you are managing this project, you may need to be there too, because this is the first point when things may get a bit hairy.  Your instructional designer will know nothing or little of the content to be covered. Your SMEs know everything. However, your instructional designer should be able to interview your SMEs to determine an overall goal for the course and perhaps a structure (although this is not always the first thing to happen). They should also be walking away with contact details, an overall goal, perhaps objectives and – preferably – source material.

(Next- this rarely happens, but I like to add this: next, work on a learner survey to determine their learning need, how they feel they might use an online service for learning and what it is they might like to see covered. If you have a mature audience, you could even seek out learner advocates to help you design the survey and better understand their needs and demands. This could be a group project between the instructional designer and the SMEs, but again needs to be managed, lest you inadvertently offer the sun, moon and stars (raising expectations unreasonably), or, the answers to your questions come back contradicting themselves. Take the results of this survey, maybe break out some learner profiles (in case broad differences arise from your learners), and figure out how much time they can spend online and how they might effectively work with the content.)

Depending on your situation, you may also need to consider your assessment principles. While your SMEs and instructional designer will have their own ideas in this regard, you need to also consider whether certification or accreditation of any kind will be sought for your course – and what these mean for your assessment. Assessment may be Multiple Choice questions, more varied questions, portfolio work (if your SMEs have time to grade this), you may use peer-assessment (if your learners can be trusted with this), the list goes on…

More steps…

Your instructional designer will then go away for a week or so (depending on how much content is there).

They should then return with an overall structure and approach for your content. Going back to yesterday’s post, this may be:

  • Highly engaging activities in Flash or HTML5, with assessments
  • Forum/social media-based discussion groups, with portfolios (for assessment)
  • Online web pages, with case studies or activities
  • Or a mix of all these
  • Or none of these (again, your specific content, audience and intended outcomes will dictate the solution)

Now, you need something to put all this on. Website, Learning Management System (LMS), Wiki, what? Well, your instructional designer (again) should have ideas based on their proposed solution. You may also need to consider whether you need:

  • Tracking to see that learners have covered specific content
  • Forums?
  • Social media integration?
  • Private messaging/mailing system?
  • Portfolio building as proof-of-activity?
  • Certification/completion monitoring?
  • Calendar?
  • Calendar with “locking” of activities?
  • Storage for grades and learner outcomes
  • A “dashboard” for learners (and SMEs, if acting as “teachers”) so everyone can monitor what is going on
  • Support for iOS devices? (I know this is very specific – but the iPad is so popular now, it is important to consider whether people can use it to take part in your online course. “Bring the learning to the learner” is my motto)
  • etc (also please note, this list is disparate on purpose – there really is a multitude of things to consider, and they will all depend on your content, approach and learner profile)

Even more steps…

Then, it’s time to start building. In a perfect world, you will hear nothing except for any regular reporting updates you have asked for. And all the rain drops will be gum drops and lemon drops. Your SMEs and instructional designer will enter a strange relationship, whereby they love and hate each other. In short, there is a tug-o-war: The SMEs want to put everything in, often in the same way they lecture (or learned or developed) the content for learners. Your instructional designer will want to highlight key information, dealing with ancillary information in other ways. They will both be right and wrong at some point. They will probably come to you also. Generally, the instructional designer will work alongside the SME to develop an outline for activities/quizzes/content. They will then design and build it (in cases where you work with a company, this will be an instructional designer, graphic designer and developer – however, they will usually be managed by a project manager who will be your point of contact). This will be sent back to the SME (as a script, as preliminary content, or as very close to complete content), who will provide corrections. This is not a “maybe”, this is a “definite”. There will always be edits to improve the accuracy of content, or to limit misleading information or to make information more “complete” so that it is coherent for learners. This is an iterative process, but hopefully your instructional designer will have strategies to minimise the number of iterations required.

See if you can put anadin into your budget.

It is not all bad news. They will also both respect each other (in my experience). If you have a good instructional designer, your SMEs should be happy with what comes out, despite how they felt as it was being developed.

Nearly there? Nope…

Now, you’re not finished. Not by a long way. Once you have content developed, it will need to be tested. Does it work the way it is meant to? Is it easy to use? Is it accessible (not just in terms of devices, but to accessibility software, you may have learners who live with disabilities). Does the course do what it said it would do in the script? Does everything “hang together” (keep in mind, the usual process is to develop in modules, or separate pieces, then put everything together. When it gets put together – is it coherent? Are there gaps?)

Just like end as you mean to start again…

You are still not finished. It is time to let the learners bask in the fruits of your labour. And compliment you (oh, really, you are too kind!) and complain (oh, you really are too cruel!). Get as much feedback as possible about your course. Gather it also from SMEs (a tip in this respect: a Project Manager who I worked with one kept SME feedback all the way through the project, rather than just those specifically requested at the end. This gave them a “live snapshot” of SME opinion and feeling throughout course development – not just at the end when they are generally happy with the outcome).

Then, you need to feed all this back into your course design:

  • What worked for learners and SMEs?
  • What did not work for learners and SMEs?
  • What worked in terms of project development?
  • Where did things go wrong? How did they go wrong? Can this be avoided in future?
  • How did the subject matter go down? Do you need to speed it up, or slow it down?
  • Did learners pass their assessments? How about those with less or more interaction in the course – was there a difference in how they performed?

How does this work? It depends on the instructional designer, but people will usually take this information and re-work the original design to improve it for the next cohort of learners (actually, it is becoming quite popular to update courses for current learners also, so this may happen after a defined period, or as the course runs).

Summary

I hope this post has helped to better explain the sort of process you may need to go through in order to develop some online learning for your organisation. While it was inspired based on feedback from yesterday’s post, I believe both posts are complementary: Yesterday’s looked at the issues you need to consider to ensure your learning suits your learners. Today’s looks at the kind of steps you need to take to put it all together.

The one question I am asked the most as an instructional designer: “How do you do that? How can you know about that?”

The question is asked in varying tones, from an undeserved derision (who are you to teach these people?) to undeserved praise (who are you to teach these people?).

The simple fact is, instructional design is not about knowing the details of the content being developed (or, to the layperson – the subject you are teaching), but knowing how to develop the content (or how to communicate the information to the people who need to know it). Dealing with content is the skill. What does that mean? It means many things to many people. I like to go back to first principles. In short: you have a problem. 

Someone (or some population) need to know X. They do not know it now. Your job is not just to tell them “X”. Your job is to ensure that when they need to use this information, X (they may be on the phone, they may be in a hospital, they may be flying a plane), X is there, in their head and ready to be used.

So, how do you get X inside their head? I can’t tell you here, because you’ve already formed an idea. An idea about who this learner is and what X is. If I don’t know who they are or what X is, I cannot possibly suggest how you get it inside their head. I can’t even pack its bags for the journey. All I know is you have to communicate it to them (not their boss, not the experts I talk about later, but to that person).

In most cases, I am dealing with technology. This is a function of the environment. People call on an instructional designer when they want something delivered via technology. If they wanted something delivered in a classroom, they would have called a trainer. This does not mean they are right, it is merely a function of how the roles of “trainer” (traditional, instructor-led training and coaching) and instructional design (new-media, technology assisted training) are viewed. I believe the 2 will converge at some point.  This will offer better results, as trainers and instructional designers will be in a position to better exploit both the instructor-led and technology assisted methods of communicating that are currently somewhat “shut off” from them (because they are currently called upon to provide either instructor led sessions or technology assisted communication).

Then, dealing with people. I work (as all instructional designers do) with Subject Matter Experts, who often have more important things to do with their time. I am blessed in my current job (I know from experience!), in that the SMEs I work with now are willing to help, but are busy (I have worked with those who are not busy, and not willing to help). Dealing with getting the most from the small slice of their schedule is the skill. 

You also need to be able to explain to the Subject Matter Expert, and to whoever manages the person you are communicating with, that this communication is not for them. It is for the audience. This may seem obvious, but in my experience can be the most difficult communication to get across. Managers want to see content in one shape, SMEs want to see it in another. However, to do your job properly, the only people who matter are the audience. The content needs to be in a shape that allows them to consume it in the most efficient way that ensures it remains durable.

Once you’ve cut through who you are talking to, how best to communicate with them and explained to their boss and the subject matter expert what it is you intend to do, it is time to deal with the information. This is the bit I like.

You consider what people need to know, why they need to know it and how they will use this information. This allows you (in discussion with their manager and the SME) to determine learning objectives. These determine what it is someone should be able to after you have finished talking with them.

With the objectives, you have an end point. Progress! 

So, next you need to know your starting point. Again, the SME should give you an idea of where people are starting from (what they should already know/what they can do). So, armed with this information, you start to create a “story” that begins where the audience currently is, and leads them to where they need to be (i.e. achieving their learning objectives).

Ah, but sometimes, management and coordination is the skill. Many instructional designers will work with programmers and designers to make this story compelling and engaging. This is usually achieved using interaction, quizzes and assessment. If you aren’t using a Rapid eLearning Tool (this is a subject for another day, but in short means you’re doing it all yourself), you have to request this content and keep an eye on its delivery to make sure everything comes together at the right time. Yes, you probably have a project manager who organises all this, but you still need a mechanism to determine what you need, how you get it and when you’ll get it back.

At the end of the road, it is always good to look over the journey. So you create a final assessment (the intermediate quizzes and assessment help people to gauge their learning as they go, so they can go back over anything that the quiz has shown they didn’t understand).  

This final quiz will serve 2 purposes. The first is testing – to prove to the person you are talking to (and whoever else needs to know) that they now know X, and can hopefully use it. A second purpose is to “activate” the information within their mind. In short: they may well know the information you have communicated to them. But it is latent – sitting in their brain, perhaps having a cocktail and enjoying the view. By quizzing them on this information, you force it to get up and head upstairs to the consciousness, so the brain knows where to find it. This helps to develop and protect their understanding of the information. Which is good, because if they need to know X while they are flying a plane, you want X arriving promptly, not delayed at the gate or snoozing on a lilo, while the brain frantically calls its name over some kind of synapse intercom.

The next skill is one you learned as a child. It is the “spot the difference” skill. What was planned, what was built and how do they compare? Are there differences? Do these differences need fixing? If not, perhaps you need to alter your plan to reflect any changes that were required in the course of the project. Because when you come back in a year or so, you may not be able to remember that last Tuesday Jo Murphy came in and said the TX3720 was going to launch with a specialist module. While the specialist module isn’t the “main event”, it still needs to be mentioned.

After you’re happy, the content needs to go to the SME to make sure it is correct. If it is not correct, terrible things may happen. And many instructional designers feel an SME review is a terrible thing, happening. But this final review ensures that even though you are not an expert, that at least what you are communicating is correct.

After all that, it’s probably time for a pint. 

What this post is about: This post provides tips on developing quiz assessments for instructional media.

What this post is not about: This is not an in-depth discussion on the various types and methods of assessment. Rather, it provides advice on what you need to consider when developing quizzes to test for knowledge retention and understanding.

Here is a summary: Quizzes are not the only method of assessment available. However, they are the most popular, as they are easy to develop and manage. Unfortunately, quizzes are often seen as an add-on to learning content, rather than being central to its purpose. In this post, I discuss this in more detail, and provide 5 useful tips that can help you develop better quizzes that really assess learner knowledge retention and understanding.

Here begins the post

Quizzes are probably the most popular form of assessment for online/digital learning materials. They are easy enough to write, develop and deploy. From a learner administration point of view, they are also easy to track and measure. They seldom require intervention, as they run from an LMS, assessment engine or other automated system.  Once learners have taken a quiz, the (reliable) results are provided, letting administrators know which learners have done well, and which have not.

However, many quizzes I have seen are poorly implemented. They are an afterthought, created as a piece that slots in toward the end of the project: an administrative effort, required by the Subject Matter Expert, Curriculum Developer, Client or the Project Manager.

This is the wrong approach.  Quizzes – or any assessment, for that matter – should be much more central to your learning product. It is the assessment that indicates how much knowledge learners have retained and how well they have understood it. The questions you pose also offer the opportunity to test how well learners will be able to apply the knowledge they have learned.

5 Things to do

Test to your objectives (verbs, purpose, etc.)

It is crucial that your quiz questions relate back to the course objectives you have set. As obvious as this seems, it is often not implemented. Instructional Designers will much more often work off their scripts or content samples, tying quiz questions to specific content in the learning product. The logic is that “This is what they have learned, so it is safe to test on it”. While this point makes some sense, it loses sight of the purpose of both the quiz and the learning content that has been developed.

Learning content is fundamentally about improving people’s personal or professional ability. The specific improvement to be achieved is broken down into constituent objectives. For most subjects and learning content, it would be impossible to test whether the learner has improved their skill by asking them to display it. Therefore, it is the objectives that are tested, with the fair assumption that if the learner can achieve those actions described by the objectives, they should have improved their skill.

This means considering:

  1. The context of each objective (are there specific conditions under which the action described in an objective should be performed?)
  2. The verb of the objective – this is really important. I’ll elaborate more on verb objectives in another post, but for now, take it as read that to “describe”, “identify”  or “demonstrate” are completely different things. While this is obvious when written like this, you might be surprised to learn that they often become interchangeable when being developed in learning content
  3. The subjects and objects of the objectives. What should the learner be able to do, will they do this with an object, a tool, or a piece of software?

Answering these questions about your objectives will go some distance to helping you develop really good quiz assessments.

Decide on quiz items when developing an IDD

An important way to decide on the design for your quizzes is to outline them as part of your initial Instructional Design Document. I know this is standard in most companies. I also know it’s standard to give assessment little more than an afterthought or consider it something of an onerous task, a push to get the IDD out and signed off.  However, carefully considering your quiz assessments, based on the course objectives at this point will actually provide a couple of benefits:

  1. They will give you a better idea of how best to assess learner’s retention of knowledge and/or understanding. This in turn can help to suggest ways in which the content itself should be developed to improve retention and understanding
  2. If you consider your quiz questions ‘blind’ (i.e. without looking at the content itself), then look back on how you will develop your content, you can check that the content does address all the objectives. If you have questions that ask something not covered by the content – you need to include it

Ask questions as closely relevant to the job/task/skill you are training on

Again, this may seem obvious, but it is advice that is often overlooked, rather than ignored. In an attempt to question objectives, instructional designers will often overlook the real purpose of the objective. Should someone know the definition of a term, or what that definition actually implies when using a tool, software or piece of information in their job? Usually, the latter is the case.

Here is an example: Pressing the Shift key will display letters in uppercase. So, what is the Shift key for? Well, two possible answers are immediately apparent:

  • displaying letters in upper case
  • capitalising words (both at the start of a sentence, or mid sentence)

The first option here is very literal, but the second provides a useful context in which the information is used. From my viewpoint, this makes the second option more ‘correct’ than the first.

Of course, the treatment you provide will depend on the objectives, which is (yet) another reason why your objectives should be carefully drawn out. However, in most cases, the application of information is more important to test than its abstract truth. This is especially true when developing and assessing content for specific work tasks.

Consider the assessment as an item in itself – not just a string of questions!

Quizzes in particular are often seen as a string of questions. Instructional Designers bash them out and send them off for review. However, a much more interesting way to develop your quiz assessments is to try and consider the whole piece. Consider the following:

  • In what you are testing, is there a beginning, middle and end that relates to your objectives? Could you use this to provide a ‘development’ for your quiz?
  • Does the information you provide in the learning content tie together in an interesting way? Can you ask one question to assess fundamentals, then go on to ask other questions that assess the ability to apply the information?
  • Is it possible to ask questions in a structured fashion that relates to how learners will use information, tools or software in their jobs?
  • Is there a typical scenario you could use to develop questions from? Ask the SME!

Add some colour!

If I had a penny for every time I’ve seen a quiz assessment made up entirely of multiple choice questions… Most VLEs and assessment engines now offer an excellent range of question types for quizzes, and you should try to exploit this. Don’t allow learners to passively click their way from one end of a quiz to the other- even assessments are learning opportunities. When considering question types, it is also important to ensure the type of question you are asking relates to the type of thing you are assessing. Here are some examples:

  • Fill in the blanks, or a series of them can be a useful way to test steps in a process
  • Matching questions are a helpful way of checking relationships between concepts, or the where one might find a menu command, information or perform an action
  • Graphic based click answer questions are great for testing people’s understanding of situations, scenarios or the actions they need to take in software applications to perform tasks
  • True or false – often considered the sledgehammer of quizzes – can be used to test understanding of conditions, situations or differences in functions or features. As long as your question is fair, you can ask learners to identify subtle implications of information outlined in the learning content.
  • Just about every question type can be used to ask learners about definitions

If you’d like to discuss this, please feel free to leave a comment below. Alternatively, you can email me about this and other opinions at brendan dot strong at gmail dot com.

Tune in next time for 5 things you should avoid when developing quiz assessments.