Instructional Design requires time

I think I'll add this piece to the list of required reading for prospective clients.  And yes, I purposely used the word "required". I already harp on this whenever we propose X number of design hours on a project, and almost reflexively the first thing a client wants to take a scalpel to is the design time. Anyone else have this experience?

If you want the cliff notes, the needs of the students at San Jose State were tossed on the back burner in favor of the wants of the business of education and of a company- Udacity.  

After reading about the school's experiment, one quote stood out in my mind- "The courses were also put together in a rush." I hope for the sake of the students that the University makes the most out of their pause in this relationship, and take a long hard look at their business decisions as well as some of their instructional design choices in this matter. 

The fact that the Udacity students fared significantly worse than their in-class peers is a red flag of sloppy and/ or rushed instructional design.  Indeed digging deeper into the piece you'll find the following gems- "faculty were building the courses on the fly... faculty did not have a lot of time to watch how students were doing in the courses because the faculty were busy trying to finish them." 

In other words, a rush job without much formative evaluation of the course before a final rollout.  I recall back when I was learning to design instruction there being great attention paid to conducting a formative evaluation of a course at every stage of development.  This step is crucial especially after instructional media, online interactions, and the instructional strategy are baked into a draft course. For online courses, you sit a sample of likely members of your target audience in front of a course and evaluate everything from their ability to navigate the course to how well they are able to perform the desired skills after completing the draft course.  The whole purpose of the exercise is to test the effectiveness of the instructional strategy and of the course and any associated materials.  

For the development of all of Collabor8's clients, we request a few test participants to go through our courses and provide us with invaluable feedback  using a punch list.  Many times, we create a simple spreadsheet in Google Drive, and share it with the folks who will be performing the testing.  It doesn't really need to be complicated, in fact- click here to see the format that we use.

Using this sheet, students can go in and provide us with feedback.  We ourselves use the sheet to note any observations that we find to be useful edits when doing our own internal quality reviews.  

cant fail cafe.jpg

To avoid a fiasco like this, keep your learner's interests and needs above all else.  And do yourself and your clients a favor, don't skip conducting a formative evaluation of your courses.  Also remember that online training courses are fantastic supplements to more traditional instructor-led training sessions. Investing a little time during development to evaluate might have alerted some in administration to the fact that some of the enrolled students in the online courses did not have reliable access to computers.

Additional source: Online Education Start-Up Gets 'F' From University

Alex Santos

Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop  and train their people. To learn more about Collabor8 Learning, click here.

Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.

 

5 takeaways from the Target job-aid lawsuit

missed-target.jpg

I read this story this morning about the lawsuit filed against Target for what appears to be an ill-designed job aid or memo.  Here are my thoughts, for what it’s worth.

1. The company was quick to distance itself from the training document stating “…the instructional guide wasn't part of any “formal or company-wide training”.  Here’s a takeaway for you, ANY instructional materials used at the distribution center SHOULD have been reviewed and approved by the training department.  At one former employer, any document that went out with any of the company’s logos, trademarks, or the like was first and foremost reviewed and approved for distribution by the brand management team.  If marketing is afforded this opportunity to review appropriate uses of a company’s logo, training should be afforded an equal opportunity to ensure that any job aids communicating appropriate behavior (a key component of your culture) is aligned with the company’s values.  

2. One of the Plaintiffs claims that he complained to human resources, and his supervisors retaliated for it.  Said Plaintiff claims his manager "began using more racial epithets” and made attempts to humiliate him amongst his colleagues.  HR needs to conduct a solid investigation into this incident and this Manager, and should PROACTIVELY evaluate practices at its distribution centers and if it hasn’t already done so- provide diversity training to its management ranks.  

3. There appears to be some dissonance between the company’s values, and their policies and procedures.  This is both a cultural issue as well as a risk management issue.  In this era of social media and transparency―what a company and its agents DO carries much more weight than any written values in a memo or displayed somewhere.  Organizations struggle daily with developing their cultures, and incidents like these (if confirmed) set those efforts back immensely.  

4. If the manager’s actions in this case are confirmed, HR at Target should re-evaluate their procedures for preventing retaliation.  

5. Finally, instructional designers everywhere should rejoice, for none of us would ever instruct a learner to “…note differences among Hispanic employees” in a job aid or training course as alleged in this complaint.  How do I know?  It’s not a measurable behavior!

Alex Santos

Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop  and train their people. To learn more about Collabor8 Learning, click here.

Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.

 

How the xAPI can improve your training ROI calculations

How many times have you been asked to calculate the value or the return on investment (ROI) your training programs are delivering?  Unless your client is okay with "real-world" back-of-the-napkin calculations, you'll need to demonstrate some objective metrics and be prepared to defend them.  It's time you get better at showing your worth, and the new Experience API (a.k.a. the xAPI, or Tin Can API) can assist-if used properly.   Prior to the xAPI's existence, your data collection was limited to what data you could track inside of your learning management system (LMS), and whatever test scores or attendance records you were keeping if you do not yet have an LMS or are still using a spreadsheet. 

Enter the xAPI, now you can include all sorts of offline activities such as conference attendance, coaching and mentoring sessions, and pretty much anything you can condense into the xAPI statement structure of nouns, verbs and objects.  Even successful activities performed on a CPR dummy or carrying a heavy hose out of a simulated burning building can now be tracked as a development activity.  In other words, your universe of learning activities that are contributing to a positive ROI for your efforts just expanded infinitely to what big data folks call N=ALL.  You can now track ALL learning activities, online or off, in front of a computer, iPad, crash test dummy, or an instructor that your learners are learning from in your analysis!

Let me give you an example of how this benefits you.  A year ago, you would have sent an employee needing to develop his or her presentation skills to a class or offered coaching by a consultant in order to develop their skills.  You had no way to track his attendance at local Toast Master meetings, his or her reading a book by a noted author on the subject, or delivering prepared remarks at a local public school or charity as a part of their developmental experiences.  Now, you can track all of these experiences and include them in your analysis of what led to the employee's improved public speaking skills.  In the case of an employee delivering prepared remarks at a local public school for example, your cost for said experience was $0, yet surely you can estimate some benefit derived from the experience in your ROI calculations. 

Return.png

You see, to improve the accuracy of any ROI measurement, it is necessary to develop as complete as possible picture of the learning and development activities your learners experienced that led to the improved performance.  The more experiences and activities you include, the stronger the inference you can make as to their value in your ROI calculations.  And if you and your team are enabling these experiences, you can estimate their worth in your ROI calculations!

 The xAPI provides improved visibility and transparency into the activities of your learners, which in turn can improve performance across individuals and teams if multiplied and positively impact your ROI calculations.

 
Additional resources you may want to review when calculating the ROI of your training initiatives:
1. Summary Process for measuring ROI of Training
2. The Direct Path to Training ROI

Have a resource for measuring the ROI of training that is not listed here?  Please feel free to include it in the comments, the above are provided to get the conversation started! 

Alex Santos

Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop  and train their people. To learn more about Collabor8 Learning, click here.

Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.

 

10 Clues you might just be designing a job aid, not e-learning

Bonnie_Clyde_Car.jpg

10. Your client hands over a 150 slide PowerPoint deck riddled with bullet points worse than Bonnie & Clyde’s Ford V8 and says “put this online for me”. 

9. You’re asked to “convert” an “employee manual” to e-learning, with no access to a subject-matter expert or a clearly articulated business goal.

8. You ask to analyze an exemplary performer to observe the desired skills and are greeted by blank stares.

7. Your budget for the development of the project is about the price of a couple of iStock images, a foot-long meatball sandwich at Subway, and a 12 oz. Coke.

6. There is no time for a front-end needs analysis, “just build us the e-learning”.

5. There are no meaningful examples of the behaviors or skills to develop, much less non-examples.

4. The objectives for the “e-learning” contain the words― understand, know, become aware of, realize, familiarize yourself, etc.

3. Your client says, “No, a skills check or assessment after the learners complete the e-learning module won’t be required.”

2. There is no baseline performance data to measure the results of any e-learning intervention to speak of.

And the number one clue you just might be designing a job aid is...

1. For source material/ content, your client asked you to “go buy a book” on the subject matter!

Alex Santos

Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop  and train their people. To learn more about Collabor8 Learning, click here.

Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.

 

Is it time to Reboot eLearning, or simply boot poor clients?

Read this article upon waking up today, and I'm not so sure our industry needs a reboot.  As instructional designers at heart, we work closely with our clients to educate them on the potential uses for their content in an online environment.  I've had that conversation dozens of times, where using the sleight of hand techniques we learned back from Performance Consulting we shift the conversation away from all of the facts our client wants his/ her learners to "learn" (more like "memorize" or "keep in mind" while performing) to the outcomes of desired performance.  You know the one.  It typically starts out with, "we have these PowerPoint files".  The author, Carol Leaman, does recognize this when she states-

"...especially businesses implementing eLearning all need to ask, not “what are we doing?” or even “why we're doing this?” but “how are we doing it?”

However, just two sentences below she falls into an all too familiar trap of asking "How do you deliver specifically what an employee needs to know..." -completely the wrong question.  Many of us in the field continue to hammer the point that what an employee needs to know should be the last thing you ask.  First and foremost, we should be asking-

1. What is the business goal?

2. What behaviors must learners perform to help us meet our business goal?

3. Why aren't learners performing this way?

4. What learning activities can we design that fall within the client's budget that will allow learners to practice these behaviors in an online environment, and receive feedback on their performance?

5. And lastly, the author's question- How do you deliver specifically what an employee needs to know?

Reboot.jpg

In summary, I know I'm not alone in having these conversations with our prospective clients.  I speak to other designers on a daily basis, many outside of our sphere of influence, and some in other countries as well.  I had one know-it-all former attorney tell me that "lawyers learn via bullet points".  The challenge I see is not to completely reboot our industry because of prospects or clients that are relics from a previous era, but to boot prospects who insist on merely putting content online. 

Reminds me of the old proverb- you can lead a horse to water, but you can't make it drink.  I say boot these poor prospects, and work with those that will allow you to make a lasting difference and add value to their efforts.  Just like dating, you should have standards and choose clients wisely.  

"The only way to do great work is to love what you do. If you haven't found it yet, keep looking. Don't settle. As with all matters of the heart, You'll know when you find it."

-Steve Jobs

Let's talk about e-learning

Let's talk about e-learning

Alex Santos

Alex is a co-founder and Managing Member of Collabor8 Learning, LLC, an instructional design and performance management consultancy. His firm collaborates with organizations to enhance the way they develop  and train their people. To learn more about Collabor8 Learning, click here.

Alex can be reached at 786-512-1069, alex@collabor8learning.com or via Twitter@collabor8alex.