How can data be used in eLearning design?
Learning and development (L&D) professionals need to be able to create impactful training for their organisation. But what’s the best way to go about this?
To create results-driven learning you need to use data to ensure the training you’re proposing is actually needed in the first place. Then, assuming that it is, your data should also confirm that it will be effective at creating the required change. In the context of learning and development (L&D), data can be used to:
- identify the most effective methods
- diagnose training needs
- ensure training and learning is focused on performance
- evaluate effectiveness
- adapt and improve solutions.
Let’s look at the types of data that can be used in eLearning design.
Posted 6 February 2023
Training needs analysis
Identifying skills and knowledge gaps in the workforce is the first step in commissioning training. This comes in the form of a training needs analysis (TNA). The data you collect for a TNA can come in many forms. For example:
- Records of formal interviews/focus groups with stakeholders
- Records of informal discussions with stakeholders (like regular team meeting chat or one on one discussions)
- Surveys of all stakeholders
- Performance data (covering output, general performance management, and data across the organisation)
- Skills test results
- Documentation (strategic plans, aims, job descriptions and person specifications)
- Industry statistics (to create benchmarks for your workforce’s performance)
Learning management system reports
Your learning management system (LMS) will contain some great information that can tell you so much about how your workforce is engaging with training. For example:
- Times training has been accessed
- Whether training has been started, is in progress, or has been completed
- Time taken to complete the training
- Type of device used to access the learning (this can influence the type of interactions or tools used in creating eLearning)
- Any other relevant information about your audience you wish to collect (this can be set up during user registration and include things like location, division, role, years with the organisation, and so on)
Workforce management solutions, such as eNetEnterprise, can also facilitate social learning. For example, forums for learners to engage with each other and share/exchange their learning. With this kind of platform, you could measure engagement using a points system which awards points for different types of interaction, allowing you to see who is engaging and in what ways. This information could also be used to tailor the content that is available to them.
Feedback on existing training
It’s essential to know whether your training is having the impact you anticipated and collecting feedback on your courses is a great way to help you make continuous improvements. Feedback normally comes in the form of learner surveys, however, creating a meaningful survey requires more thought to the survey design than you may think.
Taking a performance-focused approach can really strengthen the data you receive, allowing you to make more informed decisions when designing training. For example, when measuring training effectiveness, using a basic Likert scale (like below) leaves the learner to decide what is meant by effective, which could vary considerably between people.
“How effective did you find this course?
- Extremely effective
- Fairly effective
- Not very effective
- Not at all effective.”
To get a better idea of how well the learner feels they understood the concepts, you need to make your questions more specific. The following exemplar survey question is taken from Will Thalheimer’s ‘Performance-Focused Learner Surveys’ book (2022):
“Now that you’ve completed the learning experience, how well do you feel you understand the concepts taught? CHOOSE ONE.
- I am still at least SOMEWHAT CONFUSED about the concepts.
- I am now SOMEWHAT FAMILIAR WITH the concepts.
- I have a SOLID UNDERSTANDING of the concepts.
- I AM FULLY READY TO USE the concepts in my work.
- I have an EXPERT-LEVEL ABILITY to use the concepts.”
It is important that you have a marking criteria set up (invisible to the learner) that indicates what the results mean to your L&D team. For example, Thalheimer notes that for the above example, if a learner answers (a), this is “alarming”, (b) is “unacceptable”, (c) is “acceptable”, (d) is “superior”, and (e) can be “superior” or indicate “overconfidence”. You may decide that “alarming” answers would require immediate action to improve the training, prioritising it above other workloads, and “unacceptable” may also require intervention.
Improving learning design
Feedback surveys can be very useful in learning design when designed well. During a recent client project, three eLearning modules were distributed to learners in a pilot phase to gain feedback. Steering groups completed surveys and participated in discussion groups, giving feedback on the effectiveness of the modules. Overall, feedback was very positive:
The majority of learners reflected that each e-module had improved their knowledge and increased their confidence in talking about the core topics. The survey was designed to understand the impact these modules had on understanding, attitudes, and how people intend to change their behaiviour at work. This allowed them to make updates to the modules with eCom and plan their next steps.
xAPI
A growing trend in eLearning is using Experience API (xAPI). xAPI allows you to collect data on anything within an eLearning course. For example, you can find out:
- The number of times optional content was accessed
- Learners’ answers to knowledge check questions
- How many times pages, courses or individual interactions were revisited
- Time spent on the course, interactions or specific area
- Whether the piece of learning was completed in one sitting or across multiple
- Anything you can think of that could be useful in your learning design!
These can all be used to help you focus on the aspects of your course where more instructional design work may be required. Are you noticing that learners keep returning to a specific piece of information? Can you analyse your data to work out whether this is because they found it specifically engaging or was there something about the content or design that was unclear? Likewise with the knowledge checks; are learners getting anything consistently wrong? Does this indicate a failure for a specific learning outcome?
There is a vast amount of data that can be collected with xAPI which can tell us so much, specifically about peaks of activity within courses. However, it’s important not to lose sight of why you’re collecting the data in the first place. Focusing on the right kind of data for specific reasons will help you ensure your people are getting the support they need to drive the success of your organisation.