Getting learners to fill in a feedback form after the completion of a training is the norm. But how much of what is gathered is actually useful? What can you do with the data?
If you cannot derive any actionable insights from the responses, you ought to challenge the effectiveness of the questions. Because the purpose of soliciting feedback is to evaluate and improve, with the goal to achieve better results in the future.
There are two aspects of the form to take note to get you closer to your goal:
- The objectives of the questions is to help you collect the ‘right’ data
- The type of questions is to help you obtain better data
“ Asking the right questions is as important as answering them”
– Benoit Mandelbrot
Most survey questions I’ve seen are centred around learning experiences and engagement — Questions about the (i) Trainer, (ii) format, (iii) duration, (iv) learning environment.
While experience is an important aspect of learning, it is only a means to an end. The application of skills is one that will make every dollar you invest in your learners worthwhile. The effectiveness of a training programme can be evaluated based on the following component:
Relevance and usefulness
Q: List the top 3 learnings that are the most relevant and useful to your work.
This question helps HR identify course content that contributed to the success of the workshop. If there is a need to tweak the programme outline in the future, these insights will guide the trainer’s decision on which parts to retain. You can also consider inserting a question on the least relevant areas of the course.
Q: How have the course materials helped you in your learning? What can be improved?
This uncovers how learners interact with the course materials to aid their learning. The responses may also reveal their learning habits and preferences. From these questions, the trainer can also assess if the case studies are useful in helping the learners internalise the course content or if new ones have to be created.
On a scale of 1-10, how would you rate your proficiency in [skill] before you attended this programme.
How has the programme contributed to the difference in rating?
These questions identify learners who have had the most benefit, i.e. those who saw a big shift in rating. E.g. Did the technical folks benefit most from the course while the marketing folks the least? If so, HR can consider organising future sessions that target a specific learner’s profile in order to maximise the results.
If all learners move up at least 1 scale, HR can use this to validate that the programme was indeed helpful in upskilling. It also allows HR to see who in the class is on the lowest end of the spectrum and; who might need further support. If the rating before and after the programme remains unchanged, the trainer can follow up with the learner to identify why. You can consider asking these questions to ensure learning has turned into skills and skills are applied
Q: What needs to happen for you to turn these learnings into skills and to eventually master it?
The answers would inform HR on what additional support to provide the learners the ability to sustain learning for long-term transformation to take place. For example, if the majority of learners indicated “practice”, HR can explore ideas on how to create conditions for them to do so. This might warrant another in-depth survey to formulate a quality solution.
Q: What would be an obstacle for you to gain mastery?
We cannot assume that once the learners are equipped with the new tools that they will apply it. This question helps HR uncover possible reasons why this knowledge didn’t get transferred as skills. These could be the lack of time, resources, support, culture, or working styles. Some learners have shared that while they would like to improve, their bosses are not aligned with the best practices. Hence, they throw in the towel after multiple attempts to improve have been shot down. In such cases, HR can consider ways to get the buy-in from these bosses.
Q: Given the same amount of content and learning, which mode of learning will suit you best and why?
This helps to invalidate assumptions that one mode of learning suits all. It also uncovers ways to experiment with new learning modality that can maximise outcomes. For example, due to Covid, there are no other option but to conduct learning remotely and digitally. At first, HR may doubt the effectiveness of virtual learning but as it turns out, given a choice, 80-90% of learners prefer blended learning. When asked why, many appreciated the ability to juggle work and learning, to have more time to internalise learning at their own pace for self-study modules, while being able to also benefit from their peers sharing during live online discussions.
Q: What other skills would you like to develop? Why?
This helps to identify new learning opportunities that HR might not have considered. Such questions also encourage the learners to pause and reflect on their learning needs and goals. When the goals are clear, they become more enthusiastic in filling the gaps.
2. Types of questions
“Data by itself is useless. Data is only useful if you apply it.”
– Todd Park
There are different types of questions such as:
|Closed-ended||Has this course met your learning objectives?|
|Scaling questions||On a scale of 1-10, how would you rate the effectiveness of the workshop? 1 being not effective, 10 being very effective.|
|Open||How would you define the effectiveness of the workshop?|
|Probing||Did this course meet your objectives? Why or why not?|
|Reflective||What are the top 3 learnings that are the most useful in your work?|
Closed ended and scaling questions are best at gathering quantitative data which removes ambiguity and subjectivity. However, many questionnaires placed too much weight on such questions and with only one open-ended question like “any suggestions for improvement”. This might be due to the intention of creating questionnaires that are easy and fast to complete.
The problem with these questionnaires is that they only provide quantitative results but t you do not gain insights as to why learners gave a low or high rating. For example, the question “The course has been relevant”, scale 1-5 disagree to agree. What can we do with the information? If 90% of the class rate 5 and 10% rated 2, it indicates that there is room for improvement but we don’t know specifically where? Without actionable insights, the trainer and HR cannot pinpoint specific aspects on where to improve.
Hence, it is advisable to supplement closed-ended questions with a probing question such as “WHY?”. This allows trainers to take corrective actions that may help to increase the score on future runs.
To avoid vague or one-worded answers that are subject to interpretation such as “practice”, you can frame questions with a more direct intent. Here’s an example a question that aims at gathering specific insights to meet a goal:
Instead of this
Any suggestions for improvement?
How can this course be improved to help you get better at (skills)? Please elaborate with details so we can better understand how to help you.
You will be amazed at how much you can uncover with thoughtfully crafted questions.
To sum it up
- Identify the purpose of the survey
- Design questions to meet specific objectives
- Collect both quantitative and qualitative data
- Frame questions to gain specific and actionable insights