Analysing Feedback on Coursecheck

9th August 2019

We're all busy. And in business that means if we can't easily get our hands on the information we need, at best we won't bother - and at worst we'll make assumptions and guess! To be useful, information needs to be:

  • Instantly accessible
  • Presented in a visual format that's easy to understand
  • Accurate
  • Up to date
  • Easily shareable with others

In addition to being able to "pull" information on demand, we might need some reports "pushed" to us on a regular basis. And then there's information that we want pushed to us on an ad hoc basis, depending on a criteria: for example, negative feedback, where we only want to know about it if there's something that needs our attention.

At Coursecheck, we understand how training companies work and have done our best to design the reporting with the needs of the training manager in mind, not forgetting that there are other stakeholders too. Trainers are likely to want visibility of the feedback from events they've delivered; Marketing people will be interested in looking at the comment summaries and if you're running private courses for particular customers, then they're likely to want reports too. Here's a taster of what you can expect if you use Coursecheck to manage your Learner feedback.


Coursecheck Dashboard

As soon as you log in to Coursecheck, you'll be presented with the Coursecheck dashboard showing you at a glance, how the company has been performing in terms of overall star rating satisfaction and the more critical measure - your Net Promoter Score (NPS).



Depending on your role, you may then need to use any or all of the three reporting components on Coursecheck. These are:

  1. Analytics designed to give you the big picture and see the trends. See how your trainers compare to each other; and whether some courses always get better feedback than others. You can also analyse responses to specific questions, or compare feedback by venue to see how that's affecting overall satisfaction.
  2. Event Reports designed to give you all the detail about one or more events. Highlight negative feedback and produce PDF reports to share internally or send to customers.
  3. Automated reports and alerts. Coursecheck User accounts can be configured to receive automated comment summary reports and/or negative feedback alerts.

If you need to produce a report not catered for out of the box in Coursecheck, then there's also an option to export all data to CSV.


Net Promoter Score

Tracking your Net Promoter Score (NPS) is a well-established method of measuring customer satisfaction. If you're unfamiliar with how NPS works, you can read about it here.



Filters allow you to check your NPS by course, course category or trainer so you can see what's pulling the score up and what's pulling it down. And date filters, let you see the trends and take corrective action where necessary.

You can do the same thing with your overall 1-5 star rating although it can be harder to work with because scores are often bunched up between 4 and 5 whereas NPS scores work over a scale from -100 to +100.



Analyse performance by trainer, course, venue or by question. For example, if you want to check who your top performers are and who needs help, then Coursecheck Trainer Analytics makes it easy to do just that.



But the devil is in the detail and context is important, so it's always worth considering other factors that might be skewing the results. For example, you may have an instructor who scores highly on some courses and not on others. But before jumping to conclusions, it's worth checking things like:

  • whether other trainers are also underperforming on those same courses. And if they are, then you have your answer and it's probably the course you need to be focusing on, rather than the trainer.
  • whether a trainer might be running a course for the first time, in which case they may well not be firing on all cylinders yet.
  • When the volume of feedback is low, it may well not be statistically significant. For example, if there were only two people on a course and one gave it much better feedback than the other, then you can't tell much from that.

Course, Venue and Question Analytics all work in much the same way, giving you instant visibiity at the high level and with the ability to drill down and/or filter your analysis as needed.


Event Reports

Although Analytics are important for seeing the big picture, training managers also need to have detailed real-time visibility on recent events so they can react promptly when they need to. Trainers too, often want to see the feedback for an event they've just delivered. And if it's a private course, then the customer may also want to have a report summarising the feedback about the event. Coursecheck addresses these needs with Event Reports.



Event reports can be annotated to highlight negative feedback, and are also the place to go to respond to customer comments with comments of your own which, for training companies, are made public alongside the original comment. For training companies, responding to negative feedback is particularly important as it can have a dramatic effect (good or bad) on how you're perceived.

Subject to permission, trainers logging in to Coursecheck can see event reports about the courses they've taught and may also (also subject to permission) be able to see feedback from courses delivered by their colleagues. And you can produce PDF reports on one or more events to share internally or customers.


Automated Reports and Alerts

Unless you're using Coursecheck on a day to day basis, you'll probably prefer to have information pushed to you by email rather than having to remember to log in and look for what you need. So, Coursecheck accounts can also be set up to receive automated reports. The two key reports available in this way, are:

  1. An automated comment summary report with in-built links that take you straight into Coursecheck if you want to get to the detail. You can configure these to be sent daily, weekly or monthly depending on your role and reason for wanting the report. For those tasked with responding to comments, it's a great way to see at a glance, what people are saying and prompt to respond if you want to.
  2. Negative feedback alerts. These are only sent out if scores are below a threshold defined by you. As with the comment summary report, you can choose the frequency but of course you won't receive an alert if all the feedback is positive!



In designing our reporting capability, we've tried to think of all the typical questions you might want to ask, and ensured that you can quickly get the answers you need. But there are always exceptions, which is why we also have a data export facility where you can extract all response data within a date range to a CSV file, from where you can manipulate it in Excel as you wish. The CSV extract also includes all the Personal Information supplied by Learners when they leave their feedback, which can be useful for following up with people, whose contact details you may not otherwise have.

Notes for Instructors using Coursecheck

7th August 2019

What's Coursecheck?

Coursecheck is an online course evaluation system. It’s designed to be used in the classroom as an alternative to paper feedback forms with delegates leaving feedback using their smartphones, tablets or laptops. By getting everyone to submit their feedback together in this way, you can expect close to a 100% response rate.

Why are we using it?

By collecting feedback digitally, the results can be easily analysed which helps us look for ways in which our training services can be improved. And the Coursecheck Marketing module, (with the Learners agreement), allows their overall rating and general comments about the course, to be made public on This in turn helps us to promote our courses and provide “social proof” of the quality of our training.

How does it work?

Each event is identified by a unique six-character code. This may be given to you by your Coursecheck administrator; or if you have a Coursecheck account, you can log in and find it in the “My schedule” menu option. Ledarners use the code to gain access to the course evaluation form and leave their feedback.

What do I need to do?

  1. Before the last day of the course, make sure you know the event code for your course.
  2. As part of your introduction on the last day of the course, make the delegates aware that you’ll be asking them for their feedback at the end.
  3. When the time comes, introduce the feedback as the final exercise of the day and reassure them that it will only take a few minutes. Explain that instead of paper forms, you use an independent online survey system and they can leave feedback using their mobile phone, tablet or laptop.
  4. Ask them to go to and click on the “Leave Feedback” button at the top of the home page. While they’re doing that, write the event code on the whiteboard or flip chart.
  5. Whilst Learners are leaving their feedback:
    - Make the point that the feedback they provide is genuinely important to the business and that you appreciate them taking the time to provide it.
    - Emphasise that you want them to be open and honest as it’s only by getting detailed feedback that the company knows what’s working well and what could be better.
    - If you're using the Coursecheck marketing module, explain that with their permission, their general comments and overall rating will be visible on but all other feedback will only be visible to us, the training provider.
    - Reassure them that the system is fully GDPR compliant and that the information they provide will not be misused.
    - Thank them.
  6. If you want to see the feedback that’s been submitted about your course, then ask your Coursecheck administrator for a report, or log in to your Coursecheck account.

Frequently Asked Questions (by Learners)

Will all my feedback be made public?
No. With your permission, your general comments and overall rating will be made public but your responses to other questions will only be visible to your training provider.

Will I be identifiable?
If you give permission for your review to be made public, you will be identified only by your first name and Initial. Your full name, if you provide it, will only be visible to your training provider. Depending on how the training provider has configured Coursecheck, you may also have an option to remain anonymous.

Is Coursecheck GDPR-compliant?
Yes. Coursecheck is registered under the Data Protection Act with licence No. ZA056850.

How will the information I provide, be used?
Your feedback and any personal Information you provide will be shared with your training provider and never with other third parties. Your training provider also commits never to share the information you provide, with third parties.

What's a good Net Promoter Score?

2nd July 2019

If you're unfamiliar with Net Promoter Score (NPS), it's an established way of analysing responses to the question: "On a scale of 0-10, how likely are you to recommend us?". The way it works is that scores of six or less are treated as negative. These are your detractors; scores of 9 or 10 are your fans; and scores of 7 or 8 are neutral. NPS is calculated by taking your percentage of fans and subtracting the percentage of detractors. Neutral scores are ignored. So NPS can vary between +100 and -100 and the rule of thumb is that you want your NPS score to be positive, meaning you have more fans than detractors.

Although NPS is a popular way of measuring customer satisfaction, it can be difficult to know what score is a good score. So we decided to crunch the numbers on Coursecheck and see how NPS varied based on company size, subject matter and the price of the training. In total, we looked at around 40,000 survey responses, but it's worth bearing in mind that Coursecheck attracts training providers who are serious about quality and customer feedback and so when we talk about averages, these are likely to be higher than for the industry overall. With that caveat, here's what we learnt:

Overall NPS analysis

The average NPS across all training providers on Coursecheck over the last 12 months, was +71 but as you can see from the chart, this masks some big variations. The most commonly achieved NPS was between +65 and +75 but around half all companies were below that level. At the other end of the scale, the numbers fall more sharply with only 17% of companies managing to achieve a score of over 75.

NPS by company size

When we looked at average NPS by company size, we noticed a clear correlation, with smaller companies on average, outperforming their larger counterparts.  We think this is because:

  • Smaller companies, often owner-managed, are typically extremely passionate and driven by what they do
  • Customer service in smaller companies is generally much more personalised which gives them an advantage over their larger peers.
  • Larger training companies tend to have more trainers and it's always going to be harder to ensure that every single trainer is right on top of their game.

NPS by price of training

When we looked at NPS based on the typical price of training, again we found a correlation.  Here, there was relatively greater satisfaction with lower-value short training courses than there was with longer more expensive ones.  As to why this might be, we concluded that ultimately, customer satisfaction is about the extent to which customers expectations are met.  Someone flying first class might be unhappy about a tiny thing not being perfect, whereas in economy, expectations are much lower.  So even though the first class passenger had the superior experience, it could well result in a lower satisfaction score.


NPS by subject area

Finally we looked at NPS based on the subject matter being taught.  This proved less conclusive although it was significant that the subject area getting the highest satisfaction was Health & Welfare.  This includes a number of companies offering training in topics such as Mental Health First Aid (MHFA), Physical First Aid, Autism and Suicide awareness.  Instructors teaching these subjects tend to be extremely passionate about their chosen field and often work in the front line. So it's perhaps not surprising that these subject areas scored so highly.



The bottom line is that if you're achieving an NPS in excess of +75, then that's good by any definition.  But if you want to get a better idea of what you should be aiming for based on your company size and what you typically charge for training, then you can use the table to see the average NPS for companies like you. As you can see, if you're a small company offering low cost training, then the bar is higher at +82, than it is for a large company offering expensive courses where anything above +40 is something to be proud of.



Installing the Coursecheck widget

1st July 2019

Implementing the Coursecheck widget is a straightforward process, which should take only a few minutes to do. By installing it on your website, prospective customers can see how many reviews you have, together with your average rating, with the figures updated in real time.

Company or course-level widget?

There are two widget types: one displays your overall company ratings; and the other is specific to a particular course. We recommend starting with the company-level widget and introducing course-level widgets after you’ve built up a reasonable number of reviews about a particular course. You’re free to use both types and if you’re using course-level widgets, you don’t need to display one for every course.

Deciding where to display the widget

Widgets can be placed anywhere on your website and you can include it on an as many pages as you like. But we recommend that you do not place it anywhere where it’s not strictly relevant as this can actually have a negative effect on SEO.

Implementation instructions

To implement the widget, you need to insert an HTML snippet into the relevant page/template of your website. This will be provided to you by the Coursecheck support team. The widget consists of a panel 160 pixels wide by 60 pixels high, with the text/image right justified within it. They’re designed to be displayed on a white background so if your background is anything other than white, you’ll first need to create a white area (but not an iframe) on which to place the widget. We suggest that you encapsulate the widget with some descriptive text, styling the heading and text to be consistent with your own website.

How to make your feedback matter

1st August 2019

It’s all very well collecting customer feedback but unless it can be put to good use, it’s something of a pointless exercise.

Compared to other industries, most training companies collect huge quantities of feedback. For them, the challenge has always been not so much how to collect feedback, but what to do with it all! The key thing is to start with the end in mind.  If you're serious about continuous improvement, then you need to start by deciding how you're going to measure your success.   This could be a simple average star rating for your courses but a more useful measure is Net Promoter Score (NPS).

Net Promoter Score (NPS)

If you're unfamiliar with NPS, it's an established method of analysing responses to a question that's often found on feedback forms: "On a scale of 0-10, how likely are you to recommend us?".  The way it works is that scores of six or less are treated as negative. These are you detractors. Scores of 9 or 10 are your fans; and scores of 7 or 8 are neutral.  NPS is calculated by taking your overall percentage of fans and subtracting the overall percentage of detractors. Neutral scores are ignored.  So your NPS can vary between +100 and -100 and the rule of thumb is that you want your NPS score to be positive, meaning that you have more fans than detractors. For more about what NPS score you should be looking to achieve, see here.

Set targets and report against them

Whatever metric you choose to use, if you want people to care about quality, then you need to make a lot of noise about it within your company. And do so on a regular basis. There's nothing worse than a quality initiative that fizzles out. If that happens, there's a real danger that people take a cynical view that it was just a passing fad and that quality actually drops as a result.  Keeping a quality initiative going can be hard work which is why it's best not to be too ambitious to start with - you can always expand it later.

Lead and Lag indicators

Although NPS is a good way to report on the quality of the service you're providing, if it's low or starts to fall, it tells you nothing about what the underlying problem is, let alone what to do about it. And because whatever has caused the drop has already happened, it's too late to do anything about it.  For this reason, NPS is referred to as a Lag indicator.

By contrast, Lead indicators are about measuring underlying behaviours exhibited by your team that you believe will lead to a good outcome.  For example, you might take the view that if your instructors were to spend time, sitting in on each others courses, they would be able to swop tips and learn from each other; and that the result would be more satisified customers. This would be a Lead indicator. The way to test the theory would be to introduce the new behavious, measure how much time was being spent in this way and then see whether this led to a higher NPS in the following weeks and months.  If it made a difference then you could encourage more of the same; and if it made no difference you could consider what else might make a difference and track in the same way.

Share your results

Having set targets and measured results against them, you now need to share your results.  There's evidence to support the notion that even if you do nothing else, simply publishing your performance actually causes positive behavioural changes. But regardless of that, being transparent about performance and sharing it widely whether good or bad, is something to be encouraged.  It allows you to learn from mistakes, celebrate success and get everyone aligned on initiatives to improve even more.

Make it matter

The ultimate way to make feedback matter is to introduce a carrot and stick approach to get people to behave in the way that you think will be most beneficial for the business.  One Coursecheck customer goes as far as bonusing every member of staff - not just the trainers - on the NPS score for the month.  And guess what? Everyone is extremely interested in what that score is, and surprise, surprise, it's invariably very good indeed.

How to get high quality customer feedback

1st August 2019

When it comes to collecting feedback, training companies are very fortunate. Whereas a typical consumer business would be delighted to get feedback from even 5% of their customers, for training companies, it’s in the DNA that they should strive for 100%; and if they're running classroom training, then there is the potential to achieve this. But whilst response rates matter, what matters more is the quality of the feedback and this is something that's often overlooked. In our view, the key reasons for collecting feedback should be:

1. To gain insight about the quality of the training services being offered

If we're going to make changes to the way we do things, we need to be basing our decisions on sound evidence. In practise, this means knowing that the feedback we've got is representative. And that's more about the sheer quantity of feedback than the response rate. A decision to change something based on a hundred survey responses is always going to be more reliable than one based on ten responses so if you're training a lot of people, then response rates are relatively less important than for a small company where every response matters.

2. To know whether there are any individuals who should be followed up on a one-one basis.

When it comes to responding to feedback, then of course we can only do that if we have something to respond to. And that comes down to encouraging people to be honest and to tell you about even the small things that could be improved. The good news is that if someone has something very positive or negative to say, they're naturally more inclined to want to leave feedback so it's those in the middle that need the most encouragement.

3. For training companies, customer feedback can be a valuable marketing asset

If you're using feedback for marketing purposes, then the overall quantity that matters, but only up to a point. That's because although there's a big difference between having ten reviews and a hundred there's only a relatively small additional benefit between having a hundred and a thousand. Prospective customers simply need to believe that the reviews they're reading are representative and as long as there are a reasonable number of them, and some are recent, then the marketing objective is achieved.

In conclusion, it's worth doing everything you can to maximise both the quantity and quality of feedback you collect. The way to achieve this, depends to some extent on the type of training you offer. If you're running classroom training, then we strongly recommend collecting feedback before the Learners leave the room. As well as guaranteeing a high response rate, it's also saves you the trouble of sending them a survey link. Either way, your instructors are key to successful feedback collection and we recommend that they:

  • Mention at the start of the course, that you're going to be asking for feedback
  • Make it a group activity
  • Be enthusiastic and appreciative
  • Explain the benefits and how the information they provide, will be used
  • Reassure them about any privacy concerns they might have
  • Reassure them that the whole process will only take a few minutes

If you're sending out post-event feedback request emails, then it's even more important that the trainer makes the case for Learners not to ignore the email that's coming their way.

Feedback request emails

When sending out post-course requests for feedback, we recommend that:

  • The email is sent out within 24 hours of the completion of the course. Any longer than that, will result in a significantly lower response rate. Better still, send it out in advance so that it's aleady in their inbox by the end of the course.
  • You use a subject line that is likely to get the learner to open the email. e.g. "Your recent training with ”. It doesn’t need to mention the word “feedback" which may put them off!
  • In the body of your email, reiterate what your trainers have said in the classroom: explain why you are asking for their feedback, tell them how much you value it, reassure them that the information they provide will not be misused, and emphasise that it’s a short survey form that will only take them a few minutes to complete.

What makes a good feedback form?

27th July 2019

When designing a feedback form, the temptation is to dive straight in and start writing questions.  But by first thinking about your objectives and what you really want to know, it will make the design process much more structured. Do you simply want to run a health check and confirm that everything is in good working order; or are you looking for ways to improve? If you're serious about quality, then feedback forms should be all about improvement. Remember that your questions don't have to remain set in stone for ever.  So if you want to solicit feedback about a particular point, there's no reason not to ask the question until you've got the information you need, and then replace it with something else. Quality control is a journey not a destination and to be effective, you need to regularly review your feedback form and make sure it's fit for purpose.

Question types

Score-based questions are fine as a way of measuring your ability to delight your customers but if you want to get some insight as to how you could delight them even more, then you need more open question types in the mix.  There's no substitute for asking Learners to suggest how a course might be improved; or if you already have some ideas of your own, then you can use radio button or multiple choice type questions to find out whether your customers agree.

Mandatory questions

When using paper forms, you can't force your learners to answer your questions but when switching to a digital approach, it can be very tempting to set up mandatory questions and force your Learners to give you their views.  Our advice is Don't! No one likes to be bullied and the danger is that at best, you'll get little insight and at worst, you may get no feedback at all.  Much better that prior to collecting feedback, your instructors explain to Learners why their feedback is so important and let them give it of their own free will.

Answerable questions

It may sound obvious but don't ask questions that Learners may struggle to answer accurately.  For example, it might be invaluable to know how much time a Learner expects to save armed with their newly acquired skills but will they really know that when they've only just completed the course? Probably not. Far better to save questions such as this for a follow up survey to be completed, say, a month after the training by which time your learners should be able to give you a much more informed response.

How many questions?

When it comes to survey forms, the old adage Less is More, holds true.  We recommend no more than ten questions on a form and there's no reason why it can't be even less.  One way to reduce the number of questions is to consider whether you have redundancy.  In other words, if two questions are really asking the same thing in different ways, then see if you can merge them into one.  For each question, you should also consider why you are asking that particular question and how you will use the responses to improve your business.  If your reason for asking is because it would be "interesting to know", then it's probably one you can remove.

Who wants to know?

Consider how the feedback you collect, will be shared amongst the stakeholders. Your instructors are likely to want to know how their training was perceived, especially if you're transitioning away from paper when they would have been the first to know.  Your training manager and marketing managers will also be interested in feedback; and last but not least there are your customers.  This is especially important if your main focus is on running closed events.  In short, when designing your feedback form, it's worth getting input from all of these stakeholders and ensuring that all their needs are met.

Further information

For a free diagnostic check up on your existing feedback form, visit run by renowned feedback expert Will Thalheimer.

Getting started with Coursecheck

27th July 2019

At Coursecheck, we appreciate that whilst a trial might be free, it’s still going to take up your valuable time. And because time is money, we like to do everything we can to minimise the effort required on your part.

When you sign up for a free trial, we’ll immediately get in touch to make sure we understand exactly what your objectives are. We’ll then get to work doing the initial set up for you. If you’re a training provider, we’ll do this using information from your website.  Typically, this will include:

  • A company profile, complete with your company logo, company description, video content, accreditations, contact details, social media links and unique selling points. 
  • The locations where you run your open courses
  • A dedicated page for each of your course outlines, including backlinks to your website for people to find out more information, and an enquiry form through which people can get in touch with you directly.

Everything we set up is maintainable by you through the Coursecheck portal but by doing the heavy lifting for you at the outset, it means you can start collecting feedback almost straight away.

The initial set up is usually completed within 48 hours and we’ll then schedule a web demonstration to show you how to:

  • Fine tune the company and course profiles we’ve built for you
  • Configure your survey form with whatever questions you want to ask
  • Set up User accounts for managers, administrators and trainers
  • Set up the schedule of events for the trial

We’ll also send you some briefing notes that you can give to your trainers so that they know exactly what to do and how to answer any questions that your customers might have.

During the trial

During the trial, we’ll support you by phone and email, and show you how to use the analysis module to compare feedback by course, trainer or venue; and how to respond to comments left by your customers.


If the trial is successful and you subsequently sign up to one of our flexible subscription models, we’ll be pleased to help you with:

  • Adding the Coursecheck widget to your website to make it easy for people to read the feedback you’re collecting
  • Optimising your course outlines for SEO purposes
  • Best practice for responding to feedback
  • Options for integrating Coursecheck with your Training Administration or CRM systems

Note for in-house training departments

If you’re an in-house training department, then you won’t have a public profile on Coursecheck but the rest of the support we provide is very similar to that described above.

Using accessplanit to manage your training?

5th June 2019

If you use accessplanit to manage your training business, our off-the-shelf integration capability enables your course schedule in accessplanit to be automatically synchronised into Coursecheck, saving valuable time and effort, and eliminating the risk of inconsistent data. As an alternative to collecting feedback in the classroom, you can also use accessplanit to automatically send out Coursecheck feedback requests to learners upon completion of their courses.

Key benefits for accessplanit customers

  1. An interactive dashboard and automated reports, delivered to your inbox, let you see the big picture without ever losing sight of the details.
  2. Reduced time spent on feedback administration.
  3. Attendees prefer to leave feedback digitally, so you’ll find that feedback is more insightful than on paper
  4. By showcasing customer feedback on Coursecheck and on your website, you can “prove” how good your courses are.
  5. Google loves reviews so your search engine rankings get a big boost.
  6. Integration with social media makes it easy for people to spread the word about your courses
  7. Show your customers you're listening by responding to their feedback with comments of your own.

About accessplanit

accessplanit is a software house, specialising in training management software. Training providers can manage their course bookings; resources and delegates; automate all their customer communications, and take online bookings and payments, all within the accessplanit system. Customers love what the system can do for their businesses.

accessplanit are committed to the success of their clients’ training businesses and have a proven process that guarantees to guide them to success. They have worked with hundreds of training companies, and truly understand the industry.

Originally a training company themselves, accessplanit saw a need for better training management; developed their first system in 2001 and haven’t looked back since.

accessplanit aim to build life-long relationships with their clients; offering award-winning customer support for small and large companies alike. They are an ISO 27001 accredited and G-cloud 11 approved company with smart hosting capabilities, so you know you can trust them to keep your data safe and your business online.

For more about accessplanit, visit

Why should training businesses go digital?

1st December 2018

In an age when just about every industry has been digitised, especially within the realm of training provision, why is paper still the “go to” tool of choice to collect training feedback? Here are five issues with this approach:

  1. Anonymity — It’s difficult for delegates to be entirely truthful when submitting feedback on paper, especially if comments relate to the trainer. Digital forms, which can be completed in the classroom or sent to delegates at a later date, allow students to provide considered and honest responses.
  2. Accuracy — For providers that transpose data from paper forms onto a spreadsheet/database, there is the potential for human error, and often it’s simply not possible to log every piece of information, especially if responses include lengthy comments.
  3. Less is more — Whilst online feedback forms can, perceptually, have lower response rates, experience shows that the additional insight from digital feedback delivers a far deeper (and more useful) level of understanding. Additionally, comment areas, which are mostly ignored on paper forms (unless particularly positive or negative), are readily available for review.
  4. Timing is everything — Instant visibility of a learning experience which requires immediate response is critical. With paper feedback forms, days or even weeks may have passed before delegate reviews are analysed, and the opportunity to react is missed.
  5. Analytics — Digital systems give you access to powerful analytics via management dashboards. At the touch of a button you can monitor performance across course type and trainer, in real time.

20/20 Vision

Digital feedback systems allow your delegates to submit their feedback, in confidence, and you can gather ALL the information you need to evaluate the performance of courses you have commissioned, and of the educators who have delivered the training.

Aside from the obvious conservation element, paper forms have to be printed, collated and distributed (which takes time and costs money). Online forms accurately collect and collate every piece of information the student shares with pre-built analysis available for you to view as soon as the feedback has been uploaded. You can respond to the good, the bad, the ugly in a timely manner.

Online feedback systems enable you to track, in real time, the effectiveness of your learning programs whilst reducing the amount of time and money spent understanding delegate experience, all whilst adding to your green credentials. Training feedback definitely benefits from a digital touch.