Touchpoint is a short survey that can be offered to students part way through the delivery of a topic, to try and gauge their thoughts around their learning experiences to date. Although this can be incredibly useful, to avoid some common pitfalls it is really important to get some key design and analysis considerations sorted from the start. Although not exhaustive, here are some important considerations…
1.Be aware of the technology and use it well
Within FLO, you can use the Feedback tool to create surveys you want your students to complete. You can include many different types of questions, with multiple choice, short response and long response questions being commonly used. The best way to become familiar with what options are available is to create a Feedback activity and investigate its functions. You may have come across some really sophisticated survey questions, such as matrix tables and visual analogue scales previously. Even though these more complex options are not available within FLO, you can still get good outcomes with the tools available.
2. Use sound survey design principles
To develop an effective Touchpoint survey, apply sound survey development principles. Some examples include:
- Minimising the number of questions that you ask – longer surveys can have lower completion rates
- Having a clear purpose so that you are seeking specific information rather than mining for random data
- Double checking the wording of your questions to make sure they aren’t ambiguous
- Minimising the number of times you ask your students to complete your surveys. Asking too many times can also reduce completion rates. If you are seeking feedback part way through the semester, consider putting a single Touchpoint survey in at the midway point. If you want to determine whether differences in perception emerge over the semester, look at including one Touchpoint survey a few weeks into, and another a few weeks from the end of, the teaching semester.
3. Think about time
One of the great things about the Feedback tool in FLO is that you can generate automatic reports on your data. To do this, click on the ‘Analysis’ tab. However, this function is only available for questions where students select responses from pre-populated options. In other words, you can’t use it for open-ended questions such as long and short responses. Where possible, it’s best to aim for closed-type questions in your Touchpoint surveys. In addition to the efficiency factor around analysis, think about buy-in from the students. Many students love having the opportunity to say what they want, so it is a good idea to have at least one open-ended question that allows this (e.g. ‘Do you have any other comments?’ as a final question). However, many students also do not provide any comments for these types of questions, so having lots of open-ended questions might mean that it will take you a large amount of time to analyse a small amount of data.
4. Don’t promise the world
On your FLO site, you will need to introduce the Touchpoint survey, invite students to participate, and explain its purpose. Ideally the information that you gain part way through delivery of a topic will enable you to make minor adjustments to teaching for the rest of the semester. Sometimes, you won’t be able to make these in real time, so make sure that you manage expectations by using wording that leaves students informed about possibilities but not disappointed by unkept declarations.
5. De-identify your participants
When you look at your survey data, it’s possible you may come across responses represented by a small sample size. For example, in asking students in a lecture to complete a survey that includes asking them to identify their age range, you might find something like what is shown in Table 1:
Age range | No. responses | Age range | No. responses |
17 – 19 | 34 | 45 – 49 | 1 |
20 – 24 | 52 | 50 – 54 | 0 |
25 – 29 | 21 | 55 – 59 | 0 |
30 – 34 | 10 | 60 – 64 | 0 |
35 – 39 | 1 | 65 and over | 1 |
40 – 44 | 0 |
Table 1: Age distribution of students enrolled in SURV101
In this situation, most of the respondents are under the age of 35, with the remaining three being distributed across a variety of age ranges. Think about the person who has nominated that they are over the age of 65. While it may be clear that they are older than everyone else, they may not want people to know how old they are. By looking at this data though, it could be possible to work out that they are at least 65 years of age. The solution to this scenario is to work according to the principal of three. By having a minimum of three people in a group, it isn’t possible to tell who has responded to what. In other words, respondents have been de-identified. In this situation, the data would be presented as follows (Table 2):
Age range | No. responses | Age range | No. responses |
17 – 19 | 34 | 30 – 34 | 10 |
20 – 24 | 52 | 35 and over | 3 |
25 – 29 | 21 |
Table 2: Age distribution of students enrolled in SURV101
It’s really important to make sure that anyone who participates in your surveys always remains de-identified to ensure that confidentiality is maintained.
Although not exhaustive, these tips should give you a good starting point. Some excellent resources are available to give you further guidance around survey design. Some are quite sophisticated, but if you are after some easily digestible summaries, you may be interested in the following resources:
- Tip sheet on question wording (Harvard University)
- Things to think about before designing a survey (UCL)
- Survey design: getting the results you need (University of Virginia)
Also remember that your academic development support and eLearning teams are available to help.
Contributed by Dr Cheryl Schelbach
Learning Designer – CILT