On 4th June 2014 at the Birmingham Rep Theatre, I felt a bit like going to Cannes for a sneak preview. If Cannes is one of the events to be at for films, for surveys and student evaluation, the HEA Surveys for Enhancements conference is THE one to be at. Admittedly, without the red carpet though!
This year’s hot topic was the review of the 10-year old NSS (National Student Survey) by Professor Janet Beer‘s opening keynote as current chair of the steering group for the National Student Survey (HEPISG).
From Professor Beer, we learnt the current state of the NSS review: the NSS is staying with some rewording and modifications (it has been 10 years since the NSS started!), and the consideration of including questions on student engagement and academic experience. The 3 purposes of NSS remain valid (Quality Enhancement, Quality Assurance and public information). As Professor Beer observed, there was ‘limited appetite for change and support for retaining most questions’; the timeline of changes is as follows:
- Autumn 2014 – cognitive testing of new NSS survey
- Spring 2015 – piloting new NSS survey
- 2017: first new NSS survey
Prof Beer offered a teasing taster of a data analysis tool that will be available on the 10-year data set of the NSS, the full data will be available for analysis (not by institution) – but we have to wait until early 2015 for this! Nevertheless, it will be a customisable tool, which caused the excitement.
Taking a holistic and more critical approach, Dr Camille Kandiko Howson‘s keynote was on “Metrics that Matter: Student Expectations and Effective Educational Practices”, pressing the point that it is the process of student evaluation – the ‘who does what’ and ‘for what purpose’ with the data – that is more crucial than the ‘what data’ is collected. I loved Camille’s analogy of deep/surface learning applied to the way institutions deal with survey data. She argued that many institutions take a surface approach to using their survey data: trying to climb up on the league tables rather than the deep approach: really engaging with and thinking about their data well and how they could lead to enhancement.
My highlights from the parallel sessions were:
* Hearing about an interesting initiative from next door: LJMU‘s Dr Elena Zaitseva talking about how they have changed their module surveys to be much more concise: 4 quantitative and 4 free text questions only, focusing on student learning and development. This led to a discernible shift in free text responses from satisfaction to student reflection and learning in students’ responses.
* I didn’t see the session by Dr Sarah Townley at the Liverpool School of Tropical Medicine on their use of focus groups for student feedback, but it resonates with our Vet School’s student feedback system based on focus group methods, which has been very successful and which Ellen Singer will talk about at this year’s Learning and Teaching Conference. One to get the slides from!
* I also picked up as a skill from Dr Metaxia Pavlakou, Oxford Brookes University, who led a workshop session on cognitive interviewing to test survey responses. It is something I have done before piloting a survey but it was nice to consider and learn about the ‘official methodology’. A good book link given was Cognitive Interviewing by Willis (2005).
Another initiative I have been keeping an eye on is the UKES (UK Engagement Survey) pilot. UKES is the UK-adaptation of NSSE (National Student Survey of Engagement, US) which is currently being piloted at 36 institutions across the UK. Once piloted, it will be made available from 2015 to all institutions by the HEA. The purpose of UKES is quality enhancement – its data are never publicly available, they are for the institutions to make use of.
The panel discussion about the ‘Future of surveys’ was a stimulating debate including the keynotes, students, academics and HEA staff. At times I had felt opinions waged a battle between a Hollywood blockbuster aimed at satisfaction (the NSS) and a thoughtful independent movie (NSSE/UKES aimed at student engagement). But it was not that simple. There was a range of views from commitment to the NSS as a tool for helping the student voice through to experiences of it being used as a ‘stick to be beaten with’. The audience was pointedly asked: “How many of you know the departments with the worst NSS scores?” “And how many of know you know those with the best scores?” (Has anyone got any praise?). Views also highlighted the importance that the NSS should never be the only data collected: “Triangulate!”. The importance of the process of student evaluation (what happens with the data) was stressed again. Audience questions also suggested at the institution’s service there are many other means – focus groups, well-working student rep systems etc. – of eliciting student feedback and ways of responding to them – at their centre being systems and processes built on constructive dialogue between staff and students.
Tünde Varga-Atkins, eLearning Unit
- Presentations and resources from the Conference
- More information about the UKES pilot;
- If you want to hear more about what went on in the conference- check our the twitter back-chat on Twitter feed of the conference #s4e2014 (Storify);
- The HEA also highlighted another recent report that explored the academic experience of students, especially in the context of the new fees-regime: the HEPI-HEA student experience academic survey 2014 report.