Black Sabbath to Busted – Report on Blackboard Enhanced Assessment and Feedback Event

Leaving home under an early-morning starry sky and up over the misty, snowy Pennines to Sheffield, I wasn’t sure what to expect from the Blackboard Enhanced Assessment and Feedback day to which I was travelling. I had a sketched outline of the themes for the day ahead but not much more, namely that we would engage in some way with:

Assessment and feedback – an institutional perspective

  • Examination of key drivers and challenges (reputations, quality of process, quality of data, legal, business efficiencies, risk etc.)
  • Placing use cases on a confidence/effectiveness model
  • Highlight relevant Blackboard solutions

 Assessment and feedback – innovation and organisational change

  • Identifications for drivers for change
  • Gap analysis of practice and stakeholder experiences
  • Prioritisation of opportunities for change
  • Highlight relevant Blackboard solutions

This kind of workshoppy day from Blackboard was something I hadn’t experienced before, so I was propelled by curiosity as much as that it seemed relevant to the work that our team are currently leading on e-submission and e-feedback at Liverpool. What I got was a useful day of frank discussion and sharing of experiences, ideas and commonalities, which was mostly reassuring, with colleagues from other institutions in the kind if depth you don’t often get. This is especially useful for a sense of the bigger picture in HE, to talk about the differently badged or described but largely similar activities, structures and strategies that are top of our agendas at the moment, e-submission and e-marking being one of Liverpool’s current strategic TEL focus. One universal and rapidly-emerging area of concern that became evident on the day was a need for a variety of programme-level views of assessment activity in the VLE for academic and administrative staff and students. This is a long-requested feature from Blackboard usergroups that’s time has come with the adoption of e-submission and e-marking policies across the sector of late and I hope this was the main takeaway message for the Blackboard team.

The event was run by our regional Blackboard Customer Success Team, in partnership with the BB North user group, recognising a need for a more extensive exploration of particular issues that get raised at user group meetings, where the format doesn’t allow fuller discussion. Whilst advertised as intended for senior leaders, learning technologists, TEL managers and academic staff, the majority attending today were learning technologist types. It was instructive to hear that on the previous day at Edinburgh a couple of PVCs had attended, sending some very positive signals about the depth of an institutions’ engagement and intent with the actual tools that students, academic and professional service staff use as a part of their everyday life at the University.

Music, sweet music…

Our first activity was to introduce ourselves telling the room the first piece of music that we had ever bought. An astonishing array of formats and first loves was paraded, from Now compilation tapes to Avril Levigne downloads, from Osmonds vinyls to Busted CDs. Blackboard’s Alan Masson and Gillian Fielding all admitted to their first purchases, but I’ll spare their blushes here. Top-trumping all these, however, was Blackboard’s Steve Hoole, whose overnight Novotel stay featured a vinyl deck (remember these, kids?) and a selection of Sheffield synth heaven albums to spin the night away.

Structure of the sessions

The morning and the afternoon were structured in a similar way so that we would first ‘brainstorm’ our thoughts in groups on a set of e-assessment themes, then work together on some specific ideas from those and bring something interesting back to the room. We’d finally end with a discussion of the potentially useful tools in Blackboard that could be a part of the thinking for some of these. I initially thought this last element was going to be a sales pitch but it was pleasingly nuanced in that the Blackboard team wanted to hear stories of how people are using these tools, where they’re working well and what the gaps are. A very clear point made more than once was that the recent Blackboard activity in developing the assignment tool to offer dual marking, moderation and anonymous marking had been excellent but it now seemed that the Blackboard focus had moved on from this, whilst the toolset still needs work, that there had only been one iteration of the process. Also that this process of close consultation should be constantly repeated for other areas as our needs are constantly evolving, not just for assessment.

Morning Session – what’s needed to enhance assessment and feedback practice?

For the morning, we’d thought about quality, processes and workflows and where the opportunities for enhancement lay. We chucked all of our ideas at the first Padlet below. You’ll find all of the issues that we have encountered in the course of our work as learning technology developers and as part of the University’s EMA project, from how to handle video submission and feedback, to combining some functionality of Blackboard assignments (group tools, double marking, staged release of feedback, etc) with that of Turnitin assignments (Originality Checking, GradeMark, offline marking). In fact, this was another big ask on the day, that Turnitin and Blackboard align/integrate their products in ways that will help us, as you will gather from some of the posts in this Padlet.

Made with Padlet

What leapt out at me was a so-far un-encountered issue at Liverpool of needing a read-only external examiner access to modules. In some institutions administrative staff are packaging up content and assignments into a special section of a module, which only the external has access to, and making the rest of the module inaccessible to them. This is to meet an anxiety around externals potentially changing grades and altering content, but it costs hours of administrative time, essentially duplicating what’s already in the module, so re-introducing at a later stage of the assessment cycle serious administrative burden that the electronic submission process had originally taken away from the front end. What’s needed is a read-only access enrollment level, which is another development idea for the Blackboard team to add to the suggestions box.

Moving on to the next activity in groups again, we were tasked with listing and describing up to five assessment and feedback enhancements that would have significant benefits for the listed stakeholders and the degree to which it would require resource. This photo shows our effort…

dsc01104

…and for those who aren’t adept at reading the handwriting of people who spend their whole day attached to a keyboard our five enhancements (all pretty standard) were:

  1. External Examiner Access – read-only access for external examiners or a similar idea.
  2. Student Assessment Journey – programme level views of student assessment activity for students and staff
  3. Flexible innovative assessment – making the assessment and feedback tools at all points in Blackboard, not just for assignments, so that you can start thinking about using any tool for assessment purposes.
  4. Double marking – further work on the current functionality to take it to a robust, fully-usable level.
  5. Programme Level Assessment – looking at assessment practice across entire programmes and thinking about programme-level learning outcomes.

Hearing back from the rest of the room we discussed in more depth some of the already described above (external examiner access, programme-level views of assessment) and the Blackboard team promised to send round some case study examples of good practice for external examiner processes using Blackboard tools. In a discussion around whether and how institutions were using the Delegated Grading functionality, which was designed for UK HEIs, again the Blackboard team said they would gather together some case studies of where these are being used well. The feeling from the room was that this kind of functionality should be available across all assessment tools rather than locked to a single tool.

A few other interesting discussion points to end the morning session were that many institutions are thinking at programme level about replacing traditional assignment assessment. Video assignments and feedback are rapidly on the rise but also causing headaches as infrastructure and policy isn’t keeping up.

Afternoon Session

As I said, the afternoon session followed the same structure. So our post-lunch digestif activity was another Padlet , this time thinking about innovation and new practices that would enhance assessment and feedback in our institutions. I’ll let the Padlet do the talking so scroll around to see the ideas. I was interested in things like students being able to select the kinds of assessment that they wanted to do, and learners and academics developing assessment literacies through feedback dialogue and feed-forward as a continuous process.

Made with Padlet

As in the morning, next was a group task, where we were asked to think “aspirationally” about how we imagined assessment could look, if we had a free rein. What change or innovation in assessment and/or feedback would have significant impact and how would it benefit learners, tutors, courses and institutions? Essentially we were encouraged to go wild in the aisles of transformative assessment practice.

Our group went Back to Basics and offered the transformative potential of programmes where learning outcomes were mapped to assessment.  Well, someone had to. Other groups had some tidy, Tomorrow’s World ideas including:

  • An assessment wizard which built the kind of assessment you wanted with one view for staff and students (no more multiple systems or at least hiding these from you).
  • A tool that surfaced programme level assessment data.
  • A tool for personalised feedback and assessment routes – feedback raises flags on further help students can get and other staff can see that in later assessments.

And as per the morning session the Blackboard team led a discussion on how their products could work to do some of these things. One thing they did bring back to my attention was the Goals and Outcomes system which has a new dashboard view of the data and I think it would be opportune to review this in the light of programme development work that is heading the way of our team, as this could present an opportunity for offering programme-level views of progress through modules.

The end

So not a sales day, not your regular product roadmap/roadshow day, this represented a deeper dive into Electronic Management of Assessment, including the Blackboard tools that can be a part of that that environment. The Blackboard team wanted the day to be about sharing practice, raising awareness of what Blackboard tools we have already and encouraging us to get the best we can out the Blackboard tools and products that you have and I think this was more than achieved on the day. Thanks to the team and to the Bb North UserGroup for arranging and hosting. I had some very useful conversations (which is pretty much usual for the BB North User Groups meetings) and plenty from all of the above to take back for the project board overseeing the implementation of an e-submission policy at Liverpool.

What music did I first buy? The Muppet Show album. And I’ve never needed any other in my life…

Dan

Report: e-Learning Network Meeting – January 2017

We were delighted to welcome Professor Helen O’Sullivan, APVC Online Learning, as speaker at the first e-Learning Network meeting of 2017. Helen spoke to us about the University’s new Education Strategy, giving the network an overview of the structures, leadership teams and immediate priorities. The recording of that talk is linked-to below. Helen then led a discussion workshop on what an institutional Digital Education vision might look like (this part of the session is not recorded). We also managed to make time for a couple of extra items: a first look at Turnitin Feedback Studio, the new design for GradeMark we will be moving to in July; and an introduction to the Go Mobile user group that began meeting this academic year. A pretty busy lunchtime for the forty staff members who came together for this valued networking event.

Professor Helen O’Sullivan

So much is going on at the University at the moment it was a welcome opportunity to spend some time thinking through and discussing how current strategies relate to our own interest area and Helen did a great job of this, even in the sweltering conditions of our meeting room. The Education Strategy’s core values, ‘Liverpool Hallmarks’, of ‘research-connected teaching, active learning and authentic assessment’ are immediately appealing to anyone interested in learning and teaching, and learning technologies can play a critical role in these. I won’t go into micro-detail but what I found really useful was an update on the top priorities for the coming year, including the setting-up of a new Programmes Development Team, a media technical support team, continued work on the Electronic Management of Assessment project, and also hearing about less familiar things including the focus on the London Campus portfolio and degree apprenticeships. Click the image below for the (Stream Capture) recording, about 32 minutes long.

bloghelenimage
Click the image above to watch the recorded talk by Prof O’Sullivan (32 minutes)

 

We then moved to some group discussions to consider a Liverpool take on David White’s digital leadership framework which is designed to help high-level discussion and decision-making about all things digital, giving some coherence for thinking about the whole organisation and how decisions can affect all of these layers. The framework diagram below is taken from David’s blog post (click the diagram to read) and was the starting point for the activity. In my group we focussed quite a bit on the Digital Service layer, which possibly reflected the areas we work in but which we felt was the bedrock of an organisation’s culture and medium.

Turnitin Feedback Studio – Dan Roberts

There was also a bit of time for a couple of extra items. First up was a look at the new design for Turnitin GradeMark, called Turnitin Feedback Studio. This was an out-of-the-box walkthrough and we were only examining the feedback environment. Essentially the desktop version has been rebuilt and the design is very similar to the current iPad app version, but now you will be able to use it on any device. This video below maps the key differences between our current version of GradeMark and what we will see after this summer’s upgrade. You can also try out a live, online demo if you follow this link.

No horses seemed to be startled by this new look. From a design point of view I think it is a much-improved, cleaner system, tidying away a lot of the distracting array of menus and buttons we are used to, and instead putting the most commonly-used feedback tools directly in front of you whilst marking work; no more hunting around for different comment types for instance. The rebuild has also focussed on making GradeMark fully-accessible which is great. Asking about what kinds of things people would be interested to test in the lead-in time to the summer upgrade, long-standing functionality/workflow requirements such as double marking were top of the list. Looking through the release notes whilst writing this post I can see that there is a Beta version of the multiple markers facility for which Turnitin are looking for some testers, so we will organise this through the network and the e-submission/EMA project board. Get in touch directly if you want to be a part of this testing.

Go Mobile Usergroup – Alex Spiers

We rounded off chatting about the new user group for anyone interested in anything mobile that Alex  has set up and has met a couple of times already this academic year. It is as wide-ranging as that sounds, so we’ve looked at apps, devices like the iPad pro and pen, and the kinds of things staff and students from all parts of the University are doing with mobile technologies for learning and teaching. Look out for the next meeting which we hope will be this side of Easter and we’ll release details ASAP or keep up with #LIVUNIGO.

gomobilemarch
Next meeting – April 27th

Many thanks to Helen for the valuable and engaging insight into the strategic thinking and work going on for the University’s Education Strategy, and the role that Technology-Enhanced Learning has to play as it moves into its implementation phase. It was also a great opportunity to have a first say on some emergent ideas around a Digital Vision for the University of Liverpool. This is an ongoing process and Helen would welcome more comments and feedback on anything covered in the presentation or discussion.

The next e-Learning Network meeting is scheduled for Thursday 27th April 12:30 – 2pm. The network lunch is intended primarily as a sharing event so if you have an idea for one of our meetings or anything you want to share about something you have been doing with TEL and to get some feedback and discussion from the group then please let us know.

Dan

Winter School 2017 – Week 2 Workshops – 16th – 20th January

The Winter School began today with a great session on engaging students visually in lectures with loads of ideas for presentation materials. There are a number of other sessions in the first week , which you can read about here and still book the last few places here.

This year we are also running a second week of sessions (Monday 16th to Friday 20th January) looking at technologies for learning and teaching beyond VITAL and a number of which are designed to tie-in with the online Bring Your Own Device for Learning 2017 events. All sessions listed below and bookable at: https://www.liverpool.ac.uk/cll/booking/

  • Monday 16th Jan – An introduction to Twitter (2 – 4pm)
  • Tuesday 17th Jan – GoMobile user group meeting (1 – 3pm)
  • Wednesday 18th Jan – e-submission and e-marking workshops
    • An Introduction To Electronic Submission Of Coursework (9:30 – 11am)
    • The Turnitin Assignment Tool For e-submission (Part 1) And GradeMark For Feedback (Part 2) (12 – 2pm)
    • The Blackboard Assignment Tool For e-submission (Part 1) And Feedback (Part 2) (2:30 – 4pm)
  • Thursday 19th Jan – Student as co-creators (11am – 1pm)
  • Friday 20th Jan – Advanced Twitter (2 – 4pm)

Finally, after these two weeks are a couple of events which might interest you.

  • Wednesday 25th Jan – Building good VITAL modules – a practical session looking at ways of building on the VITAL Baseline to create well-designed modules.
  • Thursday 26th Jan – eLearning Network meeting. In this meeting we will be getting a first look at the new Turnitin Feedback Studio.

All sessions bookable at: https://www.liverpool.ac.uk/cll/booking/

winter-school

VLE minimum standards evaluation – looking around

This second post to stem from our VITAL Baseline evaluation work this year tells some of the story about looking outwards to the sector for policy seeking to understand best-practice around minimum standards for VLEs.

As a part of our original work developing the VITAL baseline the working group thought it would be useful to see what, if anything, was happening with VLE minimum standards across the HE sector at that point in time. With a pressing timescale and clear directive that a) we had to have a standard and b) of what it should consist this time round, the most we could look at was who had some kind of minimum standard or expectation for their VLE and what was in it, so get some simple benchmarks and reassure ourselves that we hadn’t missed anything that we could include whilst we had this opportunity to shape our own set of standards at the development stage. As a low-cost way (for us) of gathering some rough and ready data a call went out to the ALT mailing list asking people to contribute to a Google Sheet to say whether their institution had any kind of minimum standard or expectation, or were looking to develop one, and what was in it. As a group we didn’t engage at any further level with this work (although one of us, Peter Reed, wrote up and presented some of his own analysis of the original data at a few conferences and usergroups for which you can find an entry point to here). I had always hoped to return to this work with some questions I had about the sector-wide landscape as a part of any evaluation and that what I learnt from this exercise would also inform any recommendations we would make for future development of the standard. The main areas I want now to examine are:

  • In terms of the original data gathered we should re-examine what we can understand from this. I was anxious about how crowdsourced data stands up and what the implications were of this approach for an evaluation so would like to review this methodology in some detail as well. 
  • I wondered what results we would get if we more closely defined the group(s) of institutions against which we wanted to benchmark and actively approach these. I felt at the time that the original data might encourage a view that minimum standards policies were a standard feature of the topography of Technology Enhanced Learning strategies and policies in HEIs and I wanted to test this. 
  • Then as well as asking who has a standard and what is in it the more interesting questions would include why one exists for that institution, how is its rationale contextualised (e.g. is this linked to a learning and teaching strategy?), what kinds of processes were involved in its development, is it compulsory and has any evaluation taken place?
  • For those institutions who don’t have one, what is the story, if there is one, here?
  • Beyond approaching institutions we would look at any literature, conference proceedings, blog posts and mailing list discussions around minimum standards as a part of the evaluation work more generally but specifically for this information about why institutions do or do not have a VLE baseline.

In picking this work up again for our Baseline evaluation project I thought an obvious although simplistic initial sampling strategy would be to survey the rest of the Russell group which would make for a manageable, quick desk-based web-search. I looked for what could be discovered from the websites of the Russell Group institutions for any indication of an institutional VLE standard in place and what further detail was available publicly. A first-glance review found that in December 2015:

  • Six definitely have an institutional standard with compulsory/required/expected elements.
  • Three offer institutionally recommended good practice guidance.
  • Fourteen seemed to have no institutional standard requirement or expectation.
  • For one institution I couldn’t find any detail one way or another.

I’ve collated a list with details of what each standard or recommendation above consists but one immediate thought on the above is that there might well be standards at faculty or school/departmental level where there is no institutional requirement. It would be useful to uncover this information as my own feeling is that handing over standards to more local levels is probably the way to move in the future, which we’ll discuss in a later post on any recommendations that emerge from our evaluation work. 

Also immediately what we would want to do next is to contact institutions directly and try to confirm whether what we have found accurately reflects their situation and to get a little more detail on the whys and hows as listed above in this post (and we’ll also first need to look into ethical approval around this if we are intending to publish our results) and if there might be more local standards that aren’t found on central websites and I’ll make this list available when it is as complete as possible.

Another question this simple snapshot of Russell Group institutions raises for me is whether Liverpool is leading in this area, as we appear to be in the minority here, or alternatively whether we were late-arrivers to the debate around minimum standards and have taken a different direction to the majority? Obviously the snapshot view is unconfirmed and looking at a bigger group of institutions would give better data but as I began to discuss in the first Baseline evaluation post we want to look at the evidence used that informed the decision to make this one of the first initiatives to comprise our institutional TEL strategy to assess its strengths and its limits. Does this snapshot view tell a contrasting story to that of other sources of evidence used to develop TEL strategy and the Baseline originally? Are there any similarities between Liverpool and the other Russell group institutions that have a standard in place? I’d be interested to discover the extent to which minimum standards policies feature in Technology Enhanced Learning strategies and policies in HEIs when measured in a larger sample group. Thinking further about the sample group I wondered whether we should follow this 2014 UCISA report and look at all pre-92 institutions for a greater mix of institution types and with time post-92 institutions. More work but valuable for developing our understanding.

In a similar fashion to that of the original Baseline development representing an opportunity to look at some of the internal evidence of the ways in which the VLE is being used within the institution, this outward-looking work also offered the opportunity to richly inform our thinking and strategic approach by looking to the sector and the experience and evidence here. We want to assess the extent these sources of evidence were exploited in the original development and how they could be in the future as part of our recommendations for any future institutional TEL initiatives.

As an almost tangential sign-off, when thinking about what I’d write for this post I realised that I have been using the terms ‘benchmark’ and ‘benchmarking’ unthinkingly. Then I panicked, what are they and why would you want to do this? This panic didn’t last long as found this useful-looking JISC resource what is benchmarking? If anyone knows more about this guide we’d appreciate your thoughts and advice but I think I am going to use it for writing up this sector scene-setting aspect of our evaluation work.

Dan

VLE minimum standards & consistencies – introducing the VITAL Baseline evaluation

This is the first in a series of posts on the evaluation work the eLearning Unit is conducting this academic year, investigating the institutional impact of our VLE module standard, the VITAL Baseline, amongst staff and students. We’ll begin by introducing some of the background and some thinking about what we want to evaluate. As Christmas is close adjacent to writing this I’m unashamedly also going to somehow try and stir up a mildly seasonal metaphor – cake mixtures and consistencies – which might rise and fall over the course of the post. This is not going to be entirely awkwardly folded-in because a key idea that crops up repeatedly is the notion of consistency of provision. The eLearning Unit played a significant role in the development, implementation and promotion (weighing, mixing and baking?) of the Baseline, and so one year in to its full roll-out we are keen to find out how it is has been received.

The VITAL Baseline is the University of Liverpool’s standard for all modules in our VLE (VITAL) which we launched for the academic year 2014-15. The VITAL Baseline is six key areas of content and information students would most like to see in all modules. You can read about it in more detail hereget a ‘How to’ guide here and watch our students describe what it is and why they value it here. The initial impetus came from the student body and concerns that there was uneven use of the VLE across the institution so that some modules were highly developed whilst others might have very little or nothing in them. The Guild of Students’ report ‘Make the Most of I.T.’ (2013) made the following recommendation, based on survey and focus group data:

Policy Recommendation 1: The University of Liverpool pass and implement a policy requiring all academic modules to have a presence on VITAL with agreed material available through it. This material should remain accessible for students throughout the course of their degree. At the most basic level this should include: i. A presence of all modules on VITAL ii. Module specification uploaded to VITAL iii. Lecture notes uploaded to VITAL iv. Reading lists uploaded to VITAL v. Past exam papers uploaded to VITAL where appropriate

It would have been useful at this point to have made an institutional evaluation of practice around the VLE. This would have provided benchmarking detail on the ways and the actual extent to which staff were using VITAL modules before the Baseline was introduced, and also where there appeared to be no or little engagement with this online space, why not, whether staff were using other tools and why. We might also have gained valuable information on interesting and scalable good practice and this kind of benchmarking is definitely something to recommend for future TEL initiatives at the institutional level. Unfortunately we do not have the resource to tackle this retrospectively although we are thinking about at least a sampling strategy to see what was happening in modules before the Baseline arrived. It might also be a question for our focus groups/interviews whether any staff felt that they had examples of their own good practice overlooked at the time which could have been part of the thinking about the Baseline mix.

In response to the Guild report, the VITAL Baseline was one of the first major steps for the University’s then-new (2013) Technology Enhanced Learning strategy, and probably the most visible so far. Interestingly VITAL modules have had a default template since the introduction of the VLE ten years ago and this suggested a basic structure and content to include, not vastly dissimilar to the new standard, with the accompanying rationale being a relative freedom in the way that the VLE was used (within the limits of the systems) and this suggested structure was as open as possible to reflect that freedom. A policy focus on formalising a standard basic structure to be met in all modules and communicating this expectation to the institution was a positive, useful and timely process. One interesting piece of work here is to map the previous template and its rationale against the new Baseline standard and its rationale and examine the points of difference and whether these signal a shift in the ways that the VLE is viewed institutionally and further whether such a shift (if it exists) trickles down to staff and students or meets the pre-existing attitudes of both groups. The current University TEL strategy (2013) states:

“As we develop new and innovative approaches to structuring technology enhanced learning, it is clear that students would value consistent use of VITAL across all modules that they study and they need to feel that they are getting a similar experience to their friends studying on other programmes. This could be seen as part of the contract that the University is developing with current and potential students, describing minimum standards in the use of VITAL.”

I think we need to look carefully at what might be the underlying assumptions about what the space in a VLE means in both rationales. Is there a view of the VLE that it is a functional, administrative space in which student-defined expectations can and should be met and is there a view that the VLE is just one tool with innovative or useful pedagogic potential from the range of which teaching staff can deploy as they see suits their requirements? Are these views opposed or even false extrapolations from the stated rationales? I would want to find out from staff whether they felt that a Baseline came down on one side or another or if this simply isn’t an issue. Is it the case that something like a Baseline was always waiting to burst out of the specific VLE system that we use, given the structures and activities it appears most obviously to support? That is, has it always encouraged a particular way of thinking about teaching with learning technology by its design and its toolset? For staff who have not previously engaged with the VLE, for whatever reasons, now that is there an expectation to fulfil a certain level of engagement with the VLE is there an underlying or implicit pedagogical model determined by a standard like this which fixes their view about how learning technologies can be designed into their learning and teaching practice? If there is an implied underlying model does it in some subtle way set a course for the institution’s view of learning and teaching more generally? There have been attempts to alternatively frame such standards in customer satisfaction terms but is this to undersell or miss the pedagogic implications and can we find any evidence of this? Also, we need to look at the individual elements of the Baseline and what they represent in part and in whole in terms of a way of thinking and acting in the VLE space.

I’ve drifted waywardly but I hope not uninterestingly from the starting point of this post, and abandoned any metaphorical acrobatics, perhaps mercifully. I’ve tried to begin rummaging around the notion of “consistency” given its prominence in the rationale charging the VITAL Baseline. What is this particular kind of consistency, what is its value and desirability? Is it constrained to comprising certain information components or is it encouraging a consistency of thinking about and designing learning with technologies including the VLE? These questions are one small part of the evaluation work we hope to carry out but in many ways they will be the most critical when considering what the future shape any Baseline should take and to unearth the full impact of this new approach to the VLE.

Dan

The VITAL Baseline

A standard for modules in VITAL

First introduced for the academic year 2014-15, the VITAL Baseline specifies some simple, key information and content that students most want to see in all VITAL modules and is aimed at improving the consistency of provision across modules when using VITAL.

It was developed in consultation with the Faculties, CSD, Library, the Guild and the eLearning Unit. It is based on the findings of the LGoS student survey and report ‘Making the most of IT’ and is a core element of the University’s Technology-Enhanced Learning (TEL) Strategy which has been developed by the University’s TEL Working Group (TELWG).

Many modules already exceed these requirements and the VITAL Baseline is not intended to replace or limit existing good practice and creative, innovative use of VITAL and this should be considered when applying the Baseline. It is anticipated that the Baseline will evolve over time to respond to feedback from staff and students, and also to include institutional strategic initiatives.

In summary the VITAL Baseline is currently:

  1. Module staff details: All staff teaching on the module should be listed in the Module Staff section to include name, contact email office location, office hours (where appropriate) and an image is recommended.
  2. Module Overview page: Every module includes by default a link to the new, automated Module Overview page. Module specifications need to be accurate as information is taken from here for this page.
  3. Reading Lists @ Liverpool link: Every module should include a reading list. Reading Lists @ Liverpool is a tool for creating online reading lists. A link can be made from each module menu to its Reading Lists @ Liverpool list.
  4. Learning Resources: Modules should include resources for lectures and teaching where appropriate and which exist in an electronic format, such as lecture PowerPoints, in a suitable, easily-navigable structure.
  5. Exam Resources: Modules should contain appropriate resources, preparation and advice for students on any exam element of the module. Every module has by default a section called ‘Exam Resources’ which can be used for this purpose and can include but is not restricted to: past exam papers, samples of MCQs, types of question that can be expected, sample answers, marking criteria. (It should NOT contain any exam timetabling information or other exam information that is held by Orbit).
  6. General coursework and exam feedback: An overall perspective of the cohort’s performance in exams and in coursework should be offered through the relevant VITAL module.

Your school or department may have already developed a standard module template that meets this Baseline and you should check this first.

Full, centrally-provided guidance on meeting the VITAL Baseline is available and includes:

  • This short ‘How To’ guide describes the Baseline and the simplest ways to meet it: How to’ pdf guide to meeting the Baseline
  • A self-directed module ‘VITAL Baseline and guidance’ on which all staff are enrolled in VITAL and you will find in your list of modules. This is a more detailed, step-by-step guide to implementing the VITAL Baseline.

If you have any questions about the VITAL Baseline please contact:

CSD Helpdesk: for any VITAL technical problems – error messages etc.

eLearning Unit: for advice on implementing the Baseline in your modules and training opportunities.

There are also some key contacts in the faculties who can signpost the best places to get relevant support and answers to your queries. These are:

S&E – Nick Greeves

HSS – contact being organised

HLS – Peter Reed

The Americans are coming – again!

Circa 2000 I remember attending a conference on online distance learning at the University of Salford where the CEO of an American-based internet development company stood up (classic alpha-male stuff!) and shouted at all the academics in the audience ‘the Americans are coming!’ Basically he was warning that US HEI’s with Californian venture capital backing etc. will be soon taking over global higher education as new learners flock to take cheaper, more flexible online degrees. ‘Change or die’ being the basic message! Over a decade later not much has changed except the emergence of for-profit online distance teaching providers (our partnership with Laureate Online Education for example), the development of open-educational resources (MIT’s open courseware etc.) and the steady expansion of blended learning using technologies within residential universities. Arguably more of a steady evolution than the technology-led revolution predicted.

Image

The recent press announcement of a new partnership between Harvard and MIT to form edX online education suggest the Americans might be at it again! The press conference on the website is full of revolutionary fervour! Is this nothing new or potentially a subtle but important evolution in how we learn? My initial reaction was here we go again! But watching the press release video (once you get through the hyperbole) and other media reports, possibly there is something new emerging here:

  • First pilot module they ran attracted 120,000 registrations worldwide.
  • Free open courses – can’t get a degree but you can get an assessed certificate.
  • Created a clever online research environment to gather a lot of data about how people learn online. Essentially a large global experiment. (Learning analytics)
  • Created new agile open-source software which can be developed from this emerging pedagogical research. Interesting that they are not using existing VLE or Web 2 technologies.
  • Builds on their experiences with open courseware, and particularly their student’s use of online resources such as the Khan Academy to supplement their campus-based learning.
  • A big driver is to support the development of campus-based learning – sort of flipped classroom model on a global scale! Online cases taught with campus-based classes.
  • Part altruistic strategy to spread learning globally & part marketing opportunity.
  • Potentially fills a gap between formal and informal online education – clear benefit for non-formal CPD type professional education. People who already have masters etc. but want to study specific topics within a large global community.
  • Employers and companies can be part of these learning communities and get to know students.

Image

Other major US HEI’s are developing similar initiatives – Coursera (Princeton, Stanford, Michigan & Pennsylvania) and Udacity (Stanford computer science) for example. Coursera looks an interesting online learning model with a well defined pedagogical research foundation. I’ve signed up for a course on gamification over the summer so I let you know more about this online pedagogy!

Nick Bunyan