Report: e-Learning Network Meeting – January 2017

We were delighted to welcome Professor Helen O’Sullivan, APVC Online Learning, as speaker at the first e-Learning Network meeting of 2017. Helen spoke to us about the University’s new Education Strategy, giving the network an overview of the structures, leadership teams and immediate priorities. The recording of that talk is linked-to below. Helen then led a discussion workshop on what an institutional Digital Education vision might look like (this part of the session is not recorded). We also managed to make time for a couple of extra items: a first look at Turnitin Feedback Studio, the new design for GradeMark we will be moving to in July; and an introduction to the Go Mobile user group that began meeting this academic year. A pretty busy lunchtime for the forty staff members who came together for this valued networking event.

Professor Helen O’Sullivan

So much is going on at the University at the moment it was a welcome opportunity to spend some time thinking through and discussing how current strategies relate to our own interest area and Helen did a great job of this, even in the sweltering conditions of our meeting room. The Education Strategy’s core values, ‘Liverpool Hallmarks’, of ‘research-connected teaching, active learning and authentic assessment’ are immediately appealing to anyone interested in learning and teaching, and learning technologies can play a critical role in these. I won’t go into micro-detail but what I found really useful was an update on the top priorities for the coming year, including the setting-up of a new Programmes Development Team, a media technical support team, continued work on the Electronic Management of Assessment project, and also hearing about less familiar things including the focus on the London Campus portfolio and degree apprenticeships. Click the image below for the (Stream Capture) recording, about 32 minutes long.

Click the image above to watch the recorded talk by Prof O’Sullivan (32 minutes)


We then moved to some group discussions to consider a Liverpool take on David White’s digital leadership framework which is designed to help high-level discussion and decision-making about all things digital, giving some coherence for thinking about the whole organisation and how decisions can affect all of these layers. The framework diagram below is taken from David’s blog post (click the diagram to read) and was the starting point for the activity. In my group we focussed quite a bit on the Digital Service layer, which possibly reflected the areas we work in but which we felt was the bedrock of an organisation’s culture and medium.

Turnitin Feedback Studio – Dan Roberts

There was also a bit of time for a couple of extra items. First up was a look at the new design for Turnitin GradeMark, called Turnitin Feedback Studio. This was an out-of-the-box walkthrough and we were only examining the feedback environment. Essentially the desktop version has been rebuilt and the design is very similar to the current iPad app version, but now you will be able to use it on any device. This video below maps the key differences between our current version of GradeMark and what we will see after this summer’s upgrade. You can also try out a live, online demo if you follow this link.

No horses seemed to be startled by this new look. From a design point of view I think it is a much-improved, cleaner system, tidying away a lot of the distracting array of menus and buttons we are used to, and instead putting the most commonly-used feedback tools directly in front of you whilst marking work; no more hunting around for different comment types for instance. The rebuild has also focussed on making GradeMark fully-accessible which is great. Asking about what kinds of things people would be interested to test in the lead-in time to the summer upgrade, long-standing functionality/workflow requirements such as double marking were top of the list. Looking through the release notes whilst writing this post I can see that there is a Beta version of the multiple markers facility for which Turnitin are looking for some testers, so we will organise this through the network and the e-submission/EMA project board. Get in touch directly if you want to be a part of this testing.

Go Mobile Usergroup – Alex Spiers

We rounded off chatting about the new user group for anyone interested in anything mobile that Alex  has set up and has met a couple of times already this academic year. It is as wide-ranging as that sounds, so we’ve looked at apps, devices like the iPad pro and pen, and the kinds of things staff and students from all parts of the University are doing with mobile technologies for learning and teaching. Look out for the next meeting which we hope will be this side of Easter and we’ll release details ASAP or keep up with #LIVUNIGO.

Next meeting – April 27th

Many thanks to Helen for the valuable and engaging insight into the strategic thinking and work going on for the University’s Education Strategy, and the role that Technology-Enhanced Learning has to play as it moves into its implementation phase. It was also a great opportunity to have a first say on some emergent ideas around a Digital Vision for the University of Liverpool. This is an ongoing process and Helen would welcome more comments and feedback on anything covered in the presentation or discussion.

The next e-Learning Network meeting is scheduled for Thursday 27th April 12:30 – 2pm. The network lunch is intended primarily as a sharing event so if you have an idea for one of our meetings or anything you want to share about something you have been doing with TEL and to get some feedback and discussion from the group then please let us know.


Winter School 2017 – Week 2 Workshops – 16th – 20th January

The Winter School began today with a great session on engaging students visually in lectures with loads of ideas for presentation materials. There are a number of other sessions in the first week , which you can read about here and still book the last few places here.

This year we are also running a second week of sessions (Monday 16th to Friday 20th January) looking at technologies for learning and teaching beyond VITAL and a number of which are designed to tie-in with the online Bring Your Own Device for Learning 2017 events. All sessions listed below and bookable at:

  • Monday 16th Jan – An introduction to Twitter (2 – 4pm)
  • Tuesday 17th Jan – GoMobile user group meeting (1 – 3pm)
  • Wednesday 18th Jan – e-submission and e-marking workshops
    • An Introduction To Electronic Submission Of Coursework (9:30 – 11am)
    • The Turnitin Assignment Tool For e-submission (Part 1) And GradeMark For Feedback (Part 2) (12 – 2pm)
    • The Blackboard Assignment Tool For e-submission (Part 1) And Feedback (Part 2) (2:30 – 4pm)
  • Thursday 19th Jan – Student as co-creators (11am – 1pm)
  • Friday 20th Jan – Advanced Twitter (2 – 4pm)

Finally, after these two weeks are a couple of events which might interest you.

  • Wednesday 25th Jan – Building good VITAL modules – a practical session looking at ways of building on the VITAL Baseline to create well-designed modules.
  • Thursday 26th Jan – eLearning Network meeting. In this meeting we will be getting a first look at the new Turnitin Feedback Studio.

All sessions bookable at:


VLE minimum standards evaluation – looking around

This second post to stem from our VITAL Baseline evaluation work this year tells some of the story about looking outwards to the sector for policy seeking to understand best-practice around minimum standards for VLEs.

As a part of our original work developing the VITAL baseline the working group thought it would be useful to see what, if anything, was happening with VLE minimum standards across the HE sector at that point in time. With a pressing timescale and clear directive that a) we had to have a standard and b) of what it should consist this time round, the most we could look at was who had some kind of minimum standard or expectation for their VLE and what was in it, so get some simple benchmarks and reassure ourselves that we hadn’t missed anything that we could include whilst we had this opportunity to shape our own set of standards at the development stage. As a low-cost way (for us) of gathering some rough and ready data a call went out to the ALT mailing list asking people to contribute to a Google Sheet to say whether their institution had any kind of minimum standard or expectation, or were looking to develop one, and what was in it. As a group we didn’t engage at any further level with this work (although one of us, Peter Reed, wrote up and presented some of his own analysis of the original data at a few conferences and usergroups for which you can find an entry point to here). I had always hoped to return to this work with some questions I had about the sector-wide landscape as a part of any evaluation and that what I learnt from this exercise would also inform any recommendations we would make for future development of the standard. The main areas I want now to examine are:

  • In terms of the original data gathered we should re-examine what we can understand from this. I was anxious about how crowdsourced data stands up and what the implications were of this approach for an evaluation so would like to review this methodology in some detail as well. 
  • I wondered what results we would get if we more closely defined the group(s) of institutions against which we wanted to benchmark and actively approach these. I felt at the time that the original data might encourage a view that minimum standards policies were a standard feature of the topography of Technology Enhanced Learning strategies and policies in HEIs and I wanted to test this. 
  • Then as well as asking who has a standard and what is in it the more interesting questions would include why one exists for that institution, how is its rationale contextualised (e.g. is this linked to a learning and teaching strategy?), what kinds of processes were involved in its development, is it compulsory and has any evaluation taken place?
  • For those institutions who don’t have one, what is the story, if there is one, here?
  • Beyond approaching institutions we would look at any literature, conference proceedings, blog posts and mailing list discussions around minimum standards as a part of the evaluation work more generally but specifically for this information about why institutions do or do not have a VLE baseline.

In picking this work up again for our Baseline evaluation project I thought an obvious although simplistic initial sampling strategy would be to survey the rest of the Russell group which would make for a manageable, quick desk-based web-search. I looked for what could be discovered from the websites of the Russell Group institutions for any indication of an institutional VLE standard in place and what further detail was available publicly. A first-glance review found that in December 2015:

  • Six definitely have an institutional standard with compulsory/required/expected elements.
  • Three offer institutionally recommended good practice guidance.
  • Fourteen seemed to have no institutional standard requirement or expectation.
  • For one institution I couldn’t find any detail one way or another.

I’ve collated a list with details of what each standard or recommendation above consists but one immediate thought on the above is that there might well be standards at faculty or school/departmental level where there is no institutional requirement. It would be useful to uncover this information as my own feeling is that handing over standards to more local levels is probably the way to move in the future, which we’ll discuss in a later post on any recommendations that emerge from our evaluation work. 

Also immediately what we would want to do next is to contact institutions directly and try to confirm whether what we have found accurately reflects their situation and to get a little more detail on the whys and hows as listed above in this post (and we’ll also first need to look into ethical approval around this if we are intending to publish our results) and if there might be more local standards that aren’t found on central websites and I’ll make this list available when it is as complete as possible.

Another question this simple snapshot of Russell Group institutions raises for me is whether Liverpool is leading in this area, as we appear to be in the minority here, or alternatively whether we were late-arrivers to the debate around minimum standards and have taken a different direction to the majority? Obviously the snapshot view is unconfirmed and looking at a bigger group of institutions would give better data but as I began to discuss in the first Baseline evaluation post we want to look at the evidence used that informed the decision to make this one of the first initiatives to comprise our institutional TEL strategy to assess its strengths and its limits. Does this snapshot view tell a contrasting story to that of other sources of evidence used to develop TEL strategy and the Baseline originally? Are there any similarities between Liverpool and the other Russell group institutions that have a standard in place? I’d be interested to discover the extent to which minimum standards policies feature in Technology Enhanced Learning strategies and policies in HEIs when measured in a larger sample group. Thinking further about the sample group I wondered whether we should follow this 2014 UCISA report and look at all pre-92 institutions for a greater mix of institution types and with time post-92 institutions. More work but valuable for developing our understanding.

In a similar fashion to that of the original Baseline development representing an opportunity to look at some of the internal evidence of the ways in which the VLE is being used within the institution, this outward-looking work also offered the opportunity to richly inform our thinking and strategic approach by looking to the sector and the experience and evidence here. We want to assess the extent these sources of evidence were exploited in the original development and how they could be in the future as part of our recommendations for any future institutional TEL initiatives.

As an almost tangential sign-off, when thinking about what I’d write for this post I realised that I have been using the terms ‘benchmark’ and ‘benchmarking’ unthinkingly. Then I panicked, what are they and why would you want to do this? This panic didn’t last long as found this useful-looking JISC resource what is benchmarking? If anyone knows more about this guide we’d appreciate your thoughts and advice but I think I am going to use it for writing up this sector scene-setting aspect of our evaluation work.


VLE minimum standards & consistencies – introducing the VITAL Baseline evaluation

This is the first in a series of posts on the evaluation work the eLearning Unit is conducting this academic year, investigating the institutional impact of our VLE module standard, the VITAL Baseline, amongst staff and students. We’ll begin by introducing some of the background and some thinking about what we want to evaluate. As Christmas is close adjacent to writing this I’m unashamedly also going to somehow try and stir up a mildly seasonal metaphor – cake mixtures and consistencies – which might rise and fall over the course of the post. This is not going to be entirely awkwardly folded-in because a key idea that crops up repeatedly is the notion of consistency of provision. The eLearning Unit played a significant role in the development, implementation and promotion (weighing, mixing and baking?) of the Baseline, and so one year in to its full roll-out we are keen to find out how it is has been received.

The VITAL Baseline is the University of Liverpool’s standard for all modules in our VLE (VITAL) which we launched for the academic year 2014-15. The VITAL Baseline is six key areas of content and information students would most like to see in all modules. You can read about it in more detail hereget a ‘How to’ guide here and watch our students describe what it is and why they value it here. The initial impetus came from the student body and concerns that there was uneven use of the VLE across the institution so that some modules were highly developed whilst others might have very little or nothing in them. The Guild of Students’ report ‘Make the Most of I.T.’ (2013) made the following recommendation, based on survey and focus group data:

Policy Recommendation 1: The University of Liverpool pass and implement a policy requiring all academic modules to have a presence on VITAL with agreed material available through it. This material should remain accessible for students throughout the course of their degree. At the most basic level this should include: i. A presence of all modules on VITAL ii. Module specification uploaded to VITAL iii. Lecture notes uploaded to VITAL iv. Reading lists uploaded to VITAL v. Past exam papers uploaded to VITAL where appropriate

It would have been useful at this point to have made an institutional evaluation of practice around the VLE. This would have provided benchmarking detail on the ways and the actual extent to which staff were using VITAL modules before the Baseline was introduced, and also where there appeared to be no or little engagement with this online space, why not, whether staff were using other tools and why. We might also have gained valuable information on interesting and scalable good practice and this kind of benchmarking is definitely something to recommend for future TEL initiatives at the institutional level. Unfortunately we do not have the resource to tackle this retrospectively although we are thinking about at least a sampling strategy to see what was happening in modules before the Baseline arrived. It might also be a question for our focus groups/interviews whether any staff felt that they had examples of their own good practice overlooked at the time which could have been part of the thinking about the Baseline mix.

In response to the Guild report, the VITAL Baseline was one of the first major steps for the University’s then-new (2013) Technology Enhanced Learning strategy, and probably the most visible so far. Interestingly VITAL modules have had a default template since the introduction of the VLE ten years ago and this suggested a basic structure and content to include, not vastly dissimilar to the new standard, with the accompanying rationale being a relative freedom in the way that the VLE was used (within the limits of the systems) and this suggested structure was as open as possible to reflect that freedom. A policy focus on formalising a standard basic structure to be met in all modules and communicating this expectation to the institution was a positive, useful and timely process. One interesting piece of work here is to map the previous template and its rationale against the new Baseline standard and its rationale and examine the points of difference and whether these signal a shift in the ways that the VLE is viewed institutionally and further whether such a shift (if it exists) trickles down to staff and students or meets the pre-existing attitudes of both groups. The current University TEL strategy (2013) states:

“As we develop new and innovative approaches to structuring technology enhanced learning, it is clear that students would value consistent use of VITAL across all modules that they study and they need to feel that they are getting a similar experience to their friends studying on other programmes. This could be seen as part of the contract that the University is developing with current and potential students, describing minimum standards in the use of VITAL.”

I think we need to look carefully at what might be the underlying assumptions about what the space in a VLE means in both rationales. Is there a view of the VLE that it is a functional, administrative space in which student-defined expectations can and should be met and is there a view that the VLE is just one tool with innovative or useful pedagogic potential from the range of which teaching staff can deploy as they see suits their requirements? Are these views opposed or even false extrapolations from the stated rationales? I would want to find out from staff whether they felt that a Baseline came down on one side or another or if this simply isn’t an issue. Is it the case that something like a Baseline was always waiting to burst out of the specific VLE system that we use, given the structures and activities it appears most obviously to support? That is, has it always encouraged a particular way of thinking about teaching with learning technology by its design and its toolset? For staff who have not previously engaged with the VLE, for whatever reasons, now that is there an expectation to fulfil a certain level of engagement with the VLE is there an underlying or implicit pedagogical model determined by a standard like this which fixes their view about how learning technologies can be designed into their learning and teaching practice? If there is an implied underlying model does it in some subtle way set a course for the institution’s view of learning and teaching more generally? There have been attempts to alternatively frame such standards in customer satisfaction terms but is this to undersell or miss the pedagogic implications and can we find any evidence of this? Also, we need to look at the individual elements of the Baseline and what they represent in part and in whole in terms of a way of thinking and acting in the VLE space.

I’ve drifted waywardly but I hope not uninterestingly from the starting point of this post, and abandoned any metaphorical acrobatics, perhaps mercifully. I’ve tried to begin rummaging around the notion of “consistency” given its prominence in the rationale charging the VITAL Baseline. What is this particular kind of consistency, what is its value and desirability? Is it constrained to comprising certain information components or is it encouraging a consistency of thinking about and designing learning with technologies including the VLE? These questions are one small part of the evaluation work we hope to carry out but in many ways they will be the most critical when considering what the future shape any Baseline should take and to unearth the full impact of this new approach to the VLE.


The VITAL Baseline

A standard for modules in VITAL

First introduced for the academic year 2014-15, the VITAL Baseline specifies some simple, key information and content that students most want to see in all VITAL modules and is aimed at improving the consistency of provision across modules when using VITAL.

It was developed in consultation with the Faculties, CSD, Library, the Guild and the eLearning Unit. It is based on the findings of the LGoS student survey and report ‘Making the most of IT’ and is a core element of the University’s Technology-Enhanced Learning (TEL) Strategy which has been developed by the University’s TEL Working Group (TELWG).

Many modules already exceed these requirements and the VITAL Baseline is not intended to replace or limit existing good practice and creative, innovative use of VITAL and this should be considered when applying the Baseline. It is anticipated that the Baseline will evolve over time to respond to feedback from staff and students, and also to include institutional strategic initiatives.

In summary the VITAL Baseline is currently:

  1. Module staff details: All staff teaching on the module should be listed in the Module Staff section to include name, contact email office location, office hours (where appropriate) and an image is recommended.
  2. Module Overview page: Every module includes by default a link to the new, automated Module Overview page. Module specifications need to be accurate as information is taken from here for this page.
  3. Reading Lists @ Liverpool link: Every module should include a reading list. Reading Lists @ Liverpool is a tool for creating online reading lists. A link can be made from each module menu to its Reading Lists @ Liverpool list.
  4. Learning Resources: Modules should include resources for lectures and teaching where appropriate and which exist in an electronic format, such as lecture PowerPoints, in a suitable, easily-navigable structure.
  5. Exam Resources: Modules should contain appropriate resources, preparation and advice for students on any exam element of the module. Every module has by default a section called ‘Exam Resources’ which can be used for this purpose and can include but is not restricted to: past exam papers, samples of MCQs, types of question that can be expected, sample answers, marking criteria. (It should NOT contain any exam timetabling information or other exam information that is held by Orbit).
  6. General coursework and exam feedback: An overall perspective of the cohort’s performance in exams and in coursework should be offered through the relevant VITAL module.

Your school or department may have already developed a standard module template that meets this Baseline and you should check this first.

Full, centrally-provided guidance on meeting the VITAL Baseline is available and includes:

  • This short ‘How To’ guide describes the Baseline and the simplest ways to meet it: How to’ pdf guide to meeting the Baseline
  • A self-directed module ‘VITAL Baseline and guidance’ on which all staff are enrolled in VITAL and you will find in your list of modules. This is a more detailed, step-by-step guide to implementing the VITAL Baseline.

If you have any questions about the VITAL Baseline please contact:

CSD Helpdesk: for any VITAL technical problems – error messages etc.

eLearning Unit: for advice on implementing the Baseline in your modules and training opportunities.

There are also some key contacts in the faculties who can signpost the best places to get relevant support and answers to your queries. These are:

S&E – Nick Greeves

HSS – contact being organised

HLS – Peter Reed

The Americans are coming – again!

Circa 2000 I remember attending a conference on online distance learning at the University of Salford where the CEO of an American-based internet development company stood up (classic alpha-male stuff!) and shouted at all the academics in the audience ‘the Americans are coming!’ Basically he was warning that US HEI’s with Californian venture capital backing etc. will be soon taking over global higher education as new learners flock to take cheaper, more flexible online degrees. ‘Change or die’ being the basic message! Over a decade later not much has changed except the emergence of for-profit online distance teaching providers (our partnership with Laureate Online Education for example), the development of open-educational resources (MIT’s open courseware etc.) and the steady expansion of blended learning using technologies within residential universities. Arguably more of a steady evolution than the technology-led revolution predicted.


The recent press announcement of a new partnership between Harvard and MIT to form edX online education suggest the Americans might be at it again! The press conference on the website is full of revolutionary fervour! Is this nothing new or potentially a subtle but important evolution in how we learn? My initial reaction was here we go again! But watching the press release video (once you get through the hyperbole) and other media reports, possibly there is something new emerging here:

  • First pilot module they ran attracted 120,000 registrations worldwide.
  • Free open courses – can’t get a degree but you can get an assessed certificate.
  • Created a clever online research environment to gather a lot of data about how people learn online. Essentially a large global experiment. (Learning analytics)
  • Created new agile open-source software which can be developed from this emerging pedagogical research. Interesting that they are not using existing VLE or Web 2 technologies.
  • Builds on their experiences with open courseware, and particularly their student’s use of online resources such as the Khan Academy to supplement their campus-based learning.
  • A big driver is to support the development of campus-based learning – sort of flipped classroom model on a global scale! Online cases taught with campus-based classes.
  • Part altruistic strategy to spread learning globally & part marketing opportunity.
  • Potentially fills a gap between formal and informal online education – clear benefit for non-formal CPD type professional education. People who already have masters etc. but want to study specific topics within a large global community.
  • Employers and companies can be part of these learning communities and get to know students.


Other major US HEI’s are developing similar initiatives – Coursera (Princeton, Stanford, Michigan & Pennsylvania) and Udacity (Stanford computer science) for example. Coursera looks an interesting online learning model with a well defined pedagogical research foundation. I’ve signed up for a course on gamification over the summer so I let you know more about this online pedagogy!

Nick Bunyan

Report on the Heads of e-Learning Forum meeting 7th March 2012

I attended the HeLF (Heads of eLearning Forum) meeting on 7th March at Glasgow Caledonian University (who came to the rescue and provided a venue after a fire at the original location of University of Strathclyde meant we couldn’t meet there). HeLF is a national group with representation from 120 Higher Education institutions.

The focus of the meeting was on ‘Driving External Change‘ and contributes to HeLF’s theme for 2011/12 on Leading our institutions through change: change in external and internal environments. How do we work with students and get their involvement?

The meeting began with a presentation by Professor Phillipa Levy who is the Deputy Chief Executive (Academic) of the Higher Education Academy. Prof Levy described their new Strategic Plan, how they are focusing on putting students at the centre as ‘producers’ rather than consumers and how students are becoming agents of change. They also recognise that there are ‘e-challenges’ with the role of digital technologies, e.g. how to engage more staff in digital activities, that students may have differential access to technology and how to encourage a culture of shared learning design and and content. The HEA’s terminology for e-learning is now shifting to ‘flexible learning’ with a recognition that technology and e-learning is (or should be) embedded in the L&T experience rather than something separate.

The second presention was from Paul Bailey who is the programme manager in the JISC e-Learning team. Paul gave an overview of how JISC will change after the Wilson review to become a ‘company limited by guarantee’ from 1st August 2012.  JISC are working on a new strategy but the 5 strategic objectives are likely to be the same as before. Paul described how JISC are also focusing on students as change agents, similar to HEA.

David Beards from the Scottish Funding Council talked about their approach to e-learning and how the changes to JISC will affect them.

Andy Ramsden who is on the Steering Group of the MELSIG (Media Enhanced Learning Special Interest Group)  asked us to think about how MELSIG could be useful in our own insitution. This is an excellent SIG (Dan Roberts blogged about the MELSIG event he recently attended) and I think the general feeling around the group was that this is worth continuing.

Dr Neil Ringan from MMU gave an overview of an e-reader pilot project that started in September 2011 in the Department of English. 35 members of staff were given a Kindle e-reader (including the current Poet Laureate) though some members of staff already owned an iPad so it was possible to compare to some extent what staff thought of each. The aim of the pilot was to look at how the e-reader could be used

  • to support professional practice as producers and consumers of creative materials (using the e-reader as an e-reader) and
  • to support academic practice, particularly in relation to assessment and feedback (using the e-reader for reading & annotating assignments).

Positive aspects of the Kindle were the size, quality, battery life, that it is easy to store lots of text, the accessibility, price and quick access to the book store. Negative aspects were that PDF documents were not intuitive, creating e-books is clunky, annotation tools are too slow and primitive, the proprietary amazon e-book format lock in, limited internet capability and the speed and refresh rate.

Positive aspects of the iPad were the ease of importing PDF documents, PDF reading and annotation, that there was full internet access, that it was a more viable netbook replacement and the battery life. The negative aspects were the size and weight (in comparison to the Kindle), the Apple and iTunes propriety issues, the price and that there was no physical transfer.

In summary:

  • the Kindle was considered to be an excellent e-reader but not much more.
  • The iPad is a good e-reader (not as good as the Kindle) but it was also a lot more besides.
  • Staff were enthusiastic about using the Kindle as a reading tool but there was no enthusiasm to use it for assessment and feedback purposes.
  • The view across the department is that tablets, rather than netbooks, are the best way forward. Staff wanted the pros of the iPad but cheaper, and the pros of the Kindle, but more flexibility in what it can do and for it to be easier to use (this evaluation pre-dates apples latest developments with ibooks). It may be worth looking at the Kindle fire.

The rest of the meeting covered more general HeLF business. These are useful meetings as it is important to find out how colleagues in other institutions are addressing similar issues.

Debbie Prescott