• Dave Foord
  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 1,641 other followers

  • Dave Foords Twitter

    • RT @greenfieldscc: after one loan player for our twos tomorrow @landrcl can anyone help? 5 days ago
    • Looking out the window and reminding my self that we are 3 days away from the ‘middle of summer’ 1 week ago
    • On my way to London, to look at potential of Moodle being used by a client to create a mobile friendly gamified sport development activity. 2 months ago
    • Heading up North, where I will be spending at least a week working on a huge Moodle migration project with a client. 3 months ago
    • RT @CharnwoodPE: 🏃🏻‍♂️🏃🏻‍♀️ - The 3 Charnwood medalist’s from the @dretsport Cross Country Cup! Harry Foord 1st 🥇, Colin Foord 2nd 🥈, and E… 3 months ago
  • Advertisements

Reflections on Using tablets in FE and HE assessment event

This blog post is an example of how a blog can be used for reflective practice. I am going to reflect on a training session that I recently ran, and use this reflective practice to help me improve the session for the future and help with my planning of other similar sessions.

Image of Dave Foord

Self Reflection

On Friday 28th March 2014 I ran the first FE/HE session for The Tablet Academy at Lougborough University. The session was designed to look at the use of tablet devices in assessment, and attracted 8 people from across the country.

There are lots of different people, companies and organisations offering tablet training at the moment, of varying quality and varying price (and no direct correlation between the two), so I was keen that I offered something different, something more than the very easy “look at me and how clever I am with an iPad” type session, that yes can be inspiring, but often doesn’t give people a chance to unpick bigger issues.

Therefore the main focus of the day, was to try to get the attendees to think openly about the use of tablet devices in assessment, including the issues that may arise, and not just the positives that can be brought. We also made the decision to focus on BYOD (Bring Your Own Device) rather than specifying a particular platform (e.g. iPads).

The structure of the day was:

  1. Introduction presentation.
  2. Exploring options of how tablets can be used in education.
  3. Designing, completing, then assessing an assessed activity with tablets.
  4. Bringing it all together.

Here is my reflection on each part:

Introduction presentation

I created a short presentation that was designed to set the scene, and to try to create the mindset of not using tablets to replace laptops, but to look at what tablets do that laptops cannot, and how this can be used beneficially in assessment. On reflection, I should have spent less time on this point and more time demonstrating options of what is possible with tablets. I didn’t want to spoil the next activity by giving the attendees the answers to the tasks I was about to set them, but some of the the attendees would have benefited from such examples to help make the rest of the day less abstract.

Exploring options of how tablets can be used in education

We had allocated a significant amount of time (over 2 hours) to this part of the day, which was an opportunity for attendees to explore different options of how tablet devices can be used in assessment. Because each person would have different organisational needs, roles and devices, I created 7 separate tasks (challenges) for them to look at. Each task had some background information, then a set of questions/activities for them to work through which would hopefully guide them through an exploration of that topic, with me facilitating them to unpick some of the issues. I estimated each topic to take about 30 minutes so hoped that people could explore 4 or 5 of the options in the allotted time. As it turned out the tasks took longer than I anticipated, so they only managed to cover 2 of the tasks, which was a shame. For future events, there are various options:

  1. I could shorten each topic, but then have a section for each topic which is titled “further exploration” or similar – so they could continue exploring at a later day.
  2. I could have sent attendees some pre course information so they could look at the options and possibly start some of the tasks (even if they only downloaded any required apps, and created accounts where required on them – which would have saved time).
  3. I could have been more draconian with the time keeping, forcing them to change topics if they spent more than say 40 minutes on any topic. I am not a big fan of this idea, as I need attendees to be comfortable with their explorations, and it is more beneficial for them to unpick a smaller number of options well, rather than more options badly.
  4. If I increased the time of the introduction, to include a quick demo of each idea, this would then have saved attendees time when exploring these options.

Having jotted down these possible options, I think points 1,2 and 4 above could be used together to improve this part of the session, and is what I will do next time.

Designing, completing, then assessing an assessed activity with tablets

This part of the day worked really well. Working in pairs, each pair had 30 minutes to design a tablet enabled assessable activity. They would then share this with a different pair. Each pair then had 30 minutes to complete the task set by someone else, after which they returned the work to the original pair, who had a further 30 minutes to assess and give feedback.
For this activity I was very strict with the timings, using a countdown timer on an iPad to keep me and the group focused. By setting a very specific and challenging time helped to keep people on task, and stopped the afternoon from ‘drifting’. Attendees very much got into the spirit of this activity, they had a lot of fun (which is good), they were imaginative (which was the intention) and they uncovered a few problems with the logistics of actually getting the task to the other people. The main reflection (which was also echoed by the participants feedback) was they didn’t need 30 minutes to assess the work, so could shorten this easily to 15 or even 10 minutes, apart from that I would keep this part of her he day the same.

Bringing it all together

As with any good training session, it is important that there is a chance to reflect and regroup at the end, and some form of identified action for people to do next. For this I set the task of asking people to identify the 5 Ws – Why, What, When, How, and Who for them to identify a small step that they were going to take to move their organisation forward in using tablet devices in the process of assessment. After the slightly pressurised previous 90 minutes as they raced against the clock, this made for a useful reflecting and refocusing activity. We then had some general discussions and used Socrative as a tool to reflect on the day in general.

Looking at the feedback provided by the attendees all bar one were very satisfied with the day, with some highlighting a few of the points I have made above. A few said they would have preferred the day to be platform specific (e.g. Just iPad, or just Microsoft) rather than BYOD, and a few wanted to see more examples of good practice.

From my perspective, I wanted the day to be a chance for people to unpick the real issues around using tablets for assessment, and as such I knew that people would encounter certain problems during the day, which was good, as better for them to encounter them here rather than with real assessments, however these few problems may have been perceived as outweighing the benefits of using tablet devices, which wasn’t my intention. The biggest problem that people encountered was the BYOD issue – of working with ideas that work across all platforms. Many thought Google Docs would be a good option, but discovered that these didn’t work well on iPads and sharing them was more complicated than expected. I tried to reinforce the point that in reality you use the tablet devices when they are most appropriate and use something else when not appropriate, e.g. It is perfectly acceptable for a member of staff to use a computer to create the assessment, the student to use and tablet to complete it and the tutor to use a computer to mark it.

My thanks go to the staff at Loughborough University for their support, in particular Charles Shields and Farzana Khandia.

 

Advertisements

Using Google Apps to create a fast feedback tracking system

I have been very lucky to work with Loughborough College in recent months on an LSIS funded LIT project, which looked at using the functionality of Google Apps (which the college uses) to create a mechanism to speed up the feedback process for students once they have submitted their BTEC assessments.

In order to achieve this the services of Martin Hawksey were enlisted as the mentor – as his knowledge and skill in using Google Scripts was going to be required. The project was originally lead by Patrick Lander, but he left the college to return to New Zealand, so after he had done all the hard work, I was able to take over to finish off and get the pleasure of seeing the project through to fruition.

The project is currently in the testing stage, to check that it works fully before rolling out wide scale (which will now have to be next academic year) but the early indicators are good.

The project has been a very interesting one, not just from the technical side of things but also recognising that teaching staff work in very different ways, and for this to work we had to accomodate these different ways of working – which proved a huge challenge, but one which I think we have overcome. The main issue to overcome was to reduce the need for the tutor to record information in 2 separate locations (e.g. on the feedback sheet for the learner, and in a central recording and tracking location). The starting point was the idea of using a spreadsheet grid which then effectively mail merged the information into a feedback sheets that the learners received – however some staff don’t like filling out information in grids, and you are very limited with the formatting options of the feedback. So the solution is (in over simplistic terms) – information can be entered into a spreadsheet grid if the tutor wants to – or there is an interface that can be used which is more user friendly – this then fill sin the grid for the tutor. They can then produce feedback sheets for their students – in which they can individually add formatting or additional information to the feedback – and once done they can ‘release’ the feedback which puts a copy into the students area, and send them an emil notifying them. Although this may not sound it, when in action, it speeds the process up significantly, and because the actual grade data isn’t being entered twice there is less chance of data copying errors.

All of the outputs from the projects have been released to the wider community, in a hope that other people will see the benefits and develop this further. Full details of how to do this can be found on Martin Hawksey’s blog post on this at http://mashe.hawksey.info/2012/05/hacking-stuff-together-with-google-spreadsheets-fast-tracking-student-feedback-system/?utm_source=dlvr.it&utm_medium=twitter&utm_campaign=jisccetis. Personally I think that for an organisation to use this they would have to have implemented Google Apps for this to work (it could work without it, but I think would be very difficult – as each student would need a google account, and the tutor would need to know their email address).

As the project goes through testing, further updates will be made to the project blog at http://thehub.loucoll.ac.uk/elearning/category/lsis-lit-project/ 

Devolving the empire of the ‘learning technologist’

Most colleges and universities have some sort of centralised area of expertise regarding the use of learning technology. For some this will be 1 person as part of many other roles that they fulfill, at bigger organisations it will be a team of individuals dedicated to this area of work.

One problem with this model though is there becomes an unhealthy over reliance on those individuals – which if they leave or are off work for a period of time stops things happening. Also the gap between the abilities of the learning technologists and the average lecturing staff is widening due to the speed of change in this area of work, and only the technologists have the capacity to keep up.

I was lucky enough over the last year to work on an LSIS funded project with Loughborough College looking at a different model, where the expertise for certain technologies is devolved to teaching staff, which has the effect of building confidence, spreading the expertise, and putting the expertise into the teaching staff room.

The experiment involved the ‘learning technologists’ within the sports team mentoring lecturing staff as they explored and used a chosen technology with the idea that they became the ‘expert’ in that technology and the first point of call for other staff wanting help. The outputs from this project can be found a http://thehub.loucoll.ac.uk/lsistechnologyproject/ The outputs include at least 2 screencasts for each technology, and a simple user guide in that technology.

The next question is was the project a success? The short term indicators showed that the staff were more confident as a result, there was a noticeable knowledge shift away from the learning technologists to the teaching staff which was positive and staff were certainly using a wider range of technologies and using them better. The real test though will be in another 12-18 months time – to see if the teaching staff that became these experts are still ‘experts’ and are still influencing others around them.

From what I have seen though having worked through the process with the college, I think there is definitely merit in this ideology. It is challenging for many, as it does involve the technologists letting go slightly, and some may see this as a threat to their employment future – although I see it as an opportunity, I think there will always be a need for the technologist, it is just their role may change into including more mentoring and facilitating rather than just doing. This model would also require careful strategic planning and direction, and in the short term would take longer than sticking with the current model, which would be a real threat to is adoption else where, but if people could see the benefits of this model, then it certainly has potential.