Do we need a term for ‘Learning Technology’?

Over the years, there have been many terms used to describe the use of technology within education. If I go back to my early teaching days in the late 1990s – schools referred to it as ‘ICT’, Colleges called it ‘ILT’ and Universities called it ‘eLearning’. Since then we have also used (amongst others) the terms ‘Learning Technology’ and ‘Technology Enhanced Learning’.

The terms ICT, ILT, eLearning and Learning Technology make no quality judgements about the use of technology (e.g. the use could be good, bad or indifferent), whereas the term Technology Enhanced Learning does make a judgment (e.g. it only refers to the uses that actually enhance the learning) – and this was discussed in my last blog post – Does Technology Enhanced Learning (TEL) actually enhance learning?

So if using the term TEL, has in essence filtered out the bad and indifferent uses if technology – could we go one step further and remove the term altogether, and simply refer to this as teaching and learning? This is an ideological and philosophical question that is often asked – and the short answer is that yes, at some point in the future, hopefully the use of learning technology will be so well embedded that it won’t need its own name or definition, it will be just part and parcel of teaching and learning. However we are not there yet, and the rest of this post will explore why.

When I first became the ILT coordinator at a college, one of the first decisions that I made, was to scrap the College’s ILT strategy (the very document that had actually created my post) – my argument being that for ILT to be successful it has to be embedded fully and not seen as a ‘bolt-on’, so I scrapped the strategy and subsumed the useful bits of it into other college strategies – mainly the teaching, learning and assessment strategy, but also the IT strategy and a few others. This proved to be very useful, and I think was a key factor in the college’s successful progression in this area of work. We did have problems at the time, in that we were often bidding for pots of money for projects from the myriad of external agencies, who often required that we submitted a copy of our ILT strategy as part of the bid. I would send in the teaching and learning strategy, which was often rejected, so we did have to recreate a sort of ILT strategy, just so we had something to submit as part of this bid process. This annoyed me somewhat, that the external agencies were forcing us to take a step backwards.

Square root of 2 triangleA few years later I was attending a conference, where the night before there was a pre-conference dinner with guest speakers and an ‘ask the experts’ panel, and the question about whether we need a term to describe this area of work was raised there. Most of the panel members, (who clearly struggled with the question) waffled on a bit about something and nothing, and then concluded that we could get rid of the term, until the last person to answer spoke, and his background was actually the history of mathematics, and he used the analogy of irrational numbers (e.g. the square root of 2, pi, and various others) – which although first identified in ancient Greek times, it wasn’t given a name until much later. People that understand mathematics to a high level* understand the concept of irrational numbers, and therefore don’t need a name to conceptualise or use it – but most people don’t understand it, and partly because it is an abstract concept, they have difficulty conceptualising something that doesn’t have a name and therefore (in their eyes) ‘doesn’t exist’. The introduction of the term irrational numbers wasn’t for the benefit of mathematicians, but for the benefit of the average user.

He argued that the main purpose of terms to describe ‘learning technology’ is to make the concept of it more real and less abstract, which is essential for the understanding of the average person, and is essential for its adoption of propagation within education.

This answer from the mathematician, resonated with me – as I had often ideologically been trying to drop the term, but realised that we aren’t ready yet for such a move.

So – my argument is that there is a need for a term – and I don’t really care what the term is, along as it is universally understood within the organisation or situation, it could be ILT, Learning Technology, eLearning, TEL, or Geoffrey – it doesn’t matter as long as its use is consistent. What is important is that there are active steps taken to make sure that over time this area of work is truly embedded into practice, and doesn’t become further detached from the core business of teaching and learning (and assessment). Something that I am slightly concerned about at the moment, is this area of work has expanded rapidly in recent years, with many organisations now having dedicated ‘Learning technology’ teams, and there is now a recognised career path for someone to become a learning technologist – and there is a risk that this could actually move further away from the core practice, rather than closer to.

*Please note that I am not a mathematician by background, so apologies in advance to any mathematicians if my analogy hasn’t been articulated 100% accurately.

Can we stop using the term ’21st Century Learning’?

In the late 1990s as I was starting my career in teaching and my work in learning technologies, I often used the term 21st century learning as a reference to the future, which I think worked well as we were approaching the excitement of a new millennium which I saw as an opportunity to challenge the norms and look at the potential that technology offered. However, when Big Ben struck midnight on the 31st December 1999, and the millennium bug turned out to have not been the issue many feared, I instantly stopped using the term 21st Century learning, as that was no longer the future but the present.

Big Ben (b&w)

It slightly worries me that in 2012, people (and this includes lots of high profile conference titles) are still using the term as prevalently as they are. We are more than 12 years into this century, that’s an eighth of the way through. Things are very different now to what they were 12 years ago, and nobody knows what the next 80 odd years will look like (realistically no-one knows what the next 5 years will bring), so how can we sensibly use a term that covers 100 years, and refers to the past, present, and future all at the same time.

In general I am not interested in working in the past, I am interested in the present and the future (and roughly only the next 5 years of the future) – so why can’t we use a term that reflects this, rather than the blanket ’21st century learning’.

After playing around with various combinations I would like to propose the term ‘teaching and learning’ as an alternative.

I expect a few people to disagree with me here, but thought this post would be good food for thought!