Data, personal learning and learning analytics

This week’s topic for Stephen Downes’ E-Learning 3.0 MOOC is Data.   From the synopsis that Stephen provides for the week we read that…

…. there are two conceptual challenges associated with this topic: first, the shift in our understanding of content from documents to data; and second, the shift in our understanding of data from centralized to decentralized.

The first shift allows us to think of content – and hence, our knowledge – as dynamic, as being updated and adapted in the light of changes and events. The second allows us to think of data – and hence, of our record of that knowledge – as distributed, as being copied and shared and circulated as and when needed around the world.

To try and make sense of this topic I have watched three videos this week.

Personal Learning vs Personalized Learning: What Needs to Happen Oct 24, 2018 Online Learning 2018, Toronto, Ontario, Contact North. This special briefing explores personal learning as the future of learning, explores why it’s important, the tools which enable personal learning and the significant potential of personal learning as a key to life-long learning and the skills agenda. URL: https://www.youtube.com/watch?v=mVnjet3cKfU

This was the video that most resonated with me and related most to my personal interests. What I like about Stephen’s work is that he doesn’t forget to ask the question ‘why’, i.e. the ‘why’ of learning analytics for learners, rather than just the ‘what’ and ‘how’. In this video Stephen tells us that there are two approaches to learning, personalized (formal learning, which accounts for about 20% of our learning) and personal (informal learning, which accounts for the rest). This slide (7) from his presentation ( https://www.downes.ca/cgi-bin/page.cgi?presentation=497 ) provides a clear overview of the differences.

Stephen then considered how we can support an approach which promotes personal learning through discussion of three major themes: choice, ownership and community. In this video Stephen says of learning analytics that it should be for learners so that they can track and understand their own progress. This would mean, in terms of the three major themes, that we can choose what to work on (create our own learning paths), where to store our data and what data to store; that we own all our data and have control over how it is used; and that we are free to work openly and create our own learning communities with whom we can share our data and from whom we draw support. Learning analytics will help us to keep track of our data (which will be distributed over various locations on the web) and self-monitor our personal progress. Personalized learning, whilst still useful and necessary in certain contexts, does not allow for the autonomy necessary for personal learning. The big question raised by Stephen was ‘how can we make this happen?’ i.e. how can personal learning be promoted and recognised in today’s education contexts.

AI in Education Symposium – Introduction: Oct 24, 2018 Artificial Intelligence and 21st Century Education in Ottawa, my brief introduction and posing of a problem. URL: https://www.youtube.com/watch?v=WENb9N2gnpQ

In this 6-minute video, Stephen introduces the AI in Education Symposium in Toronto. He asks can AI solve the problems of society, since society has now become too complex for its problems to be solved by a few elite, privileged groups? He says that as society gets more complex it becomes increasingly difficult to govern. In the future we will need to teach each other and govern ourselves as a society. We will have to move from a society based on identity, nationalism, religion and language to a society based on consensus and collaborative decision making. The question posed was – Does AI offer us lessons into how to do this? I can see how this is related to the themes developed in the ‘Personal Learning vs Personalized Learning: What Needs to Happen’ video.

Conversation with Shelly Blake-Plock Oct 24, 2018 Week 1 of E-Learning 3.0 with Shelly Blake-Plock, Co-Founder, President and CEO – Yet Analytics. URL: https://www.youtube.com/watch?v=dsmdwnUwKkA

This third video was the E-Learning 3.0 MOOC course video for the week. In this conversation Shelly Blake-Plock described his work in Yet Analytics, a company which focusses exclusively on learning analytics and works with the K-12, corporate and military sectors in the US, to help improve learning content and instruction, and improve the management of data. The system they have developed for tracking learning experiences and performance is known as Experience API (xAPI). Shelly claimed that this system goes beyond how a traditional LMS is able to analyse content and activity. xAPI is able to pull data from the physical world (sensors etc.), mobile devices, games, etc. This data is stored in a secure Learning Record Store, which can then provide automated data visualisations to support learners in understanding their progress.

In watching this third video, it seemed to me that there is a mismatch between Stephen’s aspirations for learner autonomy and the learning analytics systems being developed by Yet Analytics. Questions that were asked by Stephen and others on the course, were:

  • How would this work with distributed data (remembering that distributed data allows for choice, ownership and community, as well greater security)?
  • Who owns the data/records?
  • What are the ethical implications of these developments?
  • What are the privacy and governance issues?
  • How will the data tell us what learners have learned/understood, as opposed to what they have ‘done’, in terms of number of views, clicks on documents etc.

These are important questions for Yet Analytics to answer if they are really going to provide a system that goes well beyond what a traditional LMS can do and recognises a ‘personal’ learning approach to education.

Finally, as a result of watching these videos and thinking about learning analytics this week, I have wondered what might be the implications of measuring and monitoring everything we do. Is there a danger that it could be taken to excess, such that we treat our bodies like machines, become super-competitive, self-centred and self-absorbed?

Update 31-10-2018

Shelly Blake-Plock has pointed out that there are some errors in what I have written about his work, and has responded to the questions listed above. Please see his comment below.

Related blog posts

There have been some interesting posts from other course participants related to all this. See for example:

Geoff Cain – Week 0: Seimens and Downes on AI – http://geoffcain.com/blog/ai/week-0-seimens-and-downes-on-ai/

Roland Legrand – An Experience API for learning everywhere (also in virtual worlds) – https://www.mixedrealities.com/2018/10/25/an-experience-api-for-learning-everywhere-also-in-virtual-worlds/

Matthias Melcher – #EL30 Alien Intelligence AI – https://x28newblog.wordpress.com/2018/10/24/el30-alien-intelligence-ai/

Laura Ritchie – #el30 Notes Week 1 – https://www.lauraritchie.com/2018/10/25/el30-notes-week-1/

Embodied Learning

At what point do we forget or cease to think about intelligence as being embodied and think of it only in terms of our brains and minds. Little children naturally use their bodies for learning. Most children in nursery schools do not read and write. They learn through their bodies in the sand and water tray, on the climbing equipment, with bricks, Lego and so on. They learn by doing and acting on their environment using all their senses.

There was a fascinating programme on the Horizon Programme on BBC2 last night – The Hunt for AI, which explored the relationship between mind and body and the extent to which the body is ‘intelligent’.

The question asked by the programme was whether it is possible to build a machine (robot), which can mimic human intelligence. Is it possible to make a machine that can think?  Is it possible to capture the wonderful versatility of the human brain, human imagination and creativity?

The programme showed that there have been amazing developments in artificial intelligence and robotics since the days of Alan Turing’s work on the Enigma Machine. Most recently among these developments there has been the realization that in order to mimic the nature of human intelligence, robots or machines would need to have embodied intelligence, intelligence conditioned by the body. So artificial intelligence is now trying to replicate the human body.

Once you start thinking about the role of the body in learning – as we did in the paper we have just submitted to the Stirling Conference in June (see Abstract here – Theorising Education 2012 Abstract), then it is possible to think of many examples of embodied learning.

The example in the BBC2 programme was of the presenter trying to learn how to walk the tightrope. His instructor tells him to ‘turn his brain off and let his muscles do it’; she tells him that he is ‘thinking too much’.

As he said, learning to walk on a tightrope is similar to how a child learns to walk. It is instinctive. Ultimately everything we learn to do becomes automatic with practice.

But perhaps what was most intriguing about this example was that the presenter only learned to walk the tightrope when his instructor suggested that he sing at the same time. This very much relates to the discussion we have in our paper about multi-modal and cross-modal ways of working.

This was a fascinating programme and couldn’t have been more timely in relation to the work I have been doing with Roy Williams and Simone Gumtau on our paper – Synaesthesia and Embodied Learning.