Wednesday, July 17, 2013

Learning Analytics

Greetings from the Australian National University in Canberra, where a workshop on "Learning analytics: Building evidence based practice" by Dr Shane Dawson, UniSA, is being hosted. Learning analytics is about analysis of data about students to improve courses. This has come to prominence with Learning Management Systems, as these record detailed information about what students do and when they do it. However, there are risks in misinterpreting statistics, in particular confusing a correlation with a causal relationship. As an example, there is a well known correlation between a student's participation in on-line course forums and their final result. But it is not necessarily the case that forcing students to participate will improve their results. The other question this raises is what teachers did the past: just hope that the way the educated worked?

One of the hidden agendas with learning analytics is that it works better with large amounts of data, from large courses. If you are teaching only a few dozen, or few hundred students, then some statistical techniques do not work well. There is then an impetus to have thousands, hundreds of thousands or millions of students.

One question I have which learning analytics could help with is the differences between classroom and on-line courses. I did do a quick check to see if the students who do my on-line course get different results from their classroom course: they don't (there is about 0.8 correlation for on-line and course results of the same students). But are the students who undertake on-line courses different in some way? Why do some students withdraw early on? Is there something I can do to reduce the withdrawal rate. It might be sufficient to know what program the student is enrolled in. But the central universality database has more student details, such as if they are a domestic or international student, previous results and where they previously studied.

Some of this is just a matter of good course design, assessment and normal teaching. As an example the Research School of Computer Science built itself a database years ago, with all student results in it. This is used in the end-of-semester examiner's meeting, where the results of each course are compared. If necessary, the results from previous years can be compared.

One thing which struck me about the discussion of learning analytics is the emphasis on student performance, but what about teacher performance?  The same analysis done of what students do and their results can also be applied to staff: what they do and how this effects outcomes.

Shane's paper "Informing Pedagogical Action: Aligning Learning Analytics With Learning Design" has more detail. Shane's workshop is being hosted at University of Tasmania 19 July.


This workshop, sponsored by the Higher Education Research and Development Society of Australasia and ANU Online as part of an Office for Learning and Teaching project, aims to:
  • Provide an overview of the current state of learning analytics
  • Illustrate how learning analytics can provide direct evidence of student learning
  • Explore the diversity of tools and methods associated with learning analytics
  • Provide practical ideas to apply learning analytics for evaluating and improving teaching practice.
Participants are expected to bring a laptop to enable hands-on involvement.
The high growth in adoption of education technologies such as learning management systems (LMS) across the education sector has resulted in alternate and more accessible data on learning and teaching practice. As with most online systems, student interactions with course activities are captured and stored. These digital footprints can be ‘mined’ and analysed to establish patterns of learning behaviour and teaching practice, a process described as learning analytics. Tracking the patterns of student interactions can provide detailed insight into the learning process and allows for rapid evaluation of the impact of specific learning activities. In essence, learning analytics empowers both instructor and student to make informed decisions about their learning and teaching processes, through the interpretation of educational data from both learner and teacher orientations.
This workshop provides an initial overview of the current state of learning analytics and illustrates how the implementation of multi-analytic lenses can provide direct evidence of student learning. In so doing, Dr Dawson will discuss the value of learning analytics and in particular social network analysis (SNA) as a methodology for visualizing curriculum and peer networks that are established through course relationships such as student engagement and course progression. Following this overview, participants will explore the diversity of data sources at their disposal drawing on analytic tools and dashboards such as SNAPP and LMS sources.
Dr. Shane Dawson is the Deputy Director of the Learning and Teaching Unit, and Associate Professor of Technology Enhanced Learning at the University of South Australia. His research activities focus on learning analytics and social networks to inform teaching and learning theory and practice. Shane’s research has demonstrated the use of learner interaction and network data to provide lead indicators of student sense of community, academic success and course satisfaction. Shane has also been involved in developing pedagogical models for enhancing creative capacity in undergraduate students. He is a co-founder and executive member of the Society for Learning Analytics Research and was co-chair of the 2012 Learning Analytics and Knowledge conference in Vancouver, Canada. ...

Funded by HERDSA and the Australian Government Office for Learning and Teaching.

No comments:

Post a Comment