Since the first learning analytics conference, held in Canada in 2011,1 the field has been developing rapidly. Some of the trending areas in research are teacher and learner dashboards, predictive analytics and automatic feedback. There is also rising use of natural language processing technologies and multimodal learning analytics. E-Learn interviewed experts from North America, Latin America and Europe in order to understand these trends and to find out how they are experimenting with novel uses of data in education, as well as their perspective on the future of learning analytics.
Learning analytics is an emerging field, and its adoption by higher education institutions around the world is very uneven. The United States, United Kingdom, Canada and Australia are considered leaders in this domain, while regions such as Latin America are still at an initial stage of exploring its possibilities.
In the United States, there is great awareness and interest in learning analytics by higher education institutions, says Alyssa Wise, associate professor of learning sciences & educational technology and director of New York University’s Learning Analytics Research Network (NYU-LEARN).
“If you talk to leaders at almost any higher education institution in the country, you will see that they know about learning analytics, and they are excited about the idea of using data to help make their students’ educational experiences better, but there is not as much action yet as one might hope,” Wise explains. “I think everybody feels a little behind the curve of what is possible. They think everyone else is doing these spectacular things, but (with a few notable exceptions) most universities are all working through the same set of issues. The reason is that there are a variety of barriers that need to be overcome to put robust learning analytics systems into place,” she adds.
These barriers include technical and infrastructural challenges, and important questions around data stewardship and access to data. “One of the biggest challenges we face with learning analytics projects is that we have a great idea for a project and all of the stakeholders are on board, but nobody knows who is able to give permission to look at the different data sources we need. This has not been something that universities have had to deal with before because nobody has asked,” explains Wise.
In Europe, according to Dragan Gasevic, professor and chair in learning analytics and informatics at The University of Edinburgh, the majority of institutions are aware of learning analytics, and there are some small experiments taking place.
Possible barriers for adoption include shortage of capacity or knowledge to actually understand how analytics can be beneficial for institutions, and the lack of strategic ability, that is, not having institutional leaders that can identify those critical issues and how learning analytics can help. A third barrier would be related to privacy protection and ethics, which is a main issue in specific countries.
“Countries such as Germany have had, historically, through the World War II, a huge abuse of their private data, and thus people are much more concerned with respect to ethics and privacy,” explains Gasevic. “What we are seeing, since we are running a major project called SHEILA,2 is that students are aware of the ethical and privacy protection questions and they have high expectations that institutions will protect their privacy, but at the same time, they also have big expectations that their data will be used.”
In Latin America, a recent study revealed what researchers already knew: adoption of learning analytics in Latin American countries is still at very initial stages, according to Xavier Ochoa, professor and director of the Teaching and Learning Technologies Research Group at the Escuela Superior Politécnica del Litoral (ESPOL), in Guayaquil, Ecuador.
“There are few institutions that are committed to implementing learning analytics in their core processes, but there’s a lot of interest. I haven’t talked with any administrator that is not very willing to try learning analytics at their institution,” says Ochoa. According to him, the lack of researchers and education professionals with expertise in learning analytics is the main barrier for adoption in the region.
See next some of the most interesting experiments that are taking place at institutions around the world.
Predictive Analytics: An Upward Trend
Alyssa Wise, from New York University, is involved in several different projects that work with proactive support for students through predictive modeling. In predictive analytics, a computational model is created based on students who have previously taken a course, and then applied to students who are beginning the course in order to identify those who are likely to have difficulties.
“The accuracy of the model we have is quite good. We can identify as early as the first day of class who are the students likely to have trouble in a course. The challenging thing is: what do we do about it?” she says.
It is evident through this example that being able to predict difficulties isn’t enough. Researchers and education professionals still need to figure out how to best use this information to support teaching and learning.
In one of the projects, Wise explains, the model shows that student difficulties in introductory college mathematics courses are linked to inadequate prior preparation. “In this particular model the path toward action is actually straightforward. We can offer these students learning resources and mastery driven practice sets to help them get up to speed on the skills and concepts that they may not be strong enough in at the start,” she says.
However, in other cases, the model can predict who will have trouble, but the factors used to predict are not very clearly actionable. “In this case, we know which students are going to have challenges, but we do not know why or how. So, it’s a much more difficult challenge to figure out what to do with the model. Do we show it to the instructors? How do the instructors contact the students and what do they say?” says Wise.
“We have to be careful because you do not want to tell students at the start of a course that you are expecting them to do poorly. This has the potential to be demotivating, and in some cases induce stereotype threat, and thus become a self-fulfilling prophecy,” adds Wise. “Instead, it is important to stay positive and focus on what actions they need to take to be successful.”
According to the expert, NYU has started to build teacher dashboards that give instructors an overview of their students’ digital activities. This can be used both to evaluate particular class elements and identify students that may need particular attention. “The next step is to study how instructors use these new sources of information to inform their teaching and how we can better support them in this process of data-informed pedagogical decision-making,” Wise describes.
This illustrates why a great deal of attention is now being paid to how institutions are putting data and models into practice in ways that are ethical and useful to support teaching and learning.
Feedback Loops: Empowering Students and Teachers
Establishing or improving feedback loops between teachers and learners is the main gain of learning analytics according to Dragan Gasevic, from The University of Edinburgh. That can be especially helpful in larger classes, sometimes with hundreds of students, which makes it impossible for a teacher to interact with all of them at a more personalized level.
“Learning analytics helps us get the insight into the learning patterns of every one of our students, in order to understand, for example, their learning progression, their time management and their misconceptions,” explains Gasevic.
This empowers teachers so they can get much deeper insight into what is happening in their classrooms, especially in blended or flipped models. And more importantly, instructors can also provide personalized individual feedback at scale. A project in which Gasevic is involved aims to accomplish exactly that. Project OnTask began in 2016 led by The University of Sydney in collaboration with University of Technology Sydney, University of New South Wales, University of South Australia, University of Texas at Arlington, and The University of Edinburgh.
“With learning analytics and support of the OnTask software, instructors are able to provide personalized feedback that scales up highly. For the amount of work to support 3 to 5 students, they are able to personalize feedback for hundreds or thousands of students”, Gasevic explains. The project’s results show increase in student performance, satisfaction, and learning process.
The connection between learning analytics and learning design is another important topic in the field. Learning analytics can help teachers to reflect on the effectiveness of their learning design.
“For example, when I am teaching a course, I can reflect on what is working in my course, or what is not working, and based on that I can make certain changes in my design and, in little time, start making that orchestration much more effectively,” explains Gasevic.
Gasevic and his team developed an award-winning software called LOCO-Analyst, a tool that aims at providing teachers with feedback on the relevant aspects of the learning process taking place in a web-based learning environment.
“When we developed LOCO-Analyst we were trying to analyze different types of digital traces generated by learners as they were using and interacting with digital resources on the web and inside of their learning environments, such as Blackboard Learn,” he explains.
According to the expert, one of the functionalities that is most appreciated by teachers offers critical insight not only at the level of particular resources, but at the conceptual level, and the way students are actually constructing knowledge around certain concepts. The insights into different acts of social interaction that happen among students are also valued.
Gasevic sees a fairly big disconnect in terms of what is actually needed by teachers and students, and what is currently out there. “What is happening in reality is that most teachers and students do not understand the content in their dashboards. I attribute that to major problems, and the first problem is related to the lack of proper ethnographic studies trying to understand the teachers and the students in the way they can embed and use analytics inside of their specific tasks,” says Gasevic.
Beyond the LMS: Multimodal Learning Analytics
At the beginning, learning analytics worked exclusively with data from online tools, like learning management systems or online games. But what about the learning that is happening outside the computer? How could data from the real world, such as in classrooms, student groups, or even when students are doing their homework, be captured and measured? This branch is called multimodal learning analytics and it’s Xavier Ochoa’s personal field of research.
“Multimodal learning analytics arise naturally out of the need to understand learning where it is happening. For example, there is a lot of learning that happens when students work together, trying to solve a problem or an exercise in a course. If there is no computer there, there aren’t traces of these activities. Maybe you have the end results of what they did, but a great deal of information is lost if you only analyze the end result,” he explains.
Multimodal learning analytics exploit audio and video recordings, what students write, look at, and say, and all this information is used to get a clear picture of the learning process.
“Humans are multimodal in nature, we capture information through all our senses, so we are trying to make the computer do something similar,” explains Ochoa.
According to him, as technology became more accessible over the recent years, the development of multimodality became a reality. Two factors that have enabled the field of multimodal learning analytics are the development of artificial intelligence and the availability of very cheap sensors.
“Imagine that you can have a device that costs almost nothing with a camera and a microphone,” describes Ochoa. “With this type of sensors and the right software, it’s possible to analyze, for example, student posture, create transcripts of what the student is saying, or capture emotions in the student’s voice. So, now, computers are able to see, are able to hear, and we are trying to exploit those capabilities to better understand the learning process.”
This expert estimates that 30% of studies on learning analytics currently include some kind of multimodality. That could be an indication of institutions realizing that they need to be more holistic when looking at learning processes.
One example of a multimodal analytics project developed by ESPOL is a tutor environment where students can practice giving presentations. Students enter a room, close the door, and then they are greeted by a virtual audience to whom they can present their slides.
“The system will analyze their posture, their gaze, the volume of their voice, if they are stammering, and it will analyze their slides, if they have too much text or the font is too small, and will give an automatic feedback on the presentation,” describes Ochoa.
On that same area, ESPOL is using multimodal learning analytics to understand what differentiates experts from non-experts while they are trying to solve a problem. That’s a type of system where multimodal learning analytics is used to provide feedback not for the student, but for the instructor.
“We can try to understand if a student has reached mastery in a skill just by looking at him or her. That’s exactly what a professor would do, observe student behavior,” compares Ochoa.
▪ Technical and theoretical advances are making it possible to collect better and richer data.
▪ This can be done not only through online tools, but also through clickers, sensors and motion detectors that collect data from the real world. That’s multimodal learning analytics.
▪ Providing automatic, personalized feedback through student behavior analysis is becoming more common, especially for large classes.
▪ Natural language processing technologies are being used to analyze student-created products. This allows instructors to go beyond multiple-choice questions and respond to students’ creative works as well.
▪ Researchers also want to discover how analytics can help understand students’ emotions, not just their cognitive state.
▪ A challenge that researchers currently face is that tools have to be designed to work in specific learning contexts, but at the same time they need to be general enough to be used across different courses and even institutions.
▪ Learning analytics will be increasingly used to support decision making for students, instructors, administrators, institutions and even governments.
Xavier Ochoa, Full Professor, Director of the Information Technologies Center, Escuela Superior Politécnica del Litoral (ESPOL), Ecuador Photo: Rodrigo Buendia
Dragan Gasevic, Professor and Sir Tim O’Shea Chair in Learning Analytics and Informatics and Co-Director, Centre for Research in Digital Education, The University of Edinburgh, Scotland Photo: Andy Buchanan
Alyssa Wise, Associate Professor of Learning Sciences & Educational Technology and Director, NYU Learning Analytics Research Network (NYU-LEARN), New York University, United States Photo: AFP Dominick Reuter
1 LAK11. (2011). 1st International Conference on Learning Analytics and Knowledge 2011. Retrieved November 15, 2017, from https://tekri.athabascau.ca/analytics/program.
2 SHEILA. (n.d.). Using data wisely for education futures. Retrieved November 16, 2017, from http://sheilaproject.eu.