Glen Fruin, Scotland, United Kingdom
Educational institutions that implement a learning analytics strategy can collect significant student data and create reports based on student activity. But is it ethical to use students’ data? Is communicating to students how their data is being used an ethical responsibility of educators? Lastly, are institutions taking proper care of privacy and student rights? Current discussions regarding learning analytics and ethics in the education community have been studying the implications of dealing with such data. This is a complex issue, and there are a variety of concerns that should be considered, especially when innovation in learning analytics continues to be developed at a rapid pace.
In this interview with E-Learn, consultant Niall Sclater explains that learning analytics faces a combination of ethical, legal, and logistical issues, and talks about some of the ethical concerns that surround student data and learning analytics. Sclater has been working with innovation in online and distance education for more than 25 years. He has also worked on many collaborative national and international projects related to the use of technology to enhance learning and teaching, and is the author of the book “Learning Analytics Explained.”
Main Ethical Issues Related to Learning Analytics
Poor Quality or Insufficient Data
“Trying to do analytics when you do not have much data, or it is not good quality, is a really difficult and pointless exercise. A lot of work needs to be done in cleaning the data set and compensating for that,” says Sclater. One example of this is something called ‘enmeshed identities’, where students are working together online and the data cannot differentiate between the person who is authenticated and the other members of the group.
“In learning analytics, the two main data sources are generally the learning management system (LMS) and the student information system (SIS), which capture only a tiny bit of the learning that takes place and some of the contextual information. Ideally, we need more data points as well as accurate data.”
Not only does the data need to be high quality, the analytics also need to be valid. “One of the main rationales behind predictive learning analytics is that there is a relationship between a student’s engagement in their learning activities and their subsequent success. Generally, the more students engage in the learning process, the more likely they are to complete their course and get a better grade. But engagement is not the same as success, and quite often people confuse the concepts of causation and correlation”, Sclater explains.
Loss of Autonomy in Decision Making
This is a frequently discussed ethical issue, particularly with adaptive learning systems which continually change the learning based on how students are performing. “Some people worry that it might ‘spoon-feed’ the students too much with automated suggestions, therefore make the learning process less demanding.”
Another issue is related to when students change their behavior, either consciously or unconsciously, if they know that their activities are being continuously monitored. “If the institution is monitoring the e-book the student is reading, is that going to change his or her behavior? That might result in the student improving their learning, but it also might increase stress levels. Some students might even decide not to participate in some activities at all because they feel uneasy.”
Gaming the System
Students may do something to try to improve their scores or their engagement. In one institution, for example, Sclater heard about a student inserting their ID card into the library entry system a number of times, trying to improve their library engagement score. “This is an example of unforeseen outcomes that you can have when students are aware that you are measuring things.”
Obligation to Act
Finally, Sclater points out another ethical matter he thinks is important: whether there is an obligation on the institution to act on the basis of the analytics. “If you have a lot of data on the students, is there an ethical obligation to do something with it? If it is possible to stop a student from dropping out, should you not be obliged to use that data?,” he asks.
Data Should Be Used Carefully
Sclater clearly states that students should be aware of the type of data that is being collected about them, and what is being done with it. He mentions that the law in some countries says that if any user asks about what the institution is storing about them, they need to be able to tell the student exactly what it is and what they are doing with it.
“I do not think there are many universities and colleges yet that are in a position to do that,” he affirms. “The only way forward is to make sure you understand all the data that is being collected about individual students, and be in a position to gather it together and provide it quickly to any student that asks for it,” Sclater suggests.
Is There an Ideal Way to Carry Out Interventions With Students?
Learning analytics is only constructive when accompanied by interventions to change student behavior or improve the course. The decision about when to make an intervention varies according to different institutional processes. Sclater believes that institutions should define what he calls “a trigger,” meaning what precipitates that intervention. “In some institutions, it might be the student has not logged into the system for two weeks, for example. In others, it is when they have not submitted an assignment on time. And, in others, there are regular points in which personal tutors or student advisers will look at where the students are at, four times per semester, for example.”
In addition, institutions should define the different types of intervention, such as an automatic reminder sent to the student, a question, a prompt, an invitation to meet with a tutor, or even a supportive message. “All these measures can be included in an intervention plan,” he says. Institutions should also think about the frequency and timing of interventions for them to be effective.
One of the biggest issues for institutions, however, is about how to get staff to change their working practices, and to understand that analytics is either going to help their daily activities, or make an impact on the students. Lastly, Sclater points out: “There is not much point in carrying out interventions unless you are going to evaluate the success of them.”
In Learning Analytics, Students Are Individuals, Not Numbers
It is very easy to look at a dashboard or some other kind of analytics and detect a failing student without considering that he or she might have all sorts of potential reasons for that prediction, such as facing an illness or other personal issues, finding the concepts too difficult to understand, or having a demanding job and not having enough time to study properly. “The only way to find that out may be to bring a human into the process,” Sclater says.
This means treating students like humans, not numbers. “Many universities are using analytics as a way of identifying the student who might need a subsequent conversation with a person.” Greater data sources can help to find out more about the student and tailor the intervention to their circumstances. “But you are never going to get the whole picture about a student. For many of them, meeting a person to chat through their issues is the only way forward,” Sclater alerts.
Educators Should Focus on the Positive
One of the main concerns for educators is that, if analytics shows students their position among the class percentile, it could potentially demotivate some of them. According to Sclater, while many students want to see that they are doing well compared to their peers, being told or seeing that they are likely to fail could demotivate others. “It is important to tell students early in the course to get into the habit of checking these analytics,” he says. “And then maybe if they are not doing well, the appropriate thing may be for that student to pull out of the course and enroll in a different one. I do not think that the possibility of demotivating a student is a good reason not to show students how they are predicted to perform.”
For Sclater, the best way to provide this information to students is still a matter for investigation. “If we can get the computer systems to understand more about what motivates the individual person, then we may be able to tailor the messages, and the kind of information we present to the students more appropriately.” He is excited by the possibilities that personalization presents. “We are only at the beginning of a journey here. I think the potential is huge for this.”
How to Manage Student Data
The legal requirements concerning students’ data vary from country to country. According to Sclater, some of the matters that institutions should pay attention to are being clear about what data is being collected and why you’re doing that, the capacity to anonymize the data, ensuring that the student has the right to have their data erased if they request that, and developing careful access controls.
Developing a Learning Analytics Policy
Sclater says that educational institutions should develop an institutional policy agreed by the relevant stakeholders in the institution, including students. He suggests some of the main topics the policy should include:
- What kind of data is being collected
- Who is responsible for the overall initiative
- How the institution is dealing with transparency – making it clear what you are doing and what kind of consent you are gathering from students
- What is being done about confidentiality and securing the data
- How to ensure the data and the analytics are valid
- How students will have access to their personal data
- How interventions will be carried out
- How you are going to make sure that there are no, or minimized, adverse impacts on the students
He also recommends the development of a student guide to learning analytics that answers what students might want to know (he has developed a model student guide that can be downloaded here, and of a more in-depth, technical document that presents how the analytics are actually working. Algorithmic transparency, he suggests, is going to be increasingly important.
6 Key Factors to Check if Your Institution is Ready to Build a Learning Analytics Strategy
Niall Sclater gives some tips for universities and colleges to make sure they are ready to develop a learning analytics project and to deal with the relevant issues:
Leadership Someone in the institution has to lead and promote the initiative. The person should be senior enough to give it credibility and should have a high-level knowledge about learning analytics.
Investment The institution should look into the resources that are going to be used in analytics, whether they are for buying software, linking up data sources into the analytics software and so on, and also look at costs in implementing learning analytics, such as the staff costs.
Culture Institutions need to build a vision for the initiative around their institutional vision, making learning analytics a part of their strategic plan.
Structure and governance “Learning analytics projects cut across existing roles and power structures in an organization, as different people need to be involved, and individuals can be quite concerned that their perceived ownership of data or influence is threatened somehow by another project coming in,” says Niall.
IT Besides the technology that is going to be adopted, there are different data sources that need to be cleaned and put into the right format. Also, there’s the need to make sure the staff have skills in handling, interpreting and visualizing the data.
Prepare teachers and learners They need to understand some of the processes of adopting learning analytics and be aware of its impacts and advantages.
Niall Sclater, Consultant and Director at Sclater Digital.
AFP Andy Buchanan