Learning Analytics can be described as the “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”. The field of Learning Analytics is essentially a “bricolage field, incorporating methods and techniques from a broad range of feeder fields: social network analysis (SNA), machine learning, statistics, intelligent tutors, learning sciences, and others”.
Learning Analytics applies techniques from information science, sociology, psychology, statistics, machine learning, and data mining to analyze data collected during education administration and services, teaching, and learning. Learning Analytics creates applications that directly influence educational practice. For example, the OU Analyse project deploys machine-learning techniques for the early identification of students at risk of failing a course. Additionally, OU Analyse features a personalised Activity Recommender advising students how to improve their performance in the course.
With Learning Analytics it is possible to obtain valuable information about how learners interact with the FORGE courseware, in addition to their own judgments provided via questionnaires. In particular, we are collecting data generated from recording the interactions of learners with the FORGE widgets. We are tracking learner activities, which consist of interactions between a subject (learner), an object (FORGE widget) and are bounded with a verb (action performed). We are using the Tin Can API (also known as xAPI) to express and exchange statements about learner activities, as well as the open source Learning Locker LRS (Learning Record Store) to store and visualise the learner activities.
Figure 1 depicts the widget-based architecture adopted in FORGE. The FORGE widgets use LTI 2.0 for their integration within a Learning Management System (LMS). The FIRE Adapters function as a middleware between the FORGE widgets and the FIRE facilities (testbeds), while the FORGEBox layer offers a seamless experience while learners are performing a course, reading content and interacting with FIRE facilities. All the interactions performed by users on the course content and the widgets are recorded and stored in the Learning Locker LRS using the xAPI.
Figure 1: The widget-based FORGE architecture for Learning Analytics.
Learner activities on the FORGE widgets typically include the initialisation of an experiment, setting the parameters of the experiment and, finally, completing the experiment. Therefore, the learner activities captured by the FORGE widgets use the following types of xAPI verbs:
Initialized: Formally indicates the beginning of analytics tracking, triggered by a learner “viewing” a web page or widget. It contains the (anonymised) learner id and the exercise/widget that was initialized.
Interacted: Triggered when an experiment is started by the learner, containing the learner id, the exercise and possible parameters chosen by the learner. These parameters are stored in serialized JSON form using the result object, as defined by the xAPI.
Completed: The final verb, signalling completion of an exercise by the learner. We can also include the duration that a learner took to perform the experiment, formatted using the ISO 8601 duration syntax following the xAPI specifications.
More specialised learner activities are also recorded by the FORGE widgets depending on the functionalities offered by each widget. For example, the PT Anywhere widget that offers a network simulation environment records the following types of activities, reusing already defined vocabulary:
Device creation, update and removal: We use the verbs “create”, “delete” and “update" from "http://activitystrea.ms/schema/1.0/".
Link creation and removal (i.e., connecting and disconnecting two devices): The link creation and removal is expressed as a user creating a link that has its two endpoints defined as contextual information. Another alternative could have been to use non-existing connect/disconnect verbs to express that a user connects a device to another one (the latter should have been added as contextual information). However, we chose the first alternative because it reuses already existing verbs.
These statements are collected in the Learning Locker, which features a simple but effective dashboard, giving a quick overview of the activities over time, as well as the most active users and activities, as shown in Figure 2.
Figure 2: A screenshot of the FORGE LRS visualising learner activities.
FORGE provides learners with Learning Analytics dashboards in order to raise their awareness of their learning activities by providing an overview of their progress or social structures in the course context. Learners are offered with detailed records of their learning activities, thus being able to monitor their progress and compare it with the progress of their fellow learners. Additionally, the Learning Analytics dashboards targeted to educators provide an in-depth overview about the activities taking place within their courses, thus making the educators aware of how their courses and experimentation facilities are being used by their students.
In order to improve the ways we facilitate awareness and reflection for learners and educators, we are developing further ways of analysing and visualising the captured Learning Analytics data. Our goal is to help educators better understand the use of experimentation facilities by their students, as well as to allow learners to compare their use of the experimentation facilities with that of other learners. Towards this goal, we are developing graph models in order to visualise the different sequences of steps carried out by learners when conducting an experiment via the FORGE widgets. The following widget displays a model of the different sessions recorded by the PT Anywhere widget. This model is customised by the learner or the educator, who specifies the different levels to visualise, i.e. the number of steps or actions to be displayed. In this particular model, the different states for each level apply to a network device, which is part of a network simulation experiment, and refer to its creation (ADD), removal (DEL), update (UPD), connection (CONN) and disconnection (DISCONN). Additionally, a NOOP state is used to represent the lack of action in sessions with fewer actions recorded than levels shown.
Models such as the one featured in the widget below, allow educators to get a more detailed view of how learners conduct experiments using the FORGE widgets. Learners can also use these models to replay their sequence of interactions with the FORGE widgets, as well as view the sequences of interactions of other learners. On top of providing awareness, these models also enable learners to reflect on their learning process, for example by being able to compare the sequences of interactions of other learners with theirs, as well as by comparing their experimentation results with those of their peers. Additionally, educators can reflect on the design of the experimentation facilities and the associated learning materials by studying usage patterns that can reveal common difficulties that learners have in conducting experiments. Educators can also provide suggested sequences of interactions to their students as a means of scaffolding their experimentation tasks.
The following code of use has been adapted from The Open University’s Policy on Ethical use of Student Data for Learning Analytics.
FORGE courses and tools collect and analyse student data as a means of providing information relating to student support and retention.
In the context of the FORGE courses and tools, learning analytics is the use of raw and analysed student data proactively identify interventions which aim to support students in achieving their study goals. Such interventions may be designed to support individual students and/or the entire cohort.
Different organizations are contributing to FORGE with their own educational materials. Some of these organizations (like those participating in the Open Call) are external to the FORGE consortium. Other organizations might reuse existing materials with their own students. Therefore, there is a need to establish common guiding principles which help provide a clear framework for the ethical application of learning analytics.
All data captured as a result of the interaction with the student has the potential to provide evidence for learning analytics. Data will, however, only be used for learning analytics where there is likely to be an expected benefit (which will be evaluated) to students’ learning.
The techniques used in learning analytics are based on standard statistical methods, but typically involve the development of complex models, the full working of which will only be apparent to those familiar with the data and with the statistical methods employed. It is likely, however, that users will want to understand how the models produce the outcomes which they then deploy. Students will want to understand why they have been selected for an intervention and, in some cases, may want to challenge the basis for their selection. A potential conflict exists therefore between creating models which provide the most reliable outcomes and those which work in ways that can be made transparent to users and subjects.
Learning analytics can be applied to individual students as well as to defined groups of students (as a result of identifying a student via combinations of characteristics and/or study behaviours), and to whole cohorts of students (as a result of amending the assessment regime on a module following observed behaviours and/or results, for example). The policy and principles created apply in all cases.
Any use made of data regarding individual students must be compliant with the Data Protection principles and policies of the institution or institutions running each course.
The following definitions are intended to provide clarity about terms used throughout the Policy.
The FORGE associate partners use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals. This is done in two levels:
In the future, the use of learning analytics may be extended to personalised learning paths, adaptive learning, personalised feedback, visualisations of study journey, intelligent e-tutoring, intelligent peer support, etc. Furthermore, new technological innovations might allow for more targeted, measured approaches.
Categories of data that might be captured by the associated partners as part of its interaction with students and potentially available as individual or combined data sets for use in learning analytics:
Out of scope
In its adoption of a learning analytics approach to provide student support, the associate partners do not intend to use the following types of data. This list is subject to review.
Ethical issues relating to the use of student data for academic research
Applications to use student data for the purposes of research will need to be made in accordance with the standard processes in place currently in each associate partner (e.g., ethics committees). Bodies considering applications for research using learning analytics should assess if the projects comply with this policy. The bodies, within the remit of their own terms of reference, may approve research proposals that test the boundaries of this policy. If the outcomes of that research may then be applied to operationally targeting individuals or groups of students, further alignment with this policy will be required.
This policy aims to set out how associate partners should use student data in an ethical way in order to shape the student support provided. The document and accompanying guidelines are not regulatory in nature but are intended to inform and guide the ethical use of student data.
The policy is based around eight key principles discussed in more detail in the policy statement below.
Each of the above principles is linked to particular aspects of learning analytics.
Purposes and boundaries
Principles 3 and 4 make clearer why the associate partners adopt learning analytics as one of many means of providing effective and targeted student support whilst recognising that students, as real and diverse individuals, rather than data or information, drive appropriate student support.
Principle 1: Students should not be wholly defined by their visible data or our interpretation of that data.
Principle 2: The purpose and the boundaries regarding the use of learning analytics should be well defined and visible.
Engaging students in the use of their data
Principles 3 and 4 reflect the shared responsibility of both the student and the associate partner for student learning.
Principle 3: The associate partners are transparent regarding data collection, and will provide students with the opportunity to update their own data and consent agreements at regular intervals.
Principle 4: Students should be engaged as active agents in the implementation of learning analytics (e.g. informed consent, personalised learning paths, interventions).
Ensuring that data is used wisely
The final principle which support the policy relate to the need to ensure that any interpretation or manipulation of data to extract meaning is based on sound technique which is subject to expert peer review and, if necessary, through advice and mentoring by those more experienced in techniques of quantitative data analysis.
Principle 5: Modelling and interventions based on analysis of data should be sound and free from bias.