Skip to content
Snippets Groups Projects
Commit 16dc76e2 authored by Malte Neugebauer's avatar Malte Neugebauer
Browse files

Added dashboard tool script and relating papers (preprints).

parent 06e53ea6
No related branches found
No related tags found
No related merge requests found
# Math Digital Mentoring
This repository offers material to enrich mathematical exercises in the Learning Management Systems (LMS) Moodle and ILIAS with interactive elements, e.g., gamification or adaptive learning. This lightweight solution can be implemented by lecturers in their courses using the on-board tools of the learning management system. No installation of an additional plugin is needed.
This repository offers material to enrich mathematical exercises in the Learning Management Systems (LMS) Moodle and ILIAS with interactive elements, e.g., gamification or adaptive learning, and the necessary materials to analyze their effects. This lightweight solution can be implemented by lecturers in their courses using the on-board tools of the learning management system. No installation of an additional plugin is needed.
There are several ways to get started.
- Take a look on some working examples in the [Live Demos Section](#live-demos).
- Load a stable version of the material in your LMS and adapt it to your needs: [LMS-readable Packages Section](#lms-readable-packages).
- Copy and paste the code directly in questions of your LMS: Check the [Insertable Code Section](#insertable-code).
- Once applied at your university, the resulted effects of the tested versions can be measured with analytics tools offered in the [Effect Measurement Section](#effect-measurement). This can also be done by lecturers, but Python knowledge is helpful.
- Once applied at your university, the resulted effects of the tested versions can be measured with analytics tools offered in the [Effect Measurement Section](#effect-measurement). Here you find a lightweight dashboard tool that can also be implemented without installation. For deeper analysis, also Python code is provided here.
- Check the [Paper Section](#papers) for findings from research related to this approach.
Apart form that, the repository also contains [Screenshots](#screenhots) and information about [Troubleshooting](#troubleshooting), [Exercise Generation](#exercise-generation) and the [License](#license) (MIT).
......@@ -135,7 +135,26 @@ The question files for the different versions are automatically parsed with the
[//]: # (There are various ways to address the effects of this approach on learners. We propose an experimental setting and test at least two versions: One serves as a control condition and at least one other serves as the treatment condition.)
[//]: # (<REPLACE < with open and > with closed paranthesis>The LMS offers various learning data to lecturers <e. g. grade statistics> by default that may address the question, what effect this approach has on learners. We propose to focus on learners' usage patterns among different designs. The following excerpt from the file [overview.pdf]<> visualizes the data from the datasets [alquiz-analysis-control.csv]<>, [alquiz-analysis-test_its.csv]<> and [alquiz-analysis-test_rpg.csv]<> from the folders [university1]<./analysis/university_1> and [university2]<./analysis/university_1>. The latter files include each movement of several users within the three different designs <normal, pedagogical agent <instant tutoring> and fantasy game>. These files can be generated by lecturers by running the scripts [alquiz-analyisis-control.js]<> or [alquiz-analyisis-test.js]<> respectively for either the control design or the pa/fantasy designs in the LMS attempt overview page. The Python script [visualize_patterns.py]<> generates several Latex files from that, that are included in the [overview.tex]<> file. For further information about the measurement of effects with this approach please refer to the [Papers Section]<#papers>)
Choose the [Lightweight Dashboard Tool](#lightweight-dashboard-tool) for a first overview of your learners' data. For a deeper analysis you may want to check the [Python code](#python-code) section.
## Lightweight Dashboard Tool
1. Open the file [lightweight_la.js](./analysis/lightweight_la.js) with a text editor. Select all the code and copy. Open the browser of your choice and navigate to the developer console. This can be done in different browsers as follow:
- For Google Chrome, open the Chrome Menu in the upper-right-hand corner of the browser window and select More Tools > Developer Tools. You can also use Option + ⌘ + J (on macOS), or Shift + CTRL + J (on Windows/Linux).
- For Firefox, click on the Firefox Menu in the upper-right-hand corner of the browser and selects More Tools > Browser Console. You can also use the shortcut Shift + ⌘ + J (on macOS) or Shift + CTRL + J (on Windows/Linux).
- For Microsoft Edge, open the Edge Menu in the upper-right-hand corner of the browser window and select More Tools > Developer Tools. You can also press CTRL + Shift + i to open it.
- For other browsers, kindly check out their documentation.
2. Click on the console's input and paste the code you copied in the previos step.
3. Run the code. This can usually be done by pressing Enter.
4. An icon should appear on the screen. Click on it to open the dashboard tool. Choose your analysis method and follow the instructions in the dashboard.
## Python Code
The LMS offers various learning data to lecturers (e. g. grade statistics) by default that may address the question, what effect this approach has on learners. To get a deeper understanding learners' usage patterns can additionally visualized as shown in the screenshot by following the steps below. The following excerpt from the file [overview.pdf](./analysis/tex/overview.pdf) visualizes the data from the exemplary dataset [hops.csv](./analysis/hops.csv).
![excerpt from the analysis overview](./img/screenshot_analysis.png)
......@@ -173,6 +192,8 @@ If you chose to implement the code directly as described in the [Insertable Code
[//]: # (Since the approach is based on Javascript, the regulations for using JavaScript inside the LMS are of high importance for the functionality of the approach.)
## Papers
- Neugebauer, M.; Erlebach, R.; Kaufmann, C.; Mohr, J.; Frochte, J. (accepted 2024): Efficient Learning Processes By Design: Analysis of Usage Patterns in Differently Designed Digital Self-Learning Environments. Proceedings of the 16th International Conference on Computer Supported Education. [Preprint](./preprints/Efficient%20Learning%20Processes%20By%20Design.pdf)
- Neugebauer, M. (accepted 2024): Lightweight Learning Analytics Dashboard for Analyzing the Impact of Feedback & Design on Learning in Mathematical E-Learning. Proceedings 18. Workshop Mathematik in ingenieurwissenschaftlichen Studiengängen. [Preprint](./preprints/Lightweight%20Learning%20Analytics%20Dashboard%20for%20Mathematics.pdf)
- Neugebauer, M., Tousside, B., Frochte, J. (2023). *Success Factors for Mathematical E-Learning Exercises Focusing First-Year Students.* In: Proceedings of the 15th International Conference on Computer Supported Education (CSEDU). https://doi.org/10.5220/0011858400003470
- Neugebauer, M, Frochte, J. (2023). *Steigerung von Lernerfolg und Motivation durch gamifizierte Mathematik-Aufgaben in Lernmanagementsystemen.* In: 21. Fachtagung Bildungstechnologien (DELFI). https://doi.org/10.18420/delfi2023-39
......
This diff is collapsed.
File added
File added
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment