Please use this identifier to cite or link to this item:
https://hdl.handle.net/1889/5152
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Zanichelli, Francesco | - |
dc.contributor.author | Thabet, Baha Mohammad Ghalib Khader | - |
dc.date.accessioned | 2022-11-03T07:12:17Z | - |
dc.date.available | 2022-11-03T07:12:17Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | https://hdl.handle.net/1889/5152 | - |
dc.description.abstract | Combining Deep Knowledge Tracing (DKT) and Transformer-based recommendation with serious games can establish an intelligent model for modeling players’ knowledge state over missions and auto generating help text. This model can help players to look one or more steps ahead and predict the performance of the next missions in gameplay. Afterwards and if needed, the model enables the generation of proactive recommendations as flashcards for players to be able to complete the next mission successfully. In this research, we introduce a novel Intelligent Serious Games model (ISG) based on integrating the state-of-the-art DKT method and a fine-tuned Transformer-based recommendation component to improve players’ programming skills for C++, one of the most common programming languages used in first-year computer science and engineering bachelor programs. We propose novel hybrid prediction models for DKT, a novel Transformer-based Recommender architecture and a novel Transformer-based framework tailored to three different generation tasks. The latter aims to generate flashcards in the form of supporting paragraphs, questions, and answers. Alongside with fine-tuning GPT-2, GPT-Neo, BART and T5 models to four new programming skills datasets. Flashcards are the main tool used in the Spaced Repetition memorization method, yet there are not always available for many topics due to the high efforts required to create them whereas the Transformer-based models come to simplify the process. Our findings revealed the effectiveness of integrating the Deep Knowledge Tracing (DKT) method and the Transformer-based recommender with serious games. The results revealed that the fine-tuned Transformer-based framework models are capable to generate coherent C++ paragraphs, questions and answers inspected semantically with code examples in a fully automated process and using a single input string. Also, the proposed hybrid prediction models with a multi-layer learning approach for DKT achieved the best prediction performance among the other models. Whereas assessing the proposed DKT-Missing Sequence Padding (MSP) recursive method demonstrated its effectiveness in predicting more steps ahead with missing values in the sequences. Also, the novel approach in evaluating the DKT method based on each sequence within a fixed length enabled us to trace and investigate each knowledge state. | en_US |
dc.language.iso | Inglese | en_US |
dc.publisher | Università degli studi di Parma. Dipartimento di Ingegneria e architettura | en_US |
dc.relation.ispartofseries | Dottorato di ricerca in Tecnologie dell' Informazione | en_US |
dc.rights | © Baha Thabet, 2022 | en_US |
dc.rights | Attribuzione - Non commerciale - Condividi allo stesso modo 4.0 Internazionale | en_US |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | * |
dc.subject | intelligent serious games | en_US |
dc.subject | deep knowledge tracing DKT | en_US |
dc.subject | LSTM | en_US |
dc.subject | CNN | en_US |
dc.subject | hybrid prediction | en_US |
dc.subject | programming skills | en_US |
dc.subject | missing sequence padding | en_US |
dc.subject | Transformer | en_US |
dc.subject | GPT-2 | en_US |
dc.subject | questions and answer generation | en_US |
dc.subject | BART | en_US |
dc.subject | T5 | en_US |
dc.title | Towards intelligent serious games: integrating deep knowledge tracing and transformer-based recommendation | en_US |
dc.type | Doctoral thesis | en_US |
dc.subject.miur | INF/01 | en_US |
Appears in Collections: | Tecnologie dell'informazione. Tesi di dottorato |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
THABET_BAHA_Thesis_07-2022.pdf | Thesis_07_2022 | 3.24 MB | Adobe PDF | View/Open |
BT_PhD_Activity_Report_07-2022.pdf Restricted Access | Activity_Report_pdf | 121.38 kB | Adobe PDF | View/Open Request a copy |
This item is licensed under a Creative Commons License