Universiti Teknologi Malaysia Institutional Repository

Arabic automatic question generation using transformer model.

Alhashedi, Saleh and Mohd. Suaib, Norhaida and Bakri, Aryati (2022) Arabic automatic question generation using transformer model. In: 4th International Conference on Green Engineering and Technology 2022, IConGETech 2022, 17 November 2022 - 18 November 2022, Seoul, South Korea.

Full text not available from this repository.

Official URL: http://dx.doi.org/10.1063/5.0199032

Abstract

Students of all ages benefit greatly from the use of questions in the evaluation process and in the improvement of their overall educational outcomes. The educational process’s adaptation, shift to online education, and the rapid growth of educational content on the internet. Institutions, schools, and academic organisations struggle to generate exam questions in a timely manner due to the use of the outdated method. Exam question preparation is a complex and time-consuming activity that calls for an in-depth familiarity with the subject matter and the skill to build the questions, both of which grow more challenging as text size increases. Generating questions that are both natural and relevant from a variety of text data inputs, with the possibility to provide an answer, is the goal of automatic question generation (AQG). The Arabic language has seen a small number of contributions to this problem-solving effort. Many existing works rely on Rule-based methods and input text from children’s books, stories, or textbooks to manually construct question styles. There is a lack of linguistic diversity in these models, and the tasks get increasingly difficult and time-consuming as the quantity of the text increases. When it comes to Natural Language Processing (NLP), Transformer is one of the most flexible deep-learning models. In this research, we propose a fully-automated Arabic AAQG model built on the Transformer architecture, which can take a single document of limitless length in Arabic and create N questions from it. These questions can be used in educational contexts. Our model achieves performance results with (19.12 BLEU, 23.00 METEOR, and 51.99 ROUGE-L) using mMARCO dataset.

Item Type:Conference or Workshop Item (Paper)
Uncontrolled Keywords:Transformer, Natural language processing, Learning models, Students, Educational.
Subjects:T Technology > T Technology (General)
T Technology > T Technology (General) > T58.6-58.62 Management information systems
Divisions:Centre of Information and Communication Technology
ID Code:108795
Deposited By: Muhamad Idham Sulong
Deposited On:09 Dec 2024 06:23
Last Modified:09 Dec 2024 06:23

Repository Staff Only: item control page