Using of Transformers Models for Text Classification to Mobile Educational Applications

Authors

Keywords:

Natural Language Processing, Multiclass Text Classification, Transformers, Bidirectional Encoder Representations from Transformers.

Abstract

In Q2 2022, educational apps were the second most popular category on the Google Play Store, accounting for 10.47% of the apps available worldwide. This work explores the application of five BERT-based pre-trained models with the Transformers architecture to classify mobile educational applications. These five models are according to the knowledge field: bert-base-cased, bert-base-uncased, roberta-base, albert-base-v2 and distilbert-base-uncased. This study uses a dataset with educational apps of Google Play, this dataset was enriched with description and category because it lacked this information. In all models, a tokenizer and fine-tuning works were applied for training in the classification task. After training the data, the testing phase was performed in which the models had to go through four training epochs to obtain better results: roberta-base with 81% accuracy, bert-base-uncased with 79% accuracy, bert-base-cased obtained 80% accuracy, albert-base-v2 obtained 78% accuracy and distilbert-base-uncased obtained 76% accuracy.

Downloads

Download data is not yet available.

Author Biographies

Anabel Pilicita, Universidad Politécnica de Madrid, 28040 Madrid, Spain

Anabel Pilicita obtained a Master's degree in Network and Telematic Services Engineering at the Universidad Politécnica de Madrid (UPM), in 2016, where she is currently pursuing a Ph.D. in Telematic Services Engineering. His research interests include the application of natural language processing and new artificial intelligence models.

Enrique Barra , Universidad Politécnica de Madrid, 28040 Madrid, Spain

Enrique Barra received the Ph.D. degree in telematics engineering with a minor in multimedia and technology enhanced learning from the Universidad Politécnica de Madrid (UPM). He has participated in many European projects, such as GLOBAL, FIWARE, and C@R. He is currently involved in several projects contributing to the generation and distribution of educational content in TEL environments. His research interests include videoconferencing, games in education, and social networks in education.

References

Natural Language Processing, Multiclass Text

Classification, Transformers, Bidirectional Encoder

Representations from Transformers.

Published

2023-06-20

How to Cite

Pilicita, A., & Barra , E. . (2023). Using of Transformers Models for Text Classification to Mobile Educational Applications. IEEE Latin America Transactions, 21(6), 730–736. Retrieved from https://latamt.ieeer9.org/index.php/transactions/article/view/7645