Please use this identifier to cite or link to this item: https://repositori.mypolycc.edu.my/jspui/handle/123456789/9469
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHong, Younhee-
dc.date.accessioned2026-04-15T04:51:26Z-
dc.date.available2026-04-15T04:51:26Z-
dc.date.issued2025-09-29-
dc.identifier.issndoi.org/10.3390/ electronics14193860-
dc.identifier.urihttps://repositori.mypolycc.edu.my/jspui/handle/123456789/9469-
dc.description.abstractThis study proposed OP-LLM-SA, a knowledge distillation-based lightweight model, for building an on-premise AI system for public documents, and evaluated its performance based on 80 public documents. The token accuracy was 92.36%, and the complete sentence rate was 97.19%, showing meaningful results compared to the original documents. During inference, the GPU environment required only about 4.5 GB, indicating that the model can be used on general office computers, and Llama-3.2’s Korean language support model showed the best performance among the LLMs. This study is significant in that it proposes a system that can efficiently process public documents in an on-premise environment. In particular, it is expected to be helpful for teachers who are burdened with processing public documents. In the future, we plan to conduct research to expand the scope of application of text mining technology to various administrative document processing environments that handle public documents and personal information, as well as school administration.ms_IN
dc.language.isoenms_IN
dc.publisherMDPIms_IN
dc.relation.ispartofseriesElectronics;2025, 14, 3860-
dc.subjectKnowledge distillationms_IN
dc.subjectModel compressionms_IN
dc.subjectAdministrative document processingms_IN
dc.subjectOn-premise AIms_IN
dc.subjectNatural language processing (NLP)ms_IN
dc.titleDESIGN AND EVALUATION OF KNOWLEDGE-DISTILLED LLM FOR IMPROVING THE EFFICIENCY OF SCHOOL ADMINISTRATIVE DOCUMENT PROCESSINGms_IN
dc.typeArticlems_IN
Appears in Collections:JABATAN KEJURUTERAAN ELEKTRIK

Files in This Item:
File Description SizeFormat 
Design and Evaluation of Knowledge-Distilled LLM for.pdf929.74 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.