
Sila gunakan pengecam ini untuk memetik atau memaut ke item ini:
https://repositori.mypolycc.edu.my/jspui/handle/123456789/9469| Tajuk: | DESIGN AND EVALUATION OF KNOWLEDGE-DISTILLED LLM FOR IMPROVING THE EFFICIENCY OF SCHOOL ADMINISTRATIVE DOCUMENT PROCESSING |
| Pengarang: | Hong, Younhee |
| Kata kunci: | Knowledge distillation Model compression Administrative document processing On-premise AI Natural language processing (NLP) |
| Tarikh diterbit: | 29-Sep-2025 |
| Penerbit: | MDPI |
| Siri / Laporan No.: | Electronics;2025, 14, 3860 |
| Abstrak: | This study proposed OP-LLM-SA, a knowledge distillation-based lightweight model, for building an on-premise AI system for public documents, and evaluated its performance based on 80 public documents. The token accuracy was 92.36%, and the complete sentence rate was 97.19%, showing meaningful results compared to the original documents. During inference, the GPU environment required only about 4.5 GB, indicating that the model can be used on general office computers, and Llama-3.2’s Korean language support model showed the best performance among the LLMs. This study is significant in that it proposes a system that can efficiently process public documents in an on-premise environment. In particular, it is expected to be helpful for teachers who are burdened with processing public documents. In the future, we plan to conduct research to expand the scope of application of text mining technology to various administrative document processing environments that handle public documents and personal information, as well as school administration. |
| URI: | https://repositori.mypolycc.edu.my/jspui/handle/123456789/9469 |
| ISSN: | doi.org/10.3390/ electronics14193860 |
| Muncul dalam Koleksi: | JABATAN KEJURUTERAAN ELEKTRIK |
| Fail | Penerangan | Saiz | Format | |
|---|---|---|---|---|
| Design and Evaluation of Knowledge-Distilled LLM for.pdf | 929.74 kB | Adobe PDF | ![]() Lihat/buka |
Item di DSpace dilindungi oleh hak cipta, dengan semua hak dilindungi, kecuali dinyatakan sebaliknya.
