TY - JOUR AU - Sirojul Alam AU - Jaka Abdul Jabar AU - Fauzi Abdurrachman AU - Bambang Suharjo AU - H.A Danang Rimbawa PY - 2024/11/09 Y2 - 2025/04/03 TI - Improving Large Language Model’s Ability to Find the Words Relationship JF - Jurnal Bumigora Information Technology (BITe) JA - BITe VL - 6 IS - 2 SE - Articles DO - https://doi.org/10.30812/bite.v6i2.4127 UR - https://journal.universitasbumigora.ac.id/index.php/bite/article/view/4127 AB - Background: It is still possible to enhance the capabilities of popular and widely used large language models (LLMs) such as Generative Pre-trained Transformer (GPT). Using the Retrieval-Augmented Generation (RAG) architecture is one method of achieving enhancement. This architectural approach incorporates outside data into the model to improve LLM capabilities.Objective: The aim of this research is to prove that the RAG can help LLMs respond with greater precision and rationale.Method: The method used in this work is utilizing Huggingface Application Programming Interface (API) for word embedding, store and find the relationship of the words.Result: The results show how well RAG performs, as the attractively rendered graph makes clear. The knowledge that has been obtained is logical and understandable, such as the word Logistic Regression that related to accuracy, F1 score, and defined as a simple and the best model compared to Naïve Bayes and Support Vector Machine (SVM) model.Conclusion: The conclusion is RAG helps LLMs to improve its capability well. ER -