跳到主要导航 跳到搜索 跳到主要内容

Soft Knowledge Prompt: Help External Knowledge Become a Better Teacher to Instruct LLM in Knowledge-based VQA

  • Qunbo Wang
  • , Ruyi Ji
  • , Tianhao Peng
  • , Wenjun Wu
  • , Zechao Li
  • , Jing Liu*
  • *此作品的通讯作者
  • CAS - Institute of Automation
  • Beihang University
  • Nanjing University

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

LLM has achieved impressive performance on multi-modal tasks, which have received ever-increasing research attention. Recent research focuses on improving prediction performance and reliability (e.g., addressing the hallucination problem). They often prepend relevant external knowledge to the input text as an extra prompt. However, these methods would be affected by the noise in the knowledge and the context length limitation of LLM. In our work, we focus on making better use of external knowledge and propose a method to actively extract valuable information in the knowledge to produce the latent vector as a soft prompt, which is then fused with the image embedding to form a knowledge-enhanced context to instruct LLM. The experimental results on knowledge-based VQA benchmarks show that the proposed method enjoys better utilization of external knowledge and helps the model achieve better performance.

源语言英语
主期刊名Long Papers
编辑Lun-Wei Ku, Andre F. T. Martins, Vivek Srikumar
出版商Association for Computational Linguistics (ACL)
6132-6143
页数12
ISBN(电子版)9798891760943
DOI
出版状态已出版 - 2024
活动62nd Annual Meeting of the Association for Computational Linguistics, ACL 2024 - Bangkok, 泰国
期限: 11 8月 202416 8月 2024

出版系列

姓名Proceedings of the Annual Meeting of the Association for Computational Linguistics
1
ISSN(印刷版)0736-587X

会议

会议62nd Annual Meeting of the Association for Computational Linguistics, ACL 2024
国家/地区泰国
Bangkok
时期11/08/2416/08/24

指纹

探究 'Soft Knowledge Prompt: Help External Knowledge Become a Better Teacher to Instruct LLM in Knowledge-based VQA' 的科研主题。它们共同构成独一无二的指纹。

引用此