التفاصيل البيبلوغرافية
العنوان: |
Federated Prompting and Chain-of-Thought Reasoning for Improving LLMs Answering |
المؤلفون: |
Liu, Xiangyang, Pang, Tianqi, Fan, Chenyou |
سنة النشر: |
2023 |
المجموعة: |
Computer Science |
مصطلحات موضوعية: |
Computer Science - Artificial Intelligence |
الوصف: |
We investigate how to enhance answer precision in frequently asked questions posed by distributed users using cloud-based Large Language Models (LLMs). Our study focuses on a typical situations where users ask similar queries that involve identical mathematical reasoning steps and problem-solving procedures. Due to the unsatisfactory accuracy of LLMs' zero-shot prompting with standalone questions, we propose to improve the distributed synonymous questions using Self-Consistency (SC) and Chain-of-Thought (CoT) techniques. Specifically, we first retrieve synonymous questions from a crowd-sourced database and create a federated question pool. We call these federated synonymous questions with the same or different parameters SP-questions or DP-questions, respectively. We refer to our methods as Fed-SP-SC and Fed-DP-CoT, which can generate significantly more accurate answers for all user queries without requiring sophisticated model-tuning. Through extensive experiments, we demonstrate that our proposed methods can significantly enhance question accuracy by fully exploring the synonymous nature of the questions and the consistency of the answers. |
نوع الوثيقة: |
Working Paper |
URL الوصول: |
http://arxiv.org/abs/2304.13911 |
رقم الانضمام: |
edsarx.2304.13911 |
قاعدة البيانات: |
arXiv |