GPTs and Hallucination.

التفاصيل البيبلوغرافية
العنوان: GPTs and Hallucination.
المؤلفون: Waldo, Jim1 (AUTHOR) jim_waldo@harvard.edu, Boussard, Soline1 (AUTHOR) soline_boussard@harvard.edu
المصدر: Communications of the ACM. Jan2025, Vol. 68 Issue 1, p40-45. 6p.
مصطلحات موضوعية: *CROWDSOURCING, LANGUAGE models, HALLUCINATIONS (Artificial intelligence), CHATGPT, MACHINE learning, HUMAN-artificial intelligence interaction
مستخلص: The article discusses various aspects of large language models (LLMs) such as ChatGPT and the reasons why LLMs often produce nonfactual and inaccurate responses that are also known as hallucinations. Human-artificial intelligence (AI) relations are assessed, as well as machine learning, LLM training sets, and the concept of epistemic trust. Crowdsourcing is also addressed.
قاعدة البيانات: Business Source Index
الوصف
تدمد:00010782
DOI:10.1145/3703757