target audience: TECH SUPPLIER  Publication date: Mar 2024 - Document type: Market Perspective - Doc  Document number: # US51964924

Understanding and Mitigating Large Language Model Hallucinations

By: 

  • Alan Webber Loading
  • David Schubmehl Loading

Content



Related Links

Table of Contents


  • Executive Snapshot

    • Figure: Executive Snapshots: Understanding the Issues and Opportunities in Mitigating Large Language Model Hallucinations

  • New Market Developments and Dynamics

    • Types of LLM Hallucinations

    • Causes of LLM Hallucinations

    • Model Training Issues

    • Data Issues

    • Ways to Mitigate LLM Hallucinations

    • Model Training and Behavior-Focused Mitigation

    • Data-Focused Mitigation

    • People-Focused Mitigation Efforts

    • Model-Orientated Mitigation Efforts

    • LLM Self-Refinement

    • Employing RAG as a Mitigation Tool

    • Interesting Vendor Efforts to Mitigate LLM Hallucinations

  • Advice for the Technology Supplier

  • Learn More

    • Related Research

    • Synopsis