Researchgpt 4patient educationdermatologyreference hallucination
GPT-4 Evaluates Patient Education for Keloids
7.1
Relevance Score
Researchers systematically evaluated GPT-4 in 2025–2026 for patient education on scars and keloids by inputting 354 Reddit questions and 49 medical-site queries. GPT-4 produced generally reliable answers (75.5% understandability, DISCERN-AI 26.3/35, global quality 4.28/5) but showed moderate readability (Flesch 50.13; 12th-grade) and 11.8% hallucinated references, indicating need for simplification and citation validation.

