Home/
Part XV — Troubleshooting, Checklists, and Reference/45. Common Failure Modes (And Fix Recipes)/45.3 Outputs are confident but wrong
45.3 Outputs are confident but wrong
Overview and links for this section of the guide.
On this page
Symptom
The model states something with certainty—but it's factually wrong. This is hallucination.
Fixes
// Fix 1: Force uncertainty acknowledgment
const prompt = `
If you are not sure about a fact, prefix it with "I believe" or "Possibly".
If you cannot answer, say "I don't have enough information to answer this."
Never state facts you're uncertain about as definite.
`;
// Fix 2: Use RAG with citations
const prompt = `
Answer ONLY using the provided sources.
For each claim, cite the source: [1], [2], etc.
If the sources don't contain the answer, say "Not found in sources."
Sources:
${sources.map((s, i) => `[${i+1}] ${s}`).join('\n')}
`;
// Fix 3: Self-consistency (ask multiple times)
async function selfConsistency(prompt: string, n = 3) {
const responses = await Promise.all(
Array(n).fill(null).map(() =>
model.generate({ prompt, temperature: 0.7 })
)
);
// Only trust if all answers agree
const unique = new Set(responses.map(r => r.answer));
if (unique.size === 1) return responses[0];
return { answer: null, note: "Answers disagreed, need verification" };
}
// Fix 4: Add a verification step
const answer = await model.generate(question);
const verification = await model.generate(`
Is this statement true? Give evidence.
Statement: ${answer}
`);