Building AI that 'doesn’t lie' is a myth due to various complexities in its development.Training data limitations cause bias and inaccuracies in AI, leading to flawed outputs.AI can produce factually incorrect outputs, known as 'hallucinations,' due to incomplete understanding.AI lacks real-world comprehension, operates on correlations, and cannot discern a universal truth independently.