Hacker News new | past | comments | ask | show | jobs | submit login

Okay, but it's weird there is a "don't lie to me" button.





The "don't lie to me" button for a human is asking them, "where did you learn that fact?"

Grounding isn't very different from that.


How would that ever work? The only thing you can do is continue to refine high quality data sets to train on. The rate of hallucination only trends downwards on the high end models as they improve in various ways.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: