• 0 Posts
  • 1 Comment
Joined 3 个月前
cake
Cake day: 2025年12月8日

help-circle
  • Hallucination isn’t nearly as big a problem as it used to be. Newer models aren’t perfect but they’re better.

    The problem addressed by this isn’t hallucination, its the training to avoid failure states. Instead of guessing (different from hallucination), the system forces a Negative response. That’s easy and any big and small company could do it, big companies just like the bullshit