Author: Scott Boehmer

  • Information literacy and chatbots as search

    If someone uses an LLM as a replacement for search, and the output they get is correct, this is just by chance. Furthermore, a system that is right 95% of the time is arguably more dangerous than one that is right 50% of the time. People will be more likely to trust the output, and likely less able to fact check the 5%.

    But even if the chatbots on offer were built around something other than LLMs, something that could reliably get the right answer, they’d still be a terrible technology for information access.

    Professor Emily Bender

    Information literacy and chatbots as search (Mystery AI Hype Theater 3000)

  • American Reality

    The shining possibility of an America living up to its ideals feels washed away by the dark reality of the America that is.