Scott Boehmer

  • About
  • Posts
  • Contact

BS Machines Broken By BS

Written by

Scott Boehmer

in

Technology
AI Artificial Intelligence Chatbots

You can trick AI chatbots like ChatGPT or Gemini into teaching you how to make a bomb or hack an ATM if you make the question complicated, full of academic jargon, and cite sources that do not exist.

Aedrian Salazar

Researchers Jailbreak AI by Flooding It With Bullshit Jargon (404 Media)

Share this:

  • Click to email a link to a friend (Opens in new window) Email
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Bluesky (Opens in new window) Bluesky
Like Loading…
  • Reblog
  • Subscribe Subscribed
    • Scott Boehmer
    • Already have a WordPress.com account? Log in now.
    • Scott Boehmer
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Copy shortlink
    • Report this content
    • View post in Reader
    • Manage subscriptions
    • Collapse this bar
%d