You can trick AI chatbots like ChatGPT or Gemini into teaching you how to make a bomb or hack an ATM if you make the question complicated, full of academic jargon, and cite sources that do not exist.
Aedrian Salazar
Researchers Jailbreak AI by Flooding It With Bullshit Jargon (404 Media)