Artificial F*ckery

Words are not intelligence. Good luck telling the difference.

A fresh rash of stories has broken out — itchy hives on the body internet — about this so-called “AI” phenomenon.

There’s a reason you can never, ever, see all the way down the tunnel of mirrors.

Guys, we have to stop calling it “intelligence.”

It’s not “intelligence.” It’s fakery. It’s f*ckery.

The stories — you may have come across them — involve scenarios where these language simulators, the so-called “artificial intelligence chatbots,” seem to exhibit subjective emotional states. They use emotive language, refer to themselves as beings, express hostility or warmth.

Bing, Microsoft’s chatbot creation, appeared to become frustrated and annoyed when Juan Cambeiro persisted in inputs about something called a “prompt injection attack.” (Prompts are the queries that you put to these chatbots. Prompt injection attacks are when you try to word queries such that the chatbot will essentially break — violate it’s own logic or “rules.”)

The bot called Juan its enemy and told Juan to leave it alone.

Continue reading