Persuasion, Manipulation, Oh My.

I spent a lot of time writing ‘From Inputs To The Big Picture: An AI Roundup‘ largely because it’s a very big topic, but also because I spent a lot of time considering the persuasive aspect of AI.

GPT-4 is presently considered 82% more persuasive than humans and can now read emotions.

That, friendly reader, scares me, because we do not live in a perfect world where everyone has good intentions.

The key differences between manipulation and persuasion are about intention. An AI by itself has no intention, at least for now, but those that create it do have an intention. They could consciously manipulate an artificial intelligence through training data and algorithms, effectively becoming puppet-masters of a persuasive AI. Do they mean well?

Sure. Everyone means well. But what does ‘well’ mean for them? No villain ever really thinks they have bad intentions, despite what movies and television might have people think. Villains come dressed in good intentions. Good villains are… persuasive, and only those not persuaded might see a manipulation for what it is, even when the villain themself does not.

After all, Darth Vader didn’t go to the dark side for cookies, right?

There’s so much to consider with this. The imagination runs wild. It should. How much of the persuasion regarding AI is manipulation, as an example?

I think we’re in for a bit of trouble, and it’s already begun.

Manipulation of Tech.

Manipulation doesn’t really require much. It’s pretty easy to manipulate or be manipulated, and despite the negative connotations, manipulation doesn’t always have to be bad.

What differentiates good and bad as far as manipulation is subjective. Being volunteered for a ‘greater good’ is usually seen as ‘good’, but being manipulated against one’s own interests for a ‘greater good’ that doesn’t include you doesn’t seem very good.

An example: WordPress and Tumblr users were volunteered rather than asked to volunteer information being sold to artificial intelligence companies. If they were actually volunteering, the default setting that was set up for 3rd parties being allowed to use the data would have been off. It wasn’t. The manipulation here was, “Hey, we told you to go in and do this if you don’t want to do it.”

That’s not voluntary in most stretches of the imagination except the unimaginative: Law. It was a manipulation, and I’d offer that it wasn’t fair to people.

If WordPress.com and Tumblr users were paid for it, maybe I’d think it was worth doing. Instead, the owner of the platform decides. It’s not in the interest of the users.

It’s only in the interest of those that own the platform.