

Don’t know if I’d call that an intention of the machine but rather the creator. Hate to be that kind of person but it’s similar to the whole thing of “guns don’t kill, people do.”
LLMs aren’t people. They’re not self-aware and don’t have any inner complexities like say, a dog, or a sheep has. There’s no drive or motivation. It’s just maths.
If you tie someone to a train track, and a train comes along killing them, it’s not like the train or the track intended to kill the person. That was the intent of you, who “programmed” the scenario.
Similar to guns, strict control is what will be needed to fix these kinds of things. Megalomaniac billionaires who see people as nothing but numbers running amok with narcissistic manipulator systems isn’t a recipe for anything good.


Don’t know if they do LLMs but they’ve had machine learning features for years. They added the ability to extract subjects from images and turning them into stickers back in 2023.