

Outside of sci-fi I have no reason to believe that real AI (not an LLM) has any reason or will have any ability to wipe out humanity.
The only exception to that thought is if some military application of AI gets “out of control” but if we ever give humanity ending weapons to an AI then…





To me that thought experiment feels the same as how sci-fi treats the idea.
Why would a paperclip machine (that for some reason is AI) be given such power over its environment and no limit to how many paper clips are made that it would decide it needs to turn organic matter into paperclips?
That’s always what sci-fi goes with too. Humans might turn it off so destroy all humans. I don’t find it compelling in real life and it falls into what I meant with my first comment.
(admittedly some of my disagreement falls apart since companies like Microsoft will put “AI” into shit like notepad, i can only imagine what they’d do with real AI)