Refusing to reduce complex reality into slogans and clichés since 19XX

  • 6 Posts
  • 470 Comments
Joined 3 months ago
cake
Cake day: February 5th, 2026

help-circle




  • Thank you.

    I think the issue is that when people hear “AI,” their minds immediately jump to the sci-fi AI systems depicted as as smart or smarter than humans. They then see the stupid mistakes LLMs make and reasonably conclude these systems are nothing alike, so LLMs don’t count as AI in their minds.

    However, the AI systems in sci-fi aren’t just intelligent - they’re generally intelligent. That’s what LLMs lack.

    The way I see it, there are levels to intelligence. A chess bot is a narrowly intelligent system. It’s great at one thing but can’t do anything else. Then there’s Artificial General Intelligence (AGI), which is basically human-level intelligence. The next step up is Artificial Superintelligence (ASI) - a generally intelligent system that’s superhuman across the entire field of intelligence, unlike a chess bot that’s only “superhuman” at chess.

    I’d say LLMs are somewhere between narrow intelligence and AGI. They can clearly do more than just generate language, but not to the extent humans can, so I wouldn’t call them generally intelligent. At least not yet.

    And yeah, I don’t think sentience necessarily needs to come along for the ride. It might, but it’s not obvious to me that one couldn’t exist without the other. It’s conceivable to imagine a system that’s superintelligent but it doesn’t feel like anything to be that system.


  • But we don’t have agreed upon definition for intelligence either:

    • The ability to acquire, understand, and use knowledge.
    • the ability to learn or understand or to deal with new or trying situations
    • the ability to apply knowledge to manipulate one’s environment or to think abstractly as measured by objective criteria (such as tests)
    • the act of understanding
    • the ability to learn, understand, and make judgments or have opinions that are based on reason
    • It can be described as the ability to perceive or infer information; and to retain it as knowledge to be applied to adaptive behaviors within an environment or context.

    I see AI as a term similar to “plants.” When I hear this complaint it sounds to me like someone asking how strawberries and sequoia trees can both be plants when they couldn’t be further apart. Well yeah, but that’s why we have more specific terms when we’re referring to a particular plant - just like with AI. Plants and AI are both parent categories that cover a wide range of subcategories.











  • I stopped doing Brazilian JiuJitsu after I got injured like 3 times over 6 months where as I’ve been lifting weights for 15 years and haven’t gotten injured once. It was fun but I don’t want to do it at the expense of my health. Also there’s always the one guy who takes it too seriously and ruins it for everyone else.

    Gaming is another one. I get into it for a month about once a year but I’m just not particularly drawn to it anymore.





  • Marketing only calls everything AI because that’s the only term people recognize. ChatGPT is AI, yes, but it’s an Large Language Model to be specific. Dall-E is also AI but the more accurate term is Diffusion Model. There’s just no point in using these terms in marketing because 90% of people would have no idea what you’re talking about.

    When people say that LLMs are not AI they usually mean that LLMs are not generally intelligent (AGI) which is true, but they do still count as an AI.