

That’s the messing around part.
Refusing to reduce complex reality into slogans and clichés since 19XX


That’s the messing around part.


I’d rather get canceled by the online mob than north sentinel tribe.


I’d bring weed and snacks.


Thank you.
I think the issue is that when people hear “AI,” their minds immediately jump to the sci-fi AI systems depicted as as smart or smarter than humans. They then see the stupid mistakes LLMs make and reasonably conclude these systems are nothing alike, so LLMs don’t count as AI in their minds.
However, the AI systems in sci-fi aren’t just intelligent - they’re generally intelligent. That’s what LLMs lack.
The way I see it, there are levels to intelligence. A chess bot is a narrowly intelligent system. It’s great at one thing but can’t do anything else. Then there’s Artificial General Intelligence (AGI), which is basically human-level intelligence. The next step up is Artificial Superintelligence (ASI) - a generally intelligent system that’s superhuman across the entire field of intelligence, unlike a chess bot that’s only “superhuman” at chess.
I’d say LLMs are somewhere between narrow intelligence and AGI. They can clearly do more than just generate language, but not to the extent humans can, so I wouldn’t call them generally intelligent. At least not yet.
And yeah, I don’t think sentience necessarily needs to come along for the ride. It might, but it’s not obvious to me that one couldn’t exist without the other. It’s conceivable to imagine a system that’s superintelligent but it doesn’t feel like anything to be that system.


But we don’t have agreed upon definition for intelligence either:
I see AI as a term similar to “plants.” When I hear this complaint it sounds to me like someone asking how strawberries and sequoia trees can both be plants when they couldn’t be further apart. Well yeah, but that’s why we have more specific terms when we’re referring to a particular plant - just like with AI. Plants and AI are both parent categories that cover a wide range of subcategories.


I genuinely have no clue what you’re even disagreeing with.


There are only two reasons to pull the emergency brake in a moving vehicle: either you’re messing around and making the car fishtail was the desired outcome, or your regular brakes failed and you need to stop fast instead of just coasting to a halt. In the latter case, the vast majority of people will lose control of the car immediately and it will be absolutely terrifying experience.


Go pull a handbrake on a highway speed simulating an emergency situation and then report back. I’ll wait.


By pulling the handbrake if it’s a FWD car or dumping the clutch if it’s an RWD.
An even if they’re not, this is the worst they’ll ever be.


In rock climbing circles that would be called super-good-enough.



And then, eliminating the brake fluid, reduces the residual torque and the drag between the pads and the discs, which improves the efficiency and the durability of the braking system.
Source (What an awful website)


I don’t think this is the case. At least I can’t find any source to back that up.
I really liked the larping aspect of airsoft. I might consider getting back into it if there was a group of only adult players but the average age in the local events is probably like 13 and I can’t handle that.
I stopped doing Brazilian JiuJitsu after I got injured like 3 times over 6 months where as I’ve been lifting weights for 15 years and haven’t gotten injured once. It was fun but I don’t want to do it at the expense of my health. Also there’s always the one guy who takes it too seriously and ruins it for everyone else.
Gaming is another one. I get into it for a month about once a year but I’m just not particularly drawn to it anymore.


This does still have brake pads and rotors. The brake lines just get replaced with wires.


Pulling the handbrake on a moving vehicle is generally speaking really bad idea. It’ll stop, yeah, but it’ll be really scary for a moment before that.
Outside.


Marketing only calls everything AI because that’s the only term people recognize. ChatGPT is AI, yes, but it’s an Large Language Model to be specific. Dall-E is also AI but the more accurate term is Diffusion Model. There’s just no point in using these terms in marketing because 90% of people would have no idea what you’re talking about.
When people say that LLMs are not AI they usually mean that LLMs are not generally intelligent (AGI) which is true, but they do still count as an AI.
Yeah I’m not a native english speaker so I’m not sure about the correct terminology. Oversteer is probably better one here.