One time at work I was trying to work out a least-squares fit using linear algebra.
I have no background knowledge in linesr algebra, it felt like drawing a pentagram on the floor and chanting in backwards Latin.
One time at work I was trying to work out a least-squares fit using linear algebra.
I have no background knowledge in linesr algebra, it felt like drawing a pentagram on the floor and chanting in backwards Latin.
It’s like looking at a stonetoss comic and trying to argue it’s a coincidence that all the [others] do [harmful stereotype]


Both are standard, just who’s standard.
Every instance of every digit of pi in order: 000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000…
I’ll work on it later.
Making a pan-Indian beef burger would be complicated at best.
The more common example would be misuse of native American war bonnets at music festivals and the like.


Never use ai for something you don’t understand! Especially in a niche subject. AI is very good at connecting words that are statistically likely to go together. They’re also biased to produce answers that sound confident.
The end result is an answer that appears correct, and if you don’t know well enough to identify an “apparently correct but factually wrong” answer then you’ll just believe it.
How’s the bedbug colony treating you?
I think the golden rule with LLMs is “never trust the output.” If it’s a task you can 100% verify or has virtually no associated risk, then go right ahead.
It’s just so deeply frustrating to keep seeing people look at LLM results and treat them as truthful instead of truthy.
When’s waking and breakfast, though?
Thought this post was going in VERY different direction from the first few sentences!
Or rather, nothing exists until it is perceived?
Counterpoint: affectionate acts form a nearly-continuous spectrum from chaste to extreme. You could draw a line virtually anywhere and still make the argument that the two adjacent acts are basically the same.
Both horrifying and completely non sequitur
Realistically I think they just need to put more work into getting men pregnant.


Anybody can code an application, it takes a software engineer to barely code it.
Because if you acknowledge that he’s autistic, then a LOT of the series becomes “laugh at a nasty charicature of a disabled person.”


Lol. HOAs are on a strict boot-based diet. Any well-reasoned argument will just make them install MORE of them.
Also: I really want to introduce the neologism caligavore. Can we all make that a thing?
They already listed cucumber sauce