

Even Wozniak has said that while Jobs wasn’t a good engineer, he did know enough to be strategically savvy.
Edited in hindsight for clarity.


Even Wozniak has said that while Jobs wasn’t a good engineer, he did know enough to be strategically savvy.
Edited in hindsight for clarity.


Finally, the bean counter steps down so that an engineer can take the reins again.
I mean, at least he didn’t fuck things up like so many other bean counters have. But he was only ever a bean counter.
It was a different era. Only the poorest women wore only flats, and a large number of women wore heels everywhere except on soft ground. Basically from when they stepped out of bed in the morning until they went to bed at night. Granted, they may not have been particularly high heels, but they were stereotypical high heels.


Congee. It’s a Chinese rice dish. It’s not bad tasting, but it has all the looks and texture of regular boiled rice with equal parts runny snot. It even more closely matches the “runny snot” impression if it’s correctly salted.
Pork Shumai. Also Chinese, but actually delish AF. But like so many other Dim Sum foods, it also looks - if made correctly - like it’s dripping with gelatinous snot.
Chicken’s feet. Also Chinese, these are braised in soya sauce until they look like little clawed grave markers of an incompletely-buried avian body. You’re not Chinese if you don’t eat these with absolute gusto. Meanwhile I’m thinking if the barnyard poop indelibly buried in each crevice of that clawed monstrosity.
Sauce: married to ethnic Chinese first-gen for the last twenty years. I’ve been exposed to a lot of Chinese foods, especially those from the Canton region.


As a Canadian: fuck, no.
Poutine isn’t visually unpalatable in the least. It’s just fries with curds and gravy. Unless the kitchen did a total hash of the dish and fucked it up six ways to Sunday, there ain’t no way it looks bad.
It’s even better with extras in it, like pulled pork, wiggly bacon chunks, or chopped onion greens.


Sheep’s head, or sheep’s brain?
The former is fine.
The latter screams spongiform encephalopathy.
Came here to say this. All three of the major desktop operating systems have built-in controls to ignore the closing of the lid. Being forced to keep the lid open to keep a laptop on is a rookie workaround.


And I see hierarchy as essential and required for anything beyond a small, isolated community of 50-200 people.
The difference being, through technology we can make despot-proof hierarchies that self-prune away those who hunger for power and influence.
For example, direct-participatory democracy is literally political communism, and totally eliminates all politicians. What remains is a network of functionaries and bureaucrats (invariably in meritocrally-elected boards of limited duration) whose sole employed purpose is to action the will of the populace in whatever ministry they occupy. There literally is no one single person in any position who can take any kind of control, and powerful checks and balances exist throughout the system to permit an effective and efficient but subservient state that can deal with issues at scales that small communities cannot.
The downside being that truly effective direct-participatory democracy requires three foundations to be in place:
Once these three are solidly in place, direct participatory democracy can be implemented, and it is only after it has been, that communism has any chance of surviving.


Real communism has a massive flaw in that it is too idealistic and fails to account for human corruption and the pursuit of power. Especially since communism is all about equalizing power among the people. Which is also how it has always been co-opted and destroyed from within shortly after it has been implemented.
This is why I fight against calling any current country “communist”, because those countries so severely violate everything that makes a state communist. These are authoritarian kleptocracies, nothing more. They use “communism” as a thin veneer of legitimacy over a fetid, rotting carcass of dictatorship that violently oppresses the people.


Which is why these were never communist states, any more than North Korea is democratic, or the old East Germany was a republic.
Just because these states wore the word “communism” like a thin veneer of legitimacy, does not a communist state make.


Communism can very much be decentralized, and in fact a correct implementation tends to be exactly that.
Because that’s where “communism” the term comes from - community, communal, etc…


I could get behind that.
But wealth is power, and power does not corrupt so much as it attracts the corruptible. You would need to work with all manner of sociopaths and malignant narcissists. And these are people who have the least justification for existing in a polite society.
Plus, they would also continue to be parasites on civilization, and continue to pathologically hoard more wealth than they could possibly spend in a million lifetimes.
Honestly, a guillotine is a lot simpler and a lot faster. Take out the top 0.01% of civilization, and the remaining members of the Parasite Class will not fight when you implement 99% top-tier tax rates, close all of the high-wealth loopholes, and build proper social frameworks that benefit everyone.
And this starts with the political system, with a high-tech direct-participation democracy which eliminates all politicians in favour of letting everyone vote on all issues. This requires a foundation with a population that is well educated in critical thinking and bullshit detection (which would destroy all conservatism in the first place), and an economic system (even modified capitalism) that meets everyone’s needs so everyone would have the headspace to deal with societal questions without being forced to always focus on economic survival. Without this political framework, socialism/communism of any form would continue to be corrupted and co-opted by strongmen and tyrants.
Because when you look at any attempt to implement communism in the past, it never survived beyond a few months to maybe a year or so. Sure, Russia had its revolution in 1917, but by 1918 Russian communism was effectively dead; taken over by an authoritarian kleptocracy no different than a feudal system.


Eternally youthful but mortal life.
I don’t mind dying. Death is what makes life have meaning. Let me live long enough and at some point I would be eager to wrap up my concerns and shuffle off this mortal coil.
But I would prefer to die on my own terms, at a time of my own choosing, and in the meantime exist with all the physical and mental vigour of someone between the ages of 25 and 45.
And the key is not being immortal, as I would not want to always survive grievous injuries. I would want to be mortal on purpose – if an accident would kill a normal human despite immediate medical attention of the highest modern quality, I would want to die just the same. I would not want to continue existing as bloody paste paining the interior hull of an airliner that smacked into a mountain.
But barring accidents, I would love to loiter and observe the next few centuries in great health and youthful vigour. Doing what, I don’t know. That’s for the future to determine. But it would be interesting.


As I pointed out in another root comment, the average - depending on the model being tested - tends to sit between 60% and 80%. But this is with no restriction on source materials… the LLMs are essentially pulling from world+dog in that case
So this opens up an interesting option for users, in that hallucinations/inaccuracies can be controlled for and potentially reduced by as much as ⅔ simply by restricting the model to those documents/resources that the user is absolutely certain contains the correct answer.
I mean, 25% is still stupidly high. In any prior era, even 2.5% would have been an unacceptably high error rate for a business to stomach. But source-restriction seems to be a somewhat promising guardrail to use for the average user doing personal work.


How much do large language models actually hallucinate when answering questions grounded in provided documents?
Okay, this is looking promising, at least in terms of the most important qualifications being plainly stated in the opening line.
Because the amount of hallucinations/inaccuracies “in the wild” - depending on the model being tested - runs about 60-80%. But then again, this would be average use on generalized data sets, not questions focusing on specific documentation. So of course the “in the wild” questions will see a higher rate.
This also helps users, as it shows that hallucinations/inaccuracies can be reduced by as much as ⅔ by simply limiting LLMs to specific documentation that the user is certain contains the desired information, rather than letting them trawl world+dog.
Very interesting!


That may be the case, but the most irritating thing is that thy fill all available spots with the lowest-capacity chips that meet the requested provisioning spec, instead of taking the requested provisioning and using the fewest higher-capacity chips needed to meet the provisioning spec. The latter, at least, would leave spots open for an authorized repair location to manually solder on more approved chips of compatible spec.


Read it again. It occurs even with a full system wipe and re-install from Microsoft-direct media, or even a full hard drive swap. It is wholly independent of what is on the hard drive, the only restriction being that it can only successfully run when injected into Windows.


One example of many.
You must be new to tech to not remember this. Wasn’t all that long ago.


If you have the money and want simplicity, reliability, and interoperability, go for a Mac. Just clench your sphincter and maximize the RAM; min. 32Gb ought to be minimally appropriate for a 7-8yr lifespan of basic duties. And FFS, go for what your current data uses up ×2.5 or 1Tb, whichever is larger (vital performance reasons in that). Don’t get the smallest storage unless third-party upgrade options exist like for the Mac Mini M4. And remember: all RAM and a lot of storage is integrated these days, which is why you should always max it out; there is no upgrade path except wholesale replacement of the machine. CPU is largely immaterial unless you are doing truly heavy lifting like video editing or AI, so that can often be the lowest choice.
If you want freedom and truly unconstrained system, some form of Linux/BSD on a Framework system is the way to go. Or if a desktop, hand-assemble it yourself.
If you are going to stick with Windows, go for a business-class Dell. Trust me, it’ll be almost as $$$$ painful as a Mac, but these little f**kers are built to last. At least you can upgrade the RAM and on-board storage, although I honestly recommend not going under 32Gb for anything other than basic tasks. It’ll be a lot more zippy with 32Gb even if you spend the first week tearing all the AI and built-in spyware out of Windows.
When young people face a system explicitly designed to extract as much wealth out of them as possible, nerfing their economic potential well into adulthood via crushing debt, is such a response really that unexpected?