• 0 Posts
  • 8 Comments
Joined 7 months ago
cake
Cake day: October 22nd, 2025

help-circle
  • No, sales are going down because prices are going up. If you have a fixed inventory and sales go down, you lower prices to increase demand and move the product and keep your revenue stream. But in this case, they’re moving supply away from this market (consumer hardware) to a different market (AI data centers). So the supply is going down with (previously) fixed demand, driving prices up. The “motherboard sales are collapsing” headline comes from looking at the consumer hardware slice of the computing hardware market. If you look at total sales from each manufacturer, so include the AI data center sales in the analysis, they’re not having any trouble moving inventory nor keeping up their revenue stream overall.



  • Two weeks ago, we wrote about Palantir going mask-off for fascism, specifically about CEO Alex Karp’s company posting a 22-point manifesto that included some genuinely ugly stuff about how “certain cultures” are “regressive and harmful” and how pluralism is a “shallow temptation.” I argued that this kind of public ideological positioning was both morally bankrupt and strategically suicidal. The moral bankruptcy part should be obvious (if it’s not, go do some soul-searching). But doing so at a time when American-style fascism is historically unpopular basically everywhere, including within the US, just seems like you’ve bet on the losing team at a time when it’s clear they have no chance of coming back to win.

    I keep seeing this logic that:

    1. If a movement is unpopular it will fail in short order
    2. In the US, the current fascist movement is unpopular
    3. Therefore it will fail soon

    That may be true of a lot of movements, but fascism doesn’t work like that. They don’t need popularity, they just need control over the levers of power. The Heritage Foundation and many, many other conservative groups have been working for decades, some since the 1950s, to seize control of those levers of power.

    Palantir aligning with this fascism is not nearly the clearly failing strategy the author believes it to be. There’s a very real chance they are successful for years or even decades aligning with the current fascist regime. It has a lot of momentum, and I haven’t seen good evidence that that momentum is reliably ebbing. It’s seeing speed bumps, but I haven’t seen any kind of turning point. I really hope the midterm elections are that turning point. Either conservatives lose Congress or the public realizes they can’t stop it by working within the system anymore.



  • For enthusiasts, AI promises to usher in something that socialists have long dreamed of: a world without scarcity in which human beings can move finally from the realm of necessity to the realm of freedom.

    Like many problems techbros try to solve, this is a problem of politics and social organization, not technology. We have had the technology to free the entire human population from several fundamental scarcities for decades (food and housing most prominently, but also many diseases), but the groups with the resources to do so actively choose not to solve those problems. Mostly because they are antisocial psychopathic billionaires.



  • The result of all this may be catastrophic. Should a worst-case scenario ever occur — a cyberattack, a natural disaster, an internet outage — there may be no human workers left with the skills that once kept food on the shelves.

    Very nerdy of me, but this reminds me of a Stargate SG-1 episode “the Sentinel.” The team travels to a planet whose civilization relies on fully automated technology. The people don’t have to operate or maintain it (normally), so their society has completely forgotten how. In the episode, one set of antagonists comes in and sabotages their defense system, and another set sees the opportunity and invades. The protagonists have to then figure out the defense system and fix it.

    We don’t live in a TV series. There aren’t benevolent outsiders who will swoop down and save our systems in the nick of time when they break down. We’re headed in a bad direction.


  • We’re about to face a crisis nobody’s talking about. In 10 years, who’s going to mentor the next generation? The developers who’ve been using AI since day one won’t have the architectural understanding to teach. The product managers who’ve always relied on AI for decisions won’t have the judgment to pass on. The leaders who’ve abdicated to algorithms won’t have the wisdom to share.

    Except we are talking about that, and the tech bro response is “in 10 years we’ll have AGI and it will do all these things all the time permanently.” In their roadmap, there won’t be a next generation of software developers, product managers, or mid-level leaders, because AGI will do all those things faster and better than humans. There will just be CEOs, the capital they control, and AI.

    What’s most absurd is that, if that were all true, that would lead to a crisis much larger than just a generational knowledge problem in a specific industry. It would cut regular workers entirely out of the economy, and regular workers form the foundation of the economy, so the entire economy would collapse.

    “Yes, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders.”