• 0 Posts
  • 38 Comments
Joined 3 years ago
cake
Cake day: June 19th, 2023

help-circle

  • In that context, it comes from “crunchy granola”, aka the sort of hippie/hipster that tends to embrace that kind of trendy natural diet. Granola entered the hippie sorts back in the sixties and the terminology came around somewhere in the mid seventies to early eighties.

    By the time I was in my teens, it was in use for those retro-hippie sorts that also tended to the more idealized parts of hippiedom. Those kids are now the uncles and aunts (and sometimes even grandparents or great aunts/uncles) referred to.

    In that specific context, adding in the aunt/uncle part, it is a positive. However, it’s also a tad condescending, if unintentionally so. That those aunts and uncles are something that needs an extra label to lump them into an outside group is often done because the person doing the lumping thinks they’re better.

    Truth? Some of the crunchy granola sorts are dipshits. They’ll eat “natural” even if it isn’t actually healthy. They’ll rattle on about stuff that nobody else present cares about not because that’s who they are, but because they’re evangelizing. So some of the reputation for being hippie-dippie if you’re also crunchy granola is deserved. Same with being a skeevy stoner hippy being often linked to the other labels.


  • None of my fucking business.

    See, that’s the thing about people being grown-ass adults. They get to decide what does and doesn’t work for them.

    And, despite people that want to knee jerk the matter, there’s less difference between those two ages than there is between a 21 and 25 year old.

    Personal development is heavily front loaded. By the mid to late twenties, most people are who they’ll always be. Friendship, romance, whatever. The only real barrier to age gaps are cultural touchstones and a handful of probable experiences (like job stuff, kids, etc) that aren’t even guaranteed to not be present.

    Folks just get all het up over it because they’re morons that can’t look outside of themselves long enough to realize that their motivations and concepts towards other people aren’t actually universal.

    Two consenting adults are just fine, and nobody else has an opinion that matters about them


  • I dunno, even after we got a team close enough to count as home, my choices have always been arbitrary . Mind you, I don’t really get into team sports, but it’s always fun to have a team for casual conversation.

    When I was a kid, it was the Jets, the Eagles, the Vikings or the Bengals, depending on the year and my mood. For us football, obviously. All based on their uniforms or logos. As an adult, it’s the Ravens. I just like their logo better.

    Celtics for basketball, just because I liked Larry Bird back in the day.

    Hockey, it’s the Rangers for no reason other than the name sticks in my head

    Football/soccer, it’s Arsenal. Because it’s easy to remember.

    Baseball, I tend to go with whatever Sox team pops in my head, or the Dodgers.

    Past that, I couldn’t name a team for other sports tbh


  • Wrong, no, absolutely not.

    Not giving a shit isn’t inherently wrong. Can’t be because nobody has enough inner resources to give a shit about everything. I don’t give a shit about large swathes of the human population by dint of my give-a-shitter being defective from the beginning and having worn down significantly over the years.

    Like, I don’t want bad to happen to anyone lest it be being hoist by their own petard. You know, fuck around and find out. I have enough shadenfreude in me to appreciate someone fucking their own shit up by being an asshole.

    But even “innocent” people getting wiped out by floods or whatever, I just don’t have the fucking spoons.

    So, even the best cop in the world, 1000% what a cop should be instead of what most of them are, it just ain’t a big deal. Soldiers, or any other job where there’s danger inherent to it as well. That actually goes for my fellow medical folks too tbh. There’s some things where the assumption of risk is there, and when the risk eventually happens, it sucks, but I lack the energy to be upset when there’s just so much fuckery in the world that people don’t agree to.

    Unless there’s malice involved in why you don’t give a shit, who cares? Hell, even if there is malice, as long as you don’t try to make it happen, who cares?

    Empathy is not an infinite well. How it gets spent is only partially voluntary. So if there’s some segment of the world that just isn’t on your concern list, it’s fine


  • Well, as always, it depends on who’s telling the story.

    Generally, one’s “true name” is the same thing as they were named at birth, if you’re a real human in our world. There are exceptions though! And yes, one of those exceptions is trans people. However, in real world terms, if a person has never internalized their birth name for any reason, then their true name would be up for grabs, in that they could choose or otherwise a true name of their own.

    Now, that is largely based on real world occult practices. But real world occult practices differ, sometimes significantly. There’s some branches of ceremonial magic that hold the birth name you be the true name. Others that it’s a chosen magical name that is never told to anyone, and a third “public” magical name is used for rites and rituals. So it isn’t like there’s only one way of thinking about this stuff in those belief systems.

    But what about the fey/sidhe/fairies or other supernatural entities in fiction (and yes, there’s a difference between literary fiction, mythology, and religious or spiritual beliefs in the real world. IDGAF if any given person wants to partake in those beliefs or not, they are different from fiction in several important ways that are off topic beyond saying that).

    Well, that’s when it comes down to the author/writer.

    A lot of writers of things set in our world, even when it’s a fictionalized version that works in a different way, tend to base their choices off of whatever mythology they’ve run across, then adapt that to their writing. So you run into a couple of different answers.

    Mostly, they go with the birth name being the true name. There’s exceptions, but it’s the dominant trope. In some cases, the birth name isn’t enough, it’s the exact way a person says and thinks their name that makes it true. Iirc, that’s part of the Dresden series if you want a specific example.

    That idea is where words carry more than just definitions, and names are more than just some random phonemes. The process of how we (or the characters) take our name and etch it into ourself through thought and speech is what makes it a true name. It’s a really neat idea tbh. Have you ever noticed that sometimes a mom or dad will say their kid’s name slightly different? Minor changes in inflection or emphasis.

    Plus, everyone sometimes says a name different. Your mom using your full name with stress in her voice hits harder than your friend introducing you to someone. And that’s different from the DMV (or equivalent) person calling your name as next. The words are the same, but the energy is different.

    So, the way we think of our name, inside, is ours in a way that nobody else can ever be exactly right unless we teach it to them. The way we speak it is different as well, but a supernatural entity in a story can overhear that part easy enough.

    Which all means that there’s no single answer to your question, but it’s still a damn fun one :)


  • There’s no set limit, not in days. Weeks, yeah. Generally, you’ll start noticing it tastes off after about five to ten days. It won’t necessarily be spoiled, but once opened, everything starts losing freshness fast, relatively. Safely wise, a week is what most people say is the cautious limit. But the truth is that if your fridge is stable, the jar didn’t have anything dipped into it, and it isn’t being opened frequently, it could potentially last a month or more without being dangerous to health.

    However, it could also grow shit overnight that would make your toilet very unhappy.

    So there’s a good bit of sense in not risking going past a week.

    The nose is a decent enough detector for most things, but in this case, the bacteria that are problematic can grow well before anything that will smell bad. Botulism ain’t stank, and that’s the one you have to worry about the most. Mind you, the risk is low. A lot of sauces are acidic enough that botulism isn’t going to grow well. I’d even say most. The other two risks are usually going to be less likely, because they have to get in by cross contamination of some kind, where botulinum can come in over the air (not likely, just possible).

    Me? I’d chuck it out after a week unless I had a specific plan to use it soon after. I hate wasting food, but the truth is that pizza sauce is cheap and common. Nobody is losing out if a half a jar gets trashed. But that’s me, I don’t make pizza often. If you’re using it regularly, and only pouring out into a bowl or onto dough without dipping things in, you’ll use it up before it could hurt you, even if it’s a few weeks.

    No bullshit, commercial canning is very good at killing nasties off. It’s only when we stick things into the container that risks start climbing to “oh shit” territory in a fridge.


  • Restaurant waste, which is what your post body starts with, can’t be recycled like that. It’s an unsafe practice due to the contamination gained at the table combined with time out of the temperature safe zone. Even if you killed an the pathogens there, the risk of the toxins left by those pathogens is problematic. That and it would ruin the food trying to kill them to a reasonable degree.

    Now, back in the kitchen, you could do what some restaurants do and donate the prepared but unserved food to local distribution centers (often focused on homeless charities or government outlets). But it wouldn’t make sense to turn it into some kind of “nutrition loaf”. Seriously, look up that term and be prepared to hate the prison system more than you do currently.

    And that is why even if the process could be perfectly safe, it would still suck. Nobody should have to eat the horrible crap that it would turn into. If would be cheaper, safer, and more humane to just make sure everyone has good food to eat in the first place.

    The only application for the kind of bricks you’d get from the process is feeding people that don’t have access to good, healthy food in the first place.



  • No bullshit, I’ve yet to run into jank. My bank is a credit union that isn’t a bunch of assholes, so their app works fine.

    I don’t need any authentication apps, so no worries there. Ifi ever do, there’s some known to work with graphene.

    I’m happier with graphene than I have been in years with android overall. Last time I was really happy with android, we were still in single digits. The ever increasing limitations Google was applying broke my joy of it s an enthusiast.

    But graphene at least returns me being able to use my device without the layers of Google bullshit unless I just want to.

    So no frustrations at all just easy to use handled computing.

    I’ve had this phone since early last year. I think? Might have been june? Damned if I can remember without digging up old messages lol.

    Whe I got it, my plan was to use the pixel for my second line in case I couldn’t make the transition. I switched sims out two days later and haven’t looked back since. If I could put graphene on my second phone, I would.



  • Stock? Nah.

    I have something like a dozen tablets and phones stacked on my desk. I get new ones, but the old ones have enough life in them that I don’t just count them as ewaste and wash my hands of them. Only two of those have current lineage available, and I can’t be arsed to update what amounts to a picture frame that isn’t connected to Wi-Fi. The rest get used as security cameras for very short term use.

    Most of them still have the os they came with as, again, I can’t be arsed to fiddle with the ones that I could dig up a rom for, or they couldn’t be unlocked to do it in the first place. But none of them were ever stock Android. Since when I got them, I favored Samsung and LG tablets, the ui was highly altered from regular AOSP.

    Now, my main phone? My absolutely amazing friend gifted me a pixel with graphene ready to go as soon s it reached me. But I do still use some play store apps on it, when I can’t find something good enough that isn’t (nothing touches poweramp, and I haven’t had the budget to put towards a licence for it from the dev, yet. Higher priorities).

    Never touched a pi unless it was a pie being shoved down my throat.

    Ngl though, if I wasn’t lazy as fuck, I’d likely swap to lineage on my older oneplus that’s my backup phone. Just don’t feel like dealing with the time it would take. So it’s as stock as it was when I got it a few years ago. I doubt I’ll ever do it unless I get a newer graphene device and it gets retired to the desk for infrequent uses. That’s how I end up with a still working Galaxytab 2 lol. Barely still working tbh.


  • Mad rant props!

    For real though, flatpak exists partially for exactly your use case. Simple to use, won’t break shit, and pretty much available everywhere.

    You’re kinda lucky in a way. Linux in all its flavors have steadily improved over the years. Even when win10 came out and I jumped ship for all but a few niche uses, it was a higher learning curve, and came with much disappointment in what I couldn’t do that I had been able to on win 7 (which was my favorite version of Windows overall).

    Now, while I still have my win 7 drive for the two things I can’t get working on linux reliably, I can do everything else. I also have a win10 partition on my laptop for one single piece of software because it’s easier to just keep it for the rare usage than try to figure out how to get it working (is Amazon’s shitty kindle author program, and since I only crank out a book every three years or so [and only one that I’ve felt like selling there], it just isn’t worth fucking with for that tiny amount of extra space.

    Linux, right now, is the best it’s ever been. It’s also on par with windows. Enough so that I can’t see myself ever going back. At some point, win7 won’t work on new hardware, and I’ll have to jank a musicbee install on linux, and tackle the character sheet generator that I use formy absurdly over crunchy home brew TTRPG that I’ve yet to find a replacement for that isn’t a compromise.

    Anyway, I suspect that in a year or two, you’ll be in a similar space. You’ll have figured out the bullshit, abandoned windows habits, and actually be satisfied with your distro of choice.

    Truth? If I had spent as much time on linux back in the nineties, I would likely have has equal difficulty adapting to windows if things had been in reverse.


  • Ignoring private schools, it really depends on locale. Most schools are run by a combination of local and state guidelines. So each state has its own minimum standards, which are then implemented on a district level.

    However, in some districts, the budget isn’t equal between all schools.

    So you can have varying quality within the same school system, and even more between different systems.

    The good thing about school meals is that they aren’t usually super expensive, don’t require packing only foods that won’t spoil or be gross by lunch time, and there’s usually some kind of budget for free reduced cost lunches (sometimes breakfast too) for those in need. It makes sense that most students will choke down even the bad options instead.

    Some schools do damn well though. The bulk is usually going to be supplied by one of the industrial food distributors, but most of that is similar to or the same as what you’d get in terms of ingredient quality as chain restaurants.

    So the staff of the cafeteria can make a huge difference in quality right there. Knowing how to turn fairly meh ingredients into something tasty is a great thing.

    When schools supplement with fresh produce, it can be damn good food. Local farmers out in rural areas often contribute. Some high schools have agriculture programs where they grow stuff that gets used in their own school, and may be distributed to others. Our closest high school supplements their own cafeteria, plus the elementary schools, and part of the jr high schools (some of those have their own gardens, so they tend to handle their own). My kid was very happy with the high school’s food, unlike the food at their jr high in another state that they hated.

    I ate at the high school a couple of times. Waaaaaay better than when I was a student there, and the agriculture program was starting up back then. Mind you, the lady that ran the cafeteria was doing a great job with what she had. The supplies were just crap back then. All canned shit for veggies if it wasn’t grown local, mandated recipes on a schedule set by the county, so you could only do so much to improve things. She ran a damn good kitchen though, so even when the food was bad, we knew the cooks were doing their best.

    And that’s pretty much the problem with school food. It just isn’t a nationwide priority.


  • I’d hold off another few years. There’s just enough published info pointing towards risks being higher under the mid twenties. The brain never stops developing, but that first twenty years in particular is easier to disturb since the development is more significant.

    At 18, you’re still putting the polish on some key centers, so fucking with that for a little hedonism seems unwise until there’s more and better research available.

    Me? I think I’d hold off until at least 21-23. Honestly, I’d do the same with any mind altering stuff. It’s not long to wait, and going at it too early can have long term consequences. There’s plenty of time to experiment and enjoy all sorts of recreational pharmaceuticals.

    Iirc, I had my first (of three lol) drunk at 17, and while it did no harm as a one-off (not much will tbh), I also didn’t really sink into the experience the same as later experiments with mind altering. Mind you, I’ve never liked being drunk, and avoided any of the high addiction potential stuff, but the difference in my second drunk compared to my first was massive just by dint of being 21, just a few years of added perspective on what I wanted out of it, and how I approached it.

    Weed, that’s another one that my initial exposure in my late teens (18 or 19) was just not as good as years later (late twenties).

    But, I can say that trying it once is worth it. If you end up not liking it, no big deal. I’d just wait a little longer. Weed, be it smoked, edible, or otherwise, is a very powerful experience when used infrequently. Steady use weakens the benefits of it for fun imo, but a few times a year? It’s nice. It enhances joy and good feelings.

    Just be easy with edibles. They hit slow, but hard. You can always have more if you don’t get the degree of euphoria you want after a half hour, but you can’t take it away if you get too stoned, which is absolutely possible.




  • I’m not sure if you’re mildly irritated at the compilations themselves, or that the “franchises” involved went way too long.

    I’m a f an of compilations tbh. It’s a solid way to snag the whole schmear cheap (usually). I can just choose to ignore the ones I don’t like, same as I did when a given series started going to shit.

    Spider-Man though, that’s a different kettle of fish. Comics run for decades, lifetimes in some cases. Movies about the same characters are going to be as likely to have extended production, with as many ups and downs as the comics do (and there are some horrible runs of even the best comic titles).

    I’m with you on the laziness and risk aversion that makes 9 American pie movies happen. Or most franchises that start in a similar way. The first was a great movie, but it really didn’t need a sequel, much less multiplies that not only abandoned the characters and what little storyline there was, but stopped putting in effort to good writing.

    Not that a successful one-off can’t spawn a decent franchise, it’s just that studios don’t put in the investment to make it happen.

    Look at the Bond series. While there have been plenty of stinkers, it was approached as a long term thing early on and has also managed to have some great movies even as it aged. No high art or anything, but still some solid escapist action.


  • Well, there’s actually been research into it.

    Since that shit is dry as hell, and there’s available articles about it, https://www.psychologytoday.com/us/blog/fulfillment-any-age/202202/why-it-feels-so-good-confess

    This one gives a nice overview.

    So, I’d say it’s pretty realistic to say that “confession” has mental health benefits.

    That being said, true anonymity is going to be vital if you’re going to try to build something online. Not just for the people that might want to use it, but for you too. You really don’t want the legal issues if someone were to confess on your service and it became part of trial evidence. You may be thinking it’s not a big deal, that it’ll never happen, but it does happen already with social media.

    The less you’ll be able to provide, the less hassle you’ll have. So keep that in mind. Reddit, Facebook, VPNs, they all deal with legal requests regularly, but they have legal departments to handle those to keep a barrier between the people running things and the consequences of users’ actions/words.

    Me? No fucking way I’d even confess to jaywalking online, period. And I have never done that (that’s actually true, I’ve never been in a situation where it was useful. Small towns and infrequent visits to cities ftw?). I’d also advise anyone else to never do so.

    Also, if you’re a priest/minister and your religion has a confessional seal, you have pretty robust legal protection about not having to break it, in many places. Therapists also have a degree of confidentiality that they’re legally required to maintain. Your online service has neither. So you’ll also have responsibilities above and beyond what therapists or ministers have. Well, you may, since local laws vary, and I’ve never heard of a lot of legal precedent around mandatory reporting for online services. But even if you aren’t currently required to report a range of things, not doing so might open you up to lawsuits and/or eager prosecutors looking to set a precedent.

    I guess what it comes down to is: yeah, it could help people. But better you than me


  • Gotcha :)

    Yeah, I think it comes down to what I said. No good reason to try and compete when they could scrape all the data that they would have wanted without having to build their own.

    It isn’t like any of the big social media companies wanted competition anyway. They wanted to dominate their niche. Twitter for short messages publicly transmitted. Instagram for image based posting. Facebook for mixed media sharing, etc. You find a niche, dominate it, then leverage that dominance into cash flow, usually via ads.

    If you go into the niche someone else already dominates, it’s an uphill struggle. You’re better off just waiting and either buying out the other companies, or otherwise gaining access to what they have that’s valuable.

    Hell, that’s meta’s playbook for sure, that’s what they keep doing.

    Google did try to kinda horn in on the Facebook style social media, can’t remember what it was called, but it flopped and they killed it. You’d think their greed for data for ad targeting might have made it attractive to at least try, but the fact that they eventually just paid reddit for access after a bit of a stink shows they had previously been hoovering it for free. Why invest millions or even billions when it’s already available without the investment?

    I think that part of it was also that reddit didn’t start as a forum. It was digg mark2. A link aggregator. It kept expanding its scope and turned into a forum. It was a big deal when comments were added to reddit, a major shift in how it worked. A lot of people hated it.

    That’s my take anyway.