• 4 Posts
  • 1.86K Comments
Joined 3 years ago
cake
Cake day: June 13th, 2023

help-circle


  • My previous phone was replaced when the main (for most apps only) camera stopped working, and more importantly, it randomly rebooted frequently whenever Wi-Fi was on. Also, my carrier didn’t acknowledge that it used LTE, so it was about to be booted off the network.

    Also the OS was old enough that some apps I used were starting to drop support and I would hear about others I couldn’t install.

    And Google Services Framework very often got into a rut where it would use 100% CPU and crash every few seconds, eventually getting bad enough that rebooting rarely helped. I wrote a long bash script to try and monitor it and reset it when it did this (by deleting a file that occasionally seemed to help and killing all its processes at once) but it just got worse over time. I didn’t want to do a factory reset for some reason I’ve now forgotten.




  • What I don’t understand is why there is so much resistance to the idea of having swap plus a separate hibernation file that is only enabled on demand. I finally got it working on my laptop, but it took a Lot of fiddly obscure manual configuration (which I wish I had documented - I don’t just mean setting the offset in grub) and it still didn’t play well with hybrid sleep, etc.

    This is the only way to have hibernation that works at any memory pressure.




  • davidgro@lemmy.worldtoMicroblog Memes@lemmy.worldYep
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    5 days ago

    The majority do actually want kids. I think almost everyone agrees with that and thinks it’s fine. The minority that don’t (for any reasons, yes including economic) are pretty frequently harassed about it though, and that’s the part that is recently finally getting some recognition as not cool. (It has always happened of course.)





  • Even though your post was removed, I still feel like some points are worth a response.

    You said LLMs can’t lie or manipulate because they don’t have intent.

    Perhaps we don’t have good terminology to describe the thing that LLMs do all the time - even “hallucinating” attributes more mental process than these things have.

    But in the absence of more precision, “lying” is close enough. They are generating text that contains false statements.
    Note also that I didn’t use the term in my other comment anyway: your whole comment was strawmen, probably why it was removed.

    On your other point, Yes, crazy prompts do lead to crazy outputs - but that’s mostly because these things are designed to always cater to the user. An actual intelligence (and probably most people) would try to lead the user back to reality or to get help, or would just disengage.

    However, it’s also the case that non-crazy inputs too commonly lead to crazy outputs with LLMs.