Profile pic is from Jason Box, depicting a projection of Arctic warming to the year 2100 based on current trends.

  • 1 Post
  • 3.62K Comments
Joined 2 years ago
cake
Cake day: March 3rd, 2024

help-circle
  • Rhaedas@fedia.iotoLemmy Shitpost@lemmy.worldSad
    link
    fedilink
    arrow-up
    1
    ·
    48 minutes ago

    Federation and how things are pushed out is complicated. World is pretty big (it was one of the initial places during the first growth spurt from Reddit) but I don’t know the current state of them allowing material from other instances. I think it’s okay, I see a lot of world users in my feed, but I don’t know what THEY see.

    The caveat of making other accounts is that they are different accounts, so your posting and history are only there (although there was research on ways to transfer or share, but I don’t know where that is atm). But if another instance is open to new signups, nothing wrong with trying them out, see if you see things differently, end up using that one if it feels more open.

    Go visit https://lemmyverse.net/ and see what’s out there. There’s three big “types” now, Lemmy, Mbin, and Piefed. I can’t tell you which is better, as everyone has their own take on things.


  • Rhaedas@fedia.iotoLemmy Shitpost@lemmy.worldSad
    link
    fedilink
    arrow-up
    3
    ·
    2 hours ago

    Since there’s many different sources (federated) instead of a single one like Reddit, maybe how you’re connected and being fed things is limited, giving you an impression it’s less. It was much worse in the beginnings obviously, until moderators and instances grew and learned how to better pump things out. It still has its issues, depending on where you are.

    It’s also not equal to compare a still new network of Lemmy and others to a long established Reddit structure of niche subs for everything. At some point there were none of the subs we now see. Maybe some of them aren’t needed at this time, as setting up a new community is not hard to do (moderating it and growing it is work though).








  • Rhaedas@fedia.iotoMemes@sopuli.xyzThey're just puppos
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    5 hours ago

    It’s a classic meme to show a contrast between things, but we never see the other picture of the dog ancestor creeping up to the fire submissively with its tail down, begging the nice humans for a bit of meat from their food. Would I trust it as far as domesticated dogs, hell no. But they had their “cute” side at some point, otherwise we would have never formed a bond.



  • Most any other building this wouldn’t be a thing, but the White House is recognized world-wide as a symbol of the US. It’s on our currency, on many plaques, its silhouette is about as representative as the flag itself. Meaning tearing part of it down is a direct reflection of how he’s ruining the country, and for the same reason - his ego.


  • Starting off with “you people” always sets the bar for a good discussion. Maybe part of my sarcasm was referring to those situations as well as domestic problems, or how either or both sides could have done a lot better for decades. I wonder if you think all this is recent? I’ve been seeing this shit for decades, trying to change it… I guess it’s my fault that it hasn’t. This sure sounds like projection.






  • If it had any chance to succeed it would have been with the needs of the internet. It did not succeed. None of the wikipedia entries discuss why it didn’t catch on. I think the simplest answer is that it didn’t offer anything new or better, just a different nomenclature, and would still have to be converted for most local things that used standard time keeping. So outside of a fad and marketing gimmick it had no value.


  • I agree that the results will be different, and certainly a very narrowly trained LLM for conversation could have some potentials if it has proper guardrails. So either way there’s a lot of prep beforehand to make sure the boundaries are very clear. Which would work better is debatable and depends on the application. I’ve played around with plenty of fine tuned models, and they will get off track contextually with enough data. LLMs and procedural generation have a lot in common, but the latter is far easier to manage predictable outputs because of how the probability is used to create them.