• ArchRecord@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    5 hours ago

    It can, but I don’t see that happen often in most places I see it used, at least by the average person, although I will say I’ve deliberately insulated myself a bit from the very AI bro type of people who use it regularly throughout their day, and mostly interact with people who are using it occasionally during research for an assignment, rewriting part of their email, etc, so I recognize that my opinion here might just be influenced by the type of uses I personally see it used for.

    In my experience, when it’s used to summarize, say, 4-6 sentences of text, in a general-audience readable text (i.e. not a research paper in a journal) that doesn’t explicitly rely on a high level of context from the rest of the text (e.g. a news article relies on information it doesn’t currently have, so a paragraph out of context would be bad, vs instructions on how to use a tool, which are general knowledge) then it seems to do pretty well, especially within the confines of an existing conversation about the topic where the intent and context has been established already.

    For example, a couple months back, I was having a hard time understanding subnetting, but I decided to give it a shot, and by giving it a bit of context on what was tripping me up, it was successfully able to reword and re-explain the topic in such a way that I was able to better understand it, and could then continue researching it.

    Broad topic that’s definitely in the training data + doesn’t rely on lots of extra context for the specific example = reasonably good output.

    But again, I also don’t frequently interact with the kind of people that like having AI in everything, and am mostly just around very casual users that don’t use it for anything very high stakes or complex, and I’m quite sure that anything more than extremely simple summaries of basic information or very well-known topics would probably have a lot of hallucinations.

      • ArchRecord@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        5 hours ago

        See, when I have 4-6 sentences to summarize, I don’t see the value-add of a machine doing the summarizing for me.

        Oh I completely understand, I don’t often see it as useful either. I’m just saying that a lot of people I see using LLMs occasionally are usually just shortening their own replies to things, converting a text based list of steps to a numbered list for readability, or just rewording a concept because the original writer didn’t word it in a way their brain could process well, etc.

        Things that don’t necessarily require a huge amount of effort on their part, but still save them a little bit of time, which in my conversations with them, seems to prove valuable to them, even if it’s in a small way.

        • jjjalljs@ttrpg.network
          link
          fedilink
          arrow-up
          3
          ·
          2 hours ago

          I feel like letting your skills in reading and communicating in writing atrophy is a poor choice. And skills do atrophy without use. I used to be able to read a book and write an essay critically analyzing it. If I tried to do that now, it would be a rough start.

          I don’t think people are going to just up and forget how to write, but I do think they’ll get even worse at it if they don’t do it.

          • ArchRecord@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            I definitely agree.

            However, I think there’s certainly a point at which the usage of a given tool is too small to meaningfully impact your actual retention of a skill, and I do think that when these people are just, say, occasionally firing off an email and they feel like the tone is a bit off, having it partially rewrite it could possibly even help them then do better in the future at changing their tone on their own, so personally I think it’s a bit of a mixed bag.

            But of course, when I look at all the people foregoing things like learning programming languages to ask ChatGPT to just vibe code everything for them, then talk about how they’re gonna get a job in tech… yeah, that’s 100% past the point of skills atrophying in my opinion.