The AI boom is screwing over Gen Z | ChatGPT is commandeering the mundane tasks that young employees have relied on to advance their careers.::ChatGPT is commandeering the tasks that young employees rely on to advance their careers. That’s going to crush Gen Z’s career path.

  • Dnn@lemmy.world
    link
    fedilink
    English
    arrow-up
    150
    arrow-down
    6
    ·
    1 year ago

    Bullshit. Learn how to train new hires to do useful work instead of mundane bloat.

    • Lmaydev@programming.dev
      link
      fedilink
      English
      arrow-up
      62
      arrow-down
      2
      ·
      1 year ago

      100% if an AI can do the job just as well (or better) then there’s no reason we should be making a person do it.

      • phario@lemmy.ca
        link
        fedilink
        English
        arrow-up
        43
        ·
        edit-2
        1 year ago

        Part of the problem with AI is that it requires significant skill to understand where AI goes wrong.

        As a basic example, get a language model like ChatGPT to edit writing. It can go very wrong, removing the wrong words, changing the tone, and making mistakes that an unlearned person does not understand. I’ve had foreign students use AI to write letters or responses and often the tone is all off. That’s one thing but the student doesn’t understand that they’ve written a weird letter. Same goes with grammar checking.

        This sets up a dangerous scenario where, to diagnose the results, you need to already have a deep understanding. This is in contrast to non-AI language checkers that are simpler to understand.

        Moreover as you can imagine the danger is that the people who are making decisions about hiring and restructuring may not understand this issue.

        • exbot@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          1 year ago

          The good news is this means many of the jobs AI is “taking” will probably come back when people realize it isn’t actually as good as the hype implied

          • phario@lemmy.ca
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            edit-2
            1 year ago

            It’s just that I fear that realisation may not filter down.

            You honestly see it a lot in industry. Companies pay $$$ for things that don’t really produce results. Or what they consider to be “results” changes. There are plenty of examples of lowering standards and lowering quality in virtually every industry. The idea that people will realise the trap of AI and reverse is not something I’m enthusiastic about.

            In many ways AI is like pseudoscience. It’s a black box. Things like machine learning don’t tell you “why” it works. It’s just a black box. ChatGPT is just linear regression on language models.

            So the claim that “good science” prevails is patently false. We live in the era of progressive scientific education and yet everywhere we go there is distrust in science, scientific method, critical thinking, etc.

            Do people really think that the average Joe is going to “wake up” to the limitations of AI? I fear not.

          • burningquestion@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            1 year ago

            No, companies will and are accepting reduced quality outputs in exchange for a 90+% reduction in costs to get that task done. (90% being a conservative estimate)

        • DogMuffins@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          This.

          In accounting, 10 years ago, a huge part of the job was categorising bank transactions according to their description.

          Now AI can kinda do it, but even providers that would have many billions of transactions to use as training data have a very high error rate.

          It’s very difficult for a junior to look at the output and identify which ones are likely to be incorrect.

        • Flying Squid@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          And AI is not always the best solution. One of my tasks at my job is to respond to website reviews. There is a button I can push that will generate an AI review. I’ve tested it. It works… but it’s not personal. My responses directly address things they say, especially if they have issues. Their responses are things like, “thanks for your five-star review! We really appreciate it, blah blah blah.” Like a full paragraph of boilerplate bullshit that never feels like the review is addressed.

          You would think responding to reviews properly would be a very basic function an AI could do as well as a human, but at present, no way.

          • NaN@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            This assumes that your company doesn’t decide the AI responses are good enough in exchange for the cost savings of removing a person from the role, and that they don’t improve in a subsequent update.

            • Flying Squid@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              True, although my company emphasizes human contact with customers. We really go out of our way with tech support and such. That said, I hate responding to reviews. I kind of wish it was good enough to just press the ‘respond to review with AI’ button.

    • TwilightVulpine@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      1 year ago

      They don’t want to train new hires to begin with. A lot of work that new hires relied on to get a foothold on a job is bloat and chores that nobody wants to do. Because they aren’t trusted to take on more responsibility than that yet.

      Arguably whole industries exist around work that isn’t strictly necessary. Does anyone feel like telemarketing is work that is truly necessary for society? But it provides employment to a lot of people. There’s much that will need to change for us to dismiss these roles entirely, but people need to eat every day.

      • CoderKat@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        The “not willing to train” thing is one of the biggest problems IMO. But also not a new one. It’s rampant in my field of software dev.

        Most people coming out of university aren’t very qualified. Most have no understanding of how to actually program real world software, because they’ve only ever done university classes where their environments are usually nice and easy (possibly already setup), projects are super tiny, they can actually read all the code in the project (you cannot do that in real projects – there’s far too much code), and usually problems are kept minimal with no red herrings, unclear legacy code, etc.

        Needless to say, most new grads just aren’t that good at programming in a real project. Everyone in the field knows this. As a result, many companies don’t hire new grads. Their advertised “entry level” position is actually more of a mid level position because they don’t want to deal with this painful training period (which takes a lot of their senior devs time!). But it ends up making the field painful to enter. Reddit would constantly have threads from people lamenting that the field must be dying and every time it’s some new grad or junior. IMO it’s because they face this extra barrier. By comparison, senior devs will get daily emails from recruiters asking if they want a job.

        It’s very unsustainable.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Indeed: at least in knowledge based industries, everybody starts by working with a level of responsability were the natural mistakes a learning person does have limited impact.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          One of my interns read the wrong voltage and it took me ten minutes to find his mistake. Ten minutes with me and multiple other senior engineers standing around.

          I congratulationed him and damn it I meant it. This was the best possible mistake for him to make. Everyone saw him do it, he gets to know he held everything up, and he has to just own it and move on.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I fully agree, however doing some mundane work for a few weeks while you learn is useful. You can’t just jump straight into the deep work.

    • legion02@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      The problem is really going to be in the number of jobs that are left with 40hrs of work to do.