I guess now we finally know why Babbage never finished building the Analytical Engine.

  • hayvan@feddit.nl
    link
    fedilink
    arrow-up
    10
    ·
    21 hours ago

    Not exactly. An autocorrect is a closest-match or prediction device with correct input given beforehand. When you type “fridsy”, what it does is to answer the question “between fridsy and this set of words, what is the shortest distance?”

    • CookieOfFortune@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      19 hours ago

      But to the user, it can correct their inputs. The rest is an abstraction. My point is that there’s more to a platform than just precise calculations. Obviously the asker isn’t thinking this far ahead, but Babbage is also rather flippant in his response.

      • Horsecook@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        19 hours ago

        Babbage was being flippant because, when questioned about his mechanical calculator, he didn’t imagine how computers might function two hundred years later?

        • CookieOfFortune@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          19 hours ago

          I mean, that’s a hyperbole. I think there’s more depth to this question from our point of view than just what’s on the surface.

          • Horsecook@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            18 hours ago

            No, not really. Calculators still don’t have autocorrect, because the concept is nonsense. With language, there are true and false combinations of letters. More probable, and less probable, combinations of words. Coffee is a word, covfefe is not. But a calculator cannot know that when you entered 2+2 you meant to enter 2+3, as both are valid inputs, and neither is more probable.

            • CookieOfFortune@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              18 hours ago

              Isn’t this just dependent on the level of abstraction? At the low level a CPU is just a calculator.

              Presumably the user has a way to enter these digits. If they’re using a touchscreen, then there’s plenty of algorithms being used to make sure the intended touch target is triggered, even if they touch something in between.

              There’s a lot of effort into making sure the user gets the intended result even if their input is fuzzy.