But to the user, it can correct their inputs. The rest is an abstraction. My point is that there’s more to a platform than just precise calculations. Obviously the asker isn’t thinking this far ahead, but Babbage is also rather flippant in his response.
Babbage was being flippant because, when questioned about his mechanical calculator, he didn’t imagine how computers might function two hundred years later?
No, not really. Calculators still don’t have autocorrect, because the concept is nonsense. With language, there are true and false combinations of letters. More probable, and less probable, combinations of words. Coffee is a word, covfefe is not. But a calculator cannot know that when you entered 2+2 you meant to enter 2+3, as both are valid inputs, and neither is more probable.
Isn’t this just dependent on the level of abstraction? At the low level a CPU is just a calculator.
Presumably the user has a way to enter these digits. If they’re using a touchscreen, then there’s plenty of algorithms being used to make sure the intended touch target is triggered, even if they touch something in between.
There’s a lot of effort into making sure the user gets the intended result even if their input is fuzzy.
But to the user, it can correct their inputs. The rest is an abstraction. My point is that there’s more to a platform than just precise calculations. Obviously the asker isn’t thinking this far ahead, but Babbage is also rather flippant in his response.
Babbage was being flippant because, when questioned about his mechanical calculator, he didn’t imagine how computers might function two hundred years later?
I mean, that’s a hyperbole. I think there’s more depth to this question from our point of view than just what’s on the surface.
No, not really. Calculators still don’t have autocorrect, because the concept is nonsense. With language, there are true and false combinations of letters. More probable, and less probable, combinations of words. Coffee is a word, covfefe is not. But a calculator cannot know that when you entered 2+2 you meant to enter 2+3, as both are valid inputs, and neither is more probable.
Isn’t this just dependent on the level of abstraction? At the low level a CPU is just a calculator.
Presumably the user has a way to enter these digits. If they’re using a touchscreen, then there’s plenty of algorithms being used to make sure the intended touch target is triggered, even if they touch something in between.
There’s a lot of effort into making sure the user gets the intended result even if their input is fuzzy.
Articulate the utility of a calculator that provides the response of “5” to “2+2.”
Well, it’s propping up the US economy right now…
Are you just being purposefully dense?
Are you?
No. But there’s no point in talking to a troll.