Automated could work using AI to check how someone is playing, since a lot of LLMs are specialized in analyzing if something has human behaviors on it or not, but even then that approach isn’t 100% guaranteed. Ideally we’d use server-side anti-cheat instead of offloading it to client side, but that costs money and suits are allergic to spending money.
You mean machine learning algorithms (or just “AI”), not Large Language Models. LLMs are just advanced word prediction machines; they’re categorically incapable of detecting cheating in a game.
But, yeah. It would totally make sense to have server-side detection for things like:
Consistency with performance (no changes over time within or sessions with reaction time, for example)
Behaviour changes depending whether there is vs. isn’t someone around a corner, particularly in areas with long sight lines so footstep sounds wouldn’t trigger
Inhumanly quick reaction times, with consistency (e.g. < 100ms reaction times — 101 ms is the world record)
etc.
Sure, people could still have cheats help tweak inputs, a bit, like “gentle” headshot aiming assistance, but it would catch egregious cheaters.
Automated could work using AI to check how someone is playing, since a lot of LLMs are specialized in analyzing if something has human behaviors on it or not, but even then that approach isn’t 100% guaranteed. Ideally we’d use server-side anti-cheat instead of offloading it to client side, but that costs money and suits are allergic to spending money.
You mean machine learning algorithms (or just “AI”), not Large Language Models. LLMs are just advanced word prediction machines; they’re categorically incapable of detecting cheating in a game.
But, yeah. It would totally make sense to have server-side detection for things like:
etc.
Sure, people could still have cheats help tweak inputs, a bit, like “gentle” headshot aiming assistance, but it would catch egregious cheaters.