Paper notes - Reversing Anti-Cheat's Detection-Generation Cycle With Configurable Hallucinations
Mon 03 July 2023 — download

Anti-cheat is an impossibly hard problem, meaning that the heuristics used to approximate solutions are often baroque, and sometimes clever. This is the case for this one, mentioned on Call of Duty's blog, at the end of June 2023. The research paper is unfortunately a webpage, and not a proper™ one in LaTeX, and was presented at the GDC 2023.

The main idea is that since the server knows everything about the state of the game, it can send hallucinations to player, by cloning existing player info, and placing those hallucinations so that non-cheating players can't see them: outside of the field of vision, beyond maximum draw distance, directly above/below the player, behind walls, … To prevent hallucinations from appearing from nowhere, they can emerge from actual players, so that the cheat has to take a bet about which one is the real player and which one is the decoy.

This is a really smart and cheap solution, allowing anti-cheat developers to tweak the hallucinations' parameters and behaviours easily, and in doing so, "reverse the detection-generation cycle", putting the churn burden on cheat developers.

An obvious way to bypass this is to enable cheating features (ESP, wallhack, radar, …) only against players that have been displayed on screen at least one, and make them disappear again should they "fork" out of sight.

There are also other minor issues:

  • False-positives: if players are able to shoot through walls, or throw explosives around; meaning that a sparingly used cheat wouldn't be detected.
  • Data-set poisoning: if an attacker runs a significant amount of bots doing nothing but firing in the air, odds are that the hallucinations' training model with start picking this behaviour up, allowing it to be fingerprinted.

Overall a fun and attacker-costly mitigation.