David Durst (PhD Candidate, Stanford University)
Carly Taylor (Senior Manager, Security Strategy, Activision Publishing, Inc.)
Location: Room 3005, West Hall
Date: Monday, March 20
Time: 4:40 pm - 5:10 pm
All Access Pass, Summits Pass
Vault Recording: Video
Audience Level: All
Anti-cheat and cheat developers are locked in a cat and mouse cycle of detection and circumvention. Anti-cheat developers struggle to find new behavioral patterns that reliably identify cheating players imitating legitimate players, and cheat developers easily change the patterns.
This talk will discuss a complementary anti-cheat technique, hallucinations, that baits cheating players into identifying themselves.
Hallucinations are fake enemies imitating humans that legitimate players can't observe—but cheating players can, because their cheating programs treat the apparitions as real players. Reconfigurable hallucinations are particularly interesting because they may flip the cycle so anti-cheat developers solve an easier problem, reconfiguring hallucination imitation behavior, and cheat developers solve a harder problem, detecting hallucination behavior patterns.
The talk will provide an economic argument for why reconfigurable bait techniques may be a long-term strategy, a description of a hallucination implementation, and evidence that popular cheat programs treat one hallucination implementation as a real player.
Attendees will learn how a hallucination implementation injects fake players into the snapshot, how that implementation can be reconfigured with different policies for imitating real player behavior, and how the implementation approach may allow hallucinations to evolve with cheat programs.
This is for developers and data scientists interested in anti-cheat techniques. Familiarity with multiplayer client-server architectures and FPS visibility computation techniques may be helpful but is not required.