
Moderation in games is a necessary but tricky balancing act. Deciding when a player’s behavior warrants suspension is subjective, even within tightly targeted demographics. The line between acceptable and unacceptable behavior often blurs, leaving developers in a no-win situation.

Preface: The Challenges of Moderation
Hi, I’m Dylan Hunt, the original creator of Throne of Lies: Medieval Politics before its acquisition. To set the stage, imagine the moderation nightmare for a 16-player, chat-based social deduction PvP game based on lies and deceit. Whew…
imagine the moderation nightmare for a 16-player, chat-based social deduction PvP game based on lies and deceit
We spent years navigating the treacherous waters of what to moderate, drawing from constant community feedback. However, moderation came at a price: negative reviews. Even a justified ban — for instance, against an overtly racist player — could lead to backlash. With over 200,000 reports annually, we found ourselves in a no-win situation:
- No moderation: We’d be seen as condoning bad behavior and would inadvertently allow ruined experiences.
- Strict moderation: Negative reviews, resource drain, and debates over fairness.
When the game was acquired, the new owners unbanned everyone and dropped moderation entirely. While this was a long-term move, it caused an immediate backlash. This left me pondering: is there still something I can do?

The Epic Solution
There has to be a compromise, right? Let’s explore some common approaches:
- Separate queues for toxic players: Not viable for indie games with small player bases.
- Fake reports to feign action: Feels dishonest and unprofessional.
- Muting players: In a chat-based game like ours, this undermines the entire experience.
We needed something different — something subtle yet effective. That’s when we discovered the solution: silent moderation.
What is Silent Moderation?
Here’s how it worked: We allowed messages with the most extreme words or patterns to go through — but only to the sender. This one-way communication made the sender think their message was uncensored, but no one else could see it. The result? No reaction, no trolling satisfaction.
If a player asked, “Can anyone hear me?” the responses would be “Yes,” because such questions typically didn’t include extreme language. This created the illusion of normalcy while quietly neutralizing toxic behavior.
Why It Worked
This approach required no active moderation. It was passive but far more effective than any previous system we tried. It stripped trolls of their power without alienating other players or risking review backlash.
Lessons Learned
In hindsight, I wish we’d implemented this years earlier. Silent moderation is a game-changing solution for chat-based games, and I’m surprised even AAA studios haven’t adopted it yet. For developers struggling with moderation, take note: there is a way forward.
About Me
I launched Throne of Lies: Medieval Politics in 2017 and sold the IP in 2020, hoping the new owners could expand its potential. These days, I work on contract jobs for online multiplayer services and integrations with 9 years of experience.
Do you code multiplayer games/services? Join the Game Backend as a Service (GBaaS) Discord and say hi! 👋
— Dylan Hunt, Imperium42 Games