Even as early as the days of XBOX Live, voice chat platforms have had a reputation for being toxic – and many companies are trying to fix that. Modulate Inc., a company using voice technology to “proactively detect[] key behaviors” to keep communities safer and “manage risk,” has been awarded U.S. Pat. No. 11,996,117, which relates to speech moderation using a multi-stage machine learning process.
Claim 1 reads:
1. A toxicity moderation system, the system comprising
an input configured to receive speech from a speaker;
a multi-stage toxicity machine learning system including a first stage and a second stage, wherein the first stage is trained to analyze the received speech to determine whether a toxicity level of the speech meets a toxicity threshold,
the first stage configured to filter-through, to the second stage, speech that meets the toxicity threshold, and further configured to filter-out speech that does not meet the toxicity threshold.
Modulate’s disclosure appears to focus quite a bit on the use of multiple stages to filter speech – claims from a later, currently-pending patent application (U.S. Pat. App. Pub. No. 2024/0296858) suggest that one useful aspect of the multiple stages is that processing from a “subsequent speech” can train the “speech toxicity processing of an earlier stage.” It appears that Modulate’s systems are currently used in games such as Grand Theft Auto Online, Among Us VR, Breachers, and at least one Call of Duty game. The company also appears to be working on a Discord plugin.