Americans are increasingly using code words known as “algospeak” to evade detection by content moderation technology, especially when posting about controversial topics or violating platform rules. One example of algospeak is the use of the term “camping” to discuss abortion-related issues. Social media users are adopting algospeak techniques such as codewords, emojis, and deliberate typos to avoid detection by AI moderation systems. Algospeak is commonly used to circumvent rules on hate speech, harassment, bullying, violence, and exploitation. However, the ever-evolving nature of algospeak presents challenges for tech companies and their content moderation contractors.
Signal | Change | 10y horizon | Driving force |
---|---|---|---|
Increasing use of “algospeak” to evade detection by content moderation technology | From direct language to coded language | More sophisticated algospeak and increased reliance on AI moderation | Desire to bypass content moderation and express controversial opinions |
Codewords, emojis, and typos used as algospeak to avoid detection | From clear language to coded language | More creative and widespread use of algospeak | Desire to discuss sensitive or banned topics without consequences |
Rise in algospeak during polarizing events | From direct language to coded language | Increased use of algospeak during political, cultural, and global events | Desire to express opinions without censorship or punishment |
Algospeak commonly used to sidestep hate speech rules | From direct hate speech to coded language | Algospeak used to avoid hate speech detection and rules | Desire to express hateful opinions without consequences |
Challenges for tech companies in detecting and moderating algospeak | From easily detectable violative material to more subtle coded language | Increased difficulty for AI in detecting and moderating euphemisms and coded phrases | Need for improved AI algorithms and moderation techniques |
Concerns about child and human exploitation related algospeak | From overt language to coded language | Rapidly evolving algospeak related to child exploitation and human trafficking | Need to protect vulnerable individuals and combat exploitation |
Third-party contractors follow platform guidelines for algospeak moderation | From independent decision-making to platform-driven moderation | Contractors follow platform guidelines for algospeak moderation | Contractors act on behalf of platform owners and follow their instructions |
Algospreak terms evolve and die out over time | From new and trendy terms to obsolete terms | Algospreak terms spike in popularity and then become obsolete | Natural lifecycle of algospeak terms |
Algospreak used during Ukraine-Russia war and in gaming platforms | From direct language to coded language | Algospreak used to evade AI detection in discussing sensitive topics | Desire to express political opinions without censorship |
Expectation of increased algospeak during midterm elections | From minimal algospeak to increased algospeak | Anticipated rise in algospeak related to political discussions during elections | Desire to influence political discourse without consequences |
Use of misspelled words and symbols to avoid AI moderation | From clear language to misspelled and symbol-replaced words | Continued use of misspelled words and symbols in algospeak | Desire to bypass AI detection in expressing sensitive topics |
Emojis commonly used as algospeak | From literal representation to coded representation | Increased use of emojis as coded language in algospeak | Desire to express hidden meanings or bypass AI detection |
Importance of human involvement in content moderation | From AI-only moderation to human-assisted moderation | Continued importance of human involvement in content moderation | Recognition of AI limitations in detecting certain content |
Concerns about accountability and scrutiny in content moderation | From unregulated moderation to calls for scrutiny and regulations | Increased calls for accountability and regulation in content moderation | Need for ethical and transparent content moderation practices |