Sexicon — the new language of sex
Seggs. Corn. Mascara. Accountant. . Leg booty. @N4l. . These sexual euphemisms are hugely popular on TikTok and Instagram. This shift in our sexual lexicon — sexicon, if you will — is far from fun, however. These new words — also known as “algospeak” — are designed to evade censorship from social media platforms, which have been clamping down on sexual content.
Users who don’t use work-around vocabulary risk having their content removed by social media platforms, they face shadowbanning (a type of censorship which doesn’t remove a post, but hides it from public view), and being permanently banned.
Dr. Carolina Are, innovation fellow at the Centre for Digital Citizens at Northumbria University, has interviewed a lot of sex educators who have told her that in order to have their content seen on platforms, they have to use words like “seggs” or to find “silly synonyms for body parts,” which they feel dilutes their message. “There are even doctors who are having to use terms like ‘hoo ha’ or kid words for body parts without being able to refer to the medical term,” Are adds.
So, why is big tech policing people’s sexual expression? Are says this censorship has been ramping up since 2018, when the U.S. government approved FOSTA/SESTA, which created an exception to Section 230, AKA the internet’s “safe harbor” rule, which meant that publishers and websites wouldn’t be held legally responsible for what people posted on their platforms. But, in 2018, FOSTA/SESTA changed that, making publishers liable if third parties publish anything that facilitates sex trafficking or promotes sex work.
“The law is bad as it is because it lumps consensual sex work in with a crime — sex trafficking,” says Are. “But on top of that, because platforms work with machine learning, what they've been doing to avoid getting done for promoting sex trafficking or sex work, they’ve been over-complying with everything that’s mildly related to sex.”
Removing or hiding any content that’s vaguely sexual further stigmatises sexuality within society, pushing educational content and sex positive messages to the margins.
Dr. Carolina Are, innovation fellow at the Centre for Digital Citizens at Northumbria University, has interviewed a lot of sex educators who have told her that in order to have their content seen on platforms, they have to use words like “seggs” or to find “silly synonyms for body parts,” which they feel dilutes their message. “There are even doctors who are having to use terms like ‘hoo ha’ or kid words for body parts without being able to refer to the medical term,” Are adds.
So, why is big tech policing people’s sexual expression? Are says this censorship has been ramping up since 2018, when the U.S. government approved FOSTA/SESTA, which created an exception to Section 230, AKA the internet’s “safe harbor” rule, which meant that publishers and websites wouldn’t be held legally responsible for what people posted on their platforms. But, in 2018, FOSTA/SESTA changed that, making publishers liable if third parties publish anything that facilitates sex trafficking or promotes sex work.
“The law is bad as it is because it lumps consensual sex work in with a crime — sex trafficking,” says Are. “But on top of that, because platforms work with machine learning, what they've been doing to avoid getting done for promoting sex trafficking or sex work, they’ve been over-complying with everything that’s mildly related to sex.”
Removing or hiding any content that’s vaguely sexual further stigmatises sexuality within society, pushing educational content and sex positive messages to the margins.