Body shaming is now banned on dating app Bumble
Bumble has vowed to provide a safer experience for online daters by banning body shaming on the app.
The women-first dating and networking app has announced it is updating its terms and conditions to explicitly ban unsolicited and derogatory comments made about someone’s appearance, body shape, size or health.
“This includes language that can be deemed fat-phobic, ableist, racist, colourist, homophobic or transphobic,” the company said in a statement.
The app will use an algorithm to flag derogatory comments about someone’s appearance, with human moderators then reviewing the offending account.
Read more: These will be the biggest dating trends of 2021
Bumble profiles that include body shaming language will be moderated, as well as any body shaming comments that are made through the app’s chat function, with repeated incidents or particularly harmful language resulting in a ban from the platform.
Additionally, the app's moderators now have the power to share resources intended to help people understand how damaging shaming behaviour can be, to help educate them about the importance of changing their actions in future.
Watch: Sisters develop dating app that matches personality.
The dating app creators say they want to make “a kinder and more respectful dating space,” and we are totally on board with it.
Online daters face enough self-analysis in choosing their own images to use, without having to fear harsh criticism from strangers online.
Read more: Singles Day, the world's biggest shopping event you may never have heard of
If recents stats are anything to go by, it seems there’s a real need for a kinder online dating space.
According to Bumble’s own research, almost a quarter of British users (23%) have been body shamed online, either on a dating app or social media.
Nearly everyone surveyed (87%) agreed that dating is a space where you feel more physically judged than other areas of life, which highlights why it’s so vital measures to prevent body shaming are put in place.
Read more: Facebook Dating launches in the UK with a 'Secret Crush' feature
The new measure isn’t Bumble’s first attempt to protect its users.
In 2019, the app developed a feature to prevent its users receiving inappropriate images.
The technology identifies inappropriate images and automatically blurs them out, while alerting the user of the nature of the contents.
Said to have 98% accuracy at identifying lewd images, the technology is aimed to protect the safety of Bumble’s users.
Other global companies have also taken steps to tackle body shaming with Facebook and Google increasing automated intelligence to block body shamers and remove unpleasant content.
Instagram also recently introduced an artificial intelligence tool that detects bullying language and asks users: “Are you sure you want to post this?”
We’re totally here for anything that helps tackle hate. Keep up the good work guys.