Ads

Hot Topics

Bumble, the women-first social networking app

Bumble, has launched a new initiative ‘Stand for Safety’ as part of its mission to create a safer, kinder and more respectful internet. 

In response to ever-growing safety concerns, Bumble is releasing a safety guide, in partnership with Safecity, Red Dot Foundation's flagship public safety platform, to empower women in India to identify, prevent and combat rising digital abuse. 

Through this initiative, Bumble continues to demonstrate its commitment to a zero-tolerance policy for hate, aggression or bullying of any kind.

A recent nationwide survey by Bumble found 83% of women surveyed in India experience online harassment of some kind, and 1 in 3 women experience it weekly. 

A further 70% of women believe that cyberbullying increased since lockdown was announced in 2020. Over half (59%) of women surveyed said they feel unsafe and just under half (48%) feel angry.

Bumble is committed to fostering a safe and inclusive space for its community to connect for healthy and equitable relationships. 

As a geographic-specific feature for Bumble community in India, a woman can choose to use only the first initial of her name to create her Bumble Date profile, and can share her full name with connections when she feels ready and comfortable.

Bumble has a robust block and report feature within the app, and has made it easy for its community to block and report anyone who makes them uncomfortable on our app, or anyone who's behaviour goes against our Community Guidelines. Bumble also has a photo verification feature to help prevent catfishing within the app. 

Another Bumble feature that leverages AI, Private Detector is able to capture, blur and alert users that they've been sent an unsolicited nude image making it the user’s choice to either delete, view or report the image. 

Bumble recently updated its terms and conditions to explicitly ban any unsolicited and derogatory comments made about someone’s appearance, body shape, size or health, and became one of the first social networking apps to ban body shaming. 

The app uses automated safeguards to detect comments and images that go against its guidelines and terms and conditions, which can then be escalated to a human moderator to review.