In today’s Finshots, we tell you what an investigative report by Reuters reveals about the social media giant’s performance in India and more
On February 2, Facebook (or its parent Meta Platforms) tried to explain the slowdown in new users to its investors. It said (and we’re paraphrasing here), “Hey, everyone! We have a unique situation in India, our largest market. We believe mobile data costs are too high and that’s throttling our explosive growth.”
Now if you’re looking at this and going — What on earth is Facebook talking about? We are with you.
India has some of the lowest mobile internet costs in the world — with 1GB of data costing a mere $0.68 when compared to the global average of $4.21. But data prices have jumped by 7.5 times since 2020. And the new additions of mobile internet users in the country have been slowing down. While earlier new users registered a growth of 12% (in 2020), it fell to a mere 5% in 2021.
So maybe Facebook’s explanation did have merit.
But it seems there’s also a deeper malaise here.
According to an exclusive investigation report by Reuters, Facebook published an internal memo on February 2. You know, the same day they mentioned the data cost problem. And they revealed something else. Women in India don’t like Facebook that much. There’s too much nudity. Too many unwanted advances by men. And they don’t just feel safe in general.
Facebook’s own research reveals that 79% of female Facebook users were concerned about their photos being misused. And nearly 30% had seen nudity during the week of the research.
It’s definitely not a great sign for Facebook.
This kind of explains why 75% of its users in the country are men compared to a global average of 56%. Yes, there is gender disparity everywhere in India and you could argue that it’s men that predominantly access the internet. But Facebook is still concerned. They said so themselves — “While there is a gender imbalance in internet use across India, the imbalance among Facebook users is even more pronounced.”
And this is a problem because Facebook makes most of its money through ads. Now imagine a business that wants to target women in this country. They’re unlikely to run ads on Facebook if they believe women aren’t represented enough on the social media platform no?
And Facebook’s aware that it needs to address this problem urgently. They have been trying to fix things.
By 2020, Facebook had already gathered some insight into women’s concerns in India. So, it launched a feature called “Profile Lock” that was meant to give women more control. Once locked, stalkers weren’t able to zoom into the profile picture or download it. The rest of the profile would be out of bounds too.
And going by Facebook’s internal reports, 34% of its women users in India had latched on to this feature by June 2021.
Then last year, it also launched the Women’s Safety Hub in 12 Indian languages that had videos and other content related to online safety. Facebook realised that sticking to English and Hindi alone wouldn’t help it expand its 450 million user base in a country with extreme linguistic diversity.
So yeah, it’s definitely a work in progress for Facebook.
But here’s the thing. There’s another problem that everyone seems to have forgotten about. Do you remember the “Facebook Papers”? A leak involving some very sensitive internal documents.
Well in it, there was something even more disturbing. Back in February 2019, a person living in Kerala set out to document what a new user on the platform would experience, as part of the group’s internal research. For 3 weeks, this person just blindly followed whatever Facebook’s algorithms recommended — the groups, the videos, the pages.
The end result? A shocking concoction of hate speech, misinformation, and violence.
And here’s what the Facebook researcher wrote after this experiment, and we quote: “Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total.”
It sounds quite scary, doesn’t it?
Well, it is upsetting. And one probable reason why Facebook has had a hard time fighting this issue is poor resource allocation. You see, 87% of its budget (meant for fighting misinformation) was set aside for the US. The remaining was spread too thin across the world. And the US only accounted for about 10% of the platform’s daily users. So yeah, the resource allocation was significantly lopsided. And it created a toxic environment in many places across the world.
Bottom line — Unless Facebook creates a safe space for most of its users, these issues will keep cropping up. The mobile data costs will only be an afterthought.
Until next time...
Also, don't forget to share this article on WhatsApp, LinkedIn and Twitter