*(Via Business Insider) – I don’t know how many times and ways I can tell you people to get off Facebook. The company doesn’t care about you, it pollutes society with misinformation, and no matter what happens management never stops prioritizing profit over safety.
Over the weekend a low level hacker published personal data — like phone numbers, locations, and email addresses — belonging to 533 million Facebook users from 106 countries. A Facebook spokesperson said the data had been scraped because of a vulnerability the company patched in 2019.
While the data may be old and the loophole is “fixed,” it’s unclear what if anything Facebook is doing to help affected users or prevent more leaks. Right now the best way to tell if your data was part of the leak is to use a third-party site. This kind of “brush it under the rug” mentality is the latest sign Facebook has become so big that it doesn’t have to prioritize putting the safety of its users first.
Millions of its customers leaving is not enough to trigger some kind of “come to Jesus” moment because there are hundreds of millions more users where that came from. That means, if you want to protect yourself, leaving the platform is the most rational way to deal with it. Otherwise you’re just exposing yourself to Facebook’s recklessness.
Facebook has been garbage this year, and it’s only April
The leak isn’t the only reason to dump Facebook. Let’s look at the danger Facebook has put society in 2021 alone.
First up, it was instrumental in helping the rioters plan the events that led to the US Capitol insurrection on January 6. Facebook’s COO Sheryl Sandburg tried to downplay the insurrectionists’ use of the platform prior to the riot, but evidence that Facebook was a critical tool for organizers was everywhere.
And Facebook’s role in fostering the kind of right-wing extremists that led the Capitol riot should come as no surprise. Last year, the WSJ reported that Facebook has known that its algorithms encourage and amplify antisocial behavior like hate speech and extreme political bias to keep users engaged.
Recently, MIT’s Technology Review profiled Joaquin Quiñonero Candela, a former director of the AI group at Facebook, who was supposed to try to train the algorithm to be more pro-social, but he says he was stopped at every turn.
This opinion piece by Linette Lopez continues at Business Insider.