A Facebook BuzzFeed reported this week, citing several sources with knowledge of the program and its participants. According to Facebook's former chief of security officers, journalists who covered the company's Cambridge Analytica scandal are at least partially blamed.
Alex Stamos, who oversaw security on Facebook when news first broke of the scandal last year, criticized BuzzFeed and "other outlets" over what he called "unbalanced privacy reporting," and said that media coverage of Facebook's many privacy violations has been constantly aimed at inhibiting its ability to share data for legitimate research.
[See updates below for comments from Stamos, who also says this article is an “extremely unfair characterization” of his tweets and “Gizmodo at its finest.”]
viewed as the biggest privacy scandal in the history of the internet, how do you think companies will handle future academic research? "asked, citing Aleksandr Kogan, the computer scientists whose app," thisisyourdigitallife, "collected the Facebook user data later acquired by Cambridge (BuzzFeed's history in particular does not mention Cambridge Analytica once.)
Stamo's characterization of the scandal as "abuse of an academic academic" the fact that Facebook intentionally designed a system in which a single user was authorized to consent to the release of data belonging to thousands of other users. Thanks to Facebook, while only a few hundred thousand people downloaded Kogan's app, Cambridge Analytica was able to obtain data on 86 million others, all of whom never agreed to participate in Kogan's research.
This practice continued during Stamos' Facebook period, as described by a New York Times story last year, which Stamos fired on Friday, saying it "completely misrepresents [ed] normal business practices as third-party customers" . " story, which relied on internal documents generated in 2017, described how Facebook allowed companies such as Microsoft and Amazon to access the names and contact information of users' friends without their consent.
Facebook reveals Cambridge Analytica email It struggled for months to keep secret
More than a year ago, Facebook CEO Mark Zuckerberg posted on his personal page a "timeline" "…
Read more Read
This system behind Facebook's $ 50 billion business makes it a responsibility for every user to" friend "another. There is simply no way to be sure which friends will agree to surrender one's personal information. And it is an incentive for the company not to make data sharing practices too clear, which is why privacy settings do not reflect how they really work. (As part of the recent $ 5 billion settlement with the FTC, Facebook agreed to start asking users for permission before sharing their data beyond what is specified in the privacy settings.)
According to Stamos, this is just a " normal "industry practices, and any attempt to shed light on the finer details of Facebook's operations, only serves to negate the behavior. Facebook is now too scared to hand over its data to give researchers a better look at how the platform is affecting democratic elections around the world – something CEO Mark Zuckerberg called "crazy" after the 2016 election – and according to Stamos, the media is to blame. .
] how our personal information is used, "citing, among other examples, Facebook's attempt to extract user data through the promotion of its" privacy "product, Onavo.
Onavo, spyware masquerading as a virtual private network (VPN), was marketed to Facebook users as a means to "protect" their accounts. In reality, it gave the company access to a wealth of private information – data that most VPNs are proud to announce they don't collect – including daily Wi-Fi usage and mobile data. The app also collected data on other apps running on the user's phone, ie Facebook competitors like Snapchat. Nevertheless, the Facebook app presented as a way for users to keep their personal information secure.
Onavo is still one of Facebook's most grotesque attempts to plunder its users for marketing purposes, and it was reportedly released from the Apple App Store on that basis almost exactly a year ago to the present.
While Stamos has subsequently criticized Facebook's use of Onavo he was especially the company's top security and compliance officer when Facebook decided to start advertising Onavo to users in the Facebook app. It was still running last year on the day he quit the company. Still, maybe he thinks that BuzzFeeds is somehow wrong too.
Update, at 18 : Stamos sent Gizmodo the following:
I'm not sure it would be right to say that I & # 39; m & # 39; imply & # 39; media. I'm glad [BuzzFeed’s Craig Silverman] wrote the piece, because I really want SS1 to work and for differential-based solutions to academic research to become the standard. The problem is that most of the media coverage reflects the larger social issue: we don't know how open we want these companies to be, or how to define & # 39; public & # 39 ;.
There was a recent mini-scandal about comments that was scrapped by a Mexican company outside public [Facebook] pages. Twitter offers an API that does the same and no one is watching. As WSJ pointed out, the FBI is trying to hire a company to do something Facebook must try to prevent.
I think the coverage of data protection issues at Buzzfeed and elsewhere hardly ever talk about the trade-offs.
"Yes, I think Onavo shouldn't exist and that [Facebook] shouldn't have bought them," added: "& # 39; I don't like Facebook for these other things & # 39; not the basic trade-offs with SS1 and other attempts by academics to study what is happening on the world's largest social network. "