How Frances Haugen left Mark Zuckerberg speechless

0

On Monday, ahead of Facebook’s worst outage in some time, details about Haugen emerged. She was a product manager on the company’s “civic integrity team”, where she systematically copied tens of thousands of internal documents to share with the United States Securities and Exchange Commission, members of the Congress and The Wall Street Journal before leaving in May. to be the most important act in the history of the Facebook business.

It was not a thoughtless act of impulsiveness. Haugen, 37, armed himself with lawyers and provided tons of material for a WSJ series on Facebook’s misdeeds. Giving her first TV interview on Sunday, she succinctly explained why Facebook’s algorithms were harmful. In another interview, with The Journal podcast, published Monday, she gave clear prescriptions on what could be done: don’t break Facebook, but hire more people to audit and guide the content the company shows to more. 1.6 billion people. everyday.

Haugen undoubtedly has a tsunami of legal and corporate backlash heading her way. But Facebook is going to have a hard time discrediting someone who not only speaks well, but has an MBA from Harvard and is so familiar with how algorithms are made that they have patents to their name.

Haugen’s dump of documents revealed what many suspected but couldn’t prove: that Facebook created more lenient secret rules for elite users, that Instagram made one in three teenage girls’ body problems worse and that Facebook knowingly escalated outrage at its main site through an algorithm change in 2018, potentially leading to the storming of the United States Capitol on January 6.

Regulators aren’t sure how to handle Facebook until now, but Haugen’s suggestions, coupled with internal details about how to configure Facebook’s systems, could provide a clearer path. She points out that dismantling Facebook would be a mistake because it would deprive different parts of the conglomerate of the resources needed to stem harmful content. Instead, the business needs a lot more people to audit and guide content on the platform.

While Facebook claims to devote real resources to this police department, its account suggests otherwise. Her civic integrity unit, with 200 people, was critically under-resourced and was eventually disbanded by Facebook’s management, she said.

Haugen’s claims that algorithms underperform is a well-established argument (including here), but she has a huge cache of documentation to back it up. And it’s not just Facebook’s issues, she notes, but “engagement-based ranking” issues in general.

Her greatest wish, she says, is real transparency. Imagine if Facebook posted daily data feeds on its most viral content, she says. “You would have YouTubers analyzing this data and explaining it to people.” This point is expected to fuel future regulations such as the European Union’s AI law, designed to force companies to deselect the code behind their AI algorithms for regulators.

While the 2018 Cambridge Analytica disclosures resulted in a fine, regulators ultimately left the social media giant alone and its shares rose steadily. It will likely be different, especially given the change in the White House and Congress since then. US lawmakers recently introduced five antitrust bills targeting the outsized power of Big Techs. In addition to its wealth of documents, Haugen offers lawmakers and regulators extensive insider knowledge.

She describes herself as an algorithm ranking specialist who, having worked on four social networks – including Alphabet Inc.’s Google and Pinterest, Inc. – understands the intricacies of how computer code chooses what content people see. Her denunciation is more powerful both for her own past and for the sober approach she has taken. Going first to a newspaper that adopts an impartial approach in its corporate reporting isolates it from accusations that it is on an ideological mission.

On Facebook, Haugen says she attended regular meetings where staff shared their struggles to stop viral posts that showed beheadings, or posts that likened certain ethnic groups to bugs. She ultimately concluded that the underinvestment in security was inherent in Facebook and virtually impossible to change.

Nick Clegg, Facebook’s communications and public policy manager, recently warned staff in an internal memo that they “will be getting questions from friends and family about this.”

What sets Haugen apart is how she acted on that tension, says Carissa Veliz, author of Privacy is Power, a book on the surveillance economy that speaks of whistleblowers as the moral canary in the dark. Big Tech coal mine. Veliz says that when whistleblowers realize they can’t fix wrongdoing in a business, the cognitive dissonance they experience is so violent it’s unbearable.

“Most people try to explain it,” says Veliz. “But a whistleblower will decide he just can’t go on like this. He will decide to make a huge sacrifice and get that information out.”

The next step is sure to be terrifying. Whistleblowers often deal not only with complaints from their employer, but also threat letters from lawyers. (What shouldn’t be lost in any future success for Haugen are the many whistleblowers who have been silenced by such threats.)

During the pandemic, Haugen left the Bay Area to live with his parents. Her mother, who is also an episcopal priest, told Haugen that she would have to make her concerns public if she believed lives were at stake.

With Haugen speaking publicly, it’s the silence of Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg that rings the louder. They left it up to Clegg to try and explain the Facebook side of things. This is all the more alarming for investors as Haugen has taken to the SEC to claim that Facebook has indeed lied to shareholders about the impact of its algorithms.

Zuckerberg seems to have his head in the sand. In recent weeks, he has oddly posted a series of playful or pleasant posts on his Facebook page about fencing, surfing, and helping his kids raise money for charity.

He can still try to explain the revelations. But, as Veliz says, its employees will increasingly be faced with the idea that they are working for a toxic company. Others might be inclined to come forward as whistleblowers. It won’t be pretty.

Parmy Olson is a Bloomberg Opinion columnist covering technology. She has previously reported for the Wall Street Journal and Forbes and is the author of “We Are Anonymous”.

This story was posted from a feed with no text editing. Only the title has been changed.

To subscribe to Mint newsletters

* Enter a valid email address

* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our app now !!


Source link

Leave A Reply

Your email address will not be published.