The latest tech hearing was a study in contrasts. Contrasts between lawmakers who made an effort to stay on topic in a hearing ostensibly about social media and the 2020 election and those who… just talked about whatever was on their minds.
Also contrasts between then and now. Social media companies previously treated any attempt at Section 230 reform as radioactive; now, they’ve come around to cooperating so they’re not cut out of the conversation altogether.
But most of all it was a study in contrasts for the two men on the virtual witness stand: Facebook’s equivocating chief executive, who always manages to speak too much in the service of saying very little and Twitter’s laconic business mystic who came off as measurably more poised to meet the moment, wizard beard and all.
In a signal that the hearing’s stated purpose would not reflect the grab bag of gripes on display Tuesday, the Senate Judiciary Committee’s own chairman, Sen. Lindsey Graham, threw the plan out early and asked the two CEOs if they had seen any evidence that their platforms were addictive.
Facebook and Twitter CEOs to testify before Congress in November on how they handled the election
Zuckerberg responded with characteristic defensiveness, arguing that the research in this area was not “conclusive.”
“We certainly do not want our products to be addictive,” Zuckerberg said, contradicting behavioral scientists, Facebook defectors and common sense observations of its products. “We want people to use them because they are meaningful,” he added, casting aspersions on “the memes and misinformation out there” about what makes Facebook’s business tick. The response fit neatly into a narrative a few lawmakers pushed that big tech operates out of big tobacco’s playbook.
Given the same question, Dorsey was less disingenuous. “I do think like anything else, these tools can be addictive and we should be aware of that and acknowledge it,” Dorsey said. His statement perhaps stops short of acknowledging the degree to which social media has reshaped the course of modern human behavior, but ultimately it bodes better for Twitter’s health as a platform and for its users’ addled brains.
The two CEOs also sharply contrasted on questions about their algorithms.
When Sen. Amy Klobuchar asked if social platforms should provide more transparency around the algorithms they use to decide what users see, Dorsey proposed more transparency through user control. “I think a better option is providing more choice to be able to turn off the algorithms or choose a different algorithm so that people can see how it effects ones’ experience,” Dorsey said.
Dorsey also suggested that Twitter could expand those options through something like a third-party “marketplace” where users could select ranking algorithms that suited their needs.
Zuckerberg, for his part, didn’t go near this idea with a 10-foot pole, instead lauding the existence of Facebook’s third-party fact-checking program (never mind the too-restrained way Facebook presents those fact checks) and the company’s community standards reports, which present aggregated numbers on the rule-breaking content it removes. Facebook’s algorithm is a black box that users are locked inside and that’s that. (Naturally, the box prints ad dollars.)
In contrast, Twitter has committed to a kind of openness that’s not perfect, but it’s at least refreshing. The company treats its platform policy decisions as a kind of living document, tweeting updates about the most high-profile decisions in near real-time, admitting mistakes and emphasizing that it’s learning and changing things as it goes.
One example of Twitter’s experimental approach: The company universally disabled one-click retweets before the U.S. election, hoping to make user behavior less reactive while slowing down viral election misinformation. The changes were part of Twitter’s recent experiments with introducing more friction to the platform. Twitter also hid tweets and restricted sharing for some particularly egregious bits of misinformation — some of it coming from President Trump. Facebook stuck to “labels,” the current bare minimum content moderation gesture.
Dorsey’s company is still plagued by rampant harassment, brain-melting conspiracies and, for now, a lame duck president actively seeking to destabilize American democracy, but it at least seems open to changes that could shift the dynamics of the platform in the interest of making it better.
Close US election results plunge social media into nightmare misinformation scenario