Zuckerberg, Dorsey, Pichai addressed online extremism and misinformation before House committee

The display demonstrated just how deep the desire in Washington goes to change how social media companies operate — while also underlining the lack of consensus on how exactly to do that. Some lawmakers proposed new legislation, while others called for reforming Section 230 of the Communications Decency Act, a decades-old law that shields tech companies from lawsuits stemming from the content users post on their sites.

“The power of this technology is awesome and terrifying, and each of you has failed to protect your users and the world from the worst consequences of your creations,” said Rep. Mike Doyle (Pa.), the top Democrat on a House Energy and Commerce panel focused on technology.

The hearing was the first time Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Google’s Sundar Pichai appeared before Congress since the Jan. 6 attack on the Capitol, which exploded out of a vortex of false claims spread by lawmakers and right-wing media figures that the 2020 presidential election had been rigged against President Donald Trump.

Doyle asked all of the CEOs if their companies were partly responsible for the Capitol riot, pressing them to answer in a yes or no format. Only Dorsey said yes.

Still, the companies fielded only a handful of questions on the topic, and the expulsion by the companies of Trump in the aftermath of the attack barely came up.

The executives aren’t new to testifying before Congress. Last summer, the CEOs of Facebook, Google, Amazon and Apple were grilled over antitrust concerns. And late last year, Republicans called the same trio to testify specifically on Section 230.

But the rapid-fire question-and-answer format appeared at times to prompt the executives to stumble, and even occasionally drew some interesting answers. Pichai, for example, turned out to be the only one of the three who had so far been vaccinated and who had seen the film “The Social Dilemma,” a Netflix documentary that seeks to unveil the addicting and sometimes dangerous aspects of social media.

Republicans, as they have in previous hearings, accused the companies of censoring conservative voices. But they also demanded that the tech companies do more to protect children and teens from cyberbullying and social media addiction. Several Democrats picked up on the same thread, accusing the CEOs of Google and Facebook of making money by advertising to children who technically aren’t allowed on their platforms.

The current law, Children’s Online Privacy Protection Act, prohibits companies from collecting the data of children under 13 in most circumstances or targeting them with personalized advertising.

“Of course, every parent knows that kids under the age of 13 are on Facebook and Instagram,” said Rep. Kathy Castor (D-Fla). “The problem is that you know it. And you know that the brain and social development is still evolving at a young age. There are reasons in the law that we said that cutoff is at 13.”

Legislators also questioned why Facebook and Google have created platforms for children. BuzzFeed reported last week that Facebook is planning an Instagram for children. The two companies also have Facebook Messenger Kids and YouTube Kids, respectively.

Some critics have said that those types of underage-targeted services are aimed at getting children hooked on social media, early.

Pichai and Zuckerberg said children under 13 aren’t allowed on the platform, so they don’t make money off them.

The original hearing topic of misinformation still loomed large, and the politicians managed to delve into levels of detail that had been missing from previous hearings on the subject. Several asked specifically about covid misinformation spreading through Latino communities. Others cited recent data on a rise in anti-Asian online hate and asked why hashtags like “#chinesevirus” weren’t banned.

Both of those topics forced answers from the CEOs that were more direct than the usual “I’ll get back to you,” a cliche of these events. Zuckerberg defended Facebook’s record on Spanish-language misinformation and committed to making it a priority. Dorsey said Twitter didn’t block potentially racist hashtags because they could also be used by those who fight back against racists online.

Rep. A. Donald McEachin (D-Va.) asked why Facebook wasn’t applying the same rules regarding covid-19 misinformation to climate change misinformation. Zuckerberg’s answer: Covid lies have the potential to cause “imminent physical harm,” while climate change misinformation doesn’t.

Two lawmakers asked about Google’s recent efforts to reform online advertising — a topic that rarely gets any attention outside of wonky tech advertising circles. The company is making changes to its Chrome browser — the most popular way to access the Internet worldwide — that would make it harder for advertisers to track individuals. Privacy advocates support the move, but antitrust officials in the United States and United Kingdom have said it could quash competition.

Taken together, the long list of questions shows that many of the committee members showed up prepared and knowing what they wanted to get out of the executives.

“The questioning from both sides shows that lawmakers are serious,” said Alexandra Givens, CEO of the Center for Democracy and Technology, a think tank that takes funding from foundations and companies, including Google and Facebook. “How they actually craft a path forward remains to be seen.”

Still, that doesn’t mean that a consensus has emerged on what comes next.

Lawmakers from both parties suggested the time had come to make changes to Section 230, for example. But the two parties want the law changed in opposing ways. Democrats want it to hold companies to a higher standard for the spread of racism and misinformation. Republicans want the companies to cut back on moderation, arguing that current practices threaten free speech.

Politicians have introduced a flurry of bills that would significantly change Section 230, but they have yet to coalesce around a single proposal. Sens. Brian Schatz (D-Hawaii) and John Thune (R-S.D.) have unveiled the PACT Act, which would force companies to be more transparent about content moderation. A group of Democrats in the House and Senate introduced the Safe Tech Act, which aims to hold the tech companies more accountable when posts on their services result in real-world harms.

At Thursday’s hearing, Rep. Anna G. Eshoo (D-Calif.) floated her bill, the Protecting Americans from Dangerous Algorithms Act, that would amend Section 230 to remove tech companies’ protections from lawsuits when their algorithms amplify content that leads to offline violence.

And Rep. Yvette D. Clarke (D-N.Y.) announced plans to introduce a bill that would change Section 230 to prevent discrimination in online advertising.

Zuckerberg brought his own ideas for changing Section 230. The Facebook CEO says new legislation should require the biggest tech platforms to be more transparent about their rules for taking down content, and hold them liable when they fail to block illegal activity when they find it on their platforms. Pichai and Dorsey said they generally agree with Zuckerberg’s proposals, though Dorsey said it would be difficult to enact.

“It’s going to be very hard to determine what’s a large platform and what’s a small platform,” Dorsey said.

Below are the updates from the House hearing.

And … that’s a wrap

Five and a half grueling hours and that’s a wrap. The Big Tech CEOs came to Congress — in the case of Facebook’s Mark Zuckerberg, for the seventh time.

But this go-round, instead of the primary focus being on privacy, competition or conservative bias, they grilled them on the harms of misinformation and extremism.

The CEOs were directly confronted about whether they should be held responsible for the Capitol riot on Jan. 6, but most managed to dance around an answer by saying it was nuanced.

CEOs were asked repeatedly about vaccine information, election-related misinformation, illegal drug sales and the regulation of the Internet.

The questions were largely substantive. Lawmakers appear to have learned from their past mistakes of asking naive and meandering questions that allowed tech CEOs to skillfully dodge. This time, many used the strategy of asking a “yes or no” question to try to pin down some of the world’s richest and most powerful men.

The CEOs also faced few questions about former president Donald Trump.

But the big takeaway was that lawmakers appear ready to crack down on the loosely regulated tech industry. More than a dozen times, different lawmakers said they were prepared to better regulate the companies, even as soon as this year.

Trump is planning to launch his own social network after large tech companies booted him off

A top Trump adviser confirmed the former president is building his own social network after major tech companies suspended his accounts in the fallout of the Jan. 6 Capitol attacks.

“I do think that we’re going to see President Trump returning to social media in probably about two or three months here, with his own platform,” Trump senior adviser Jason Miller told Fox News on Sunday. “And this is something that I think will be the hottest ticket in social media, it’s going to completely redefine the game, and everybody is going to be waiting and watching to see what exactly President Trump does.”

Miller predicted the new platform will be “big” once it launches, suggesting he would bring tens of millions of people to the new service. He also said Trump has been having “high-powered meetings” at Mar-a-Lago regarding the venture, and that “numerous companies” have approached the former president.

Trump’s decision to build his own service signals he no longer wants to be dependent on dominant social networks.

Trump has largely been muzzled online since Twitter permanently banned him in the aftermath of the Jan. 6 riot. YouTube and Facebook have also suspended Trump’s accounts, but they’ve left open the possibility he could return to their services. Facebook’s independent oversight board has accepted Trump’s case, and it will make a binding decision in the coming weeks on whether he can return to the platform. YouTube CEO Susan Wojcicki has said Trump will remain suspended until the company can determine the risk of violence has decreased.

Miller made his announcement just days before Thursday’s hearing. It will be the social media executives’ first appearance on the Hill since the Capitol attacks, and the Democratic-led committee has said the hearing will focus on the proliferation of disinformation on their platforms. However, it also will be Republicans’ first opportunity to publicly grill the CEOs about their decisions to suspend Trump’s accounts, which reignited claims tech companies were too powerful and biased against conservatives.

Schrier calls out vaccine misinformation on Facebook, Twitter

Rep. Kim Schrier (D-Wash.), a pediatrician, said she has witnessed firsthand the swirling misinformation about coronavirus vaccines on social media.

When she posted about a vaccines act she introduced, comments popped up on her Facebook page threatening her and spreading false claims about harms associated with vaccines.

Most of the concerning comments seemed to come from two Facebook Groups that directed its members toward her post, Schrier said.

“So while the overt threats are unsettling, particularly after Jan. 6, I think about this whole ecosystem, your ecosystem, that directs a hostile sliver of society, en masse, to my official Facebook page,” she said to Zuckerberg.

He acknowledged that the vaccine misinformation is an important issue and that the enforcement process can be difficult. The Post reported this month that Facebook is conducting a huge study of doubts expressed by U.S. users about vaccines.

Illegal drug sales on online platforms come up again

Several times during the hearing, the tech CEOs, and particularly Facebook’s Mark Zuckerberg, were asked about the sale of illegal drugs and opioids on their platforms.

This issue — the subject of a Washington Post investigation in 2018 — continues to pop because the such posts are still available, though there is more enforcement now than there was a few years ago.

Rep. David B. McKinley (R-W.Va.) pointed to the charges against drug companies and said to Zuckerberg, “So why shouldn’t you be held liable as well … Do you think you’re above the law?”

But Zuckerberg insisted that drug sales weren’t happening, even as the lawmaker pointed to evidence of Ritalin, Xanax and Adderall sales.

“I don’t think we’re allowing this to take place where we’re building systems that take the vast majority of this content off our systems” he said.

Lawmakers press tech CEOs to admit fault for Capitol riots

Nearly five hours in, lawmakers pressed the social media CEOs to admit they held some responsibility for the ideas and organizing that led to the Capitol riots on January 6th.

Zuckerberg insisted that the reason Facebook is mentioned so frequently in the government charging documents that have emerged from those events is because Facebook has been so helpful to law enforcement. He also compared Facebook’s predicament to that of a police force, pointing out that no one expects a city police force to stop every single crime.

Pichai focused on the content Google and YouTube took down, rather than the content the company missed.

And Dorsey said the company saw no evidence of violence in the days before the riot, despite researchers raising red flags.

This hearing is a sprint, not a marathon

Whew, there are a lot of lawmakers who want the chance to question the CEOs of Google, Facebook and Twitter at Thursday’s hearing. And each of them are trying to cram as many significant topics as possible into their five-minute windows.

Coronavirus disinformation? The dangers to children online? The harmful impact of false claims on climate change? Please answer as quickly as possible. Preferably, yes or no.

The hearing, which has already lasted nearly five hours, has featured lawmaker after lawmaker trying to force the executives to give brief, often one-word answers. That’s likely a tactic to get a straight answer, rather than hedging.

But it’s also a way to maximize on their five-minute time limits, which is being closely enforced by the committee’s chairs. Over and over again, members of Congress remark something to the effect of: “I’m going to reclaim my time. I only have five minutes.”

And if they don’t cut themselves off, the chair is sure to: “The gentlelady’s time has expired.”

Zuckerberg says election misinformation spread on TV, private messages too

Facebook’s Zuckerberg said his company handled misinformation well during the 2020 election cycle and argued that TV broadcasters and news providers should carry some of the blame for pushing false information, too.

“A lot of the stuff, I think, unfortunately, was amplified on TV and in traditional news as well,” Zuckerberg said in response to a question about how misinformation spread through Florida’s Latino community. “There were certainly some of this content on Facebook. And it’s our responsibility to make sure that we’re building effective systems that can reduce the spread of that. I think a lot of those systems performed well during this election cycle.”

Researchers have shown that Facebook was a core arena for supporters of former president Donald Trump to organize rallies protesting the election result. The rally in Washington, D.C. that spawned the Jan. 6 Capitol attack was promoted on Facebook and Instagram posts.

Zuckerberg also pointed out a lot of misinformation is spread through private messages and groups that the company can’t moderate. Messages sent through Facebook’s WhatsApp messaging service are encrypted to keep them private.

“Someone sends a text message to someone else,” he said. “They’re determining whether that gets delivered. People can just send that to someone else.”

Rice grills Zuckerberg on targeting of veterans online

Rep. Kathleen Rice (D-N.Y.) drilled into disinformation targeted at veterans and military service members at the hearing Thursday. But first, she threw a barb at Twitter’s Dorsey for tweeting during the meeting.

“Your multitasking skills are quite impressive,” she said, after asking Dorsey which option in his “yes/no” tweet poll was winning. (Yes is winning.)

Rice asked Zuckerberg how the company is working with veterans groups to prevent disinformation from being lobbed at military members online.

“It’s deeply disturbing the involvement of our veterans and military service members in the violence that took place on Jan. 6,” she said. The Capitol attack was expected to be a major theme of the hearing, but has taken somewhat of a back seat.

“Nefarious actors” have found ways to use algorithms on social media to push content to veterans and military members that they did not seek out, Rice said. They are targeted in order to “misappropriate their voices, authority and credibility” to disseminate propaganda, she said.

“Do you believe that veterans and military service members are just like other Americans in that they are susceptible to the impulses in human psychology that Facebook exploits to drive engagement?”

“Congresswoman, there’s a lot in your characterization there that I disagree with,” Zuckerberg responded.

Dorsey again admits to booting Trump from Twitter

The hearing is going on four hours, and former president Donald Trump’s name has barely come up.

However, in one short exchange with Rep. Debbie Lesko (R-Ariz.), the lawmaker asked Dorsey whether he signed off on the decision to ban Trump’s account permanently.

The Washington Post has previously reported that his policy team made the decision and Dorsey signed off on it, and Dorsey has previously discussed the decision in a long Twitter thread. He reiterated again that the decision did roll up to him.

To the extent that Democrats brought up the events of Jan. 6 and many of them did, they have focused much more on the role of protesters than the role GOP leaders played in stoking anger and egging on rioters.

Trump has barely come up at the first social media hearing since his accounts were suspended

Tech executives are in the hot seat for the first time since their controversial decision to suspend the accounts of a sitting president of the United States.

But even Republicans who have raised unproven allegations of anti-conservative political bias barely touched on the watershed moment for the social media giants.

The lack of questions about Trump underscored how the social media hearing did not primarily focus on the pivotal decisions that tech executives made about content on their services since their appearance on Capitol Hill last year.

Instead lawmakers whipsawed into many different issues, ranging from how tech algorithms can promote discrimination to how social media companies amplify disinformation about climate change. A major focus for many was protections for children, as lawmakers raised concerns with the CEOs about how their platforms were used by children under the age of 13 and research that shows social media has a negative effect on the mental health of teens. Lawmakers from both parties asked questions about this topic, indicating that strengthening online protections for children could be an area of bipartisan consensus.

Throughout the hearing, lawmakers from both parties attempted to pin executives on whether they take responsibility for offline harms fueled by their platforms or their content moderation decisions. That reflected a general interest from lawmakers in examining changes to Section 230, a decades-old law that shields tech companies from legal responsibility for the posts, photos and videos that people share on their services.

Lawmakers from both parties indicated throughout the hearing that they’re ready to regulate the tech companies. Several mentioned legislation they’ve introduced, such as to address the algorithmic promotion of extremism or curtailing discriminatory advertising. But it was not clear what path lawmakers would take to follow through on their promises to crack down on Silicon Valley.

Facebook’s climate change center was modeled on its covid-19 page

Rep. A. Donald McEachin (D-Va.) wants to know why Facebook isn’t applying the same level of fact-checking to instances of climate change misinformation as it is to posts and ads regarding the coronavirus.

“As my colleagues and I clearly expressed in our letter, climate change is a real and urgent threat,” the lawmaker said. “The spread of disinformation on your platforms is undermining that fact.”

Zuckerberg agreed that it’s a serious issue and confirmed that Facebook modeled its climate science information center on a similar initiative it launched for covid-19. But Zuckerberg said Facebook divides misinformation into buckets, the most serious being what could cause “imminent physical harm.”

Some false claims about the coronavirus, or the vaccines to prevent it, could lead to someone getting sick, he said, so Facebook will remove those posts. The company often leaves up posts that it deems will not lead to imminent harm, sometimes with labels, something it has faced criticism for in the past.

That’s the broad approach that we have … that sort of explains some of the differences between some of the different issues and how we approach them,” Zuckerberg said.

Huge cardboard cutouts of tech CEOs portray them as rioters

Nonprofit advocacy group SumOfUs propped up seven-foot-tall cardboard cutouts of the three tech CEOs outside the Capitol — depicting the executives as rioters who stormed the building in January.

The cutouts, positioned on Third Street SW near the Capitol, show Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey as recognizable figures who stormed the Capitol during the Jan. 6 attack. Their likenesses were substituted in for the faces of the rioters, including one who wore a horned hat, another who stole House Speaker Nancy Pelosi’s lectern, and one more wearing a shirt with the symbol for the extremist ideology QAnon.

SumOfUs, which advocates quelling the power of large corporations, said it made the cutouts after researching the role social media companies played in the attack.

“The platforms’ inability to deal with the disinformation crisis shows that these platforms are failing to regulate themselves, and after the past 5 years of manipulation, data harvesting, disinformation, and hate speech, the time has come to rein in Big Tech,” the organization said in a news release.

Google, Facebook and Twitter declined to comment.

Dorsey tweets yes/no question of his own

Lawmakers were intent on holding tech CEOs to yes/no answers during the Thursday hearing, and Dorsey posed a similar question of his own.

Dorsey tweeted a simple “?” during the hearing, and added a poll with just two answers: yes and no.

The poll had more than 37,600 votes in the first 30 minutes. So far, “yes” was winning with nearly 65 percent of the vote.

Lawmakers tried to hold the executives to concise, one-word answers over and over again during the hearing. It only sometimes worked, and usually only on simple questions such as if the CEOs had seen the documentary “The Social Dilemma.”

Congress members regularly cut off the executives when they tried to hedge answering in one word, or add more context to their answers.

“Let me just say this, and it’s I think it’s irritating all of us, and that is that no one seems to know the word yes or the word no,” Rep. Anna G. Eshoo (D-Calif.) said when the executives again tried to answer in more than one word.

CEOs pressed on misinformation in Spanish

The issue of Spanish-language misinformation came up several times in the hearing, with a particular focus on Facebook.

A lawmaker cited a study by the left-leaning human rights group Avaaz which found that found that 70 percent of misleading Spanish-language misinformation analyzed had not been labeled by Facebook’s fact-checkers, compared with 30 percent of English-language misinformation that had not been labeled.

Another lawmaker brought up Spanish-language ads run by the Trump campaign that falsely accused President Biden of being endorsed by Venezuelan President Nicolás Maduro.

Zuckerberg did not answer specific questions about how much the company invests in combating misinformation among its Spanish-speaking users compared with English-language users.