Facebook, Twitter and Google CEOs are likely to face tough questions about how their platforms deal with disinformation at the Congressional hearing.
When the CEOs of Facebook, Google and Twitter testify before Congress Thursday, March 25, the future of information flowing unchecked and unregulated across their platforms will be at stake. Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey are scheduled to appear via teleconference before the House Energy and Commerce subcommittees on Communications and Technology, and Consumer Protection at Noon EDT. The hearing is titled Disinformation Nation: Social Media’s Role in Promoting Extremism and Misinformation.
“Whether it be falsehoods about the COVID-19 vaccine or debunked claims of election fraud, these online platforms have allowed misinformation to spread, intensifying national crises with real-life, grim consequences for public health and safety. … This hearing will continue the Committee’s work of holding online platforms accountable for the growing rise of misinformation and disinformation,” the committee chairs said in a joint statement announcing the hearing.
Based on the role disinformation and its rapid, unchecked spread on social media platforms has played in the last two U.S. presidential elections, the Jan. 6 insurrection that brought Congress to a standstill and ended the lives of five people, as well as efforts to tame the COVID-19 pandemic, regulating big tech is top-of-mind for U.S. legislators.
“The hearing will shed light on whether the government is likely to be deferential to certain components of industry self-regulation and, if so, how,” said attorney Jennifer Lee, a partner at the law firm Arent Fox who works with tech companies on regulatory issues. “It will also shed light on how executives and CEOs can plan ahead, and what everyone should expect to see coming down the pike in the next few months.”
SEE: Report: SMB’s unprepared to tackle data privacy (TechRepublic Premium)
Section 230 of the 1996 Communications Decency Act, which shields tech companies from liability for what is said by third parties on their platforms, will be a likely focus of the hearing, Lee said.
During his time in office, former President Donald Trump repeatedly railed against the protections Section 230 afforded social media companies. Conservatives in Congress, citing censorship by big tech, have taken issue with Section 230, Lee said. Liberals, too, are concerned about Section 230 because they believe it promotes the spread of disinformation and hate speech.
Another focus of the committee could be how the companies decide what is prohibited speech and what is not, said India McKinney, director of federal affairs at the Electronic Frontier Foundation and a long-time Capitol Hill staffer.
“What is truth? Who gets their posts taken down, and can the company, at scale, tell the difference between hateful speech and people documenting hateful speech?” McKinney said. “They are going to have some questions around transparency. Members are going to want to know how come Facebook suggests all these sites that are, in theory, against their terms of service. How does that happen?”
For example, if a watchdog group is posting information about what they believe to be hate speech, will their posts be taken down even though their intent was not to create hateful speech, but to document it? The problem is very similar to one faced over the years by the U.S. courts when defining obscenity, she said.
Also at issue is the definition of censorship. The First Amendment protects free speech from government censorship. Private companies such as Facebook, however, are under no such Constitutional restrictions. They can set whatever policies they want.
Karen Schuler, a practice leader at BDO’s Governance, Risk and Compliance national practice, said the committee will be looking to see if the companies are promoting fake news and if their algorithms are blocking accounts “that do not hold the same values as the companies.”
The committee will likely want information on how the companies are combating disinformation through things like vetting news for accuracy before it is promoted, similar to what Facebook has been doing, and if its handling of disinformation is disrupting the economy. The committee will also be looking for information on who, such as nation-state actors or other criminal enterprises, are creating and spreading disinformation.
SEE: Navigating data privacy (free PDF) (TechRepublic)
While all three companies have for years made efforts to combat misinformation, the results have been mixed. According to a study by the German Marshall Fund of the United States, a non-partisan policy organization focused on democracy, human rights and international cooperation, disinformation “is infecting our democratic discourse at rates that threaten the long-term health of our democracy.”
How the hearing could impact the business of information
Should today’s hearings lead to legislation that is not carefully crafted, the impact could be felt by every business that allows public commentary on its web pages, Lee said.
“The realities of 1996, when Section 230 was enacted, are very different from the realities of 2021 and the future,” she said. “Congress must catch up and understand [that dynamic], in order to write good policy that works for their constituents. If a law is written to hurt tech companies’ liability, the real losers will be the smaller businesses or new entrants who try to innovate.”
Schuler said the hearing could, among other outcomes, lead to legislation requiring social media platforms to verify promoted news stories with humans, not algorithms, and improve guidance regarding the information partners can share without vetting. Giving the FCC and FTC more latitude in investigating misinformation could also be part of future legislation.
“Additionally, as a result of this hearing, environmental, social and corporate governance initiatives will likely start to intersect with privacy and security legislation to ensure that appropriate governance standards are instituted … the link would be that, from a social and governance responsibility perspective, companies might have greater responsibility to provide accurate information to the public,” Schuler said.
Even though the role of disinformation in shaping public opinion and real-world events has been debated by Congress for years, the impact of today’s hearing, the first since Democrats took back Congress and the White House, on future legislation could be significant, McKinney said.
“Sometimes the hearings are just scattershot but, if they come in really targeted and focused, it will be clear that Congress is being very intentional, and I would expect legislation to come out this,” she said. “It’s going to shift the conversation. This is a big deal.”
Update March 26, 2021: Correctly attributed a quote to India McKinney and not Jennifer Lee.
more recommended stories
Graphs, quantum computing and their future roles in analytics
Graphs are used in mathematics, engineering.
Lawsuit against Amazon alleging it failed to protect workers from COVID-19 moved back to state court
A federal judge has granted the.
Microsoft unveils 64-bit version of OneDrive
Compatible with the 64-bit version of.
China fines Alibaba $2.8 billion after antitrust investigation
China has hit Alibaba, one of.