Senator Chris Coons (D-Del.), Chairman of the subcommittee hosting the hearing, told POLITICO last month that he plans to make social media and algorithmic accountability a key topic for his panel in this Congress.
“Social media platforms use algorithms that determine what billions of people read, see and think every day, but we know very little about how these systems work and how they affect our society,” Coons told POLITICO on Friday I hear that these algorithms amplify misinformation, promote political polarization, and make us more distracted and isolated. “
No technical CEOs this time: Coons said in March that he would “very likely” call technical executives, including the CEOs of Twitter and Facebook, to testify on the matter. However, the upcoming meeting will contain statements from executives who instead monitor the companies’ content policies.
Congressional assistants said they hope to avoid the typical grievances that have plagued some hearings with the big tech CEOs by involving company officials who are more focused on their content policies. But the aides said the option to bring in the CEOs is still on the table for the panel in the future.
YouTube once in the hot seat: Mark Zuckerberg, CEO of Facebook, testified four times in the past year, while Jack Dorsey, CEO of Twitter, and Sundar Pichai, CEO of Google, appeared three times during that time.
YouTube, a Google subsidiary and the second largest social media platform in the world after Facebook, has been called sparingly over the years compared to its rivals Facebook and Twitter or even its parent company. YouTube CEO Susan Wojcicki has never testified with any of the other tech bosses.
According to congressional assistants, researchers have found that YouTube is generally less transparent about how its algorithms work than some other popular platforms. How YouTube determines which videos to recommend to its users will be a big focus for the hearing, they said.
Where the legislation is: Two prominent House Democrats, MPs Anna Eshoo (D-Calif.) And Tom Malinowski (DN.J.), have introduced laws to remove liability protection from online platforms in cases where platforms expand content that leads to certain damages in the real world. such as civil rights violations or acts of international terrorism.
Algorithmic reinforcement has also been increasingly scrutinized by Democrats on the House’s Energy and Trade Committee, whose leaders have pledged to legislate against the spread of domestic extremism and misinformation on social media.
However, it remains to be seen whether this is an issue where Democrats can get significant bipartisan buy-in from Republicans who have traditionally been more concerned about how tech companies restrict, not reinforce, content.
“This is not a show hearing to knock on the table – it is an opportunity to learn,” said Senator Ben Sasse of Nebraska, the top Republican on the subcommittee, in a Behavioral Health Information and Management statement. “
Congressional assistants said they hoped to spark bipartisan interest by focusing the session on structural issues of how companies deal with moderation of content rather than how they deal with certain types of material like political speech.
The hearing could lead to discussions about whether the Federal Trade Commission regulators should be given more powers to oversee company practices or whether additional federal research into the technology sector is needed, the advisors said.