Some Democrats Concerned With Algorithmic Discrimination During TikTok Hearing

Some Democrats Concerned With Algorithmic Discrimination During TikTok Hearing

( – While most lawmakers on both sides of the aisle spent their five minutes of questioning on national security concerns during the House Energy and Commerce Committee hearing with the CEO of TikTok, Shou Chew, some expressed concerns about discrimination via algorithms. Abortion misinformation was also discussed, with Representatives Lisa Blunt Rochester, D-Delaware, and Diana DeGette, D-Colorado, questioning the CEO about the amount of these types of posts that were removed from the platform.

Representative Yvette Clarke, D-N.Y., questioned the CEO about monetization, specifically regarding people who have violated the community guidelines of the app not being allowed to receive compensation with the possibility of algorithms incorrectly flagging accounts based on the use of certain terms. Rep. Clarke stated that terms like “black lives matter have been flagged as inappropriate content,” without citing any specific evidence. Mr. Chew responded to the Congresswoman, informing her that there is an appeals process for people to dispute flags on their accounts.

Rep. Clarke went on to say that she is concerned about the disproportionate effect that black people face on TikTok due to algorithmic discrimination. She claimed that there “is nothing new” about the fact that black people experience exploitation and erasure of their ideas and creations, adding that action needs to be taken to protect black creators on the platform.

Earlier in the hearing, Representative Doris Matsui, D-Calif., spoke about her bill, the Algorithmic Justice and Online Platform Transparency Act, which would prevent algorithms from discriminating, requiring public reports to be published by platforms that provide transparency around the type of content that is moderated. The bill is co-sponsored by Senator Ed Markey, D-Mass., and would allow a conglomerate of government agencies to conduct investigations into platforms for algorithmic discrimination.

Copyright 2023,