FILE PHOTO: A 3-D printed figures are seen in front of displayed Tik Tok logo in this picture illustration taken November 7, 2019. REUTERS/Dado Ruvic/Illustration/File Photo
Reuters
  • Kids are vulnerable to COVID misinformation on TikTok within minutes of signing up, a study shows.

  • Though TikTok prohibits users under 13, younger children can easily lie about their age to sign up.
  • Social media companies continue to face public criticism about their effects on young users.

At this point, it's no secret that social media algorithms unintentionally help peddle COVID misinformation to millions of users. The more pressing problem is who that content is directed toward.

The popular social media app TikTok is feeding misinformation content to young children – even within minutes of signing up. False information was targeted toward children as young as nine, even if the young users did not follow nor search for that content.

According to a report from media rating firm NewsGuard, researchers found that COVID-related misinformation reached eight of the study's nine child participants within the first 35 minutes on the platform, with two-thirds of the participants saw incorrect information specific to the COVID vaccines. This included content relating to unsubstantiated claims about COVID and the vaccine and homeopathic remedies for COVID.

"TikTok's failure to stop the spread of dangerous health misinformation on their app is unsustainable bordering on dangerous," Alex Cadier, the UK managing editor for NewsGuard who co-authored the report, told the Guardian. "Despite claims of taking action against misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively unimpeded."

NewsGuard conducted the study in August and September, asking children ages nine to 17 from different cultural backgrounds to create accounts on TikTok. Though the platform restricts full access to the app for users younger than 13, the three youngest users were able to create accounts with no outside help. As of March 2021, a quarter of TikTok's 130 million active monthly users in the US are between 10 and 19, according to Statista.

"TikTok is very bad at removing videos with misinformation, and these videos with vaccine misinformation stay for months and months on the platform," University of Illinois School of Public Health Epidemiologist Katrine Wallace, who battles misinformation on Tik Tok, told Insider. "The more viral these videos get, the more eyes will see them, and unfortunately some will be children, due to the nature of the algorithms."

TikTok's community guidelines prohibits "false or misleading" content relating to COVID-19 and its vaccines, and the company employs teams that work to identify and remove misinformation, evaluating all COVID-related content on a case-by-case basis.

The app also said that it pushes for an "age-appropriate experience," discouraging and removing accounts created by underage users and restricting LIVE and Direct messaging features for younger teens. Douyin, the Chinese version of TikTok, announced in September it was capping the amount of time users under 14 could use the app to 40 minutes per day.

TikTok didn't respond to a request for comment on the NewsGuard report.

Besides TikTok, other platforms like Facebook, Instagram, and Twitter have come under fire in recent months as increased transparency from the companies revealed more about social media's effects on society, particularly on younger generations. This week, a Facebook whistleblower helped shed light on the ways its platforms psychologically harm teenage users. Meanwhile, high-profile influencers on social media continue to spread COVID misinformation, ramping up the amount of harmful content directed at younger viewers.

Read the original article on Business Insider