Google wants to invest $ 10 million over the next two years to support literacy efforts, including media literacy, and is partnering with YouTubers to help teach children how to spot fake news.
YouTubers as asapSCIENCE, an award-winning science channel with more than 7 million subscribers, and Smarter Every Day, another science channel with more than 5.5 million subscribers, are a couple of creators that YouTube plans to work with, according to a company representative. . Both AsapSCIENCE and Smarter Every Day have expressed interest in helping with the initiative, according to a representative, and it is expected that more information will be published in the coming months before the launch of the program.
Creators who focus on other educational areas or focused on the news can also work with YouTube as part of the initiative. The goal is to help teach children how to discern what is real and what is fiction when they are reading news articles or watching YouTube videos. Google's decision to invest in a media literacy project comes at a particularly interesting and worrying time for YouTube. The problem of YouTube's growing conspiracy theory, a subject that has been written extensively and studied by academics, is one that the company is trying to address.
YouTube is trying to address its growing problem of conspiracy theory
After last month's school shooting in Parkland, Florida, the videos describing the survivors of the attack as "crisis actors" reached the top of YouTube's list of trends, which means that the platform's algorithm recommended them. . YouTube CEO Susan Wojcicki responded last week by announcing that the company would begin to supplement those videos with Wikipedia annotations on the issues in question. The decision was met with contempt from reporters, critics and even Wikipedia itself, whose executives wrote a letter calling Wojcicki and his team for not contacting the Wikimedia Foundation.
YouTube is caught in a unique situation: people turn to the platform for news, even watching official news clips from renowned organizations such as CNN, but YouTube does not believe it is a news organization.
"If there is an important news event, we want to provide the correct information," Wojcicki said, as reported by BuzzFeed, before reiterating that "we are not a news organization."
That means YouTube's moderation and engineering teams have been trying to find ways to help discern the facts of conspiracy theories. Changing the recommendation algorithm and bringing annotations from Wikipedia aims to help. However, part of the problem is that YouTube will not take a stand against some of the most notorious conspiracy theories that are sold on its platform, although the company admits that they are a problem for YouTube viewers. InfoWars presenter Alex Jones, for example, who is best known for claiming that the Sandy Hook shooting did not happen, is one of the biggest contributors to the current problem of YouTube's conspiracy theory. As we said in a previous piece:
Jones, like many other conspiracy theorists and propagandists on YouTube, learned how to take advantage of Google's algorithm to ensure that its content is seen and shared. The trick is simple: take a popular news topic – a shooting at a school, for example – then add a term like "crisis actor" and post content about it. Jonathan Albright, research director of the Tow Center for Digital Journalism at the Columbia School of Journalism, wrote about this growing trend in late February, right after the shooting at Marjory Stoneman Douglas High School in Parkland, Florida.
"The conspiracy videos of Mass Shot, False Flag and Crisis Actor on YouTube are a well-established genre, if not flourishing," said Albright.
Investing in global literacy and media literacy over the next few years, and engaging YouTube users, may be Google's next step in combating false news. It is not yet clear if YouTube will take additional measures and will ban the conspiracy theorists once and for all.