Honolulu Star-Advertiser

Sunday, November 24, 2024 68° Today's Paper


Top News

Misinformation thrives on video site popular with far-right

ASSOCIATED PRESS
                                Affidavit printers are lined up at the Maricopa County Elections Department in Phoenix, on Sept. 8. Big tech platforms say they are working hard to address misinformation about voting and elections ahead of the November midterms, but a look at their sites shows they are still struggling to contend with false claims from 2020.

ASSOCIATED PRESS

Affidavit printers are lined up at the Maricopa County Elections Department in Phoenix, on Sept. 8. Big tech platforms say they are working hard to address misinformation about voting and elections ahead of the November midterms, but a look at their sites shows they are still struggling to contend with false claims from 2020.

Stay updated on Hawaii and national elections coverage
2024 Hawaii & National Election Coverage

Election misinformation is thriving on Rumble, a video-sharing platform popular with some conservatives and far-right groups, according to research published today.

Nearly half of the videos suggested by the site in response to searches for common election-related terms came from untrustworthy sources, according to the analysis from NewsGuard, a firm that monitors online misinformation.

The percentage was far better at Rumble’s much larger rival YouTube, where about 1 in 5 videos came from untrustworthy sources. The search terms included the names of candidates as well as politically sensitive words and phrases such as gun rights, voter fraud and abortion.

The findings illustrate how alternative platforms like Rumble have become hot spots for election-related misinformation as they have increased in popularity. The site is popular with conservatives and some far-right groups critical of content moderation efforts by larger platforms such as YouTube.

Misleading or deceptive claims about voting and elections have proliferated heading into next week’s elections and have been blamed for increasing distrust and polarization.

Some of the videos reviewed by NewsGuard’s researchers in October included online shows featuring allies of former President Donald Trump such as Steve Bannon and conspiracy theorists such as Alex Jones. Many videos contained debunked claims about the 2020 election, the Jan. 6, 2021, attack on the U.S. Capitol, the QAnon conspiracy theory, as well as misinformation about voting and the elections.

“Rumble frequently pushes videos from untrustworthy sources that traffic in election misinformation,” NewsGuard’s report found.

The researchers used a number of factors, including the use of deceptive headlines or a history of publishing false content, to determine which sites were untrustworthy.

Messages left with Rumble were not immediately returned Wednesday and today. According to a mission statement on the platform’s website, Rumble aims “to restore the internet to its roots by making it free and open once again.”

Rumble said in September it now has 78 million active monthly users around the world, with 63 million in the United States and Canada. The site boasts a long list of podcasts helmed by prominent conservatives such as Don Bongino and Bannon, whose videos have millions of subscribers on Rumble.

The Florida-based platform’s growth has come from users interested in news and politics, as well as younger users in the 18-24 age group, Rumble CEO Chris Pavlovski said in September.

Bannon’s show was among the top results when researchers ran a search on the term “voter fraud.” A longtime Trump ally, Bannon was kicked off YouTube last year for repeatedly violating its rules; he was banned from Twitter after calling for Dr. Anthony Fauci, the nation’s leading infectious disease specialist, to be beheaded.

A spokesman for Bannon told the AP today that Bannon’s comment was metaphorical and that he didn’t intend it to be taken literally.

Together, the the misinformation-laden videos turned up by researchers at NewsGuard had been viewed nearly 9 million times so far.

YouTube has been criticized for not doing enough to tackle misinformation on its platform. But the NewsGuard report shows the platform’s efforts are making a difference. Researchers said that in addition to suggesting fewer videos containing misinformation, YouTube did not recommend any videos supporting QAnon.

Nevertheless, a report released this fall by New York University faulted Meta, Twitter, TikTok and YouTube for amplifying Trump’s false statements about the 2020 election. The study cited inconsistent rules regarding misinformation as well as poor enforcement.

In a statement emailed to the AP, YouTube spokeswoman Ivy Choi said the platform, which is owned by Google, has invested in efforts to identify misinformation and limit its spread. The most visited channels and videos about the election all rely on trustworthy, authoritative sources, she added.

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.