Engadget and Point worked in conjunction to investigate the continued activity of Russian troll farms on the popular social media platform, Reddit. For popular clarification, these troll farms are operated by the Russia Intelligence community, overseen by President Vladimir Putin and have no relation to the activity of non-government Russian citizens.
A recent video published by Point details their findings.
Details Of The Investigation
“I’ve continued to hunt down Russian propaganda,” said DivestTrump, an individual on deep background who has requested anonymity from Engadget to avoid online abuse. “It’s in the tens of thousands of posts and thousands of users are spreading it’s incredibly pervasive,” he continued when speaking to the outlet concerning the research.
ARK Invest is known for targeting high-growth technology companies, with one of its most recent additions being DraftKings. In an interview with Maverick's Lee Ainslie at the Robinhood Investors Conference this week, Cathie Wood of ARK Invest discussed the firm's process and updated its views on some positions, including Tesla. Q1 2021 hedge fund letters, Read More
The Point and Engadget investigation learned that alt-right.com, veteranstoday.com, and southfront.org were linked to Russia’ Internet Research Agency and targeting conservative subreddits as a part of a larger active measures campaign.
According to the extensive report, propaganda sites are actively targeting ‘at least 89 subreddits.’ Engadget would continue, “Our findings suggest a Russian-led attempt to antagonize and influence Americans online, which is still ongoing. The total subscriber base of all 89 subreddits is in excess of 68 million registered users.”
The disinformation campaign on Reddit spread to sites that were either non-partisan, left-leaning, or representative of a different bread of authoritarian ideology as domains such as r/worldnews, r/atheism, and r/COMMUNISM were targets. However, the investigation was able to uncover even further details.
The LinkedIn page of usareally.com lists its location as Moscow and its social media manager is a Moscow-based Dimtry Kukushkin. Another Russian, Alexander Malkevich, created the website and he works for Yevgeny Prigozhin, who has been indicted for interfering with U.S. elections. The FBI indictment says Prigozhim bankrolled Russia’s Internet Research Agency, which has been accused of being a troll factory. The other website that DivestTrump exposed last month, brutalist.press, left similarly easy breadcrumbs to follow. In June 2017, a job listing was posted to a Russian career website with “[email protected]” as the contact email address. The advertised position was for a front-end developer with English skills who would work from home with the occasional work trip to St. Petersburg.
DivestTrump would comment, “A lot of times [these sites] can look American. Sometimes they don’t even try to cover their tracks, which was the case with usareally.com — they were registered, they were hosted, all out of St. Petersburg.”
The entirety of the report can be found on Engadget.
What These Findings Detail
Unfortunately, a fringe segment on both sides of the political isle of the United States continue to argue whether the Russian intelligence community attempted to influence the 2016 United States Presidential elections, or if they had any influence on the actions of those voters — despite growing evidence against such thought. While the investigative report has no way to track changes in the voting habits of individuals on these various Reddit domains, it does show how widespread and calculated the disinformation campaign is. The study does not assert that these actions were the only factor in the 2016 election results in the United States, instead, it details the importance of eliminating the use of bots in political discourse.
Within the report, Srijan Kumar, a postdoctoral researcher of Stanford University provided analysis of conflicts on Reddit within the investigation. “Less than one percent of communities were responsible for 75 percent of antisocial behavior,” he began. “What a Russian troll or bot could do is essentially start these conflicts so that people get more engaged in the community and that would increase the visibility of the community and therefore increase the anger that they have stirred.”
These conversations being amplified push often false and hateful narratives which deflect from important topics the nation faces.