In the first, a Rolling Stone article, by Darren Linvill and Patrick Warren, the authors suggest that professional trolls don't actually troll. Amateur trolls* are pretty easy to spot if you've been on social media for a moderate amount of time.
Yet, many times, professional trolls are far from obvious, even to experienced social media users, and work to befriend users and then sway them with spin, using the following strategy:
"Grow an audience in part through heartwarming, inspiring messages, and use that following to spread messages promoting division, distrust, and doubt."The goal is to undermine trust in American institutions and "drive mainstream viewpoints in polar and extreme institutions." Yes, this goal seems somewhat obvious, but it's interesting to note in the context of ubiquitous sneering on the left and the right at "moderates" and "centrists."
And, while I believe that it's generally not helpful to find a "moderate" position between civility and, say, neo-Nazis, the phrase "centrist" (like "neoliberal") is often thrown around on Twitter in pretty disingenuous ways by people acting in both good and bad faith to drive people toward the extremes. .
In a related piece, Linvill has noted just how adeptly Russian trolls understand US culture, as he's written about disinformation in the context of our national conversations about campus climates, for instance:
"Covert Russian disinformation may seem out of place in the context of a conversation regarding campus climate. It is not, though. The IRA’s attempts to demoralize, distract, and divide have been discussed as a form of political warfare (Galeotti, 2018) and, through social media, it is a form of warfare that extends to our college campuses. Not only does the IRA seek to reach students on our campuses in order to influence their ways of thinking, but also they wish to attack the institution of higher education itself and make it a political wedge between Americans of different ideologies (Bauman, 2018; Morgan, 2019). Bauman pointed out, for instance, that in the run up to the 2016 election, IRA troll accounts repeatedly tweeted segments of conservative media that 'spotlighted incidents of liberalism run amok at colleges' (2018, p. A28)."Here, it's worth pointing out that conservative Christian Rod Dreher bemoans campus political correctness practically on the daily at his blog at The American Conservative, essentially acting as a useful idiot for amplifying, overreacting to, and sowing these divisions. He's hardly alone there, as this PC Gone Awry narrative is a cottage feature of rightwing media.
Anyway, Russian disinformation has been ongoing since before the 2016 election. And, knowing this, although I don't always succeed, I've been trying pretty hard to stay above the fray, particularly online, when it comes to getting embroiled in the day-to-day dramas of the 2020 Democratic Primary.
Just so you know where I stand, I am leaning heavily toward voting for Elizabeth Warren, because I believe she has the best policies, judgment, and demeanor for the job. But, I also believe we have a solid slate of candidates, with some exceptions, any one of which would be infinitely better than Donald Trump and Mike Pence.
I also think candidates should be critiqued, fairly, when warranted, but Twitter in particular is often used to virally spread some of the most disparaging, superficial, and yes dumb critiques of candidates. In fact, the retweet is built for the shallow dunk that's less about analysis and more about feeding users' need for the dopamine hits they get from attention/notifications for likes and retweets.
Relatedly, another takeaway from the Rolling Stone piece is that Russia's disinformation efforts are ongoing, and are bigger than the 2016 and 2020 elections. What I need from political candidates is an acknowledgement of this problem and solutions to address it, not people who boast about how they woulda won in 2016 (or will magically win in 2020) even though nothing about their platforms, statements, or mental capacities suggests they even understand the magnitude of the threats facing our nation and democracy.
Finally, these stats:
"Recent research exploring fake news may expand Boyd’s concerns regarding how we have taught digital media literacy. Research examining Twitter suggests that concerns regarding fake news may be based on incorrect assumptions of its prevalence. Grinberg, Joseph, Friedland, Swire-Thompson, and Lazer (2019) found that only 0.1% of users were responsible for sharing 80% of fake news posts, and these users were highly concentrated among conservative voters. Research examining Facebook found similar results. Guess, Nagler, and Tucker (2019) found the sharing of fake news on that platform to be a rare event; and to the extent that it was a problem, it was largely a problem among Baby Boomers. Users over 65 were nearly seven times more likely to share fake news as the youngest cohort of users. This stands in strong support for Lee’s (2018) call to teach digital media literacy to older adults. Yet to date we have been teaching digital literacy in college, when we should be teaching it in retirement homes."Whew.
*I continue to object to using the word "troll" to describe abusive online behavior, because I believe it tends to minimizes the harmful impact such behavior can have on legitimate users of social media. I've used it throughout this post for the sake of consistency with how Linvill and Warren use it.