Russian involvement in the 2016 US general election acted as a catalyst for this interest. The spotlight was quickly broadened to include Brexit, the Scottish independent vote.
Russian media operatives bots have shown interest in a number of other areas. They get fired up by the gun regulation debate. They are opposed to vaccination. They are opposed to GMO crops.
Matt Ridley wrote an article recently about how Russia promoted the "nuclear winter" theory - which was subsequently widely-discredited.
Fracking, climate change and energy policy have been other targets of the Russians.
Russia's use of disinformation goes back decades, but has only been the last couple of years that awareness of it has become more widespread. "Disinformation" has become known as "fake news" - a term which exploded in populatity in November 2016.
This article is about Russian efforts, but I don't mean to suggest that other countries are not involved. American cyberwarfare operations include Operation Olympic Games and Stuxnet. The American efforts have been covert, though - while the Russian ones are easier to study because they often involve public distribution.
The topic is currently popular, and it could fairly easily be used as a case study for students of cultural evolution. The most obvious things we want to know are what issues the Russians have attempted to influence, how they have attempted to influence them, how successful they have been and what can be done about the issue.
One frequently reported Russian technique is to play both sides of an issue. However, it is not known precisely why they like to do that. There are two main possibilities. One is that they want to create conflict between Americans, creating domestic problems and distracting them from foreign policy issues. Another possibility is that they want to create media noise and controversy as part of spreading their message. If everyone is on one side of an issue, it becomes a non-issue, and that is not newsworthy or spreadable. Another possibility is that by controlling both sides of the argument they can better make one side look stupid. Audiences assess arguments by considering the merits of the case presented each side. They also consider which side they want to affiliate with. Infiltraring the other side and then presenting weak arguments and behaving like an idiot are techniques which can be used to damage your opponent's position.
For example, here is a Russian anti pro-DAPL meme and a Russian pro-DAPL meme:
Now, I think it is pretty obvious that the second meme is trolling. Pollution by protesters isn't the reason why some people favoured the DAPL. A more realistic reason is that they didn't want a tiny minority messing up energy distribution for everyone else due to selfish, NIMBY issues. The real intended message of the second meme is something like: those who favor the pipeline are assholes. This makes sense as a message that might be favored by the Russian manipulators: opposing the DAPL interferes with the American domestic energy distribution network.
Not everyone seems to agree with this conclusion. For a counterpoint, see this ArsTechnica article, which argues that the Russians were just trying to create conflict.
Disinformation campaigns are part of the dark side of memetics. By working to understand them it might be possible to combine performing useful science with doing social good. I hope that some people in the field will sieze these kinds of opportunity.
No comments:
Post a Comment