Fake news is not new — it is probably as old as humanity. It has long been rife in politics (manifestos announced but never kept), and commerce (“marketing is no longer about the stuff you make, but the stories you tell” — Seth Godin, marketer). But most of all, it is rife and active on the internet following the US presidential campaign last year and the new administration this year.
Security firm Trend Micro has been considering how the internet fake news market operates, and finds that it is remarkably organized and effective: The Fake News Machine: How Propagandists Abuse the Internet and Manipulate the Public (PDF). For effective fake news via the internet, it finds that campaigns need three elements which it terms the Fake News Triangle: motivation; tools and services; and social networks.
The tools and services are available on the underground internet. Often, for a fee, these services generate the fake news itself as well as promote it via social media.
The social networks are used to spread the fire of fake news, using social engineering techniques: “the fake news posts are crafted to appeal to its readers’ psychological desires,” says the report; “confirming biases, the hierarchy of needs, etc.”
Today, the most common motivation is political; but, warns Trend, “It is inevitable that other motivations — such as profit — will come to the forefront in later years.”
Today, fake news promoters can choose between underground tools and legitimate gray-area promotional tools. State-sponsored campaigns may choose to use their own resources; but market tools offer an additional layer of anonymity.
The Chinese fake news underground largely caters for its own market. It includes the generation and placement of advertorials disguised as news stories, large-scale commenting on news and blog sites, vote manipulation and click farms, and the use of social media to influence public opinion. It also offers the opposite — the removal of content. Methods are known to include bribing website administrators and actual hacking and deletion.
The Russian marketplace, says Trend, “can be likened to a onestop shop for creating, promoting, and manipulating stories and events, news, and profiles — real or imagined — that favor the clientele’s motive.”
One feature of the Russian market is the use of crowdsourcing. VTope, for example, with 2,000,000 mostly real members, implements tasks that “incentivizes users with points, which they can resell or use for self-promotion.” Other organizations, including SMOFast, Kwoki, like4u, TopSoc, and ZiSMO, offer similar or variations on this service.
Voter manipulation to influence public opinion is also available. “Siguldin,” says the report, “markets itself to be capable of manipulating almost any voting system in the Internet and bypassing security checks such as source IP address, Captcha, and authentication mechanisms in social media, SMS, and email as well as on-site registration among others.”
Such offerings are not the only option — there are also DIY kits for automated social media spamming, which become particularly effective when used with a botnet.
The Middle East market, suggests Trend, is new but growing — especially in generating social media followers. Interestingly, CoulSouk will promote content for a fee, but prohibits the promotion of racist, pornographic, and illegal content. Dr.Followers is a similar service provider.
The English-speaking world hosts numerous services designed to increase followers or YouTube viewers. These include BeSoEasy, Quick Follow Now, and 100kfollowers. Other sites, such as Break Your Own News and ClassTool’s Breaking News Generator, allow users to generate their own fake news. “While these types of websites are only meant for personal recreation and must be taken with a grain of salt,” comments Trend, “when combined with these underground services, they can be very effective in manipulating a story and leading the public into believing it’s actually authentic.”
The key to getting fake news accepted is in the headline. Modern society is an instant-gratification society — and this applies to news consumption as much as anything else. More people read headlines than the actual article; and Twitter comments show that many people believe they know the content based on just 140 characters in a tweet. In fake news, that headline or tweet is designed to be attention-grabbing and something that reinforces extreme views.
“In the realm of political opinion manipulation,” explains Trend, “this tends to be in the form of highly partisan content. Political fake news tends to align with the extremes of the political spectrum; “moderate” fake news does not really exist.”
Spreading fake news is largely achieved by Twitter and Facebook. One Twitter method is to use Twitter bots to saturate Twitter searches with relevant and timely keywords plus a link to the fake news site. Alternatively, the Twitter bots might generate the same message, but mention different users and/or include the hashtags of unrelated but trending topics to increase their reach.
The intent is similar, but the method slightly different on Facebook. Facebook’s algorithm is likely to promote news stories that receive large numbers of likes. Here the social engineering capability of a dramatic or tantalizing headline with a brief text expansion seeks the combined effect of garnering likes and sending readers to the fake news site.
A completely different method is the tampered data leak. “Let us suppose,” postulates Trend, “a leak occurs and 99% of the documents are legitimate, but 1% was tampered to help the leaker’s agenda. The victim organization will have a hard time proving that any tampering did occur, let alone which documents were modified. The very fact that a leak occurred also undermines the target’s security and credibility.” It is believed that just such an event occurred with the Fancy Bear hack and data leak of the World Anti-Doping Agency (WADA) in 2016.
Trend Micro sees three current major motivations behind fake news: political, financial gain, and character assassination. Political propaganda is designed to get people to change their mind about their political beliefs or some other opinion. Fake news has a similar intent, but will use falsehoods to manipulate public opinion faster and across a wider audience. Rather than delivering arguments to persuade people to turn towards a new belief, it can use false events to turn people away from an existing belief.
The most obvious financial motivation could be advertising. Social media manipulation can be — and already is — used to drive traffic to a particular site, in what is already called clickbait. However, Trend also see a danger of using fake news to manipulate share prices. There are already many examples of how tweets have affected shares — most notably, perhaps, in 2013 when the Syrian Electronic Army hacked the Associated Press Twitter account and claimed that Obama h
ad been injured by a bomb at the White House. Stock markets plunged instantly.
“It’s no big stretch of the imagination,” claims Trend, “to think that fake news could be used to influence stock prices. This is particularly true for stocks with low prices and those that are infrequently traded, which makes their price easier to manipulate. For more established companies, a campaign could lower the image and reputation of a target company, affecting their earnings and stock price.”
Character assassination by fake news could have many targets. The most obvious one is the politician. But private individuals are also at risk. For example, Mexican journalists are routinely harassed by Twitter bots under the control of drug cartels. In one recent example, a ‘death threat’ was delivered by promoting the fake news that a particular journalist had died in the recent Manchester terror attack.
Trend Micro’s research describes and costs several potential fake news campaigns. These include: create a celebrity with 300,000 followers in just a month, $2,600; help instigate a street protest, $200,000; discredit a journalist, $55,000; and manipulate a decisive course of action, $400,000.
Trend sees three approaches to countering the growing problem of fake news: legal, action from the social networking services, and increased reader awareness. In Germany, a new bill seeks to curb the spread of fake news and threatens to fine social networks as much as EUR 50 million where they fail to comply with rules.
“Google,” comments Trend, “rolled out a feature where fact check can be tagged on the blurbs or snippets of news articles posted on its News search page. It is one of Google’s many strategies for ridding its services of fake content — including rewriting the algorithm of its search engine.” Meanwhile, Facebook has suspended 30,000 fake news sites in France; and Twitter regularly shuts down abusive and bot accounts.
The ultimate arbiter, however, has to be the reader. “In a post-truth era where news is easy to manufacture but challenging to verify,” says Trend, “it’s essentially up to the users to better discern the veracity of the stories they read and prevent fake news from further proliferating.”