TROLLS, BOTS AND FAKE OR PUPPET ACCOUNTS

Who is able to produce:

Professional – Amateur – Anyone.

Level of skill needed to produce active troll, bot, fake or puppet accounts varies. Anyone can create and utilise a simple fake account, use trolling techniques or buy bots for click and like farming. There are online tools that can generate all kinds of counterfeit personal information, needed to create fake accounts – from phoney names to temporary email addresses to National ID Number generation and validation. At least a minimum amount of programming skills is needed in order to create social media bots. The most harmful effects of troll, bot or fake accounts are usually conducted by people who have professional skills: some bots employ advanced AI techniques in order to look more realistic; some trolls use compelling storytelling and manipulation techniques in order to get a needed reaction. The creation of fake social media profiles (or buying ‘likes’) is now an industry worth over 700 million €.

Level of deception:

Low – Average – High – Very high.

Level of deception heavily varies – while some trolls, bots or fake accounts can be identified easily, others look like accounts of real people, and need a more serious investigation to identify. A study from the University of Reading School of Systems Engineering found that 30% of people in the study could be deceived into believing a real person ran a social media bot account. Trolls usually mislead other social media users by posting harmless content, creating realistic profiles and stories.

A troll is a person who deliberately tries to upset or start an argument, especially by posting offensive or unkind things on the internet (source: https://www.collinsdictionary.com/dictionary/english/troll ).

A bot is a software application that runs automated tasks over the Internet (in this case follow social media accounts and interact by like, comment, share or other platform functions). Bots behave in an either partially or fully autonomous fashion and are often designed to mimic human users.

A puppet account is an account someone sets up to act in ways they either can’t publicly, or to support them (to upvote their own material, and give positive comments, praise, or advertise my work).

Working principle (what and how does it do):

Trolls

Some tactics trolls use (source: https://medium.com/better-humans/the-complete-guide-to-understanding-and-dealing-with-online-trolls-4a606ae25c2c):

Refusing to back down on known fallacies: when one troll tells a lie (either directly or through the use of hyperbole, omission, or twisting facts), many will repeat it – even if it can be easily disproved.

Troll telephone. A troll in one forum says something flip, and another troll takes it as truth and repeats in another forum. Then it becomes a lie that gets told repeatedly.

Sea-lioning: Repeated and relentless questioning, often after the question has been explained in detail multiple times. The sea lion will insist they are acting perfectly civilly, but they are just trying to delay you as long as possible and derail the conversation. The name comes from a webcomic frame: http://www.muddycolors.com/wp-content/uploads/2017/12/81acd-a5b.jpg.

Flaming: Bringing up incendiary and controversial topics to overwhelm a post or moderator, who must deal with finding and policing every post.

Grammar police: Not caring about the content of your post or comment but insisting your spelling and grammar must be perfect, or you can’t possibly make a valid argument.

Boomerang: Someone who returns as much as possible to keep commenting on a thread. Even if you do block them on social media. They’ll make new accounts and keep making comments to follow you until you are convinced, they are right.

Flooding: When someone posts on your page, but they repeat the same thing over and over, to destroy the ability to have a conversation with anyone else. Usually, it’s something like “lol” or something NSFW, or just childish and taunting.

Hatemonger: That person that goes straight for the incendiary words and name-calling—or right for the death thrusts and rape threats—even when the thread or comments didn’t warrant that level of response. Drives all your sane commenters into a raging frenzy and the conversation immediately turns into a melee.

Bots

For social bots to be applied to a specific (social media) channel, the platform has to be accessible through an Application Programming Interface (API), as offered, e.g. by Twitter and Facebook. By using APIs, a large number of bot accounts can be controlled simultaneously with little effort. With simple keyword searches, they scan Twitter timelines and Facebook posts for specific terms or hashtags. As soon as they find what they are looking for, they comment, share links or start a fictive discussion. Or they comment directly on specific topics. In combination with other bots (forming a botnet), their noise becomes even louder and can mislead other users.

Malicious social media bots can be used for a number of purposes (source: https://www.cloudflare.com/learning/bots/what-is-a-social-media-bot/):

Artificially amplifying the popularity of a person or movement: A person or organisation with millions of social media followers can be seen as important or influential. A primary use case of social media bots is to boost the apparent popularity of other accounts.

Influencing elections: A study by First Monday, a peer-reviewed journal found that in the day before the 2016 U.S. presidential election, as much as 20% of political discussion on social media was generated by about 400,000 social media bots.

Manipulating financial markets: Social media bots can also be used to influence financial markets. For example, bot accounts can flood social media with manufactured good or bad news about a corporation, in an attempt to manipulate the direction of stock prices.

Amplify phishing attacks: Phishing attacks rely on an attacker gaining their victim’s confidence. Fake social media followers and social engagement can help convince a victim that their scammer can be trusted.

Spreading spam: Social media bots are often used for illicit advertising purposes by spamming the social web with links to commercial websites.

Shutting down free speech: During the 2010-2012 Arab Spring movement, government agencies used Twitter bots to overwhelm social media feeds. These bots were used to push down the messages of protestors and activists deliberately.

More about trolls: https://www.lifewire.com/types-of-internet-trolls-3485894.

More about bots: https://niccs.us-cert.gov/sites/default/files/documents/pdf/ncsam_socialmediabotsoverview_508.pdf?trackDocs=ncsam_socialmediabotsoverview_508.pdf.

Example:

Checking method:

Recognising and Dealing with different types of trolls: https://www.teamtechnology.co.uk/troll-tactics.html.

While some of the most advanced social media bots can be hard to spot even for experts, there are a few strategies to identify some of the less sophisticated bot accounts. These include:

  • Running a reverse image search on their profile picture to see if they are using a photo of someone else taken off the web.
  • Looking at the timing of their posts. If they are posting at times of day that don’t match up with their time zone or are making posts every few minutes every single day, these are indications that the account is automated.
  • Using a bot detection service such as me that uses machine learning to detect bot behaviour. Cloudflare Bot Management also uses machine learning to identify bots.
  • https://botometer.iuni.iu.edu.

Recognising a fake social media account: https://smallbusiness.chron.com/spot-social-media-fake-46150.html.

Other disinformation types:

Hoax

A hoax is a falsehood deliberately fabricated to masquerade as the truth. A common aspect that hoaxes have is that they are all meant to deceive or lie. For something…

Deepfake

Deepfake is an AI-based technology used to produce or alter video content by editing faces (face-swapping or creating new face expressions).

Conspiracy Theory

A conspiracy theory is an explanation of an event or situation that invokes a conspiracy by sinister and powerful actors, often political in motivation when other explanations are more probable.…

Contact us

Email: info@checkorcheat.eu
Phone: +370 525 97 247
This project is co-funded by the European Commission
under the preparatory action “Media Literacy for All 2018”.