Fake likes can be purchased and social networking websites are doing little to prevent this manipulation.

Fake likes can be purchased and social networking websites are doing little to prevent this manipulation.

A couple of months later researchers paid a few dollars to a company to draw attention. In 30 minutes, the post had 100 more likes. The researchers had similar results on a Christmas tweet and on a holiday place on Vestager’s Instagram accounts the European Union’s justice commissioner, from Vera Jourova.

Businesses like Facebook and Twitter are poorly policing automated bots and other procedures for manipulating social media platforms, according to a report published on Friday by researchers in the NATO Strategic Communications Center of Excellence. With a small amount of money, the researchers discovered anyone can hire a company to get likes, clicks and opinions.

The group, an independent organisation which advises the North Atlantic Treaty Organisation, tested the tech companies’ ability to prevent paid influence campaigns by turning to 11 Russian and five European companies that sell fake social networking engagement. For $300, the investigators purchased over 25,000 likes, 3,500 comments, 20,000 views and 5,000 followers, including on posts from prominent politicians like Vestager and Jourova.

Roughly 80 percent of the fake clicks stayed, the investigators said. And virtually all the accounts that had been used to generate the clicks remained active three weeks after the researchers reported them to the companies.

The report spotlights the ongoing challenges for Facebook, YouTube and Twitter as they try to combat other types and disinformation of manipulation that is online. After Russia interfered in the United States’ 2016 presidential election, the companies made adjustments to reduce the spread of online disinformation and interference. In recent months, the platforms have announced takedowns of Africa, Saudi Arabia and, most recently, accounts in China, where new strategies were being tested by Russia.

But the report also brings renewed attention to an often-overlooked vulnerability for platforms: companies which sell comments, enjoys and clicks on social media networks. Lots of the businesses are in Russia, according to the researchers. Since the networks’ software ranks posts in part by the quantity the action can cause positions that are more prominent.

“We spend so much time considering how to regulate the social networking companies – but not so much about how to regulate the social networking manipulation industry,” said Sebastian Bay, one of the investigators who worked on the report. “We need to take into account if this is something which should be allowed but, perhaps more, to be quite conscious that this is so widely available.”

From May to August, the researchers analyzed the ability of the social networks to handle the for-hire manipulation industry. The researchers said they had discovered hundreds of suppliers of social media manipulation.

“The openness of this industry is striking,” the report says. “In fact, manipulation providers advertise openly on major platforms.” Engagements were purchased by the researchers on about a hundred articles.

Following their purchase, the researchers identified nearly 20,000 accounts that were used to manipulate the media platforms that were social and reported that a sample of these to internet companies. Over 95 percent of the accounts were still active online, three months later.

The researchers directed all the clicks to posts on media accounts they had made for the experiment. But they analyzed some confirmed accounts, like Vestager’s, to see if they were better protected. They were not, the researchers said.

The researchers said this to limit their influence on real conversations, they had purchased engagement on posts from politicians that were at least six months old and comprised apolitical messages.

Researchers discovered that the large tech companies weren’t equally bad in removing manipulation. Twitter identified and removed more than the others, the researchers discovered; on average, retweets and half the likes bought on Twitter were removed, they said. Facebook, the world’s biggest social media, was best at blocking the production of accounts under false pretenses, but it rarely took down content.

Was the easiest and cheapest to control. The researchers found YouTube the worst in removing inauthentic accounts and also the most expensive to manipulate. The investigators reported 100 accounts used for manipulation in their test to each of the social networking firms, and YouTube was the only one that did not suspend any and provided no explanation.

Leave a Reply

Your email address will not be published. Required fields are marked *