Most fake accounts on Facebook set up “not with political intent,” but “with commercial intent”: Facebook’s EMEA vice president for public policy

Most fake accounts on Facebook set up “not with political intent,” but “with commercial intent”: Facebook’s EMEA vice president for public policy

Noting that most fake accounts are created “not with political intent,” but “with commercial intent,” which entail creating fake accounts to sell such accounts and even followers of such accounts, Facebook’s vice president for public policy for Europe, Middle East and Africa Richard Allan explained that “there will be a three-month period” where “hundreds of millions of such accounts” are removed from the platform, with most being taken down “within minutes of account creation”.

Speaking in the House of Commons in Parliament in London on Tuesday (27 Nov), Pasir-Ris Punggol GRC Member of Parliament and one of the members of Singapore’s Select Committee on Deliberate Online Falsehoods Sun Xueling posed a question to Mr Allan regarding how Facebook is “policing the setting up of fake accounts, the shutting down of these fake accounts and their networks”, to which Mr Allan responded by saying that “shutting down fake accounts is an ongoing battle that we have”.

“The best way I’ve found to describe this is that it’s a bit like a “robot war” – that there are people who’ve created programmes on computers that create fake accounts, not just on Facebook, but other networks … millions of them – they’re blasting away – and we have artificial intelligence systems that try and understand what a fake account looks like, and shut them down as soon as they come up,” said Mr Allan.

However, he lamented that the real challenge lies in busting fake accounts that are set up in a convincing, realistic manner to mimic legitimate individuals.

“There are people who are very careful, who create one or two accounts, and they act as though they are normal Facebook users … the issue that we saw in the United States with this internet research agency was often accounts like that, where it wasn’t this mass creation [of spam accounts], but very carefully curated [ones] so they look real, even though they weren’t real,” highlighted Mr Allan.

When asked by Ms Sun if Facebook’s removal of Russian-linked fake accounts as recently as August to October this year following Russian state interference is “reactive” and “perhaps insufficient”, Mr Allan stated that research has shown that “low-quality information has decreased by 50 per cent across the Facebook platform due to a number of measures that we have taken”.

“Those are independent studies by academics, and by Les Décodeurs which is the fact-checking arm of “Le Monde” from France,” he added.

Mr Allan added that while “the battle is not over,” he believes that Facebook is “starting to make inroads.”

He added that determining what constitutes a fake account is one of “the hardest things to do”.

“They can use technology, they can use a VPN [Virtual Private Network] to appear as though they are coming from a different country, they would get hold of photos that will look very legitimate … That stuff is hard, and that’s frankly where we need a lot of cooperation often with law enforcement agencies so we can understand what’s going on and try to deal with those people,” stressed Mr Allan.

When asked if it is possible for future elections to be “interfered with” by unprecedented methods via Facebook, Mr Allan said: “I think it is the case that we will continue to discover groups of people who are doing things that they shouldn’t be doing at election time. Our job is to minimise that as far as we can, but I think it’s unrealistic … As long as we have an internet, there will be … any kinds of attempt to interfere. Technology does give people very strong tools.”

When probed by Ms Sun on how Facebook prioritises credible content and de-prioritising falsehood when it continues to push contents that readers want to see, and “potentially creating online echo chambers and amplifying falsehoods”, Mr Allan noted that one of the major changes rolled out by Facebook is “something called “Meaningful Social Interactions”, which advertises content that typically comes from users’ family and friends.

“That kind of content tends to be less controversial than some of the contents that’s coming in from some of these other sources,” he reasoned.

Ms Sun interjected, pointing out that such algorithms could “potentially create online echo chambers” because “people are sharing information that they want to see amongst groups of people that they are close to, self-selected individuals”.

Countering her suggestion, Mr Allan responded: “There is some good research that shows if you have a reasonably broad family and friend group, you would actually get more diverse content from your group of family and friends – that is from my experience on Facebook – than you would if you’re simply going to the same restaurants and bars where you normally go and only do [so] with one group of people.”

When asked if he agreed that more would be achieved if Facebook works with development authorities to take down false information online and shut down fake accounts, Mr Allan agreed, noting that Singapore parliamentarians are currently “looking at a piece of legislation” in relation to such joint efforts.

“We do think it’s important that there is a judicial process in place … I know France has just passed a law … Similarly if someone claims that a politician is corrupt, and that’s false, [if] we don’t take it down … That’s a problem. If it’s true, and we do take it down, that’s equally a problem, because it stops somebody from shedding light on the genuine harm.

“The best person to make a decision about whether that claim is true or false is not Facebook or a Facebook employee, it is the relevant judicial authority in any country,” said Mr Allan.

When further prompted by Ms Sun if Facebook is open to “adopting a regulatory approach”, which may include “voluntary reporting and independent audits”, Mr Allan agreed, adding that Facebook is “keen to see a properly regulated structure”.

However, he qualified his statement, saying that this does not mean the social media giant will necessarily agree to everything suggested by governments.

“What we’re trying to do in the process of working with the French government,” he cited as an example, “is to understand how we can work together with regulators.”

Parliamentarians from other countries including Argentina, Belgium, Brazil, Canada, France, Ireland, and Latvia, as well as members of the UK’s Digital, Culture, Media and Sport Committee were also present at the inquiry on disinformation and fake news on Tuesday.

Notify of
Inline Feedbacks
View all comments