by Andrew MacGregor Marshall
“Fake news is not our friend,” declared Facebook this year in an aggressive new public relations campaign to counter accusations that the platform enables dangerous disinformation and hate speech.
Earlier this week, Facebook took some of its boldest steps yet to counter misuse of the platform, removing 18 Facebook accounts, one Instagram account and 52 Facebook pages in Myanmar, followed by almost 12 million people. Among those banned was Senior General Min Aung Hlaing, commander-in-chief of the Myanmar armed forces.
Many of the pages were banned for “engaging in coordinated inauthentic behavior on Facebook”.
Yet Facebook continues to pander to dictators by also blocking genuine news about their activities and atrocities. While it continues to do so, Facebook cannot genuinely claim to be promoting truth and cracking down on disinformation.
In July 2016, German newspaper Bild published photographs of Thai crown prince Maha Vajiralongkorn — who has since become king — at Munich Airport dressed in a crop top and low-slung jeans and plastered with fake tattoos. After I shared the photographs on Facebook, more than 20 Thai police raided the Bangkok home of Noppowan Bunluesilp, the mother of my son Charlie, and detained her for hours of questioning.
In April 2017, I shared a video of Vajiralongkorn in a Munich mall, again in a crop top and covered with fake tattoos.
This caused outrage among the Thai junta and palace, who began threatening Facebook with punitive fines, or even blocking the platform altogether, unless it removed content deemed insulting to the monarchy.
In response, Facebook agreed to geoblock the video, making it inaccessible to anyone resident in Thailand. Facebook said that the Thai authorities had produced a court order claiming the video breached the draconian lèse-majesté law — which prohibits any content deemed insulting to the monarchy — and so it was obliged to geoblock the post.
“When governments believe something on the internet violates their laws, they may contact companies like Facebook and ask us to restrict access to that content,” a Facebook spokesperson said.
News of the junta’s threats to Facebook, and the geoblocking of the post, attracted global media attention — an example of the “Streisand effect” in which clumsy efforts to suppress information actually make the information much more widely seen.
Over the past month, the Thai authorities have escalated aggressive efforts to geoblock content deemed embarrassing to King Vajiralongkorn. Twelve of my Facebook posts have been geoblocked this month alone, and posts by several exiled Thai dissidents have also been affected.
Some of the geoblocked posts contained critical commentary about the Thai monarchy, in breach of the lèse-majesté law, so it is perhaps understandable that Facebook felt compelled to restrict access.
But two of the geoblocked posts were photographs showing Vajiralongkorn in Munich with his fake tattoos and crop top.
Although the Thai authorities have claimed that photographs and video of Vajiralongkorn in Munich were “fake” or “doctored” they have never provided any evidence of this, and no independent analysis has ever established that the images were manipulated. There is a good reason for this — the images are genuine. Facebook has the resources to verify this itself, should it wish to do so.
Geoblocking critical commentary about dictators and despots is one thing, but geoblocking genuine images is a significant step further. If Facebook is serious about combating fake news, why is it restricting access to genuine photographs and video? Is this really the right way to counter disinformation?
The implications are extremely troubling. Facebook’s policy means that any dictator in a country with a compliant judiciary can get unflattering images or photographic evidence of atrocities blocked by producing a court order — even if the images are completely genuine.
For example, the Myanmar authorities would be able to insist that photographic evidence of atrocities and genocide be hidden from Facebook users in Myanmar, preventing them from seeing the truth.
How does that help combat fake news?
This was first published at Andrew MacGregor Marshall’s Facebook page and reproduced with permission.