fbpx
Source: CNA.

Tech giants grilled by Select Committee on their efforts to combat deliberate online falsehood

Parliamentary Select Committee on deliberate online falsehoods scrutinised social media giants by representatives of technology companies Twitter, Google, the Asia Internet Coalition, and Facebook over their effectiveness in acting on false or harmful content and the dangers the falsehoods presented for over more than six hours on Thursday on Thursday (22 March).

Twitter’s representative, Director of Public Policy and Philanthropy in Asia Pacific Kathleen Reen; Ms Irene Liu, News Lab Lead at GoogleFacebook’s Asia-Pacific vice-president of public policy Simon Milner; and Jeff Paine, Managing Director, Asia Internet Coalition had earlier submitted written submission on the matter.

During the exchange, the representatives had to answer questions about the algorithms and technology they have been using to identify falsehoods and information that violate their respective policies on hate speech.

Citing examples, the committee listed posts that could be viewed as having crossed the line in certain jurisdictions, and detailed the failure or refusal of the tech giants to remove them.

Responding to Home Affairs and Law Minister K Shanmugam, Mr Milner said that Facebook does not have a policy that requires all content posted to be 'true, verified and accurate', so as not to be placed in a position where it has to be the arbiter of the truth. He added that it will remove content that is “locally illegal” if it receives a court order.

The tech giant was further pressed by the Minister about its policies on taking down false information, asking if Facebook would take down information that was identified to be false.

Mr Milner said there are no clear-cut answers, noting that there are instances where rumours become the truth.

"We do not have a policy that says everything has to be true and we do not put ourselves in the position of deciding what is true," he said. However, he noted that Facebook will take down fake accounts and if there is a court order directing them to do so, said Mr Milner.

He said that the company does not have a policy that says everything has to be true, and we do not put ourselves in a position of deciding what is true, adding that if something is shown to be false through a court process and Facebook gets a court order, it will respect that order, saying, "If we get a court order telling us that something is locally illegal, we will take it down. That’s as far as I can go."

The minister pointed out that the courts can only act based on legislation, asking, "Do you not realise then that the natural and logical conclusion, based on your policy that you will not yourself take down falsehoods unless there is a court order, combined with the fact that court orders can only be given pursuant to legislation, means that if a state wants falsehoods to be taken down, that can only be done through legislation, vis-a-vis Facebook."

Responding to this, Mr Milner explained that defining what constitutes a deliberate online falsehood is tremendously difficult, however,  Facebook is taking lots of steps to address the issue, adding, "We’re concerned about a rush to legislate, and legislation that is enacted in haste is often legislation that is regretted at length."

Mr Milner also said that it would be tremendously difficult to define in law what is or is not deliberate online falsehoods, saying that this was why the tech firms were skeptical about legislation.

He said, "I know it is a real concern and we absolutely share your concern, we know you have certain responsibilities as do we. We are here to talk about all those and we are just saying we are concerned about a rush to legislate and that legislation which is enacted in haste can often be regretted at length."

Social media companies Twitter and Google, as well as industry association Asia Internet Coalition (AIC), took a similar position on this issue of being the arbiter of the truth. Representatives from the three organisations were also making oral representations to the committee.

Google stated that it is not positioned to evaluate disputes related to facts or characterisations laid out in the news article, pointing out that for both Google Search and Google News, claims that a particular article’s content is inaccurate will generally not result in its removal unless pursuant to a valid legal request.

During the hearing, Social and Family Development Minister Desmond Lee asked Ms Liu to clarify that it does not have a policy to ascertain truth, or a policy to remove content that is clearly shown to be untrue.

Responding to this, Ms Liu said, "For searches, we do not host the content, we just point to it. Our mission is to point to high quality sources of information. If a particular news article - that in someone’s view - requires a correction, we wouldn’t be able to correct it, because it is another company’s content."

Mr Lee then asked Google to clarify that it is not in a position to discern if content is truthful or a deliberate online falsehood, and that Google would let a legal party “make that determination” for them, and comply with legal authority.

Ms Liu responded, "We stand by our submission."

AIC and the tech firms stated that a stringent self-regulatory approach, executed in coordination and cooperation with the authorities, would be more appropriate compared to legislation.

Mr Lee then asked if Twitter has a policy to discern what is true and what is a deliberate online falsehood, as well as confirming that Twitter does not have a policy of taking down information that is known and proven to be false unless there is a legal requirement to do so.

Ms Reen responded by referring to the roll-out of a hateful conduct policy across the platform in November 2016 that Twitter subsequently updated the policy last December and created new rules related to violent extremist groups.

She said, "We have detailed much more commitment to making sure that incitement, encouragement, harassment and the inference of that kind of racism is not allowed on our platform. We have zero tolerance for it."

Therefore, Ms Reen said that there are many instances when Twitter’s policies can be applied to circumstances involving deliberate online falsehoods in Singapore’s context.

As such, it sees the question of a court order as a forced binary that goes against the company’s commitment and constant updates to its policies.

Twitter then was asked about its responsiveness to pressing situations and how it measures up to its commitment.

Ms Reen stressed that the company is congruent, referring to Mr Lee’s examples, which include a fake Twitter account masquerading as the Tennessee Republican Party during the 2016 United States elections, she said that the social media platform has since made more than 30 changes to its product and policies. It is also undertaking special measures in anticipation of the upcoming mid-term elections in the US.

"We could progressively measure and say that we not only are congruent, but that moment has passed in 2016," she added.

Mr Lee also pressed Twitter about its efforts at detecting malicious automation, noting that such initiatives, which rely on machine-learning, run the risk of being over- or under inclusive in tackling malicious bots and could for instance, shut down legitimate handles or allow malicious bots to continue to operate.

Responding to this, Ms Reen said it takes time for these measures to be effective and that their technologies are getting smarter everyday at detecting malicious bots, adding, "At this point, we are not as worried about the binary you just presented, we are much more interested in the technical detail of how we can continue to grow our capabilities."

Mr Lee then asked Mr Paine about the adequacy of Singapore’s legislative framework that can be used to counter deliberate online falsehoods.

Mr Paine noted that deliberate online falsehoods are “fairly new things” and legislation surrounding it should not be rushed.

He said that many of the existing laws in Singapore are aligned with the community guidelines and policies of its members, adding, "So we didn’t really see that as a major issue with respect to any kind of legislation on DOFs."

Mr Shanmugam then asked whether he accepts that his statement is inaccurate and that there are gaps that are not covered by current legislation.

Mr Paine responded by saying that there could be gaps.

Google was questioned about relying on algorithm to do the heavy lifting of identifying what is low and high quality content.

Responding to this, Ms Liu said that the tech giant is always trying to hone the algorithm but at the same time are working closely with news organisations to identify authoritative sources.

When asked about how YouTube, under Google, could only be spending less than 0.1 per cent of their annual revenue on discerning online content given the enormity of the challenge of tackling online falsehoods, Ms Liu said that these figures does not reflect the overall effect of their research.

"To look at just one dollar figure and one budget line, I don’t think it really reflects what we are doing in terms of the overall effect and the leveraging we learn on the technologies from different products," she noted.

No policy on banning of foreign currency payments for political advertisements

Responding to questions from Mr Shanmugam, Facebook’s Mr Milner said that the company does not currently have a policy on banning foreign currency payments for political advertisements.

However, he stated that it will consider doing so.

The Facebook had admitted in September 2017 that Russian-linked ad buyers had spent US$150,000 on thousands of US political ads during the 2016 Presidential campaign in a hearing by a Senate Intelligence Committee investigating Russia’s interference in the election.

Mr Milner explained that the reason Facebook does not have such a ban in place is due to the difficulty of defining what a political advertisement is, saying, "Most of the ads, in the case of the US, may not have been classified as political ads under the jurisdiction there because they did not endorse a candidate. It’s not because we don’t want to try to address the issue, it’s just that it is actually not simple."

However, he said that Facebook can take actions to ensure that only people who can advertise that kind of content are based in the country concerned.

Objection raised on line of questioning

Mr Milner at a point, raised an objection to the Chairman of the Select Committee, Mr Charles Chong, asking if the line of questioning by Mr Shanmugam is appropriate and a waste of time for the committee.

Mr Chong said, "I think you should leave it to us to decide what is relevant, what is not relevant. If you are unable to answer questions because you don't know or if you do not wish to answer the question, please state so."

Mr Shanumgam further replied to Mr Milner, "If you feel that you are unable to support them, of course, you can say so. But I think you will leave the relevance of the questions to me and for me to be directed by the chair. Can we move on? I don't need an answer from you."