Media
CheckMate faces scrutiny over government ties, GE2025 focus, and uncritical ST coverage
Mr Tan Bing Wen’s CheckMate fact-checking initiative is under scrutiny for its government ties, perceived narrow focus on GE2025, and lack of transparency in grading misinformation. Critics argue its speed-over-depth approach and unexplained classifications may undermine credibility and user confidence.
The Straits Times recently spotlighted CheckMate, a fact-checking initiative employing AI and volunteers, for its efforts to combat misinformation ahead of Singapore’s General Election 2025 (GE2025).
However, questions have emerged about its founder’s professional background, its funding, and the initiative’s apparent focus on election-related disinformation, prompting calls for greater transparency.
Origins and government links
Founded in 2023, CheckMate claims to have processed over 3,400 user queries via WhatsApp and aims to deploy advanced AI alongside 100 volunteers to address a surge in misinformation leading up to GE2025. While its mission has been widely praised, scrutiny over the initiative’s origins and leadership persists.
According to his LinkedIn profile, founder Tan Bing Wen, 35, has been with the Central Provident Fund (CPF) Board since graduating from the National University of Singapore, becoming Senior Deputy Director in April 2023. The CheckMate domain was registered under his name in July 2022. Critics have raised concerns about potential conflicts of interest stemming from his dual role as a government employee and the leader of an ostensibly independent initiative.
Further scrutiny arises from the profiles of other prominent CheckMate contributors.
Wu You, the AI lead, works with the Inland Revenue Authority of Singapore, having interned with various government agencies.
Amanda Goh, head of fact-checking, has been with the Ministry of Finance since January 2024 and previously served with the CPF Board from 2021. Audrey Tim, in-charge of growth and marketing who has been with the Ministry of Defence for six years.
Similarly, other volunteers have current or prior ties to government bodies, including GIC, GovTech, Enterprise Singapore, and even the Prime Minister’s Office.
Adding to this, CheckMate has received the support of major government-related organisations such as the National Library Board and the Centre for Advanced Technologies in Online Safety, which was launched in May with $50 million in funding to tackle online harms. These partnerships have facilitated roadshows and networking opportunities to promote CheckMate’s services and boost volunteer recruitment.
The prevalence of government-linked volunteers and institutional support has led to speculation about CheckMate’s independence and the extent of public sector involvement.
The branded uniforms worn by volunteers and the professional organisation of roadshows have further fuelled doubts about whether the initiative is truly a ground-up effort.
Criticism over narrow focus on elections
CheckMate’s prioritisation of election-related misinformation has also been questioned. While its work debunking false claims during the 2023 presidential election—such as fabricated vote shares and misleading symbolism—was commendable, some argue that broader issues, such as financial scams and public health misinformation, deserve greater attention.
Former Straits Times columnist Gan Swee Leong, a vocal critic, asked why General Election-related disinformation has been prioritised in the ST report.
In a social media post, he noted that misinformation on scams, which directly harm vulnerable individuals, might warrant more immediate focus.
Mr Gan also raised the possibility of government involvement in CheckMate, suggesting that greater transparency about any such collaboration could bolster the initiative’s credibility.
Concerns about sustainability and transparency
CheckMate’s sustainability has also come under scrutiny. While Mr Tan has stated that he funds the initiative’s computing costs personally, its reliance on unpaid volunteers and partnerships with publicly funded organisations raises concerns about its long-term viability and independence.
On LinkedIn, Mr Tan described CheckMate as a “hobby project” that grew into a “tech-for-good” initiative. However, he acknowledged challenges in securing financial backing without evidence of a working business model, stating that “commercial sustainability is something we’re actively thinking about.”
The initiative’s recruitment and training practices have also drawn attention. Volunteers must pass an entry test and are tasked with verifying AI-generated results. Once flagged content is submitted, CheckMate’s volunteers compare it against trusted news outlets, official sources, or online forums to evaluate its credibility. Despite this process, questions remain about the consistency and neutrality of the checks, particularly in politically sensitive contexts.
Mr Tan explained that the service prioritises speed over detail to quickly stem the spread of misinformation, describing it as a “trade-off.” “Giving an indication, rather than full justification, is more effective to stop them [users] from being tricked quickly,” he said. While this rapid approach allows for high-volume responses, it sacrifices the depth required for more nuanced misinformation.
CheckMate’s use of AI aims to enhance efficiency further. According to Mr Tan, the advanced AI being developed will leverage large language models to assess the reliability of flagged content, evaluate tone, and detect signs of scams or unreliable information.
However, the absence of detailed explanations for why flagged content is categorised as “misleading,” “scams,” or “untrue” has raised questions about the transparency of its methodology.
One can’t help but to notice how CheckMate’s website provides no clear justification for its classifications, leaving users in the dark about how these determinations are made. This lack of clarity has sparked concerns about the credibility of the initiative and its ability to provide reliable assessments, particularly in politically sensitive scenarios.
While the streamlined method ensures rapid feedback, it risks undermining user confidence in instances requiring detailed context and justification.
Such ambiguity in grading flagged content underscores a critical gap in CheckMate’s fact-checking process, potentially affecting its perceived reliability and its capacity to address the complexity of modern misinformation effectively. Transparency in its methods will be essential to bolster public trust and ensure its evaluations remain credible.
As CheckMate prepares for an anticipated surge in misinformation surrounding GE2025, it will undoubtedly face growing pressure to address concerns about its origins, focus, and operational methods.
While tackling fake news is a crucial endeavour, initiatives like CheckMate must demonstrate their own integrity to gain public trust. As Mr Gan aptly noted, “If an outfit wants to be a legitimate and respected fact-checker, it should allow itself to be checked first—voluntarily.”
-
Singapore2 weeks ago
Ministers silent on legal proceedings as Bloomberg and TOC refuse demands
-
Politics1 week ago
Progress Singapore Party accuses PAP supporters of harassment during Choa Chu Kang walkabout
-
Politics1 week ago
Progress Singapore Party volunteer files police report alleging harassment during walkabout
-
International2 weeks ago
Palestinian Authority suspends Al Jazeera operations in West Bank
-
Opinion1 week ago
Holes in Low Yen Ling’s allegations against PSP: No evidence provided from her volunteers
-
Singapore4 days ago
SM Lee Hsien Loong defends CECA, calls for integration and openness amidst political sensitivities
-
Politics1 week ago
Low Yen Ling accuses PSP of “twisting the truth” over alleged harassment in Choa Chu Kang GRC
-
Comments1 day ago
Netizens criticise K Shanmugam for sharing video on alleged Bukit Gombak harassment incident