SINGAPORE: Organised fake news and disinformation campaigns could already be influencing and undermining Singaporean society, said national security expert Shashi Jayakumar on Thursday (Mar 15) in both written and oral submissions to the Select Committee on Deliberate Online Falsehoods.
“An aggressor could attempt to ‘peel off’ one particular ethnic group or religion, using social media and disinformation to appeal ... to deeply ingrained historical, cultural issues, setting off one group against others, or even against the Government,” he wrote. “Singapore can be a sandbox for subversion.”
Speaking on the committee’s second day of public hearings, Dr Shashi, who heads the Centre of Excellence for National Security at the S.Rajaratnam School of International Studies (RSIS), added: “I don’t want to cast allegations or smears or be a fear-mongerer, but in my view, it would be a mistake to assume that this is not already happening in Singapore.
“These are advanced, persistent threats. You deploy them long in advance, before you actually need to use them.”
In his written representation, he explained how state actors could employ such techniques and tactics, using the example of Russia and its alleged manipulation of the 2016 US presidential election.
“Spreading rumours to discredit politicians (and to play up certain themes – such as negative portrayals of immigration policy) have been aimed at undermining public trust towards democracy and systematically influence populations to become less trusting of mainstream, established news networks,” said Dr Shashi, who also coordinates the Future Issues and Technology programme at RSIS.
AdvertisementAdvertisement“Some researchers think they have found fake Facebook groups almost entirely populated by bots. These fake groups, convincingly operated and orchestrated but operated by bots and AI, leveraged on existing ideological filter bubbles and echo chambers, eventually attracting real fans.”
“It is possible, as some researchers have posited, that many Trump fans were emboldened to declare their support for the candidate by the artificially created perception of a swell in support for him.”
He added: “There exist individual ‘consultants’ and private sector entities specializing in hacking or interfering with elections with the aim of achieving the desired election result for the client. Their methods include smears, hacking, spoofing webpages, and sending mass emails to influence outcomes.”
“More broadly, there also appears to exist a growing shadow market for methods to influence target populations – and outcomes - in nations, using methods like those offered by Cambridge Analytica, the company said by some reports to have profiled, and micro targeted, the US electorate during the 2016 presidential election.”
REGIONAL THREAT?
During the public hearing, Law and Home Affairs Minister K Shanmugam brought the topic closer to home as he spoke to Dr Shashi, noting: “You say Russians but presumably, this expertise is available even to much smaller countries.”
The Nanyang Technological University (NTU) academic agreed.“There has been a proliferation of agencies which are sort of hired guns which will parlay this kind of experience for the highest bidder. Cambridge Analytica is one of them. I understand it now has a presence in Malaysia, but there are others,” said Dr Shashi.
In his written submission, he explained that “data-driven political consultancies (whose methods may involve disinformation) appear to have been engaged by political parties, as well as individual candidates, in the coming Malaysian general election”.
On Thursday at the hearing, Dr Shashi added: “Let’s say, touch wood, in a worst-case scenario where we have a serious mishap in relations with one of our near neighbours. It will be a mistake to assume the means employed against us will be simply kinetic.
“The means and the tools are actually there,” he commented, pointing to the Saracen fake news mill in Indonesia which he suggested was used to bring down Jakarta governor Basuki Tjahaja Purnama, or Ahok, and now for character assassinations against high-level politicians including president Joko Widodo.
“That’s so far been limited to Indonesian domestic political machinations. But the tools and stratagems, with just one or two tweaks, can be turned against Singapore itself in the event we have a falling-out.”
WHAT CAN SINGAPORE DO?
Dr Shashi then suggested a blend of new and old non-legislative measures, starting with the need to shore up “trust between people and Government”.
“The citizenry should be taken into confidence and the nature of the threat to cohesion should be clearly laid out, without fear-mongering. Because of this underlying trust, their citizens are less disposed to believe fake news,” he wrote.
“But even as government builds trust, the responsibility of combatting fake news and disinformation should not solely lie with the authorities.”
He pointed to instances in Europe, Ukraine and Indonesia where think-tanks, citizens and journalists alike run websites and portals to counter disinformation. These “are better placed to act, and to act quickly”, he said.
Dr Shashi proposed the setting up of a body - “not necessarily a Government one”, but using grassroots participation to carry out research and fact-checking initiatives, congregate various experts under its umbrella, produce content for TV, newspapers and social media; and offer training to media professionals and other relevant parties.
He also pitched the establishment of a Centre of Excellence (COE) analogous to a Latvian NATO facility for hybrid threats.
“There is some merit in studying the COE model with the view of introducing countermeasures customised to Southeast Asia’s cultural and political landscape … This would be a first in Southeast Asia,” said Dr Shashi. “As ASEAN Chair in 2018, and with cyber (and by implication issues relating to social media) on its stated agenda for its chairmanship, Singapore would be well-positioned to promote concrete efforts.”
He also asked the Ministry of Education to do more in teaching children in schools how to spot fake news, noting global calls for such skills to be included in tests.
Such a move, said Dr Shashi, will “make it more likely that the citizenry of Singapore’s SMART Nation will in future have the underlying resilience to recognize filter bubbles and echo chambers of the mind”.
At the same time, he called for a revival of Singapore’s Total Defence movement, in particular the psychological pillar, to recognise the threat of slow-burn issues such as disinformation.
This would emphasise the importance of civil-military cooperation, rather than separating these into different silos, Dr Shashi added.
He also said that ways should be found to help national newspaper The Straits Times “regain readership and respect in the eyes of the Singapore public” in light of its “steady decline over the years”.
“While numerous amateur blogs and forums which have sprung up which to some degree provide commentary over Singapore-related issues, their coverage is patchy and none of these platforms can be considered a serious, consistent news source,” he explained.
In response to these suggestions, Mr Shanmugam said: “The evolving nature of the threat and the seriousness appear to be such that I suppose, all that even you as an expert can say at this point in time, is we have to try all of these with no certainty any particular method will in fact be a silver bullet.”
NATIONAL CONVERSATION NEEDED
In their concluding remarks to each other, Dr Shashi and Mr Shanmugam both spoke of the need above all for “real world intervention” and “active human agency”.
“I’ve seen many people radicalised online worldwide. I’ve seen very, very few people de-radicalised online,” said Dr Shashi. “Not to say it can’t be done, but whenever I’ve seen it, there must be some active human agency involved together with any online countermeasures. It’s the same case for the fake news and disinformation space.”
“We are going to grow up in 20, 30 years with a digital generation which has only ever known digital. I wonder whether if you look deeper, that brings with it risks as well: Ideas go into your head, you foment, think, get worked up, ideas go out.”
“So I think this will be a big problem …. Whatever we do eventually to fix the fake news problem, will certainly at some point need real world intervention.”
Mr Shanmugam said he could “not agree more”.
“The antidote … is you do fact checking, though ... truth doesn't really convince many people who want to believe in certain things,” he observed. “Second, I think you’re going to need extensive public education, that’s going to take time.”
“Third, build even more actively the bonding within community. And the kind of conversations … where people feel part of the country, feel valued and feel they have a stake and want to do something for the country and for themselves, and by sticking together we all benefit - that kind of feeling has got to be ingrained.”
Added Mr Shanmugam: “That’s not going to be done by some rules or regulations or laws ... You have to have state agencies do that and you probably need a kind of legal framework to make clear whatever you have outlawed in the physical world you also have outlawed online.
“One thing that has come through is, you’re going to need active intervention by human agencies, Government as well as non-Government.”
Let's block ads! (Why?)
More...