Bots And The Ballot Box Is Facebook Prepared For Asia S Elections
Doubts remain over tech giant’s fight against fake election news. Battling criticism and political pressure in the U.S. and EU, Facebook is in advance damage-control mode ahead of major elections — including Asian polls in Thailand, India, Indonesia and the Philippines.
(NIKKEI ASIAN REVIEW) – In 2015, Twinmark Media Enterprises launched a business that would become one of the most prolific sources of spam and fake news in the Philippines. One of the marketing firm’s specialties was generating viral content by posting glowing stories about President Rodrigo Duterte — and takedowns of his critics — on Facebook.
Typical of Twinmark’s content was a 2016 viral article aimed at discrediting Duterte’s political rival, Vice President Leni Robredo, by alleging a secret first marriage to a “leftist.” The image used as evidence, however, was unrelated; it had been lifted from a private Facebook user’s dedication to his parents’ wedding anniversary, according to an investigation by Rappler, an online news publication based in Manila.
This kind of false content could have vast reach: just one of Twinmark’s Facebook pages was followed by 43 million users.
In early January, Facebook cracked down on Twinmark by taking down 220 of its pages and 29 Instagram accounts. Explaining its decision to ban Twinmark, Facebook said the company had “repeatedly violated our misrepresentation and spam policies,” including through “coordinated inauthentic behavior and use of fake accounts.”
But critics say Facebook should have acted against Twinmark much sooner. Rappler had exposed Twinmark’s practices in its December 2017 report — 13 months before Facebook banned the company.
Facebook “took too long to act,” said Maria Ressa, Rappler’s CEO and executive editor, echoing a frequent criticism of the social networking site.
Facebook’s crackdown on Twinmark came just four months before Filipino voters go to the polls for midterm elections, which are widely seen as a referendum on Duterte’s unconventional policies — including the bloody drug war that has killed thousands of suspects over the last two years.
For a company whose early motto was “move fast and break things,” Facebook has faced frequent accusations that it moves too slowly to react to fake news — often with world-shaking consequences.
The social media company last year acknowledged being “too slow” to react to the brutal crackdown on minority Rohingya Muslims in Myanmar, in which military leaders used Facebook to spread anti-Rohingya propaganda. Mark Zuckerberg, Facebook’s founder, initially dismissed reports that Russia spread divisive messages on the social network ahead of the 2016 U.S. election as “a pretty crazy idea” before finally admitting that it had been a conduit for fake election news.
Now, as it faces withering criticism and political pressure in the U.S., Facebook is in advance damage-control mode ahead of a number of elections — including Asian polls in Thailand, India, Indonesia and the Philippines — that will demonstrate whether it can be trusted to prevent malign actors from interfering with the democratic process.
On Jan. 28, the company announced a number of initiatives aimed at protecting elections, including two new regional operations centers focused on election integrity located in its Dublin and Singapore offices. These teams will “add a layer of defense against fake news, hate speech and voter suppression, and will work cross-functionally with our threat intelligence, data science, engineering, research, community operations, legal and other teams,” it said.
Facebook says it has made “massive investments” in election integrity, increasing the number of people working on “safety and security” from 10,000 in 2017 to 30,000. “Hundreds” of monitors have been added in Asia, it said, but details of what they are doing are scarce.
“2019 is a huge election year. Nearly half the world is voting this year,” said David Madden, founder of Phandeeyar, Myanmar’s leading tech hub. Facebook “has no choice” but to create election centers and attempt to curb electoral manipulation, he said, because “they have created a beast and now they are desperately trying to figure out how to tame it.”
The latest moves had been in the pipeline for many months, “but they were accelerated by recent controversies over the Nigerian elections and fears about upcoming ones this year,” said a Southeast Asia-based tech consultant familiar with Facebook’s discussions on the matter.
The real challenge for Facebook in curbing election manipulation is to harness both technical and human resources in ways that go beyond simple detection of hate speech and threats of violence, said another consultant who has worked with Facebook.
“We are talking about propaganda dressed up as fact in Facebook posts, or campaign-linked information intended to sway voters — these can’t be detected by straightforward algorithms or even human monitors who lack local knowledge,” said a social media expert who has advised the company in Southeast Asia.
Facebook remained reluctant to take on responsibility for monitoring election-related content, said the consultant who has advised the company. “But it reached the point, particularly amid U.S. calls for a breakup of Facebook, where the company knew it had to be seen to be acting on concerns about electoral manipulation.”
Given its record in Asia — in addition to violence in Myanmar, Facebook and one of its messaging units, WhatsApp, have been tools for extremists in Indonesia and India — some are skeptical of its willingness to police the content on its networks.
Supporters rally for Philippine leader Rodrigo Duterte, whose divisive social media “army” helped propel him to the presidency in 2016. (Photo by Ken Kobayashi)
There are also complaints about lack of transparency in what Facebook claims is “more aggressive” monitoring and deletion operations. And there are questions about its reliance on third-party fact-checkers to root out hate speech and fake news. The fact-checking groups often employ only a handful of people.
Marzuki Darusman, an Indonesian lawyer who chaired the United Nations Independent International Fact-Finding Mission on Myanmar in 2017, said many doubts remain over the extent to which Facebook actively moderates the content on its site.
“Facebook had been overly reliant on third parties to detect dangerous speech and misinformation on the platform and then report it to them,” he said at Myanmar’s Digital Rights Forum in Yangon in January, citing the crackdown on Rohingya in 2017 as a particularly egregious case. “They were far more reactive than proactive. We need to consider that this is still the case.”
In the Philippines, Facebook has hired Rappler as one of its two local fact-checkers. Ressa, the site’s editor, says Facebook is “learning” but blamed it for shirking responsibility for the content on its network.
“Facebook became the largest distributor of news — and yet they didn’t take the responsibility of the gatekeeper seriously,” Ressa said in an interview. “The reality, still, is they make choices by not making choices. … They are allowing a lie to be treated the same way as truth. And they have already seen the impact on democracies around the world.”
Rappler is bracing for false political content on Facebook reminiscent of the 2016 presidential elections that helped propel Duterte, a former mayor in the southern Philippines and poor performer in early voter surveys, to the presidency.
The company has five people, including the research head Gemma Mendoza, working on fact-checking the election. At some points, the whole newsroom gets involved in the fact-checking, depending on the subject, Mendoza said.
Fake tapes, ethnic tensions
A couple of days into the new year, a mysterious voice recording started doing the rounds on Indonesian social media, including Facebook and WhatsApp. In the recording, a man claimed he had found evidence of vote-rigging for the upcoming presidential election.
He said seven ballot containers were found in Jakarta at Tanjung Priok, the country’s busiest port, each containing 10 million voting papers for the April election. The ballot paper was already punctured next to “1” — the candidate number for the incumbent President Joko Widodo and his Muslim cleric running mate Ma’ruf Amin.
“The … cards were punched in number 1, punched by Jokowi,” the recording said. “There is a possibility [these are] from China.”
Indonesia’s General Election Commission, KPU, denied these claims, and the police swiftly arrested the man who made the recording, along with three other people who disseminated it. The man was identified as Bagus Bawana Putra, chairperson for a support group for Prabowo Subianto and Sadiaga Uno, the opposition camp. The Prabowo-Sandi camp denies any knowledge of the man.
Indonesian president Joko Widodo, left, and challenger Prabowo Subianto, right, mark the launch of the election campaign period with a ceremony in Jakarta.
As Indonesia heads closer to the election, fake news is again becoming commonplace in the archipelago nation, where it played pivotal roles in previous polls. In the 2014 presidential election, which pitted Widodo and Subianto against each other for the first time, smear campaigns aimed at Widodo accused him of being Chinese and anti-Islam. The 2017 Jakarta gubernatorial election was also rife with ethnically and religiously charged fake news, which eventually led to the defeat — and subsequent arrest — of then-Jakarta governor Basuki Tjahaja Purnama.
“Misinformation on political issues is already filling the minds of social media users in Indonesia,” said Septiaji Eko Nugroho, founder and chairman of Indonesian Anti Hoax Community, or Mafindo. “It’s mostly easy to fact-check any misinformation, but post-truth era is making the delivery of the clarification difficult.”
Indonesia’s young population — 60% of its 262 million population is under 40 — and the high penetration rate of smartphones makes the country fertile ground for the mass transmission of misinformation. The country boasts one of the highest Facebook user bases in the world, and a recent survey showed that 77% of respondents use WhatsApp every day, and 43% Facebook.
Facebook has launched several initiatives in Indonesia to tackle misinformation, including enlisting a number of local media as fact checkers. If the fact checkers deem a news item to be fake, Facebook will decrease its distribution on the platform.
In a polarized country like Indonesia, fake news can be a potent force. “Poll results suggest that voters have turned partisan, that whether or not they believe in fake news depends on their political preferences,” said Kuskridho Ambardi, communications lecturer at Gadjah Mada University.
“Partisan stances create preferences toward information, in which voters select info based on their political preferences, not the other way around … What happens are monologues, where people will simply delete information they don’t want to hear,” Ambardi said.
Modi’s threats
Across India last summer, rumors about kidnappers spread on Facebook, WhatsApp and other social media sites, causing panic among villagers. The hysteria led to more than 30 deaths and left many injured simply for being perceived as child abducters.
In one instance, two young men – a dreadlocked musician Nilotpal Das and his friend Abhijeet Nath — were lynched to death in a village in the northeastern state of Assam on suspicion of child lifting. The deaths raised serious concern over rural India’s perceptions of people based on looks and demeanor.
In other cases, Hindu cow protection vigilantes carried out attacks against people who were suspected of trading in beef, based on rumors spread on social media sites. According to an IndiaSpend analysis, between January 2017 and July 2018, at least 33 persons were killed and at least 99 injured in 69 reported cases.
India is estimated to be Facebook’s largest and fastest-growing market, where prime minister Narendra Modi is a prolific social media user with 43.4 million Facebook followers.
Facebook and WhatsApp are each used by over 200 million people in India and the number is expected to more than double in the next couple of years. The rampant spread of rumors and fake news on the platforms pushed Narendra Modi’s government to direct Facebook to contain the spread of such messages.
In August, the Indian government asked telecoms and internet service providers to explore ways of blocking Facebook, WhatsApp, and other apps during emergencies. WhatsApp responded by rolling out a feature allowing users in the country to forward a message to only five chat lines at a time, marking them as forwards — a step that prevents the messages from being distributed to large numbers of people.
Now, with an oncoming general election due by May that will decide whether Modi will have a second term in office, there are fears that the platforms will be used more for creating unrest.
Facebook’s efforts to curb the spread of fake news and hate speech in India include a tie-up with digital fact-checking media startup Boom Live and news agency AFP to fight misinformation. Boom Live’s team of eight people helps Facebook fact-check in Hindi and Bengali, and will also fact-check photos and videos. Facebook’s security team says it can monitor content in 16 languages in India. (India has 22 major languages and nearly 20,000 mother tongues.)
Facebook posts that go viral but are fake come with warning tags when they are shared, said Boom Live deputy editor Karen Rebelo.
“It is still work in progress and it should be perfected closer to the elections,” Rebelo said.
Native speakers needed
For veterans of the social media wars in Asia, concerns remain that Facebook’s election-year measures do not go far enough. Some have raised questions over whether Facebook has enough native speakers monitoring the site ahead of the elections, pointing the lack of Burmese speakers to police content on the site during the Rohingya crisis.
The company says it does. “Our goal is to make sure we always have the right number of content reviewers to review content in the native language or languages of the communities that we serve, and to action reports within 24 hours,” said Yoko Shimada, a spokesperson in Facebook’s Tokyo office.
But Vera Files, a small news organization that is partnering with Facebook to fact-check the Philippines elections, illustrates the difficult realities on the ground. Vera, which is funded by the Washington-based National Endowment for Democracy, has six fact-checkers taking turns to vet posts in an archipelago with over a hundred distinct local languages.
“We are lucky if we happen to know other dialects and local languages apart from Tagalog,” said Daniel Abunales, deputy project director for fact-checking at Vera Files.
Maria Ressa, founder and editor of Philippine online news outfit Rappler, has clashed with the country’s government for investigating false pro-Duterte content. “Facebook … are the only one that has power to solve this.”
Jes Petersen, chief executive of Phandeeyar, Myanmar’s leading tech hub, says Facebook still needs to be more transparent about how it monitors content on its site. He ultimately sees the need for regulation — although he warns there is no “silver bullet.”
“We should be concerned about Facebook’s impact on elections — there are obviously a lot of problems concerning election processes and social media, as we have seen in the U.S. and EU,” he said. “The only examples we’ve seen of Facebook taking really concerted action is in places like the EU where they are being forced to by tough laws.”
Despite being a vocal critic, Rappler’s Ressa said she still plans to cooperate with Facebook. Twinmark’s takedown and similar recent efforts are a step in the right direction, she said.
“I’ll say now I think Facebook is taking it seriously. I think they are trying to solve the problem,” Ressa said. “They are the only one that has power to solve this.”
Additional reporting by Nikkei staff writers Mikhail Flores in Manila, Shotaro Tani in Jakarta, Rosemary Marandi in Mumbai and Yuichi Nitta in Yangon.
Artikel ini hanyalah simpanan cache dari url asal penulis yang berkebarangkalian sudah terlalu lama atau sudah dibuang :
https://www.malaysia-today.net/2019/02/03/bots-and-the-ballot-box-is-facebook-prepared-for-asias-elections/