This article first appeared on the South China Morning Post.
A Singaporean company is now offering content review services in overseas markets amid rising demand from governments around the world, and as Beijing continues a drive to clean up its domestic cyberspace.
BIGO Technology, a Singapore-based social media company owned by Chinese live broadcaster YY Inc, is helping the Indonesian government to filter its internet content, and is also in talks with authorities in Vietnam, Egypt, India, and the Middle East to export its content review technology, according to James Wang, vice-president of BIGO.
Chinese internet and social media companies have had to develop comprehensive content review capabilities to block content deemed harmful or inappropriate by the Chinese authorities, putting them at the forefront of technologies that are in increasing demand by governments around the world who want to clean up the internet of nefarious and illegal activity.
BIGO’s artificial intelligence-powered content moderation system, which was trained on a huge pool of content to analyze images, boasts 99% accuracy in identifying illegal content such as pornography, violence, and terrorism-related information, according to the company. At BIGO, this is supported by a team of 2,000 staffers around the world who also manually review content to ensure nothing is missed.
“Now that we have the technology, we shall show it,” Wang said in a phone interview last week. “I am only a seller of the kitchen knives. It’s their [the customer’s] business whether they use it to cut vegetables or meat.”
BIGO Technology, which runs the live broadcasting platform Bigo Live, short video app Likee, and messaging and social network service Imo internationally, was founded in Singapore by Chinese entrepreneur Xueling Li, who is also the co-founder and chief executive of YY. Beijing-based, US-listed YY acquired BIGO in March this year.
The company is not shy about promoting its capabilities despite rising concern in some quarters about the rise of technologies that enable governments to censor and surveil the online activities of their citizens. Instead, BIGO sees an opportunity to meet increased demand for technology that helps to combat online lies, propaganda, conspiracy theories, and hate speech.
Even social media giants in the west, such as Google and Facebook, are coming under increased pressure from governments to step up their content moderation efforts, after a series of controversial incidents including the recent New Zealand mosque massacre, which was streamed live on Facebook and posted on YouTube and Twitter.
On March 31, Facebook founder Mark Zuckerberg published an open letter inviting governments and regulators to play “a more active role” in deciding what is harmful content, to help ensure election integrity, privacy, and data portability.
Under a partnership with Indonesia’s Ministry of Communication and Information Technology (KOMINFO), BIGO has been using its filtering system since 2017 to help the authority detect, monitor, and block “negative content” including pornography, fraud, gambling, and terrorist-related information.
Bigo Live was itself banned in Indonesia in 2016 for a large amount of “inappropriate content” on its platform. It returned in January 2017 after a clean-up and implementation of a monitoring mechanism, and 200,000 pieces of negative content have been blocked as of February 2019, according to a statement by KOMINFO in March.
KOMINFO did not respond to email and telephone requests for comment on its work with BIGO.
The Indonesian government uses BIGO’s technology to filter content across the internet—including social platforms such as Facebook and Instagram, said Wang, who added its main aim was to block content generally deemed as “unacceptable” such as violence and porn, rather than political material.
Wang said the company is “a neutral service provider” offering a technology. The customer decides how to set the filters.
Within weeks of the New Zealand shooting, Australia passed sweeping legislation to punish social media companies that fail to remove violent material from their platforms in a timely manner. New Zealand prime minister Jacinda Ardern also joined with French president Emmanuel Macron to launch a plan that brings together governments and tech companies to review how online content is disseminated.
Meanwhile, the UK in April released the Online Harms White Paper, which proposes broad social media regulation, including instituting a new regulator with enforcement powers to fine companies and executives who breach the “code of practice” by failing to remove content promoting terrorism, hate crimes, and self-harm.
Wang says it is crucial that BIGO, which spearheads YY’s international operations, complies with local regulations wherever it operates. User-generated content is difficult to moderate, and the legal risks needs to be managed.
BIGO runs apps in 150 countries with combined monthly active users of over 300 million. Its messaging app Imo has reached 212 million monthly users globally, bringing more users and revenues to its short video and live streaming services. North America and South America are currently its fastest-growing markets.
“We don’t see ourselves as an outsider in new markets,” said Wang. “Instead we embrace the local community, culture, religions, and regulations.”