Police have set up a 12-strong advisory panel to combat cybercrimes, after cases quadrupled over the past decade.
According to the force’s crime wing, the number of annual cases surged from 5,133 to 22,797 over the last 10 years, with financial losses tripling from HK$916 million to HK$3.2 billion.
Some 86 percent of cybercrimes are online scams stemming from shopping, investment, job hunting and romance, according to data from the force.
Scammers have also approached victims via phishing emails and SMS texts on mobile phones.
In the wake of the surge in cybercrimes, the police in December established the Cybercrime Policing Advisory Panel.
The panel is composed of 12 tech experts, including Hong Kong Science and Technology Park chairman Sunny Chai Ngai-chiu and Cyberport chief executive officer Peter Yan King-shun, Hong Kong General Chamber of Small and Medium Business vice-president Eric Yeung Chuen-sing, Hong Kong Applied Science and Technology Research Institute chief executive officer Denis Yip Shing-fai and Yiu Siu-ming, the associate head of the University of Hong Kong’s department of computer science.
They are led by the police’s director of crime and security Keith Yip Wan-lung.
“The duties of the members are to share recent technological developments, recognize potential threats and advice on police technologies,” Yip said.
Police have a few solutions for cyber crimes, Yip added, including requiring real-name registration for commercial numbers, the cybersecurity volunteering team, and enhancing the Scameter+ anti-scam app.
Another possible solution is to allow financial institutions to share bank account information with the government and other institutions if the account is suspected to be related to criminal activities.
Recapping previous panel meetings, Yip said artificial intelligence regulations were a major focus and members agreed that a clear definition for AI’s legal responsibilities is needed. “For example, if an AI-driven car got into a traffic accident, should the AI system provider, the car manufacturer or the driver bear the responsibility?” Yip said.
Yip said scammers often approach victims via AI tools that trace internet users’ activities, but the existing Personal Data (Privacy) Ordinance does not cover such personal information. Panel members have suggested extending the scope of privacy protection.
AI developers should also ensure fairness among users who speak different languages.
“For instance, [AI chatbot] ChatGPT was trained by an English database,” Yip said. “It may be prone to discriminate against users who don’t speak English or those who come from a different culture, race or religion.”
Yip cautioned that criminals can also use AI to write fake news to influence elections and create fake media content for scammers.
Panel experts have advised authorities to study if a mandatory “created by AI” label should be put on AI generated texts, audios, images and videos. Meanwhile, products created by AI should still be protected by intellectual property laws.