The Chinese government is cracking down on artificial intelligence (AI) products in the nation – with security reviews now being required before they can be sold. This announcement comes alongside the unveiling of a new product by Chinese tech giant Alibaba, which is set to challenge one of the biggest names in AI, OpenAI’s ChatGPT product.
China’s move to implement security reviews on AI products is an effort to allow the authorities to put in place measures to safeguard against potential cybersecurity issues. This could include ensuring AI products have built-in measures to protect against malicious AI entities, or more generally that the product meets the standards of the state for cybersecurity.
This announcement comes shortly after Alibaba revealed its new AI product, ChatGPT. The product works in much the same way as OpenAI’s version, allowing users to engage in natural conversations with a digital assistant. However, the biggest difference is noted in the development team behind the product – while OpenAI’s version is producer by the tech giant, Alibaba’s version is being developed in-house.
This development of its own AI product shows how serious China is getting with the technology, and how it is looking to compete on a global stage. With the security reviews now in place, businesses within China will also be safe in the knowledge that their AI products have been approved by their government – which may prove to be a great advantage for both Chinese businesses and customers.
Overall, the announcements by the Chinese government and Alibaba indicate how serious the nation is getting with AI technology. With security reviews now in place, businesses and customers in China can have faith in the safety of their AI products, while also taking comfort in knowing that Chinese firms are now looking to compete on the global stage with AI technology. All in all, this could prove to be a win-win situation, allowing Chinese firms to put their best foot forward in a rapidly evolving field.
Hey Subscribe to our newsletter for more articles like this directly to your email.