China has released new draft rules aimed at regulating artificial intelligence systems that act and communicate like humans. The proposal targets AI tools that can form emotional connections with users. Such systems are increasingly common in daily life, including chat apps, digital assistants, and virtual companions.
The draft rules were issued on Saturday by China’s cyber regulator, the Cyberspace Administration of China (CAC), and are open for public comment. They reflect concern over how emotionally responsive AI could affect users, particularly when people spend long hours interacting with these systems.
The move highlights Beijing’s effort to guide the rapid growth of consumer AI while enforcing safety and ethical standards. Authorities are closely monitoring AI behavior, user interaction, and potential risks.
Rules Focus on Human-Like AI and Emotional Interaction
The draft rules apply to AI products and services offered to the public in China that display human-like personality traits, thinking styles, and communication patterns. They also cover AI that interacts emotionally through text, images, audio, video, or other formats.
China imposes sanctions on 20 US companies linked to Taiwan weapons deal
This includes AI chatbots and virtual companions that respond like friends or helpers. Any system attempting to understand feelings or build emotional bonds falls under the new regulations.
Under the proposal, companies must warn users against overuse. Providers are expected to take action if users show signs of excessive use. The aim is to reduce addiction and emotional dependence.
The rules also require companies to take full responsibility for AI safety throughout the product lifecycle, including development, testing, launch, and daily operation. Providers must implement systems for algorithm review, data security, and personal information protection to ensure safe and controlled AI use.
China and Russia say their partnership has reached its highest level despite sanctions
Addressing Addiction and Emotional Risks
A central focus of the draft is psychological well-being. Providers should identify user states and assess emotions during AI interactions. They are expected to monitor how dependent users become on the service, watching for signs of extreme emotions, obsession, or heavy reliance on the AI.
If users display addiction or strong emotional distress, companies must intervene. Measures could include reminders, reduced interaction time, or adjustments in AI responses. The rules recognize that human-like AI can feel real to users, creating the risk of emotional overdependence.
By placing responsibility on service providers, the rules aim to prevent harm before it escalates. Emotional safety is treated as equally important as technical safety.
Israel acts against Chinese aluminium dumping as global trade tensions rise
Content and Conduct Limits
The draft also sets strict content rules. AI systems must not generate content that threatens national security, spreads rumors, or promotes violence or obscenity. These limits cover text, images, audio, and video. Providers are accountable for ensuring compliance.
Data protection is another key requirement. Companies must safeguard personal information, including emotional data collected during AI interactions. The draft strengthens oversight by adding emotional and psychological safeguards alongside existing technical regulations.
The rules are part of China’s broader effort, led by Beijing authorities, to manage advanced technology while maintaining strict ethical and safety standards. They reflect concern over how AI can affect real people when technology becomes personal and emotionally engaging.
The draft does not change existing laws but adds responsibilities related to emotional interaction and addiction. Public feedback will be reviewed by the Cyberspace Administration of China (CAC) before the rules are finalized.
The draft sends a clear message: AI systems that act like humans must follow strict rules. Companies must protect users not only from harmful content but also from emotional harm caused by overuse or dependence.

