AI Toy Data Security and Privacy Protection
Technical Solutions and Industry Initiatives for Child Data Security in the Era of Intelligent Toys
Introduction
As artificial intelligence technology integrates with the children's consumer market, AI toys enhance educational value and fun through features like voice interaction and facial recognition. However, issues such as excessive collection of children's biometric data and storage of sensitive information frequently occur.
In response, during the recent Personal Information Protection sub-forum of the National Cybersecurity Awareness Week, the China Cyberspace Security Association, together with 16 member companies, released the "Smart Toy Personal Information Protection Initiative" (hereinafter referred to as the "Initiative"). This initiative aims to address child personal information security issues exposed during the rapid development of the smart toy industry and promote compliant and healthy industry development.
The Rapid Growth of AI Toys
Data shows that AI toys have seen significant growth in financing volume and amount this year, with nearly a hundred investment institutions entering the market. According to Tianyancha data, as of now, there are over 13.477 million toy-related enterprises in operation or existing in China. Among these, approximately 2.635 million new related enterprises were registered in 2025 alone.
Looking at the trend of enterprise registrations, over the past five years, the number of toy-related enterprise registrations has shown a year-on-year growth trend, peaking in 2024. Regionally, Guangdong, Hainan, and Hubei provinces lead in the number of toy-related enterprises, with the combined total of these three provinces exceeding 4.119 million, accounting for 30.57% of the total number of enterprises.
The influx of numerous toys has brought prosperity to the AI toy market. According to predictions by consulting firm IMARC, the global AI toy market size is expected to grow from $18.1 billion in 2024 to $60 billion by 2033, with a compound annual growth rate of approximately 14% from 2024 to 2033.
However, the rapid development of the AI toy market has also brought about data security issues. Since toys themselves rely on interactions through voice, touch, vision, or emotional computing and long-term memory involving AI large models, this means that AI toys contain a significant amount of users' sensitive biometric information.
The Initiative: A Multi-Layered Security Approach
The emergence of the "Initiative" is to address child personal information security issues, implementing controls across four pathways: hardware, firmware, cloud, and App to minimize the risk of information leakage.
Hardware-Level Security
For example, MOS transistors or DIP switches can be added to the power supply lines of microphones and camera modules, allowing parents to physically cut power by simply turning a screwdriver. At the SDK level, the GPIO status is read to achieve "driver refusal to load when the switch is not closed."
Firmware-Level Security
In MCU/SoC firmware, only necessary HID channels are registered (e.g., only retaining buttons, accelerometers). Voice ADC and Camera MIPI are conditionally compiled out at the driver layer. During OTA updates, hash verification is performed to prevent drivers from being secretly added back.
End-to-End Encryption
End-to-end encryption can be implemented, for example, where the Wi-Fi layer already supports WPA3-SAE, and the application layer adds TLS 1.3, disabling TLS_RSA and TLS 1.0/1.1 to prevent man-in-the-middle downgrade attacks. Voice/images are first encrypted in place in the toy's RAM using AES-256-GCM, with keys stored in TPM/TrustZone. Only after encryption is completed are HTTPS interfaces called for upload. The cloud only receives ciphertext and cannot reverse-engineer the original files.
Third-Party Data Sharing Controls
Even when data is uploaded to the cloud, third-party sharing must be auditable and traceable, only allowing pre-registered third-party SDK access, requiring parental item-by-item authorization. All outbound requests are logged by the gateway, and user ID hashes are embedded in shared data to facilitate leak tracing.
Overall, this initiative essentially promotes smart toys to shift from a "function-oriented" to a "privacy-security-first" engineering paradigm, requiring enterprises to embed privacy protection capabilities throughout the entire chain of hardware selection, firmware development, and cloud service design. In the future, certified child-safe smart terminals (similar to the EU's KIDS standard) may become a market access threshold.
Data Privacy Security as a Key Link in the AI Toy Supply Chain
With increasing emphasis on data security, manufacturers across the supply chain, including module providers, AI model service providers, complete machine manufacturers, and platform operators, are exploring feasible privacy protection paths through a combination of technology, systems, and compliance measures.
Case Study: Hape and Aleph Alpha's AI Story Machine
For example, Hape, in collaboration with European large module supplier Aleph Alpha, launched an AI story machine. This product incorporates a physical bus switch, with a MOS transistor connected in series to the microphone power line, allowing parents to completely cut power by simply turning a screwdriver. At the driver layer, only I²S input is retained, and Camera MIPI is conditionally compiled out.
The network is not continuously connected; Wi-Fi is only turned on from 03:00 to 03:15 daily, using TLS 1.3 + Mutual ECDSA certificates, automatically closing after a 15-minute window. Storage uses 32G eMMC local cyclic storage with AES-256-XTS hardware encryption, with keys written to NXP A71CH security chips that self-destruct upon disassembly.
In daily conversations, original voice is first sliced, then 30% of frames are randomly discarded, and -40 dB noise is added with ε=1 differential privacy before finally being uploaded via HTTPS for updating small models.
Domestic Solutions
Some domestic cellular/Wi-Fi modules have also been supplied to European multi-brand AI companion dolls. Some modules feature encrypted MCUs with built-in TrustZone, where microphone data is encrypted in place using AES-256-GCM before entering the SoC. Modules also come pre-equipped with eSIM, only allowing access to whitelisted IPs, using TLS 1.3 + OCSP Stapling, and disabling TLS renegotiation.
For cloud storage, independent child encryption libraries are used, with object storage buckets encrypted by default (SSE-KMS), and KMS keys physically isolated from adult services. Modules also come pre-configured with data nationality tags; EU users default to the Frankfurt region without cross-border data transfer. If cross-border transfer is required, parental secondary confirmation + SMS OTP is needed.
BubblePal: A Model for Data Security
BubblePal toys, which have gained popularity in recent years, adhere to strict data security standards, using advanced encryption technologies and security protocols to protect all data transmissions, preventing unauthorized access and data leaks. All chat history preservation requires parental consent, and parents can access and delete their children's personal data at any time through a dedicated application.
Some manufacturers follow the principle of minimal data collection, only gathering data necessary for product functionality. For example, some AI toy companies only collect data essential for functions like voice recognition and image recognition in their products, avoiding excessive collection of children's sensitive personal information such as facial data, geolocation, and family contact details.
Of course, although many manufacturers have begun to prioritize child data security, numerous small and medium-sized enterprises, constrained by costs and technical capabilities, struggle to deploy complete end-side AI and encrypted transmission solutions. Some products may quietly expand data collection scope after firmware updates, which parents find difficult to detect.
Conclusion
In the current era of rapid AI toy development, data security is receiving increasing attention. With the introduction of the "Initiative," it is believed the industry will understand that encrypted storage, access classification, anomaly alerts, and third-party audits are no longer just slogans but can be transformed into a series of testable, regressible technical requirements.
For example, requiring voice and image data to be processed locally first, reducing leakage risks from the source, truly making AI toys fun companions that don't steal privacy.
Note: This document is a translation and adaptation of the original Chinese content.