Connected baby products are becoming a fast-growing segment of the consumer technology market. From AI-powered baby monitors to “smart bassinets” that respond to an infant’s cries, companies increasingly promise parents insights that previous generations could only guess at. But as these products begin collecting sensitive data about children—including audio recordings—regulators and industry watchdogs are paying close attention.
A recent review by the BBB National Programs examined marketing and privacy practices tied to smart bassinets sold by Dorel Juvenile Group, a manufacturer behind the well-known Maxi-Cosi line of baby products.
The inquiry—conducted jointly by the National Advertising Division (NAD) and the Children’s Advertising Review Unit (CARU)—reached a mixed conclusion.
Watchdogs determined that Dorel’s advertising claims about the functionality and performance of its AI-powered CryAssist™ technology were supported by evidence. However, regulators also urged the company to strengthen how it provides notice and obtains parental consent under the Children’s Online Privacy Protection Act (COPPA).
The case highlights an emerging regulatory challenge: ensuring that innovative AI features targeting families comply with strict child privacy protections.
The Rise of AI-Powered Nursery Technology
The products at the center of the inquiry are connected nursery devices marketed to parents seeking help understanding infant behavior.
Dorel sells several smart bassinets that incorporate CryAssist technology, including:
- Sibia Bassinet with CryAssist Audio Monitor
- Starling Smart Bassinet
These devices combine traditional baby-monitor functionality with machine-learning models designed to interpret infant cries.
According to marketing materials, the system can analyze crying patterns and categorize them into potential needs such as:
- hunger
- fatigue
- discomfort
- agitation
- gas-related distress
The goal is to provide parents with faster insights into why a baby may be crying, potentially helping caregivers respond more effectively.
The underlying AI technology was developed by Zoundream AG, a Swiss firm specializing in acoustic analysis and infant cry research.
Why Advertising Watchdogs Investigated
The inquiry arose through monitoring activities conducted by NAD and CARU. These organizations review marketing claims and data practices to ensure companies adhere to advertising standards and child privacy regulations.
Two key issues triggered scrutiny.
First, the agencies examined whether the company’s AI-related marketing claims accurately reflected the technology’s capabilities.
Second, they reviewed whether the company’s data collection practices complied with child privacy rules, particularly because the products process audio recordings from children under the age of thirteen.
When children’s data is involved, companies must satisfy strict requirements under COPPA, including providing parents with clear notice and obtaining verifiable parental consent before collecting personal information.
Advertising Claims About AI Cry Translation
One of the primary claims evaluated during the review involved the device’s ability to interpret infant cries using artificial intelligence.
Dorel advertised that CryAssist technology could analyze audio signals and translate a baby’s cry into specific categories indicating possible needs.
To support this claim, the company provided evidence including:
- peer-reviewed academic research validating the AI model
- documentation showing the system’s calibration and accuracy
- technical explanations describing how the algorithms process cry audio signals
After reviewing the materials, the National Advertising Division concluded that the claims were sufficiently substantiated.
In particular, NAD determined that the statement describing the AI technology as translating cries into potential emotional or physical states was supported by available evidence.
Evaluating Claims About User Control
Another marketing message under review involved the company’s statements about user choice.
Dorel promoted the product by emphasizing that parents maintain control over certain AI-based features and that cry-response functionalities are optional.
Advertising suggested that parents could decide whether to activate or deactivate features tied to CryAssist.
After evaluating the product design and documentation provided by the company, NAD determined that these claims were also supported.
The watchdog concluded that the technology’s configuration options allowed users to control whether certain features were used.
Claims About Data Security and Privacy
Perhaps the most sensitive aspect of the inquiry involved Dorel’s representations about how cry recordings are stored and protected.
Marketing materials suggested that audio data collected by the devices is:
- anonymized
- encrypted
- securely processed in the cloud
The company provided additional technical documentation describing its data handling practices.
This documentation included explanations about:
- how cry recordings are stored
- the encryption methods used
- retention policies for collected audio data
Based on this evidence, NAD determined that the company’s claims regarding encryption and anonymization were supported.
However, even when security practices appear adequate, child privacy regulations impose additional obligations related to transparency and parental control.
COPPA Compliance Under Review
While the advertising claims largely passed scrutiny, the investigation identified shortcomings in how the company handled certain privacy obligations.
CARU examined whether Dorel’s data practices complied with the Children’s Online Privacy Protection Act, the primary U.S. law governing online data collection from children under thirteen.
COPPA requires companies collecting children’s personal information to implement several safeguards.
These include:
- providing clear privacy notices to parents
- explaining what information is collected and how it is used
- obtaining verifiable parental consent before collecting data
- limiting data collection to information necessary for the service
CARU found that Dorel had implemented reasonable security controls and appeared to limit data collection to what was necessary to operate the cry-analysis service.
The watchdog also concluded that the company was not using children’s audio data for undisclosed secondary purposes.
However, the inquiry identified deficiencies related to COPPA’s notice and consent provisions.
Where Compliance Fell Short
According to CARU’s findings, the company’s current practices lacked several key components required under COPPA.
Specifically, regulators found that Dorel’s online privacy disclosures did not fully satisfy requirements related to:
- maintaining a comprehensive online privacy policy
- providing direct notice to parents before collecting children’s data
- implementing a verifiable parental consent mechanism
These elements are central to COPPA compliance.
The law requires companies to verify that a parent—not the child—is authorizing the collection of personal data.
Common methods for verifying parental consent include:
- credit card verification
- government ID validation
- signed consent forms
- secure authentication systems
CARU recommended that the company update its practices to incorporate these procedures.
Why Smart Baby Products Raise Unique Privacy Issues
The case illustrates how emerging consumer technologies are introducing new privacy risks.
Smart nursery devices can collect a wide range of sensitive data including:
- audio recordings of children
- sleep patterns
- environmental conditions in the home
- behavioral insights derived from AI analysis
Because the users of these devices are infants, regulators treat the data with heightened sensitivity.
Even if the information is primarily used to help parents care for their children, companies must still ensure strict compliance with child privacy laws.
As AI-powered parenting technologies expand, these issues are likely to attract increasing regulatory attention.
The Growing Compliance Challenge for IoT Devices
Connected products are rapidly transforming the consumer electronics landscape.
From baby monitors to smart speakers and health devices, many products now rely on cloud-based analytics and machine learning.
This shift introduces significant privacy governance challenges.
Companies must ensure that:
- device firmware and mobile apps comply with privacy regulations
- data collection practices are clearly disclosed
- consent mechanisms function properly
- sensitive data is protected against misuse
These requirements are particularly complex when devices collect information from children.
Automating Privacy Compliance for Connected Devices
For companies developing connected hardware and AI-powered consumer products, managing privacy compliance can become operationally complex.
Organizations must track how data flows between:
- mobile applications
- cloud servers
- AI processing systems
- third-party analytics tools
Automated privacy governance tools are increasingly important for managing these obligations.
These systems can assist companies in implementing privacy-by-design frameworks that ensure connected devices comply with laws such as COPPA while maintaining transparency for users.
Industry Self-Regulation Still Plays a Major Role
The inquiry also demonstrates the continued importance of industry self-regulatory bodies in advertising oversight.
The National Advertising Division and CARU operate outside of government agencies but serve as influential watchdogs within the marketing ecosystem.
Their decisions often shape industry practices and encourage companies to adjust advertising claims before regulators intervene.
In many cases, companies voluntarily agree to follow these recommendations to avoid potential enforcement actions.
In this case, Dorel indicated that it intends to comply with the findings and implement the suggested privacy improvements.
The Future of AI in Parenting Technology
Artificial intelligence is rapidly entering the consumer parenting market.
Developers are exploring systems capable of:
- interpreting infant cries
- analyzing sleep patterns
- monitoring breathing or movement
- predicting developmental milestones
These technologies may provide meaningful assistance to caregivers.
But they also create new regulatory questions about how sensitive family data should be handled.
As AI becomes more deeply embedded in household devices, regulators will likely demand stronger privacy safeguards—especially when children are involved.
The review of Dorel Juvenile Group’s smart bassinets reflects a broader trend in technology regulation.
Innovation in AI-powered consumer products is moving quickly, but privacy protections must keep pace.
In this case, advertising watchdogs confirmed that the company’s AI performance claims were backed by research and technical evidence.
At the same time, the inquiry revealed that even well-intentioned products can fall short when it comes to the strict procedural requirements of child privacy law.
For companies building connected products for families, the message is clear:
Strong security measures are not enough on their own. Transparent privacy notices, verifiable parental consent, and rigorous compliance processes are essential to earning consumer trust in the age of AI-driven parenting technology.