Parenting by Barbie?

AI AI Barbie explores the relationship between parents, children and smart toys

A Barbie doll that teaches a child manners and encourages good behaviour. Sounds like a dream? With artificial intelligence (AI) and the arrival of smart toys, it may become reality. And the reality may not be quite as cosy as the dream. Smart toys can listen in on conversations, and might interfere with parenting in unwelcome ways. They are therefore at the centre of considerable debate about privacy and data protection. And, at least as significantly, about the implications for children's development, socialisation and communicative skills. How does AI influence fantasy and play? Can it change a child's relationship with parents or carers? Such questions led SETUP and Waag to carry out research into smart toys. The AI AI Barbie initiative is taking a critical look at the role of AI in the parent-child relationship. Karien Vermeulen, Head of the Learn Programme at Waag's Creative Learning Lab, talks about the project and the hackathon that the project partners recently organised with support from SIDN Fund.

Research by Stefania Druga has shown that children often think that technology is cleverer than they are. They also tend to place a lot of trust in what smart toys tell them. Smart toys and smart assistants, such as Alexa, could therefore influence a child's views on morality. "I wasn't at all comfortable with that idea," says Karien. "What are the implications of outsourcing difficult conversations -- leaving it to a robot to explain about, say, sex?" Imagine your child being ordered around by a doll all day, and copying its bossy behaviour!

AI AI Barbie

A preliminary meeting between Karien and researcher Druga, followed by an ad hoc brainstorming session with SETUP's Jelle van der Ster, ultimately led to the idea of organising a public event. "How best to enable children to make responsible use of new technology is an established theme at Waag. The new initiative is based on the concept of research through action. We want to both promote awareness and explore the possibilities of smart toys."

AI AI Barbie Hackathon-2


The hackathon held on 6 June 2019, with SIDN Fund as one of the sponsors, was the focal point of the initiative's 'action' element. Using a playful approach, a group of technical experts, artists, developers, hackers, designers and parents explored the various uses of smart toys. "Doing some AI programming for yourself is very instructive. You get a clearer idea not only of the dangers, but also of the potential of the technology. That's the beauty of the approach," says Karien. A number of 'provotypes' were used to test the extremes of smart toys. One was the MoodyMonster, a monster that prepares children for disappointment. The smart creature leads a child on an adventure, but ensures that the escapade is ultimately disappointing. Another provotype was Best Fake Friend: a doll that listens to a child's problems, then offers highly commercialised solutions, in a way that almost parodies personalised pop-up ads.

AI AI Barbie Hackathon


"SIDN Fund is supporting the project because more and more 'smart' toys are coming onto the market. Yet there are no clear, reasoned parameters in place to ensure that these toys are responsibly designed. So, for example, we've had the recent furore surrounding Google's smart assistants behaving like spies in the home, listening in on conversations. The rise of smart toys also raises more fundamental questions about parenting. The unique aspect of this project is that the co-organisers, SETUP and Waag, are addressing the issues in a way that is simultaneously investigative and creative. Insight into how these smart devices work is a prerequisite for informed decision-making and contributes to 'responsible' security behaviour," says Valerie Frissen, CEO of SIDN Fund.

Don't ban smart toys

Should we ban smart toys from our homes? Karien doesn't see that as a solution. She favours promoting awareness. "What we should be doing now is asking ourselves what we want life with social robots to be like. We want people to understand what AI entails, and we want to resist the creation of technological black boxes." Realising those aims will require a lot of work. "Our next objective is to dive deeper into the relevant issues. We'd like to do more research on themes such as the influence of technology on communication, socialisation, fantasy and play. Ideally using a similar approach to the one adopted so far, involving a wide group of people; from artists to scientists, from parents and children to experts in the relevant disciplines. Then research, awareness and debate go hand-in-hand," Karien argues. Wouldn't it be great to get civil servants creating a smart Furby? Or to ask young people how they envisage life with smart technology? Interested in this initiative and the developments surrounding smart toys, or curious about the other outcomes of the hackathon? Visit Waag and SETUP for details.

  • Tuesday 25 June 2019


    Slowdown in government adoption of internet security standards


    Legal requirements on the cards

    Read more
  • Wednesday 6 November 2019

    Internet security

    Are SMEs ready for the growing cyber-threat?


    SIDN Panel responds to recent research

    Read more
  • Thursday 13 December 2018

    Internet security

    Now's the time for digital validation


    Expert Kick Willemse on the developments surrounding digital validation

    Read more


Your browser is too old to optimally experience this website. Upgrade your browser to improve your experience.