One of the biggest influences in our children’s lives is someone we barely know. We asked tech trust expert Rachel Botsman how to welcome smart agents safely into the family...
What is the longest river in the world? ‘Ask Daddy’, is invariably the answer in our household, or more frequently, ‘Ask Alexa’.
Smart agents and technology are becoming an ever-more integral part of our lives and as their role increases, so too does our parental anxiety, fear and confusion. Just look at the recent Momo Challenge – a fabricated game that involved children receiving a series of increasingly dangerous instructions from an anonymous contact – which had schools sending a warning to parents to be vigilant when supervising their children online.
One of the first instincts we learn is to trust, and those blessed with a secure family do so implicitly. Yet when it comes to artificial intelligence, the reverse is true. Rachel Botsman is a globally recognised expert on how collaboration and trust enabled by digital technologies will change the way we live. The first ever Trust Fellow at Oxford University and author of Who Can You Trust?, her concept of ‘collaborative consumption’ in her first book, What’s Mine Is Yours, was named by TIME as one of the ‘10 Ideas that Will Change the World’. Her Ted Talks boast more than four million views.
“Children are some of the most vulnerable people in our society when it comes to technology, not only because the commercial agenda of corporations is beyond their comprehension, but because children are innately trusting,” she explains.
“Kids will readily answer questions from new acquaintances, let alone inanimate objects that appear trustworthy confidants.
“The challenge for our children is how we can continue to have healthy relationships with ‘real’ people and embrace all the incredible benefits technology has to offer.”
We asked the mother-of-two how to safely integrate these devices at home and how we can use them to improve family life.
Go to www.rachelbotsman.com |www.linkedin.com/in/rachel-botsman
What should we encourage our children to ask smart devices?
If you watch a child play with a smart home assistant such as Alexa, it’s clear that kids naturally ask questions to probe the nature of different devices. They start with things they know. ‘What’s the weather like today?’ ‘What’s the time?’ ‘Can you play music from Frozen?’ Etc. They then move onto trying to figure out Alexa’s personality and how much it really knows. ‘Do you know any pirate jokes?’ or ‘What’s the largest shark in the world?’ These stages are fairly predictable, but then children get quite discerning in different ways. ‘What question would you ask me?’ my son shouted trying to test the device. My daughter wanted to know who made Alexa. “Smart people at Amazon,” replied Alexa. You could see my daughter looked confused. She paused. “Who are these Amazons?” I tried not to laugh. Rather than give children specific questions to ask, it’s more important to teach them to be curious and a little bit sceptical.
Is it safe to let our children talk to bots?
Navigating the dangers of smart toys can feel very intimidating for parents, especially when they are often overwhelmed by demands for the new Dora the Explorer or Transformer collector’s item. A good test is whether or not the child is ready to understand the risks involved with the toy. You wouldn’t give a child scissors until they are ready to understand that the blades are sharp, and the item needs to be carried in a certain way to prevent injury. Or if you did you would take the time to explain the dangers and watch carefully as they attempt to cut string or paper. I try to use the same approach when I introduce new tech into my home and that when I say ‘no’, it’s because I care. For instance, I observed Grace interacting with Alexa, and when I realised that she was being influenced without her knowledge, I decided to unplug it until we can have a conversation about why ‘Amazon’, and not the ‘Amazons’, are inside the little black box. It’s not about putting Alexa away for good, it’s about setting Alexa aside until Grace is ready. Often when we withhold trust, we are making a decision from a place of care.
Do bots store the information children share? Who are we entrusting our information to?
Do you remember having conversations with your teddy or doll as a child? Imagine if those conversations weren’t imaginary, but your ‘friend’ really replied and was fascinated by the details of your family, habits, deepest secrets and wishes. In 2015 Mattel’s wifi-enabled Hello Barbie came under fire for recording the conversations of its unwitting young owners. The audio was not only being stored without the child’s knowledge, but the data was shown to be vulnerable to external hackers. The incident sparked crucial conversations about the responsibility of the toy manufacturer for the data it collects and whether or not the toy was, in fact, being used as a surveillance device for advertisers to collect information and more effectively market their collection of toys.
What is the ‘robotic moment’ and how is it affecting family life?
Sherry Turkle, a brilliant professor of social science and technology at MIT, first coined the term ‘robotic moment’ several years ago to describe the amount of human interaction people are happy to delegate to robots or carry out over phones and computers. The concern is that we’re losing real moments of human connection. A simple example is parents busy texting while they’re with their kids in the playground. I’m totally guilty – the other day I was pushing my daughter on the swing while trying to FaceTime with her cousins in America. It ended up in an unfortunate accident. Why do we feel the need to share these moments virtually? Why can’t we just be present? As Turkle puts it, we are ‘alone together’. The challenge for our children is how we can continue to have healthy relationships with ‘real’ people and embrace all the incredible benefits technology has to offer. It’s a critical question to address now because smart agents will become an integral part of children’s lives.
How can we make children aware of the commercial agenda of these bots?
I’m trying to imagine what my kids would say if I asked them: ‘What is a commercial agenda?’ They’d probably laugh and tell me not to talk about work! Even the concept that companies spend a lot of money to try to sell them something is not a concept most children understand. Language is crucial when it comes to helping children be more aware of what technology is and how it might influence them. Kids naturally ask questions. Some days lots of them! A great tip is when they ask you a question about how something works, follow it with a question for them. If they feel you’re exploring something rather than preaching or scaremongering them, it’s far more effective. For instance, my son asked the other day how the computer remembers the last thing he worked on. Bingo. Here’s a beautiful opportunity to explore together the inner workings of the machine. Three hours later, Jack knows more about a computer memory than I do!
Why do you think many parents are concerned about AI devices?
The fear and confusion are entirely understandable. The challenge around making informed decisions is there is so much garbled and conflicting information out there. Take something simple like screen time. We hear report-after-report that too much is bad for kids and then the Royal College of Paediatrics and Child Health (RCPCH) in the UK releases a report saying there is little evidence that screen time is in itself harmful to a child’s health and that it is impossible to recommend age-appropriate time limits. Huh? No wonder we’re confused.
Part of the problem is the label ‘artificial intelligence’. I mean, you don’t exactly want to eat artificial food. It makes the technology seem fake and imposing at the same time. Plus, you have respected people such as Bill Gates and Stephen Hawking saying AI is going to take over the world. I think if you asked most parents about their concerns about AI they would be afraid to admit they don’t really know what it is and their real fear is robots taking their children’s jobs.
Our worries are amplified by viral stories such as the Momo Challenge. It’s important not to allow healthy scepticism around the risks of AI devices to explode into blanket fear or rejection.
What tools can we offer to help them make informed decisions about AI devices?
Taking the time to learn about technology with our children is one of the best steps we can take: curiosity is vital. The more our children understand the nature of technology and the importance of asking questions, the more equipped they will be to make smarter trust decisions in their own lives.
The educational tools available really depend on the age of the child. Some great tools are being designed by educational institutions like the MIT Media Lab, which are specifically designed to help parents and children learn about the nature of programming and the scaffolding behind robots together such as cognimates that are aimed at children between the ages of seven to 10. I’d love to see more material that introduces the concept and questions around artificial intelligence for children between the ages of four and seven. I’m yet to find a beautiful picture book that creatively introduces computers and bots as human inventions.
What do bots teach us about trust?
Bots teach us a lot about our incredible propensity to adapt to new ideas. When we do something new or differently for the first time, we take what I call a ‘trust leap’. From infancy we take trust leaps by the hour – every time we do something new – whether tasting an ice-cream, taking the first step onto an escalator or scooting to school. In the digital age, we are constantly being asked to take untested trust leaps – new technologies, products or services that take us into uncharted territory. Leaps we think will be huge and challenging, such as hopping in a self-driving car or being looked after by a robot for the first time, come rapidly and before we think twice, we’ve adapted to the change. It feels normal.
When I interviewed the brilliant philosopher Stephen Cave for my last book, Who Can You Trust?. He spoke a lot about the challenges he anticipated for his young daughters who are a similar age to my children. One profound idea from our conversation stayed with me: our children will need to learn at what point to interrogate the machine.
I can see it now: my son, Jack, in 2035, 25-years-old, sitting in a workplace with a robot, asking it, ‘What do you do?’ and ‘What can’t you do?’. Of course, there is another possible future scenario: the robot is interviewing Jack. At the end of the day, the responsibility for making sure robots are trustworthy and behave well must lie with us.
Do you limit the technology your children are allowed to use?
From a young age, we set the same time every day for our kids to watch TV or play with the iPad. They could choose. A very conscious decision was to resist the impulse to snatch the control or device and turn it off and make them do it (even it required a few minutes of quibbling over ten more minutes.) My intention has always been to try to teach them how to make conscious choices of how they are spending their time and what they’re choosing to play or watch. I’ve found that if we’re consistent, they don’t really question the boundary. It’s when we go away, or they’re with other kids who have different rules things go a bit pear-shaped. Don’t get me wrong, it’s tough to be consistent. There are many days when I feel like ‘Oh, please just take the iPad!’ It’s also tricky when we say go out to eat, and they look around and see other kids playing on phones. I find the sight utterly depressing. The entire dinner becomes a negotiation as to why they can’t have the phone.
My son is now coming up for eight, and at school, he’s learning how to touch-type and write at the same time. Some of his homework is on spelling apps. He’s learning how to code in science. So, we’ve had to give him a lot more autonomy to make the right decisions.
And are we more untrusting now than we have ever been? And will our children be even more so?
No far from it! The problem is not a lack of trust but that we give our trust away too easily. We are living in an age of trust on speed. The primary challenge is how we teach children to slow down before they tap, swipe, click, accept and share. To ask the right questions and find reliable information to decide is the person, product, content or robot worthy of my trust?