hero image

Hey, Alexa, should I wear the pink or the sparkly dress today?

By Rachel Botsman

I invited ‘Alexa’, also known as the Amazon Echo, into my home for an experiment with my daughter, Grace, 3. Pointing at the black cylindrical device, I explained it was a talking speaker, a bit like ‘Siri’ but different. “You can ask it anything you want,” I said.
“Hello, Alexa,” said Grace. “Will it rain today?” The turquoise rim around the speaker glowed into life. “Currently, it is 60 degrees,” a female voice answered and assured her it wouldn’t rain.
Over the next hour, Grace quickly figured she could ask Alexa to play her favourite music from the film Sing. She realized Alexa could tell jokes, do maths or give interesting facts. “Hey, Alexa, what do brown horses eat?” Her favourite interaction was realizing she could tell the assistant to stop with a simple command. "Alexa, shut up," Grace barked in a raised voice. Looking a little sheepish, she asked me if it was okay to be rude to her. Did she think it had feelings or even deserved respect?
By the next morning, Alexa was the first ‘person’ Grace said hello to as she bounded into the kitchen. My preschool daughter, who can’t yet ride a bike, read a book or properly decipher good from bad, had also quickly mastered that she could buy things. “Alexa, buy the movie Frozen,” she said. Of course, Grace had no idea that Amazon, the world’s biggest retailer, was the corporate master behind the helpful assistant.
This simple experiment is a telling illustration of a profound technological shift. It’s easy enough for adults to be coaxed into giving away their trust to a seemingly ‘helpful’ bot cleverly designed by marketing and technology experts. But when it comes to children, few checks and balances exist to deter them from giving away their trust very quickly.
Two days in of living with Alexa, something significant happened. “Alexa, what should I do today?” Grace asked nonchalantly. It was shortly followed by a question about her fashion choice. “Alexa, what should I wear today?” I unplugged the thing.
In April 2017, Amazon launched the Echo Look device, which comes with a camera. In other words, Alexa doesn’t just hear you, it sees you. The Style Check feature uses machine learning algorithms to judge our outfit choices, awarding them an overall rating from Alexa.

Related content

Confronting, isn’t it? We’re no longer trusting machines just to do something but to decide what to do and when to do it.
For generations, our trust in technology has resided in a confidence that the technology will do what it’s expected to do – we trust a washing machine to clean our clothes or an ATM machine to dispense money. But what happens if I, say, step into an autonomous car? I’ll need to trust the system itself to decide whether to go left or right, to swerve or stop. It’s an often cited example of how technology is enabling millions to take what I call a ‘trust leap’ – when we take a risk and do something new or in a fundamentally different way.
The artificial intelligence trust leap, and others like it, raises a new and pressing question: When an automated machine can have so much power over our children’s lives, how do they set about trusting its intentions?
The next generation will grow up in an age of autonomous agents making decisions in their homes, schools, hospitals and even their love lives. The question for them won’t be, “How will we trust robots?” but “Do we trust them too much?” In our rush to reject the old and embrace the new, children may end up placing too much trust, too easily, in the wrong places.
One of our key challenges is deciding where and when it is appropriate to make trust a matter of computer code. We need to be giving children the tools to judge whether automated machines are trustworthy (or secure) enough to make decisions. Beyond security concerns, the bigger question is whether we can trust these bots to act ethically. Specifically, how do they ‘learn’ what’s right and wrong?
It would be a shame to find ourselves in a world so automated that we depend solely on machines and algorithms to make decisions about whom to trust. That’s a world devoid of the colour and movement born of human imperfection, and, if we take our hands off the wheel too much, possibly even dangerous. It’s humans, with all our wonderful quirks and mutations, that make trust possible – not technology or mathematics.
If we want the upcoming generation to understand that, we need to design for a ‘trust pause’, an interval in which children stop and think before they automatically click, swipe, share or accept. To ask “Are you sure?” And we need to provide them with the knowledge and education that helps them decide: Is this person, information or thing worthy of my trust?
Portrait of Rachel Botsman

Rachel Botsman is an author, speaker, university lecturer and global expert on trust. Her work examines how technology is transforming human relationships. She is the author of Who Can You Trust? (Penguin Portfolio, 2017) and co-author of What’s Mine Is Yours (Harper Collins, 2010). She teaches the world’s first MBA course on the collaborative economy, which she designed, at the University of Oxford’s Saïd Business School.

The digital challenges and opportunities facing young people