Skip to main content
Feature

Alexa! Why are my Kids Yelling at You?

Not too long ago, artificial intelligence was completely science fiction. Machines that talk to you in conversational tones? Devices that understand commands? The future was going to be weird.

Welcome to the future.

Today, thanks to digital assistants and other smart devices, artificial intelligence is in almost every home in America. Frankly, we’re not that far away from having increasingly lifelike robots joining the fray. As AI continues to flood regular daily functions, researchers are starting to wonder how it might be impacting our social interactions, especially those of children.

As convenient and helpful as digital assistants are, almost no one stopped to assess how Siri, Alexa, or Google Assistant might impact people’s behavior until after the devices became widely available. Now parents are starting to worry that the way their child yells, “Alexa! Play ‘Into the Unknown!’” may impact the way he or she treats people. (For those out of earshot of the ever-listening AI, Alexa is the digital assistant built into Amazon’s Echo and Echo Dot smart speakers that responds to commands and speaks in an increasingly human-sounding cadence.)

Researchers nationwide, including some from BYU Marriott, are looking to uncover the answers. And studies are finding lots of information, addressing things like “Will Siri make my children rude?” or “Will Alexa cause me to stop saying thank you?” The results: There is no reason to be overly concerned. Yet.

Alexa and Siri and Google, Oh My!

It was James Gaskin’s five-year-old daughter who got him thinking. The associate professor of information systems at BYU Marriott and daily user of Alexa and Siri noticed how frustrated his daughter became when she couldn’t get Alexa to work for her. He noted she started to speak in a commanding tone to Alexa, saying things like “Tell me a joke” with no please or thank you.

Gaskin watched his daughter become impolite and sometimes upset with Alexa, as well as with the Gaskin family’s pet robot that couldn’t understand her but could understand the older children. He recalls asking her to say “please,” but that would often increase her frustration because saying “please” confused Alexa. He started to wonder if his daughter would start treating others the way she treated the AI.

So being the researcher that he is, he put together a study. Gaskin and one of his grad students at the time, 2020 MISM alum Nathan Burton, surveyed and observed 274 college students regarding their interactions with digital assistants and their interactions with people.

“The big idea that we had, that we wanted to push, was that once an AI becomes a social actor in a real sense, once it really acts like a person, it’s going to start influencing you,” Burton says. “We wanted to know what that means for how we design these devices and if we should be concerned using them.”

They fully expected to find that the way people treat AIs would make a difference in their interpersonal interactions—and not in a good way. But they didn’t.

“We were surprised that the way you treat your digital assistants seems to have no relation with the way you treat others,” Gaskin says. “Worried parents and news outlets alike have fretted about how the personification of digital assistants affects our politeness, yet we have found little reason to worry about adults becoming ruder as a result of ordering around Siri or Alexa.”

After contemplating the results, Gaskin and Burton started to piece together the why. They theorize that since adults have already formed their behavioral and communication habits—and are more skilled at compartmentalizing contexts—treating artificial intelligence rudely probably has a negligible effect on how an adult treats other humans. They believe adults also realize that trying to be polite and saying “please” and “thank you” may actually hinder the AI’s ability to comprehend and follow a command.

“There is a clear ability to separate intelligent entities with feelings and those without,” Gaskin says. “Our language toward digital assistants is not influenced by our desires to be sensitive. It’s totally a utilitarian approach—we just want it to work.”

The researchers believe that as technology improves and artificial intelligence takes on more and more human-like forms (think robots), things might change. But for now, at least for adults, the age of AI doesn’t appear to be negatively impacting human relations.

But what about kids? After all, it was Gaskin’s five-year-old daughter who prompted the study in the first place.

What About the Children?

According to the 2020 ChildWise Monitor Report, a comprehensive annual study from the United Kingdom that looks at five- to sixteen-year-olds’ media consumption, social habits, and technology behavior, the number of households in the UK owning a virtual assistant jumped to nearly 40 percent last year. In their 2020 Smart Audio Report, NPR and Edison Research found that there are 157 million smart speakers in US households.

Simon Leggett, research director at ChildWise, has grown increasingly concerned about the unintended consequences for “the Alexa generation.”

“As there is a surge in children’s use of gadgets that respond to verbal commands, we may see them learning ways of communicating that then slip into their interactions with humans,” Leggett says in a 2018 ChildWise press release. “Will children become accustomed to saying and doing whatever they want to a digital assistant—‘do this, do that’—talking as aggressively or rudely as they like without any consequences? Will they then start doing the same to shop assistants or teachers?”

Some of that research has made it back to the folks at Amazon, and over the last two years, Alexa has been modified so it can offer positive feedback to children who speak politely. One of those modifications includes more parental controls that allow parents to monitor the way their children use the device.

That’s great news to Melody Bennion, a mother of three, who only recently acquired an Echo Dot with Alexa but has already had to intervene multiple times when her children talk rudely to the device.

“I’ll tell them to apologize to Alexa,” says Bennion, a graduate of BYU–Idaho who lives in Mesa, Arizona. “I know it’s silly and it’s just a machine, but I don’t want them to get in that habit and to think that it is acceptable to speak that way, even to a machine.”

Bennion, whose children range from six to ten years old, isn’t “hugely concerned” that the way they speak to Alexa is going to impact the way they speak to humans, but she feels it’s important to be polite.

Parental intervention was a common finding in a study about how Amazon Echo owners interact with their devices. Over the course of four months, Jodi Forlizzi, director of the Human-Computer Interaction Institute at Carnegie Mellon University, and other researchers tracked and dissected how people were speaking to Alexa.

“One of the things we learned was that [Alexa] definitely changes how people converse,” Forlizzi says in a January 2019 article in the Deseret News. “So many moms in the study told us that their kids demanded things; [for example,] ‘Mommy, bring me orange juice.’”

The Question Remains

Shortly after Gaskin and Burton completed their preliminary study on adult interactions with digital assistants, the duo embarked on a larger study with children as the focus. Then the COVID-19 pandemic hit, and the ability to carry out the study properly was short-circuited.

Gaskin and Burton believe if they ever get the chance to study children, they will find different results. Children process information differently, they say, and have differing abilities to discern what has feelings and what does not. Some researchers have even recorded anecdotes of young children thinking there is a tiny person inside devices that speak back to them.

“Children are far more impressionable than adults, and I think we would find they are more affected by their interactions with artificial intelligence,” Gaskin says. Burton, who is now working on a PhD on conversational agents at the University of Georgia, agrees: “Children have a more difficult time differentiating between a machine and a human. They don’t know how they work. They don’t know someone in California used a keyboard to write a program to make Siri.”

If he has sufficient access to families post-pandemic, Gaskin would like to carry out a longitudinal approach by finding families without any personal digital assistants, giving them one, and then tracking the behavior of every family member over the course of six months.

In his opinion, you would need to track at least three dozen families to get sufficiently rigorous and valid results, and finding that many families that don’t have a digital assistant is getting increasingly difficult. Plus, working with children requires a significantly greater amount of care and precision. Whether that study comes from Gaskin and Burton remains to be seen. But the two are confident that someone will do it.

One bit of good news, according to Gaskin, is that digital assistants are getting better and better. The devices themselves are powered by algorithms and programming that helps them learn and evolve, and the companies behind them—Amazon, Apple, and Google—are getting more data every day to make them better. Gaskin says that his daughter (who is now seven) would have had much more success in her interactions with the Alexa of 2021 than she did with the Alexa of 2019.

But perhaps when the time comes for Gaskin and Burton to study children and AI again, Alexa, Siri, and other digital assistants will no longer need to be the focus. “In five years, we’ll likely have real-looking robot pets,” says Gaskin.

_

Written by Todd Hollingshead
Photography by Bradley Slade

About the Author
Todd Hollingshead works at BYU as a media relations manager for the University Communications office and as an adjunct professor of communications. He lives in Springville, Utah, with his wife, Natalie, their four children, a pug, and—despite his ongoing concerns—a cat.

Related Stories

data-content-type="article"

Smoothing the Rocky Road to Retirement

October 31, 2024 03:15 PM
There's no shortcut for retirement saving, but companies often provide a lot more help than employees may realize.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

Inbox Zero

October 31, 2024 03:13 PM
Message Management for a Brighter, Cleaner Outlook
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

Repeat Reads

October 31, 2024 03:12 PM
BYU Marriott faculty share their favorite annual reads—and the best time of year to read them.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText=