On some days you talk more to Siri, Alexa and Google than to your partner. Can such a new digital relationship even be an advantage in times of pandemics?
For many people, voice assistants have long been part of everyday life. They answer questions, help you to find information quickly, play music on short orders or remind you of appointments. But are the voice assistants also able to simulate a human relationship?
Voice assistants are becoming more and more human
Friendly voices and preprogrammed answers to funny or philosophical questions could almost give you the impression that you have some kind of relationship. But only almost. Because when interacting with technology, people like to humanize objects in order to explain processes that they would otherwise not understand, says Esther Görnemann from the Vienna University of Economics and Business. “If Cortana doesn’t do what I say, it’s probably because“ she doesn’t want to ”. Participants in studies report that Alexa is “offended”, “cheeky” or “charming”, or even “a small family member who sits at the breakfast table in the morning”. “The tendency towards humanization is particularly pronounced among children.
But there is also a social motive for humanizing objects, says Görnemann. And this is where it becomes interesting with regard to the corona pandemic: “We are trying to compensate for a lack of social ties with other people.” Those who are lonely tend to develop social ties to objects.
Also interesting: the 10 most popular Alexa skills
In-depth discussions not yet possible
In general, however, you shouldn’t worry if you notice that you talk a lot with a digital assistant, says Prof. Arvid Kappas from Jacobs University Bremen. “We know that solitary confinement is one of the worst things you can ask people to do. If someone has no opportunity to talk to anyone else or to be together, something like this can happen, ”explains the psychologist. In principle, however, you should try to build up the account of social interactions by other means and, for example, prefer to phone real people.
Prof. Kappas is not surprised that children, for example, can develop a social relationship with voice assistants and perceive them as real beings: “You don’t even think about it when children talk to their teddy bear for a long time and think that the teddy bear has a soul . “That children are able to have complex interactions with non-living objects is not a new development. The latest generation of language assistants can only understand language much better than was previously the case. Nevertheless, at the moment you are still relatively far from being able to have an in-depth conversation with an assistant.
Esther Görnemann shares this view, but believes that this could soon change due to technical progress in the field of artificial intelligence (AI): “With GPT-3 we now have an AI that can formulate amazingly good texts and is surprisingly creative and is versatile. Such a good language model is an essential component for a language assistant with whom we can establish a social connection. ”It only becomes problematic when people start to replace their social relationships with people with language assistants.
Voice assistants cannot replace human relationships
Basically speaking, language assistants are just another medium for conducting communication and accelerating things, says Prof. Andreas Dengel, Director of the German Research Center for Artificial Intelligence (DFKI). On the other hand, they would not be of any use as comforters of the soul.
Among other things, because they can only pretend empathy and only exercise it to a limited extent, says Dengel. “People also need negative conversations in order to be able to feel empathy. Interpersonal communication is more complex and multi-dimensional than a conversation with a language assistant could be. “
Despite all the fascination that voice assistants have, children shouldn’t play too much with them, as this could have a negative effect on their ability to communicate, warns Dengel. “Communication does not only consist of language, there are various non-verbal forms of communication involved, such as facial expressions and gestures or also the reflection of the other person. And you just don’t learn that with devices like this. “
Also interesting: The best cell phones and smartphones for seniors
Opportunities for Seniors
In addition to the risks, Prof. Kappas also sees the opportunities offered by voice assistants. For older people in particular, they could mean an increase in freedom. A language assistant can help as a companion with certain topics, for example by reminding you of appointments or taking medication, says the psychologist.
“A natural language interface is much more suitable for older people who may not be able to type as well or look at a screen,” says Kappas. You can also simply ask the voice assistant to call someone, without having to search for numbers or typing. For most people, however, dealing with voice assistants is simply a playful nature.
Be careful what personal information you share
Voice assistants always carry the risk of surveillance, says Esther Görnemann from the Vienna University of Economics and Business. And: “I see it as problematic that we reveal more personal information when we build up a social relationship with our voice assistant. That happens quite involuntarily and we may not even be aware of it. “
In the background, manufacturers have already developed patents that are supposed to pick out advertising-relevant keywords from the voice input, says the researcher. For a long time, the companies would learn as much as possible about the customers and, for example, deduce which advertising could work when.
Advertising could then be so individually adapted to situations that you would not even notice that your own behavior was being manipulated, warns Görnemann. “As long as tech giants examine us down to the smallest detail and this process remains as intransparent as it is now, there is a risk that we will behave as the manufacturer wishes, and we will not even notice it. “