Returning to Earth after long periods of time in space on the International Space Station can be a strange experience for astronauts. After months in microgravity, even an act as simple as walking may be accompanied by feelings of dizziness, nausea, and vertigo.
Astronauts liken it to the world’s worst hangover as their vestibular systems struggle to re-acclimatize to Earth gravity. Some report dropping things as they are so used to being able to let go of objects and have them continue to float close by. Others, despite their physical fitness, find that they are surprised by the extreme weight of even a lightweight iPad after long periods of handling tablets in weightlessness.
These physical effects are well-known. But what about the mental side effects of time in isolation, or near-isolation, in space? This is less publicized. But it’s clear that space can have a big impact here, too.
“If you meet [astronauts] after they return, they can act strangely,” Judith-Irina Buchheim, a researcher at the University Hospital München in Munich, Germany, told Digital Trends. “It’s quite hard to pinpoint, but you notice that something apparently happens. They may have difficulty in readapting, or they become very thoughtful. We see behavioral changes.”
A smart speaker that really listens
Buchheim’s area of interest involves stress and the physiological changes that can manifest in the human body. For the past several years, she has focused on people voluntarily living and working in isolation, whether that’s astronauts on the ISS or researchers in the Antarctic. She has also been working closely with groups including IBM to help feed these insights into an ambitious project to create a robot assistant to help astronauts in space. Robots, that is, like CIMON (Crew Interactive Mobile companion).
CIMON is an autonomous, free-flying beachball-looking smart device with a touchscreen smiley face. It was developed by IBM and Airbus, with funding from the German Aerospace Center, and has been deployed on the ISS since 2018. The robot’s latest iteration, CIMON-2, was sent up to the space station at the end of last year aboard a resupply mission.
CIMON assists astronauts by carrying out tasks, such as finding objects or documenting tasks. But its creators believe it could do a whole lot more, too. They think it could be a communication platform to keep astronauts company, half-psychotherapist, half-Wilson-in-Cast Away-style buddy. “We’re interested in making a machine that can behave empathically, and can also provide advice,” Buchheim said. “I think a computer that can do that has a lot of potential.”
And a buddy, too
The idea of building a friendly robot for space missions has been a staple of science fiction for years. Star Wars’ R2-D2 and C-3PO, Forbidden Planet’s Robby the Robot, and other fictitious robots have not always been the most efficient or competent of assistants, but they have been lovable support characters in their own right. In other cases, whether it’s Data from Star Trek: The Next Generation or the Arnold Schwarzenegger T-800, they are introduced as functional tools, only for this quality to recede in importance as their “humanity” comes to the fore. Why, then, should real-life astronauts not have access to the same space sidekicks?
“The isolation of being up there [in space] is something that can take a toll on astronauts,” Buchheim said. “But they don’t like to talk about it. We’ve done studies with questionnaires where we ask ‘do you feel stressed?’, ‘do you feel isolated?’, ‘do you feel alone?’, ‘do you feel sad?’, and things like that. Astronauts in general don’t like to report that. Why? Because they’re heroes; they’re very special people. They have been part of a very long selection process. They have to prove that they are healthy, that they are resilient, and that they are able to perform tasks under very, very stressful conditions. So to be honest about themselves, to be honest about psychological problems, is quite a thing for them.”
This was one of the impetuses of the CIMON project’s emotional support research. Plenty of psychological profiling is carried out to ensure that astronauts will be able to work together in space. But there are, inevitably, challenges. Language and cultural differences exist, as do the potential for whatever squabbles and challenges exist in any workplace. Some astronauts may be trained psychologists, but what if the crew’s psychologist needs someone to talk to?
The mission to Mars
As Buchheim explains, in some ways the importance of this research won’t become apparent until the next phase of space exploration. The ISS is close enough to Earth that astronauts can look out of the viewing windows and see their home planet. Communications with the ground station are almost instant. “But once we move further away from Earth — maybe to the moon, but we are also planning interstellar flights — [this could become more significant],” she said.
On a trip to Mars, communication times with Earth would drop to around 30 minutes. That means recorded messages to friends and family only. It also means a half-hour wait for feedback in the event of an urgent problem. This, along with more cramped environments for long-haul space flight, could place an added strain on team dynamics. Psychological effects like groupthink could wind up excluding particular crew members who struggle to have their voices heard. CIMON, which would not be part of the hierarchy of the crew, would exist as a neutral platform to talk with.
“CIMON could be a platform that [astronauts] like to talk to because he’s not part of the team,” Buchheim said. “If an astronaut doesn’t want to share personal feelings — for example, if it’s a team leader who doesn’t want to show weakness or open up to the team that [they are] afraid of what might happen — maybe it’s easier for this person to share the problem with CIMON and keep the data files like a diary.”
A robot psychotherapist … in the 1960s?
The idea that a highly trained adult might find solace in speaking to artificial intelligence could sound unlikely or even dismissive of the complexity of actual human communication. But there’s plenty of reason to believe it could help.
In the 1960s, computer scientist Joseph Weizenbaum at the Massachusetts Institute of Technology created ELIZA, a prototypical chatbot designed to function as a computer psychotherapist. ELIZA was intended to engage users in seemingly intelligent conversations via text. Users were asked to type in sentences, which ELIZA would then reflect back to them, either questioning or supporting their statement. Although ELIZA had no actual understanding of the topics being discussed, and was simply following a template, Weizenbaum expressed surprise that students in his lab would willingly pour their hearts out to the program. What was meant, in some senses, as a wry parody of real psychotherapists wound up being the recipient of serious concerns about everything from failing classes to broken relationships.
More than half a century later, these technological capabilities have advanced significantly. High-level voice recognition is something that’s found in just about every smartphone thanks to breakthroughs in machine learning. Emotional analysis through image recognition is increasingly prevalent. And projects like IBM’s Project Debater, capable of carrying out debates with expert human debaters, is way beyond what Joseph Weizenbaum could have imagined in the 60s.
One of CIMON’s abilities is something called IBM’s Watson Tone Analyzer, which can pull out emotive words from the conversations it has with astronauts and use this to modify its responses. An astronaut saying that they feel bad will result in a different line of conversation than them saying they would like to have some fun.
Making CIMON empathic
“We want CIMON to be empathic,” Buchheim said. “Seven tones is not much, but it is the beginning. Anger, fear, sadness, being distressed, being inquisitive — all of these tones can be detected, and they will trigger a response in CIMON.”
Unlike ELIZA, CIMON aims to provide input into conversations, rather than simply be a passive listening machine whose only activity is to prompt further insights from the user. “A diary function is nice, but at the end of the day you want to get something back,” Buchheim said. “You want to get advice, like you would from a good friend.”
There’s still much to do, but the team believes that they are developing technology which could be a game-changer. And who knows? As the project goes on, Buchheim said the insights could be used to inform technologies closer to home which could help those who are stressed or living in isolation. (Which, let’s face it, is a whole lot of people right now!)
“We’re always trying to think of ways to transfer this knowledge to Earth,” she said.