2017
Max Lell is currently staying with the Austrian Space Forum, doing his master thesis on the Aouda space suit simulator. In this week’s blog, he explains his research.
How did you get in there and how is it to work here?
My background is mechatronics and I was searching for a fitting master thesis, which is interesting and challenging. Since I was always interested in space related research I found the OeWF at the „Lange Nacht der Forschung“ in Innsbruck. They had an open project regarding the gesture control interface (will be explained later) and I decided to join them and I need to say: Working here is great.
Why a gesture control system?
A gesture control interface offers many advantages, when properly deployed. For instance: You want to record a data log point. Taking a data log point is like taking a note of the current situation. When used in the Aouda.X suit, all current relevant parameters are getting stored: the camera, which is installed on the suit takes a picture, the temperature, humidity, GPS, and other sensor parameters are getting stored. In field you might use it, when you want to “remember” the current situation for later.
You see a promising rock formation, or the soil changed in a way you didn‘t expect or something might be interesting for further investigation or later analysis in the lab? – take a note.
Currently you need (in order to take a „Note“):
- open the Chest Pack
- press the mechanical buttons to navigate through a menu (like the setup menu on your digital camera)
- search for the entry and activate it
These steps take all much longer then just simply shaking your hand in a specific pattern. Since operation time is a critical aspect of each mission, it is very important to not waste time in scrolling and searching through menus’ throughout the mission.
Another application might be to integrate the gesture navigation into a map application. Instead of pressing and selecting the right buttons, you simply perform similar gestures with your hands as you already know from your smartphone (like zoom in or zoom out) and the map application shows similar results.
How? A short system overview
Several sensors are deployed on the astronaut’s hands to detect his movements. When the astronaut performs a gesture, the algorithm interprets the sensor‘s signals and outputs an unique gesture. In reality the whole system is more complex, since the sensors only provide noisy signals, the signals must be transferred through different components, getting filtered and prepared to get processed from a machine learning classifier in real time.
I also need to consider what environment Mars provides. For example, I need to think about the dust on the Red Planet. Dust in a mechanical system is not very supporting. These and other factors must be monitored, but all in all it makes a lot of fun to design such a Human machine interface for such interesting conditions.
- Tagged: aouda, Aouda.X, exoskeleton, simulateMars, SuitLab
Events
Blog categories
- AMADEE-15 Simulation (13)
- AMADEE-18 (19)
- AMADEE-20 (21)
- AMADEE-24 (8)
- Aouda Spacesuit Simulator (67)
- ASE 2016 (9)
- Book tips (1)
- Events (32)
- Expeditions/Simulations (81)
- Flight projects (13)
- Guest blogs (14)
- Internships at the OeWF (53)
- OeWF News (352)
- Phileas rover (21)
- Press Releases (36)
- Research/Projects (129)
- Serenity spacesuit (3)
- World Space Week (25)