For nearly six decades, NASA has been at the cutting edge of space and aeronautics, and has also been involved in some of the earliest research into virtual reality (VR). Now, the US space agency is working with the Unreal Engine real-time visualisation platform to help it create an incredibly real Mixed Reality replica of the International Space Station (ISS) that provides an ‘out of this world’ environment for its astronauts and engineers.
Already recognised as being one of the best-in-class 3D computer game engines, one that has driven many of the top PC, PlayStation and Xbox titles over the past 20 years, Unreal Engine has increasingly been the tool of choice for those working in automotive, aeronautics and architecture, as well as many other areas where real-time visualisation and ultra high fidelity graphical representations are important. Many of these applications are designed to allow people to work in environments that are practically impossible to access for training and development, such as the depths of the North Sea, buildings that have not yet been built or new car models that are no more than design files on a computer.
“The International Space Station is a great example of an environment that is simply not available for training in the real world, but which can be created in Virtual Reality,” says Simon Jones, director of Unreal Engine’s Enterprise division.
As NASA software engineer Matthew Noyes, explains: “NASA is always interested in how cutting-edge technology could help our programmes. Creating a truly immersive experience for astronauts is a lot like creating a game. With Unreal Engine, we’ve created a completely immersive, three-dimensional, mixed reality training and development environment that is incredibly lifelike. In basic terms, that means we can put our crew in space while they’re still on earth.”
NASA’s advanced implementation of VR combines Unreal Engine’s ability to create a realistic virtual world, with physical models and room scale tracking to create an immersive International Space Station experience. In addition to tools and elements of the space station, physical systems include an Active Response Gravity Offload System, which is a smart robotic crane that offloads the user’s body weight to make it feel like he or she is in micro, lunar, or Martian gravity. The resulting Mixed Reality system provides the tactile sensations of what it feels like to be in orbit, increasing that vital sense of presence.
Applications of the system include training astronauts in maintenance and to use exercise equipment, helping to design new habitats and engineering development.
Matthew Noyes concludes: “The environments created by Unreal Engine have allowed us to meet many of our training goals. The more realistic your training feels, the faster you can respond in critical real-world situations, which ultimately can save your life.”
Wider Applications of VR
For Simon Jones, Director of Unreal Engine Enterprise, the NASA application is part of growing use of immersive visualisation, accelerated by the technology becoming much more accessible: “Development engineers can look at the execution of detail areas without having to make separate desktop models. Marketing specialists can create visuals before there is a prototype, or customer experiences that pre-sell before production. All of this means that organisations across a range of sectors are now looking at how they can embed VR within their engineering information strategy,” he says. “So what started life as a high-end computer gaming technology has developed to become an application that accelerates innovation, drives new technology and creates incredible new opportunities.”
For more information, please vsit: https://www.unrealengine.com/showcase/how-nasa-trains-astronauts-with-unreal-engine