The objective of this subtopic is to develop and mature extended reality (XR) technologies that can support NASA's goal of a sustained presence on the Moon, the exploration of Mars, and the subsequent human expansion/exploration across the solar system. NASA’s current plans are to have boots on the surface of the Moon in late 2024. Over time, lunar, Mars, and other solar system exploration missions will be much longer, more complex, and face more challenges and hazards than were faced during the Apollo missions. These new missions will require that astronauts have the very best training, analysis tools, and real-time operations support tools possible because a single error during task execution can have dire consequences in the hazardous space environment. Astronauts will also be required to function more autonomously than they have functioned previously. Technologies, such as XR, that can improve training, operations support, health and medicine, and collaboration provide tools with capabilities that were not previously available; while also improving a crew's ability to carry out activities more autonomously.
Training and operations support during the Apollo era required the use of physical mockups in labs, large hangars, or outdoor facilities. These training modalities had inherent detractors such as the background environments that included observers, trainers, cameras, and other objects. These detractors reduced the immersiveness and overall efficacy of the system. Studies show that the more “real” a training environment is, the better the training is received. This is because realism improves “muscle memory,” which is critically important, especially in hazardous environments. XR systems can be made that mitigate the distractors posed by observers, trainers, background visuals, etc., which was not possible in Apollo-era environments. The virtual environments that can be created are so “lifelike” that it can be extremely difficult to determine when someone is looking at a photograph of a real environment or a screen captured from a digitally created scene. XR systems also allow for training to take place that is typically too dangerous (e.g., evacuation scenarios that include fire, smoke, or other dangerous chemicals), too costly (buildup of an entire habitat environment with all their subsystems), not physically possible (e.g., incorporation of large-scale environments in a simulated lunar/Mars environment), and a system that is easily and much more cost effective to reconfigure for different mission scenarios (i.e., it is easier, quicker, and less expensive to modify digital content than to create or modify physical mockups or other physical components). Industry is using next-generation digital technologies to create XR-based digital twins that facilitate Product Lifecycle Management (PLM). Most of all, an XR-Based Digital Ground Replicate of the physical systems can serve as a common media (i.e., a “window”/viewpoint into the actual system) to communicate among all the stakeholders from different locations, sharing and interacting within the same virtual workspace simultaneously.
Industry has been using XR for gaming successfully and although there are some enterprise-level applications, there are still many opportunities in this domain that have not been realized. NASA has a long history of developing and using XR for training, operations, engineering, collaboration, and human performance applications. These are prime areas where NASA can help guide the development of XR technologies for enterprise applications that could provide significant benefits to industry, other government agencies, and companies forging into the space industry domain.
Furthermore, as industry/academia develops XR technologies, the way NASA is using XR will also change. Previously, NASA would need to develop all the software and hardware needed, but industry is developing XR technologies that allow NASA to focus on specific use cases. We have also identified several gaps/needs in the XR domain that could further support XR use across NASA, so working with industry/academia to address these gaps would be beneficial to NASA.
XR for Health Care and Health Management
Scope Description:
In the upcoming stages of human exploration, astronauts will embark on missions that take them deeper into space and require them to spend extended durations away from Earth. Consequently, these missions will require astronauts to exhibit a higher degree of autonomy. A critical facet of this increased autonomy revolves around healthcare, as astronauts must be capable of managing healthcare situations with limited Earth-based support. Technologies that allow astronauts to do this will become critically important.
The healthcare industry at large is actively using Extended Reality (XR) as a solution for the planning, training, and real-time support of health-related activities. This field has experienced substantial growth in recent years and currently represents a $7 billion industry. XR offers a range of applications, including enhancing the safety and efficiency of surgical planning and execution by providing 3D perspectives from multiple angles. It streamlines the planning process by seamlessly integrating imaging with immersive 3D content. It enables medical students to practice procedures and familiarize themselves with medical instruments in a lifelike digital environment that is both immersive and cost-effective. It is allowing for the transition from traditional cadaver-based training to digital human simulations that allows students to engage in repeated training sessions from the convenience of their homes or any location with a computer. It is also allowing for the simulations of complications during practice sessions to a degree that has not been possible before. Furthermore, XR technology is also making inroads into pain management and being used to help support mental health support services.
Combining the healthcare industry's XR proficiency with NASA's extensive experience in human spaceflight medicine has the potential to enable astronauts to operate more independently and effectively manage medical emergencies in the challenging space environment.
Expected TRL or TRL Range at completion of the Project: 2 to 5
Primary Technology Taxonomy:
- Level 1: TX 11 Software, Modeling, Simulation, and Information Processing
- Level 2: TX 11.6 Ground Computing
Desired Deliverables of Phase I and Phase II:
- Research
- Analysis
- Prototype
- Hardware
- Software
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art (SOA). Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
Currently, NASA relies on limited training performed before flights, support from a flight surgeon on console, or manuals to carry out health care of crew during missions. As mission durations increase and the distance we visit grows, crew must become more autonomous in managing their health and addressing any medical situations that arise. The commercial health care industry is already leveraging some of these technologies in day-to-day operations and NASA could benefit significantly from incorporating some of those technologies and concepts.
Novel concepts related to real-time photorealistic visuals, markerless tracking of people/instruments, human interface systems (including haptics/mixed reality), and wearable XR devices could provide a system that provides significantly more capabilities than are currently in use.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD) and Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD) and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://science.nasa.gov/
https://www.nasa.gov/specials/artemis/
https://www.nasa.gov/gateway
This type of capability would enable the development of immersive systems that could support planning, analysis, training, and collaborative activities related to surface navigation for Artemis missions.
Holodeck Technologies for XR
Scope Description:
One particularly notable technology from Star Trek was the holodeck, which served multiple purposes. It functioned as a data analysis environment, a planning tool, offering 3D visualization of data, models, and simulations. The holodeck was also used as a training environment, providing crew members with simulated environments for training. It also served as an engineering design tool, allowing crew members to create and manipulate 3D models of objects and systems piece by piece before their fabrication. Furthermore, it provided a recreational environment, simulating various locations for crew members to visit.
While today's technologies may not replicate the full range and fidelity of capabilities seen in a Star Trek holodeck, there are Extended Reality (XR) technologies that allow for the creation of a "holodeck-like" system that could provide NASA with significant benefits. Some of these technologies include:
- Multi-user participation - This includes being able to have two or more individuals participating in the same immersive environment.
- Mobility and Markerless tracking - This includes being able to determine the position/orientation of the torso, limbs, fingers, and other items in the physical environment, while carrying out activities in a much smaller physical environment than the virtual space.
- Human interfaces - This includes methods by which users can interact with the immersive environment or other users in the system. This can also include interacting with artificial intelligence (AI) and machine learning (ML) agents and/or other intelligent systems.
- Sensory - This includes improvements to the visuals, incorporation of haptics (full body or limb/finger), acoustics, and olfactory. Also includes the possibility of incorporating temperature control, wind, etc., into the overall physical experience.
- Wearable or projection XR display systems - Visual display quality (resolution, field of view, refresh rate, etc.), comfort for prolonged use, complexity getting the system operational, and costs to implement are important in this area.
- Dynamic scene generation - This includes the ability to augment the immersive scene (in real time) with new content that is relevant to the current state of the simulation.
The integration of the XR technologies above, combined with NASA’s extensive experience innovating and conducting cutting-edge research in human spaceflight, science, aeronautics, and engineering could lead to the development of next-generation training systems, applications that improve real-time mission support, a framework that facilitates the engineering design process, and tools that enhance our ability to visualize and analyze complex data. These systems and tools can improve NASA’s risk posture, reduce costs, and provide capabilities not previously possible. These systems are not only relevant to NASA, but also to the broader commercial industry as a whole.
Expected TRL or TRL Range at completion of the Project: 2 to 5
Primary Technology Taxonomy:
- Level 1: TX 11 Software, Modeling, Simulation, and Information Processing
- Level 2: TX 11.6 Ground Computing
Desired Deliverables of Phase I and Phase II
- Research
- Analysis
- Prototype
- Hardware
- Software
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks, algorithms, and demonstrate feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art (SOA). Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
The three most used XR immersion methods are projection-based systems, head-mounted displays, or flat screens.
- A cave automatic virtual environment (CAVE) is a type of projection system that can be used to provide users with immersive content. The CAVE consists of a room that has multiple projectors displaying content on the wall. The glasses that users put on provide stereo capabilities. The users usually interact with some of the content being displayed with wand-like controllers. The major drawbacks with this type of system are the scene is not customizable to individuals (everyone has to be in the same virtual area), the ways that users interact with the system are typically very rudimentary, the system is usually limited in the type of nonvisual sensory feedback incorporated, the system requires significant space to implement, and the high cost to deploy this type of system.
- A head-worn display can provide highly immersive visualization for multiple concurrent users and supports having users at different locations. These types of devices can usually provide higher resolution, brighter displays, do not require a large space to use, and have a lower cost to implement than CAVE-like systems. For those reasons, these types of systems have become very popular. Some of the limitations for this type of system include not being able to show immersive content to a large number of people concurrently, they can become uncomfortable to wear, and being problematic when showing content to a large number of people concurrently.
- A flat display (computer monitor, tablet, smartphone, etc.) provides accessibility by a large number of people, can be very portable (smartphones/tablets), and thus is the most widely used. The biggest limitation is the level of immersion that can be provided.
Along with the limitations mentioned above, these types of systems could benefit by integrating some of the holodeck technologies mentioned previously.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD) and Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://science.nasa.gov/
https://www.nasa.gov/specials/artemis/
https://www.nasa.gov/gateway
XR Usage for Human Performance Applications
Scope Description:
Industry is using a combination of XR, biometrics, and AI/ML to create applications that monitor and enhance human performance. Examples within the industry include personalized training, real-time monitoring, stress and cognitive load management, task assistance, and cognitive improvement. By merging NASA's extensive expertise in human performance-focused training and operations, with the cutting-edge research conducted by the industry in XR, biometrics, and AI/ML, we can facilitate the development of next-generation training, planning, and operations support systems. Not only would NASA benefit from these systems, but industry would be able to leverage the systems created for NASA to develop variants to use in their own general public products.
Key technologies of interest in this domain include:
- Multimodal sensor data integration into an XR system. This includes data from wearable and nonwearable biometric devices. The idea is to minimize the number of biometrics sensors required for the task and make the overall system more reliable and easier to use. This includes both optical and non-optical-based biometric systems.
- Cognitive state determination system. An example of this is an adaptable human interface that can dynamically modulate the XR content based on a person's cognitive state. If the system detects that a person is highly stressed, confused, or about to go into a cognitive overload, then it could dynamically modulate the content and activities being carried to reduce the cognitive workload. If the system detects that the person is bored or in a low cognitive workload state then it would provide more engaging content. We want to keep the users in the cognitive workload goldilocks zone learning/operating state. This means that the XR system needs to be extremely configurable and be able to create and insert, into the scene, new on-demand content of varying fidelity levels in real time.
- Physiological state determination system. Along with modulating XR content based on a person’s cognitive state, the system could modulate XR content based on a person’s physiological state. If they are showing signs of high levels of physical fatigue, the system could modulate the content to reduce the physiological workload required to continue. If the user is showing signs of boredom, then the system will take the appropriate action by increasing the exertion required.
- XR-Based Advanced Object Recognition. This system can support navigation in complex environments by combining concepts related to edge detection and AI/ML to identify partially occluded objects in the field of view and provide full object views to a person wearing a headset. The best way to provide this information is still an open area of applied research.
Expected TRL or TRL Range at completion of the Project: 2 to 5
Primary Technology Taxonomy:
- Level 1: TX 11 Software, Modeling, Simulation, and Information Processing
- Level 2: TX 11.6 Ground Computing
Desired Deliverables of Phase I and Phase II:
- Analysis
- Research
- Prototype
- Hardware
- Software
Desired Deliverables Description:
Phase I awards will be expected to develop theoretical frameworks and algorithms and demonstrate the feasibility (TRL 3) of the overall system (both software and hardware). Phase II awards will be expected to demonstrate the capabilities with the development of a prototype system that includes all the necessary hardware and software elements (TRL 6).
As appropriate for the phase of the award, Phases I and II should include all the algorithms and research results clearly depicting metrics and performance of the developed technology in comparison to state of the art (SOA). Software implementation of the developed solution along with the simulation platform must be included as a deliverable.
State of the Art and Critical Gaps:
There are many small businesses that are currently involved in the XR space and developing XR technologies that are unique, innovative, and proving to be very useful for a wide variety of applications. These companies are making good progress advancing the state of the start in this field. The scope and funding defined in the call is such that the small businesses can select a specific technology area and approach to address part of the overall challenge. Funding small businesses to further develop XR capabilities of interest would provide them with additional technologies to include in the applications they are developing and which could be used to support many NASA applications.
Relevance / Science Traceability:
XR technologies can facilitate many missions, including those related to human space exploration. The technology can be used during the planning, training, and operations support phase. The Exploration Systems Development Mission Directorate (ESDMD) and Space Operations Mission Directorate (SOMD), Space Technology Mission Directorate (STMD), and Science Mission Directorate (SMD), Artemis, and Gateway programs could benefit from this technology for various missions. Furthermore, the crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
https://www.nasa.gov/directorates/heo/index.html
https://www.nasa.gov/directorates/spacetech/home/index.html
https://science.nasa.gov/
https://www.nasa.gov/specials/artemis/
https://www.nasa.gov/gateway