Dual-Filter Approach to Trust in Hybrid Human-Machine Systems

Status: Completed

Start Date: 2023-08-03

End Date: 2024-09-02

Description: Autonomous systems, when subjected to novel environments and tasks, suffer from issues with unknown unknowns. Under these circumstances, human intervention is frequently required. In deep space, such as a manned mission to Mars, communications with Earth require minutes round trip, effectively removing a typical safety net for the astronauts onboard when dealing with autonomous systems and sensors. Kalman filters provide bounds for system error which are only accurate when sensors perform as expected, and with estimates leaving their advertised bounds when trusted sensors fail. This derivation of the trust-based filter is not only intended to solve this problem but also is based on a human understanding of trust, thereby making it interpretable by astronauts. By providing both autonomous agents and astronauts with the ability to determine which agents are trustworthy, decision-making is simplified and more systems can become autonomous, reducing mental and physical load on the most important assets for the mission.
Benefits: The technologies developed under this proposal will be applicable to lunar and Martian missions defined in NASA’s Strategic Goal 2. Specifically, the trust filter applies to multi-agent autonomous and collaborative systems, especially systems for which direct oversight by ground control is infeasible due to complexity or distance.

Technologies developed under this proposal will be applicable to multi-agent autonomous systems and collaborative robotics. Self-driving cars, manufacturing, drones within the National Airspace System can all take advantage of the trust filter derived herein. As the number of autonomous systems interacting with humans grow, so does the number of domains in which this technology is useful.

Lead Organization: Infinity Labs LLC