Computer Sciences

A technique that allows robots to detect when humans need help

You are interested in A technique that allows robots to detect when humans need help right? So let's go together Ngoinhanho101.com look forward to seeing this article right here!

A technique that allows robots to detect when humans need help
The robot monitors the user’s eye gaze and speech to determine when to help the user as she prepares to bake cookies. Credit: Wilson, Aung & Boucher.

As robots are introduced in an increasing number of real-world settings, it is important for them to be able to effectively cooperate with human users. In addition to communicating with humans and assisting them in everyday tasks, it might thus be useful for robots to autonomously determine whether their help is needed or not.

Researchers at Franklin & Marshall College have recently been trying to develop computational tools that could enhance the performance of socially assistive robots, by allowing them to process social cues given by humans and respond accordingly. In a paper pre-published on arXiv and presented at the AI-HRI symposium 2021 last week, they introduced a new technique that allows robots to autonomously detect when it is appropriate for them to step in and help users.

“I am interested in designing robots that help people with everyday tasks, such as cooking dinner, learning math, or assembling Ikea furniture,” Jason R. Wilson, one of the researchers who carried out the study, told TechXplore. “I’m not looking to replace people that help with these tasks. Instead, I want robots to be able to supplement human assistance, especially in cases where we do not have enough people to help.”

Wilson believes that when a robot helps humans to complete a given task, it should do so in a ‘dignified’ way. In other words, he thinks that robots should ideally be sensitive to their users’ humanity, respecting their dignity and autonomy.






There are several ways in which roboticists can consider the dignity and autonomy of users in their designs. In their recent work, Wilson and his students Phyo Thuta Aung and Isabelle Boucher specifically focused on preserving a user’s autonomy.

“One way for a robot to support autonomy is to ensure that the robot finds a balance between helping too much and too little,” Wilson explained. “My prior work has looked at algorithms for adjusting the robot’s amount of assistance based on how much help the user needs. Our recent study focused on estimating how much help the user needs.”

When humans need help with a given task, they can explicitly ask for assistance or convey that they are struggling in implicit ways. For example, they could make comments such as “hmm, I am not sure,” or express their frustration through their facial expressions or body language. Other implicit strategies used by humans to communicate that they need help involve the use of their eye gaze.

“For example, a person may look at the task they are working on, then look at a person that can help them and then look back at the task,” Wilson said. “This gaze pattern, called confirmatory gaze, is used to request that the other person look at what they are looking at, perhaps because they are unsure if it is correct.”






The key objective of the recent study carried out by Wilson, Aung and Boucher was to allow robots to automatically process eye-gaze-related cues in useful ways. The technique they created can analyze different types of cues, including a user’s speech and eye gaze patterns.

“The architecture we are developing automatically recognizes the user’s speech and analyzes it to determine if they are expressing that they want or need help,” Wilson explained. “At the same time, the system also detects users’ eye gaze patterns, determining if they are exhibiting a gaze pattern associated with needing help.”

In contrast with other techniques to enhance human-robot interactions, the approach does not require information about the specific task that users are completing. This means that it could be easily applied to robots operating in various real-world contexts and trained to tackle different tasks.

While the model created by Wilson and his colleagues can enhance user experiences without the need for task-specific details, developers can still provide these details to enhance its accuracy and performance. In initial tests, the framework achieved highly promising results, so it could soon be used to improve the performance of both existing and newly developed social robots.

“We are now continuing to explore what social cues would best allow a robot to determine when a user needs help and how much help they want,” Wilson said. “One important form of nonverbal communication that we are not using yet is emotional expression. More specifically, we are looking at analyzing facial expressions to see when a user feels frustrated, bored, engaged or challenged.”

Conclusion: So above is the A technique that allows robots to detect when humans need help article. Hopefully with this article you can help you in life, always follow and read our good articles on the website: Ngoinhanho101.com

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button