Log in
Enquire now
Human-robot intelligent user interface

Human-robot intelligent user interface

Human-robot intelligent user interface refers to the development of user interfaces designed to improve the communication between human users and robots.

OverviewStructured DataIssuesContributors

Contents

OverviewResearch backgroundInterface designTypes of interfacesTimelineTable: Companies in this industryTable: Further ResourcesReferences
Is a
Industry
Industry

Industry attributes

Parent Industry
‌
Intelligent user interface
Artificial Intelligence (AI)
Artificial Intelligence (AI)
18
‌
Human–computer interaction
Intelligent control
Intelligent control
Human–robot interaction
Human–robot interaction
Child Industry
Chatbot
Chatbot
Autopilot
Autopilot
‌
Robotic process automation
Autonomous vehicle
Autonomous vehicle
Autonomous System
Autonomous System

Other attributes

Also Known As
RUI
HRI
Human-robot affordances
Overview

A human-robot intelligent user interface is designed to improve communication between humans and computers, specifically designed by researchers, designers, and developers to enhance the flexibility, usability, and power of human-robot interaction for all users. Human-robot interaction consists of an input facility, environment display, intuitive command and reaction, and the architecture of the interface program. For example, an intelligent user interface system for human-robot interaction can consist of ultrasonic, position sensitive detector (PSD), and DC motors.

Part of the interaction and the research into intelligent user interfaces with robotics aims to equip robots with the required intelligence to actively support humans completing it. The focus of much of the research is to develop intuitive and "natural" interfaces and interactions to allow for tool handover in a work environment, such as using robotics in an operating theater, on a factory floor, or in extreme working environments, such as a nuclear power plant.

Research background

Human-robot intelligence user interface is a field of research and development built on research on intelligent user interface that fuses the fields of human-computer interaction and artificial intelligence to create interfaces that users perceive as "intelligent." Human-computer interaction is concerned with solutions for the usage of interfaces, techniques, and artifacts; AI research is focused on techniques for automation to aid users with the execution of various tasks. To make an interface seem "intelligent," it usually has to be based on the type of interaction, the automation of the interaction or output, or the interface or interaction as a whole.

As in many related intelligent user interface use cases, human-robot interaction is facing challenges such as safety, transparency, trust, gesture or speech, and user experience to help humans collaborate with robots across industries and use cases. Research into human-robot intelligent user interface seeks to solve or ease those pain points to increase the ease of human-robot interaction.

Interface design

Whether the robotic design or interface is for delivery robots, self-driving cars, or autopilot systems, there needs to be an interface in which the human can control the machine to enable control and feedback, meaning users need to learn with their interface. A more complex interface will not allow someone to use the robotics system without proper training.

One example of an interface that has been evolving for a long time is aviation, with aviation systems being one of the oldest human-machine interfaces. Since the 1950s, aviation systems have been slowly and increasingly automated. Two examples of these systems are those developed by Airbus and Boeing. Airbus developed a hard autopilot system that defines and controls entire operations and does not give the pilot any ability to control the aircraft. Whereas Boeing built a flexible system in which the pilot could take control at any time. The flexible system allowed airflight crews to save their planes from emergencies in unexpected situations.

This suggests there needs to be some kind of flexibility in robotics interfaces to allow for human control in a given scenario. Similarly, any interface needs to balance in design, with the interface needing to balance four key dimensions: user, application, interface, and technology. This can include designing an interface that is expressive and able to interact with someone using modalities such as speech, gestures, or symbolic expressions.

Types of interfaces

The types of interfaces that can be designed also depend on the type of interaction expected or required by a robotic system and should be designed with the goals of allowing intuitive control of the robotic system and creating a balance between automation and user control. These interfaces can be touch, voice, or gesture recognition. For example, a robot supervising operations in a hospital needs to be able to react to a nurse, and react quickly, to perform emergency care.

Robots, whether they are mobile or stationary or act in an environment close to humans, have a primary purpose of supporting humans in various environments, such as at work, at home, or for leisure. This type of expected interaction, as noted above, will define the interface and the expected type of interaction.

GUI and TUI

The following are two common and contrasting interfaces often used for an assistive manipulation robot system:

  1. graphical user interface (GUI), in which the user operates entirely through a touchscreen as a representation
  2. tangible user interface (TUI), which makes use of devices in the real world—such as laser pointers, projectors, and camera systems—to enable augmented reality

Both systems are designed to allow an operator to use a robotic system in an open environment.

Robot with tablet

As robots are increasingly integrated with artificial intelligence or are part of an internet of things (IoT) environment, they can play a greater role in individuals' lives through automation. However, how users interact with these devices can vary. For example, the system can have a human-robot interface with a tablet, also known as a "robot with a tablet" system, in which a human is given increased control through a tablet-based interface, although it does not limit a robotic system from having additional interfaces, such as speech or expressive systems. A contrasting type of interface, the "robot only" interface, does not provide the user with an interface and can still provide the same functionality as a robotic system equipped with a tablet but can be more cost-effective. While the interface—which will usually be vocal—has to be more robust to properly interact with users.

User intention through interface

One development of robot-user interface is a method for predicting user intentions as part of the interface, using geometric objects that can partition an input space to provide a robotic system with a means for discriminating individual objects and clustering objects for hierarchical tasks. This could be developed in combination with robots designed to mimic human conversational gaze behavior to help robots collaborate with humans, and humans feel the robotic system is understanding the task described in a conversational user interface.

Voice-controllable intelligent user interface

A social robot, and an understanding of how humans and robots can interact to accomplish a specific task, can create a more sophisticated robotic platform that could become integral in human societies. This is particularly because a social robot needs to be able to learn the preferences and capabilities of the people it interacts with to allow it to adapt its behaviors for more efficient and friendly interaction.

Advances in human-computer interaction technologies have worked to improve human-robot interaction, allowing users to interact with robots through natural communication or speech, and voice-controllable intelligent user interfaces allow more ease of use for humans. Studies into these systems have found that users prefer voice control to manual control. Subjects with high spatial reasoning tend to prefer voice control, and those with lower spatial reasoning prefer manual control. The overall effect of spatial reasoning was shown to be less important with voice control compared with manual control.

Intelligent environment

Because of the complexity of human-robot interactions, part of developing a user interface can include the development of spatial limitations, or intelligent environments. This could be as simple as geofencing, which dictates a robotic system's functionality inside and outside of a designated area, and can increase in complexity to systems capable of orienting robots to roads, airspace, or spaces in cities and buildings. This can be part of the interface for order—picking robots capable of navigating a warehouse based on directions and increasing the capability of workers to collaborate with robots. And it could extend to delivery robots and lanes reserved for self-driving cars and also be used for experimenting related to human-robot and human-machine interaction for developing smart cities.

Timeline

No Timeline data yet.

Companies in this industry

Further Resources

Title
Author
Link
Type
Date

Great Interface Design Is the Key to Human-robot Performance

https://www.designnews.com/automation-motion-control/great-interface-design-key-human-robot-performance

Web

February 24, 2020

How to Build Collaborative Human-Robot Affordances - Man + Machines

Jean-marc Buchert

https://manplusmachines.com/human-robot-affordances/

Web

August 19, 2021

Human-robot interaction via voice-controllable intelligent user interface | Robotica | Cambridge Core

https://www.cambridge.org/core/journals/robotica/article/abs/humanrobot-interaction-via-voicecontrollable-intelligent-user-interface/28B4C91FACF679293F995A94C615760C

Web

Human-robot interface design - the 'Robot with a Tablet' or 'Robot only', which one is better?

Chih-Chien Hu, Yu-Fen Yang, Nian-Shing Chen

https://www.tandfonline.com/doi/full/10.1080/0144929X.2022.2093271?src=

Journal

July 1, 2022

User intentions funneled through a human-robot interface

Michael T. Rosenstein, Andrew H. Fagg, Shichao Ou, Roderic A. Grupen

https://dl.acm.org/doi/10.1145/1040830.1040888

Conference

January 10, 2005

References

Find more entities like Human-robot intelligent user interface

Use the Golden Query Tool to find similar entities by any field in the Knowledge Graph, including industry, location, and more.
Open Query Tool
Access by API
Golden Query Tool
Golden logo

Company

  • Home
  • Press & Media
  • Blog
  • Careers
  • WE'RE HIRING

Products

  • Knowledge Graph
  • Query Tool
  • Data Requests
  • Knowledge Storage
  • API
  • Pricing
  • Enterprise
  • ChatGPT Plugin

Legal

  • Terms of Service
  • Enterprise Terms of Service
  • Privacy Policy

Help

  • Help center
  • API Documentation
  • Contact Us
By using this site, you agree to our Terms of Service.