Intuitive Interfaces

In history, several tools and devices have been developed by humans to interact and interface with different systems, facilitating the work and increasing ergonomics, safety, and precision. In general, the interfaces enable communication between systems that do not share the same language.

The best-known examples are mouse, keyboard, monitor, speaker and recently virtual reality headset, touch joystick and 3D mouse. These technologies, called “human interface devices” (HID), were a decisive factor in the proliferation of computers in society, enabling users to communicate and interact with them in a fast and intuitive way. When the machine in question is a robot, it is called human-robot interface (HRI), defined as a technique used for controlling machines with human activities.

Today, user interfaces are playing a relevant role in the industrial field, bridging the gap between humans and machines. To enable effective interaction between humans and robots, it is fundamental to ensure correct information exchange between the natural and the artificial side. This requires suitable interfaces to monitor human behavior, to plan the execution of the collaborative task, and strategies to increase the mutual awareness of the human–robot dyad. Nevertheless, a high level of robot autonomy, far beyond current capabilities of the autonomous robots, is required for vision or auditory based interfaces to function on a wide range of applications. Therefore, we address such limitations by developing novel and intuitive interfaces for the execution of dynamic industrial tasks.

The research activities in this area present interfaces for remote and  proximity interaction:

  • Interfaces for remote interaction: The main field of application concerns teleoperation. The approaches developed in our Lab are designed to interface with a generic robot composed of both manipulators and mobile systems. The strengths of the developed interfaces lie in their intuitiveness, functionality, and affordability. The proposed teleoperation framework leverages a motion capture suite composed of wearable and lightweight devices placed on the user’s arms, forearms, and hands. Each device integrates inertial and electromyography sensors thanks to which it is possible to remap the user’s movements and intentions on the controlled robot. In addition, stereoscopic cameras integrate.

 

This framework has been tested on different robots and in several real scenarios: Search & Rescue applications, service robotics, and in response to the pandemic.

  • Interfaces for proximity interaction: Although kinesthetic teaching represents one of the most effective methods to program industrial robots, it is useful only for manipulators and often requires expensive extra hardware (e.g. force/torque sensors) to achieve a powerful gravity compensation. Our method provides a tool to jog and programs mobile manipulators by using a single user-friendly interface. Inspired by hand-guide approaches to jog robotic arms, the proposed approach enables users to move both manipulators and mobile bases in an intuitive and contactless way through a common smartphone.