- Summary: In a symbiotic human-robot interaction project, a multimodal conversation control system and a multi-robot conversation control system were developed to promote a robot with a higher degree of human-like presence as well as a 'sense of conversing'. This project attempted to expand the field of activity of such conversational robots and has resulted in the development of a child-like android 'ibuki', which has equipped with wheels that could enable the android to move around.
"Ibuki" is a child-like android equipped with a moving unit. By acting
with the human, having a conversation together and consequently sharing
their experience, this robot is expected to become a conversational
robot which is able to construct a deeper relationship with the human.
With regards to the feasibility and the safety aspects, a set of wheels
is adopted as its moving unit. The unit includes a pair of eccentricity
wheels for horizontal body motion and a ball screw driven actuator for
vertical body motion. This replicates the movement of a human's center
of gravity position on the android robot and expresses a human-like
movement even with the wheels. Also, having 47 degrees of freedom
enables it to have various emotional expressions such as gestures and
hand signs, in addition to different facial expressions.
The conversational robot has received
considerable attention in the recent research. However, research to date
has not sufficiently explored the robot's "sense of conversing," the
robot's "existence" and its "sociability." In response to this gap in
the research, ERATO ISHIGURO symbiotic human-robot interaction project
has launched, in which the project leader Prof. Hiroshi Ishiguro (Osaka
Univ.) and his team members have developed a humanoid robot with the
ability of a human-like conversation. In this project, we have focused
on the affinity process that emerges during the move of the robot with a
human. In order to promote an active role of conversational robots, a
child-like android named "ibuki" was developed. "ibuki" was designed to
be able to walk (move) together with the human by using equipped wheels.
Firstly, a multimodal recognition system utilizing the camera,
microphone array, etc. was developed. Next, in order to set a
technological foundation to facilitate the interaction of the robot with
the human, a conversation control system was developed that can control
the speech, motion, gaze, and emotion of the robot based on its
intention and desire towards making the human feel more human-like
existence of the robot during the interaction. Although the experiment
for the verification of the system was conducted for a short period of
time, including having conversation with a visitor in a waiting room; it
has proved that the android "ERICA" is able to conduct natural
conversation and increase the perceived existence of the robot by the
human, which are less likely to be achieved by using the other
well-known robots.
Furthermore, by using some novel technologies such as the
implementation of natural and various types of nodding during the
interaction, asking in return with analyzing the linguistical focus
terms of the interaction sentence, and the implementation of the
reaction detection mechanism, a conversation system was developed for
the robot which has resulted in more human-like sense of conversing.
Adopting this system in an experiment in which human participants were
asked to have a conversation with the robot, and the human participants
were interviewed by an interviewer during and after the experiment, a
successful induction of the human was approved in speaking with the
robot and continuing a human-like conversation for a long period of
time; compared to the well-known smart speaker-based systems.
Also, a group of conversational social robots named "CommU"s was
adopted to develop a multi-robot conversation control system. This
system controls the timing of multiple CommU's conversational behaviors
such as the starting of the speech, nodding, and the nonverbal
communication behaviors. This was set to perform between-robot
turn-taking interactions such as passing the conversation, playing a
specific role during the conversation, and even switching the roles of
each other. It was found that by showing such between-robot
conversations to the interacting human, the human feels that the
conversation is actually occurring through an independence of the
accuracy of the voice recognition (a technology for conversation without
voice recognition). Furthermore, adopting conversations including
ambiguous and vague sentences, which are applicable to more than one
meaning/intention, by two or more robots have leaded the human to feel
no contradiction in the conversation independence of human's speech
intention (a technology for conversation without intention detection).
By using these techniques, the sense of conversing could be expressed
and the perception of the human regarding the conversation could be
improved. In other words, it was found that a coordination of two or
more robots can establish a social situation and can advance the
imagination of the human to interpret the observations with a positive
direction, which consequently decreases the discomforts of the
conversation.
Finally, to develop the required conversation system technologies for
a robot which attempts to "coexist" with the human, and also to
investigate a platform for a conversational robot performing in the
daily life of the human, "ibuki," a child-like android with a moving
unit was developed in this project. It has been developed not only for
the purpose of enabling the movement of an android robot but to promote
the technologies in the interactions inducing the affinity with the
human by walking together. It is expected to achieve an autonomous
conversational android, which is able to have activity in human's daily
life.
The knowledge acquired from these researches and the development of
"ibuki" is expected to be applied on the development of further areas
regarding similar social conversational robots, such as for an aim of
providing information, life support, and the human learning support.
No comments:
Post a Comment