Robotics – Field of the Century
What is RoboCup @ Home League?
Figure 1. Physical Design of RE@H-II
RE@H-II was fabricated and developed at Advanced Robotics and Intelligent Control Centre of Singapore Polytechnic. Repeated experiments were performed for optimization of robot’s dimension including height and weight of body, arm and other components. The final optimized dimensions of the robot are 150cm in height and a weight of about 51kg. Most of the frames of the robot were constructed using aluminum. The robot has a size of 45cm x 45cm x 150 cm. RE@H-II is equipped with a Sony VAIO as on board PC. The camera is connected to the VAIO for image processing and high bandwidth communication. A speaker system is located on the front part of the torso and a microphone is placed on the head of the robot are used for interactions. The battery lifetime is approximately 40 minutes. The torque and power requirements of the actuators used in RE@H-II were carefully chosen so as to enable the robot to navigate, turn right/left, and manipulate object . The robot is capable of expressing emotions through the use of colored LEDs, servo motor control of mouth and eye movements. The finger system for the arms of the robot was developed to manipulate objects. A 3D virtual avatar based interaction system was also developed for special performance in open challenge. Figure 2 shows the virtual avatar developed for emotion expression and speech.
Figure 2. Virtual Avatar for Emotion Expressive Speech
One of the most important tasks of an autonomous system is to acquire knowledge about its environment. Sensors collect all the information as autonomous system needs to operate and interact with its environment. There are wide varieties of sensors that can be used in mobile robots. RE@H-II is currently equipped with a camera sensor for perception, microphone for speech recognition, an electronic compass to measure the direction towards which the robot is moving, sonar and laser scanner for distance measurement of the objects in the environment. RE@H-II is equipped with a monocular USB camera from Logitech for perception. The camera is a 640 x 480 (VGA) progressive scan CMOS with a standard miniature lenses. The frame rate is 30Hz for 640X 480 with automatic control of exposure, gain and black level. Sick LMS 100 laser scanner provides distance information which is fused with the sonar reading for obstacle avoidance and terrain mapping.
This module provides the control and strategy for the robot when undertaking different challenges of RoboCup @ Home league. A framework of hierarchical reactive behaviors is the core of this control module. This structure restricts interactions between the system variables and thus reduces the complexity. The control of the behaviors happens in three layers: skill, reactive, and planning layer. This architecture was initially developed for our Robo-Erectus version of humanoid robots where it yielded excellent results . Figure 4 shows the control architecture of RE@H-I. These layers respond in a different way to sensor data. The interaction of these layers produces the final behavior of the robot. Besides the physical sensor data, the system employs abstract sensors, take decisions.
Figure 3 Faces Recognition Cases under Different Condition
These abstract sensors are built by merging data from different sensors and their history records. The best example of these types of sensors is the map, which is generated with camera information, speech, compass data, and previous positions. The details of the three layers are given in the following part. The skill layer controls the servos, monitors targets, actual positions, and motor duties. It receives actions from the reactive layer and converts them into motor commands. After performing the motor commands, a feedback is sent back to the reactive layer.
Figure 4 Control Architecture of RE@H II
Speech Processor Module
The speech processor module consists of speech recognition and speech synthesis sub modules from Sensory Inc. The speech recognition sub module can be trained for recognizing a set of 50 commands in English language. The recognized words were then used to activate different behaviors of the robot ensuring natural way of communication between the human user and the robot. The speech synthesizer sub module helps the robot to convey its intention in a natural way to the human user through speech by selecting a best suitable sentence from a pre recorded voice storage bank. Lip synchronization has been achieved with the 3D avatar based interaction system. Emotional speech which fuses emotional expressions to speech synthesis has been tested with RE@H-II.
We integrated the AL5D robotic arm into RE@H-II towards achieving the manipulation objectives for RoboCup@Home competition. AL5D delivers fast, accurate, and repeatable movements. The arm has 10.25" median reach; 13oz lift capacity, 4 DOF, and all-aluminum construction. The robotic arm features: base rotation, single plane shoulder, elbow, wrist motion, a functional gripper, and optional wrist rotate. Figure 5 shows our arm system of RE@H-II. Inverse kinematics and arm control strategies were handled by the main processor which handles the decision making process. Based on the object of interest information from vision processor, the main processor computes the 3D co-ordinates and sends appropriate motor commands to the actuators fitted to the arm manipulator. The arm manipulator then moves to the 3D co-ordinates received and uses its customized gripper structure to hold the object of interest. A customized two finger gripper system has been built to handle tasks in RoboCup@home competitions.
Figure 5 Robotic Arm System of RE@H-II
Human Robot Interaction
RE@H-II is capable to sensing bio signals including systolic and diastolic blood pressure, temperature, sugar level, muscle contraction and expansion, ECG and EEG signals. With rising elderly population, our robot is expected to work closely taking care of elderly, a personal care box consisting of blood pressure kit, temperature, and diabetes kit was built on the robot. RE@H-II was used for our experiments in human robot interaction  . Currently, there are no metric to quantify the erroneous interactions between the human and the robot both of whom we want to interact freely. In most real life applications erroneous interactions between the human and the robot are common due to uncertainties in both human as well as in robots. We defined false alarms in human robot interaction. We also categorized the false alarms into false positives wherein a robot rejects a “correct” interaction and false negatives wherein a robot fails to reject an “incorrect” interaction. False alarms negatively impact the performance and fan out in human robot teams. Most research work assume zero false alarms which would result in an optimistic prediction of performance and fan out, not only leading to operator’s failure in accomplishing the task as scheduled due to higher attention demands in actual situation, but also leads to operator’s inability in handling planned number of robots/task. Performance of robots in human-robot teams is complex and multifaceted reflecting the capabilities of the robots, the operator(s), and the quality of interactions . We extended the neglect tolerance model which is used as a general index for estimating robot performance in relation to autonomy in human robot interaction community by incorporating the demands due to false alarms during human robot interactions. Figure 6 shows the results from our extended neglect tolerance model with RE@H-II for fetch and carry challenge presenting the relationship between performance, operator and time.
We also redefined the fan out metric commonly adopted in the human robot interaction community with incorporation of demands due to false alarms towards obtaining a realistic prediction of the maximum number of robots a single operator can handle simultaneously while maintaining performance at acceptable levels.
The RE@H-II project aims to develop a platform for competing in RoboCup @ Home League and for our ongoing research work in human robot interaction, robot localization and navigation areas. An object oriented software frame work is implemented for object recognition, motion control and communication modules to achieve real time capabilities in terms of robot localization and navigation. During the experiments performed, RE@H-II exhibited excellent navigation, object manipulation, human robot interaction and emotion expressive skills. For more detailed information about Robo-Erectus Humanoid robots, please refer to the team’s website www.robo-erectus.org
The authors would like to thank staff and students at the Advanced Robotics and Intelligent Control Centre (ARICC) and higher management of Singapore Polytechnic for their support in the development of our service robots.
Publish write-up in file REHome Web Info 2009.doc on the RoboCup @ Home page