This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
Tele-operations in hazardous environments are often hampered by the lack of available information regarding the state of the remote robotic device. Typically, ideal camera placements are not possible, and an operator is left with the problem of performing complex maneuvers in the presence of severe 'blind-spots'. A telerobotic approach can be adopted that allows a robot to detect and avoid environment collisions automatically, however while this simplifies the task for an operator, total robot autonomy is inadvisable in safety critical applications. At all times an operator must be in total control, yet the remote robot must be allowed to perform its local problem solving autonomously. To address this dilemma, we have been investigating the use of a haptic interface which not only allows an operator to communicate motion commands to a robot, but also allows the robot to communicate to the operator its motion when performing autonomous collision avoidance. This haptic communication thus providing total operator control, plus vital information that can be used to decide if and how a robot's autonomous operation should be overridden. This paper details our work in this area and presents the results we have obtained from operator/task performance experimentation with this new haptic communication method.
Hazardous environment operations such as nuclear plant decommissioning or terrorist bomb disposal typically require the use of a remotely operated mobile manipulator vehicle. Visual information concerning the vehicle and its environment is essential if a remote operator is to successfully achieve a given task. However, ideal camera placements within such environments are not always possible and in most cases, an operator has to depend upon the information provided by a single camera mounted on the vehicle. This provides a very restricted window' onto the vehicle and its environment and a number of 'blind-spots' exist. The lack of visual information when operating in cluttered environments makes vehicle maneuvering very difficult, and when this situation is exacerbated by strict time limits for task completion, then vehicle/environment collisions and resultant damage can occur.
An operator can generate x, y motion commands from the joystick whilst feeling appropriately synthesized haptic sensations via the joystick handle. We realized that this bi-directional exchange of information could be used as a low band width communications method, thus allowing an operator to control a remote robotic device, whilst receiving information regarding the robot as it encounters objects with its environment. Furthermore, we also realized that haptic communication could be used not only for standard teleoperation tasks, but also for those applications requiring a telerobotic approach. It has long been recognized that the cognitive loading placed upon an operator could be alleviated if a remote robot was equipped with appropriate sensors and was allowed to make its own local decisions regarding problems such as collision detection and avoidance. However, while in principle this may appear desirable, for safety critical applications, total robot autonomy is imprudent.
An operator must be in total control at all times; hence a telerobotic approach is preferable to total robot autonomy. However, the problem remains of how to communicate to a remote operator exactly what the semi-autonomous robot is doing so that its behavior can be overridden, as and when required. We realized that haptic communication may be of use in this area, and we have conducted a number of telerobotic experiments using the haptic joystick. This paper details our work in the areas of teleoperation and telerobotic, with and without haptic feedback to the operator, and the results we have obtained from operator/task performance experiments are presented.
Throughout history, Education has evolved and new teaching methods have been acquired in order to improve the learning procedures. It has never been a static field but always tries to adapt the current cultural and technological status, and the intellectual requirements of society. One of the most important questions has always been how to make students participate in the learning procedures. Actually there are two ways of participating in the Learning Procedure, the Passive way and the Active way.
The Passive way of learning procedure is to obtain knowledge without interacting with the media that offers it. One such way is by reading books, where the student accepts the knowledge but has limited ways to test if he has fully perceived it. In the Passive way students read / listen / view but do not experience, and that leads to a low level of perception for part of the Physic World. Moreover, most of us have noticed the decreased interest and enthusiasm that most of participants have during the above procedure.
The Active way of learning procedure is to gain knowledge by participating, investigating the physical scene and manipulating its elements. One of the first active ways of Learning Procedure has been the experiments performed at school, providing students with the ability to acquire practical knowledge that plays a great deal in cognition of science.
In the last years several multimedia and on-line applications have been released allowing children to study scientific issues. There are commercial applications concerning Physics and Chemistry, containing images, sounds, videos and animations that describe several phenomena and allow children to participate, interact and play with the content while gaining knowledge. In the Active way of learning, students experience the principles that rule nature in a more focused way and it becomes apparent almost instantly whether they have understood the theory that describes the Physic world correctly or erroneously.
Newtonian Physics, trajectories & the Solar System
Within this application, the user navigates through the solar system, collects information about it and interacts with the various elements that it is consisted of, such as the planets, the satellites, the comets and the asteroids. The user experiences the effect of the forces when accelerating objects as well as the strength of the gravitational forces applied to objects at different distances. For the purpose of interaction the user is endowed with "super-powers". The MUVII H3DI is a chance for the students to experience, feel and steadily learn the effect of simple mechanics in the scale of our solar system.
Model Assembly - Gears
The user will learn about the history of toothed wheels, gears and their applications through the years and will experience the assembly of some selected applications of gears through time. The user experiences the effect of the forces like weight, friction, motion, rotation etc. They will understand meanings like the transmission of motion from one part of a machine to another.
A basic concept that is introduced in the applications' description is the Metaphor. A metaphor is visual, acoustical, or haptic representation of an event that takes place in the scene. For example, the event describing that the user's hand has grabbed the spaceship has the (visual) metaphor of this spaceship that moves according to the hand's movement. When the spaceship approaches a planet, a certain sound (acoustic metaphor) that corresponds to the strength of the gravity field is generated with escalating volume.
A number of related hypotheses were proposed:
If haptic feedback was present during a teleoperation task, then improved operator performance would be obtained.
If a telerobotic approach was adopted, as opposed to teleoperations, then further improved operator performance would be obtained.
If haptic feedback was present during a telerobotic task, then even greater operator performance improvements would be obtained.
The haptic joystick used for the experiments was an Immersion product, which is two degrees of freedom device. It uses a local processor to read buttons and encoders, and to control two DC motors which generate the desired haptic sensation. The processor also provides high level communications to a PC via an ISA bus card. For our experiments, the joystick was programmed to generate 'spring' sensations, each with a spring constant k that could be assigned to a particular robot sensor input. The resultant haptic sensation was one of pushing against a stiffening spring, as the robot got closer to an obstacle.
The telerobotic experiments required that the K2A vehicle possess obstacle detection and avoidance capabilities. This was developed to address the problems of behavior conflict resolution, behavior adaptation and behavior scheduling and is based upon the assignment of utilities to sensor/response functions called behavior patterns (bp's). The BSA has been demonstrated to work well for real mobile robots and is a simple architecture to implement. Here the sensory stimulus is a forward facing proximity sensor, and utility and response functions have been chosen so that has the robot gets closer to an obstacle, the motion value decreases, thus decreasing the forward velocity of the vehicle.
At the same time, the utility (or importance) of this motion increases. Thus as the robot gets nearer to an obstacle the more important it becomes to slow the robot down. The response and utility values form an utilitor and should a situation arise where competing utilitors are generated, these can be resolved by simple vector addition. For these experiments, a proximity sensor system was simulated which generated K2A to obstacle distance and direction data. Behaviour patterns were chosen that reduced the forward velocity of the robot as it approached an object, and turned the robot in a direction away from the detected object. Thus implementing a simple obstacle detection and avoidance capability.
Interfacing the Haptic Device
In a series of experiments with ten different operators, five modes of teleoperation/telerobotic were tested, some with and some without haptic communication. As can be seen from Fig. 2, the exercise was to successfully maneuver the K2A vehicle through an obstacle course which comprised pairs of posts (gates) arranged in a slalom fashion. Operator performance measurements were based upon the total number of collisions with the posts, the total distance travelled through the course and the total time taken to negotiate the course. Each operator was allowed two minutes to familiarize themselves with driving the K2A using the joystick interface, and with the correct route through the slalom course. The five modes are described as follows:
A. Mode 0: Teleoperation without haptic feedback
In this mode, the operator was in total control of the K2A at all times. There was no haptic feedback and hence an operator had to rely entirely on visual information to maneuver the K2A through the slalom course. The forwards and rotational velocities of the K2A were proportional to the displacement of the joystick. The joystick spring constants, k, were set to a small value simply to allow the joystick to return to a centre position and hence provide the operator with the sensation that they were using a standard passive joystick.
B. Mode 1: Teleoperation with haptic feedback
As with mode 0 the operator was in total control of the K2A at all times. However, unlike mode 0 the joystick spring constants, k, were dynamic. In addition to the available visual information, haptic data was communicated to the operator in the form of forces that intuitively conveyed information regarding the K2A's environment. Beyond a preset sensor range the joystick behaved as a regular passive device, as in mode 0. Within this preset range, the joystick provided an operator's hand with force sensations that conveyed when motion in a particular direction was likely to cause a collision. The generated x and y haptic force data were inversely proportional to the range d, and the orientation q, of the K2A relative to a sensed obstacle.
C. Mode 2: Telerobotics without haptic feedback
This mode introduced a semi-autonomous capability to the K2A, which provided a simple collision avoidance behavior. The BSA synthesized an operator's joystick generated forward and rotational velocity commands (implemented as bp's), with appropriate obstacle avoidance bp's, to produce a resultant forward and rotational motion for the K2A. As the vehicle approached an obstacle, the utilities associated with an operator's command bp's were decreased, whilst those utilities associated with the collision avoidance bp's were increased. Hence as the K2A was commanded by an operator to move closer to a post, in a manner that may have caused a collision, control was dynamically moved away from the operator and greater control was given to the K2A's autonomous capability. Thus preventing an operator from colliding the K2A with an obstacle. Once past a potential collision situation, control was dynamically returned to the operator. This mode had no joystick haptic feedback, and was similar to using a standard passive joystick device.
D. Mode3: Telerobotics with obstacle haptic feedback
As with mode 2, this mode provided autonomous collision detection and avoidance. However, it also provided haptic feedback to an operator in the same manner as mode 1. The major difference of this mode as compared to mode 2, was that as the K2A solved it local collision avoidance problems, the operator was able to 'feel' what the
K2A was sensing within its environment. Thus allowing an operator to experience the presence of a post, and hence infer as to why the K2A was responding in a particular manner, even when such an obstacle was out of the camera's field of view.
E. Mode 4: Telerobotics with collision avoidance haptic feedback
As with modes 2 and 3, this mode incorporated the K2A autonomous collision avoidance behaviour. However, the generated haptic sensation was considerably different. Instead of an operator being able to 'feel' what the K2A was sensing, an operator was provided with force feedback relative to the resultant motion generated by the BSA. This was achieved by feeding the velocity and rotational velocity components from the collision avoidance behaviour, back to the joystick as an updated centre position. Thus, in addition to the operator driving the K2A via the movement of the joystick, the K2A was also able to drive the joystick in a way that conveyed its motions to the operator. Hence an operator was able to 'feel' what the K2A was doing. This mode can be regarded as bidirectional haptic communication as an operator had to impart forces onto the joystick, so as to communicate desired motion to the K2A, meanwhile, the K2A also generated forces onto the joystick so as to communicate to the operator its actual motion.
Haptic Applications / products / systems are used in:
Applications in this area mainly are simulators that recreate realistic medical procedures. These simulators allow healthcare providers to practice procedures in an environment that poses no immediate risks to patients. Such applications are provided by companies such as Novint Technologies, which has developed the Virtual Reality Dental System and the Medical Imaging, and Immersion with products such as CathSim Vascular Access Simulator, AccuTouch Endoscopy Simulator and AccuTouch Endovascular Simulator.
In petroleum exploration, developing accurate models of the subsurface environment is a complex and challenging problem. Using existing 2 dimensional mouse-and-keyboard interaction devices to work with 3 dimensional data can be slow and cumbersome.
Novint has developed customized software, such as Voxel Notepad and Touchstone that makes it possible to work in 3D with 3D data, by adding haptic feedback and providing real time, 3D interaction to existing visualization techniques.
Some haptic applications are developed for the simulation of mechanical parts or several other systems (e.g. landing gear system of planes) aiming at the control and testing of the system operation before the production of the prototype. Boeing Co. has experimentally developed some haptic applications with the Phantom haptic 3D interface.
Haptics technologies offer a new way of creation and manipulation of 3D objects. Several modeling systems were developed to facilitate the digital fabrication of any type of model from shoes to toys, classical fine arts sculptures to industrial product designs.
Examples of such systems comprise Freeform modeling system by Sensable and Virtual Hand Studio by Immersion
Entertainment - Games
The first commercial products with some elementary application of Haptics are the Force-Feedback Joysticks that provide the user with the sense of force effects while playing, known as force feedback.
Logitech and Microsoft already have produced such devices that provide users more realistic interactivity with the games that support them.
A number of experiments have been performed to test the hypotheses:
The teleoperation performance can be improved upon if haptic feedback is introduced
Telerobotic yields improved performance over teleoperation alone
Telerobotic when combined with haptic feedback yields improved operator performance over that of telerobotic alone.
The results proved the hypotheses 1, 2 & 3 has proved more elusive. When a remote robot is equipped with some autonomous behavior, e.g. collision avoidance, then this telerobotic capability is extremely useful when maneuvering the vehicle in a cluttered environment. In the absence of ideal camera placements, haptic communication can be used with good effect to augment the information available to an operator. However, when this communication method is combined with telerobotic, there are likely to be additional costs in terms of operator performance, e.g. a greater task completion time. Nevertheless improved safety is achieved, with zero collisions and greater information for the operator regarding the remote robot and its environment.
It became obvious that, despite the small sample of participants during these initial tests of the Kiosk, Haptics Technology improves the level of perception for some areas of the Physic World due to the increased immersion.
Robotic arms can perform harder tasks
Construction of a model of a Wind mill