Reachin API Programmer's Guide

Reachin API Programmer's Guide Reachin API Programmer's Guide

staffwww.itn.liu.se
from staffwww.itn.liu.se More from this publisher
11.07.2015 Views

Reachin API 3.0Programmer's Guide

<strong>Reachin</strong> <strong>API</strong> 3.0<strong>Programmer's</strong> <strong>Guide</strong>


Copyright© 1998-2001, <strong>Reachin</strong> AB - All Rights ReservedThe contents of this document may not be copied or duplicated in any form, inwhole or in part, without the prior written permission of <strong>Reachin</strong> Technologies AB.Revision No: 300-501-002-03000


<strong>Reachin</strong> <strong>API</strong> 3.0<strong>Programmer's</strong> <strong>Guide</strong>


PART I - OVERVIEW OF THE REACHIN <strong>API</strong> 31 THE REACHIN <strong>API</strong> FRAMEWORK 51.2 REACHIN <strong>API</strong> SCENE GRAPH OVERVIEW 51.2.1 MULTI-SENSORY SCENE GRAPHS 61.2.2 TYPICAL TASKS OF A SCENE GRAPH 71.3 EVENT HANDLING OVERVIEW 81.3.1 WHY APPLICATIONS USE FIELD NETWORKS 81.3.2 TYPICAL EVENT HANDLING TASKS 91.4 REACHIN <strong>API</strong> MODULE OVERVIEW 91.4.1 SCRIPTING 111.4.2 SHAPES 121.4.3 SIMULATION 121.4.4 TRACKING DEVICES 132 REACHIN <strong>API</strong> NODES 172.1 BUILDING A SCENE GRAPH 182.2 VISIBLE AND TOUCHABLE SHAPES 202.2.1 APPEARANCE AND SURFACE NODES 212.2.2 MATERIALS AND TEXTURES 232.3 OBJECT POSITIONING AND GROUPING 242.4 LIGHTS, VIEWPOINTS AND BINDABLE NODES 262.5 TIMESENSOR 272.6 ANIMATION AND INTERPOLATORS 282.7 DYNAMICS AND INERTIA 312.8 DEVICES 312.9 DATA NODES 323 EVENT HANDLING 353.1 FIELD NETWORKS 353.1.1 TYPE CHECKING 363.1.2 EVENT PROPAGATION 363.1.3 LAZY EVALUATION 373.1.4 EVENT DEPENDENCE / INDEPENDENCE 373.2 FUNCTION FIELDS 383.2.1 PROCESS DEPENDENCE 393.3 OVERVIEW OF HAPTICS RENDERING 403.3.1 REACHIN <strong>API</strong> HAPTICS RENDERING BEHIND THE SCENE 40iv<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


7.1.6 RUNNING THE C++ CREATED SCENE GRAPH 807.2 USING VRML TO CREATE NODES 817.2.1 USING A GROUP OF CREATED NODES 817.2.2 CREATING SEVERAL SEPARATE NODES FROM VRML 817.2.3 USING DEFED NODES AND FIELDS FROM VRML 827.3 FIELDS 847.3.1 SFIELDS, MFIELDS AND COMPOSEMFIELDS 867.3.2 CLASS MODIFIER TEMPLATES 887.3.3 TYPEDFIELD AND REACHIN <strong>API</strong> S RUN-TIME TYPE CHECKING 887.3.4 FFIELD AND EVALDFFIELD 907.3.5 DEPENDENT AND INDEPENDENT 917.3.6 MFNODE AND SFNODE 927.4 EXTENDING REACHIN <strong>API</strong> NODES 937.4.1 DATA MEMBER SPECIALISATION 947.4.2 ALLOWING DATA MEMBER SPECIALISATION BY A SUB-CLASS 967.4.3 CREATING A VRML-INTERFACE FOR A NODE 977.4.4 LOADING CREATED NODES 987.4.5 EXTENDING A VRML INTERFACE FOR A NODE 997.4.6 INITIALIZE 997.5 INSPECTING THE HARDWARE DEVICES 1007.5.1 THE DEVICEINFO 1017.6 THE SCENE GRAPH LOOP 1027.6.1 THE COLLISION TRAVERSAL 1027.7 MOVING BETWEEN COORDINATES SPACES 1037.7.1 THE TRANSFORM NODE 1037.7.2 COLLISION TRAVERSAL COORDINATE SPACES 1057.7.3 MOTION 1067.8 FORCE OPERATORS 1077.8.1 INTERPOLATION 1077.8.2 ABSOLUTE GLOBAL COORDINATES 1117.8.3 RELATIVE GLOBAL COORDINATES 1117.8.4 ABSOLUTE LOCAL COORDINATES 1127.8.5 RELATIVE LOCAL COORDINATES 1137.8.6 CREATING NEW FORCE OPERATORS 1137.8.7 EFFECTING SCENE GRAPH OBJECTS FROM THE REALTIME LOOP 1167.8.8 ACCOUNTING FOR OBJECT MOTION DURING THE REAL-TIME LOOP 1207.9 SURFACES 1207.9.1 SURFACE CONTACTS 1217.9.2 TEXTURE COORDINATES 1237.9.3 THE CONTACT FORCE OPERATOR 1247.9.4 CREATING NEW SURFACES 1257.10 GEOMETRIES 1297.10.1 CREATING NEW GEOMETRIES 131vi<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>PART IV - CLUSTER TUTORIAL 1418 CLUSTER 1438.1 BUILDING THE SCENE GRAPH REPRESENTATION 1448.2 SETTING DEFAULT FIELD VALUES 1478.3 CREATING THE VRML INTERFACE 1488.4 NEW FIELDS, DEFINED NAMES AND ROUTING 1508.5 THE INITIALIZE FUNCTION 1528.6 ADDING ROTATIONAL BEHAVIOUR FOR THE PLANET 1548.7 BUILDING THE CLUSTER WORLD STRUCTURE 1548.7.1 POPULATING THE WORLD 1588.8 ADDING PLANET BEHAVIOUR 1608.8.1 ENTERNODE/EXITNODE 1628.9 CREATING THE GRAPHICAL RUBBERBAND 1638.9.1 COMPOSING THE RUBBERBAND END POINTS 1648.9.2 DEFINING THE RUBBERBAND END POINTS 1658.10 ADDING FORCES TO THE RUBBERBAND 1698.10.1 USING A SIMPLE FORCE OPERATOR 1718.10.2 CREATE A CUSTOM FORCE OPERATOR 1738.10.3 AFFECTING THE SCENE GRAPH WITH THE FORCE OPERATOR 1758.10.4 ACCOUNTING FOR PLANET MOTION IN THE REALTIME LOOP 1768.10.5 ADDING SLACK TO THE RUBBERBAND 1778.11 HOOKING ON TO THE PLANET S SURFACE 1798.12 GRADUALLY RAISING THE GRAVITY 1808.13 COORDINATE SPACES 1818.14 PERFORMING COLLISION DETECTION 1838.15 USING THE MOST RECENT FORCE OPERATOR 183APPENDIX A: URNS - UNIVERSAL RESOURCE NAMES 185A.1 URN INDEX FILE 186A.2 URN EXAMPLE 186table of contentsvii


APPENDIX B: ENVIRONMENT VARIABLES 189B.1 SYNTAX 189B.1.1 SIMPLE EXAMPLE 190B.2 USING LISTS OF VARIABLES 190B.3 DEFAULTS 191B.4 EASILY SPECIFYING URN S 192B.5 CONVENTIONS 192APPENDIX C: A VERY BRIEF PYTHON PRIMER 195APPENDIX D: DATA SPECIALISATION IN C++ 201APPENDIX E: REACHIN <strong>API</strong> AND VRML NODES 205REFERENCES 211C++ 211GRAPHICS 212GENERAL GRAPHICS 212OPENGL 213VRML 213HAPTICS 214PYTHON 214INDEX 217viii<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>PrefaceThis programmer’s guide is intended to enable an understanding of the <strong>Reachin</strong><strong>API</strong> at various levels to be obtained as rapidly as possible. It is not a completereference to all the features of the <strong>API</strong>; that is provided by the <strong>Reachin</strong> <strong>API</strong>Online Reference Manual, located in the <strong>API</strong>/doc directory. We aim here togive an understanding of how to use the <strong>API</strong>’s VRML (Virtual Reality ModellingLanguage) interface, how to use and extend the <strong>API</strong> using C++ and generalhaptics programming issues.The manual is divided into 5 parts. These are:• Overview of the <strong>Reachin</strong> <strong>API</strong>• High-Level Programming• Low-Level Programming• The Cluster Tutorial• AppendicesThe <strong>Reachin</strong> <strong>API</strong> is an application programming interface for producing multisensoryinteractive applications. It is designed to give the programmer a fullyextensible programming framework for building interactive applications andprototypes. It integrates haptic and graphic rendering capabilities into one easyto use <strong>API</strong>.The <strong>Reachin</strong> <strong>API</strong> is a C++ <strong>API</strong> based on the VRML scene graph model. As such,it can be programmed using VRML and Python (as a scripting language) as wellas with C++. The former allows for extremely rapid and simple prototyping ofprefaceix


interactive graphic and haptic environments. However, the <strong>Reachin</strong> <strong>API</strong> is stillprimarily a C++ <strong>API</strong> and much of the power of the <strong>API</strong> can only be accessedby using C++.Required KnowledgeFor those wanting a basic understanding of the <strong>Reachin</strong> <strong>API</strong>, reading the first part(chapter 1-4) of the book requires no previous knowledge.The second part of the manual requires some knowledge of 3D computergraphics. Some familiarity with VRML and Python is desirable.The third section, Using <strong>Reachin</strong> <strong>API</strong> with C++, requires an understanding ofC++ and object oriented concepts. In particular, the reader is assumed to havesome familiarity with templates and the STL (Standard Template Library).The references section at the back of the book is strongly recommended as astarting place for obtaining knowledge of these subjects.Conventions used throughout the manualThe basic building block in the scene graph is a node. We write the names ofnodes in bold, e.g. the Shape node. Abstract nodes are written in bold Italicse.g. Interpolator.The basic information-holding primitive is called a field. We write the names offields, field types and events in fixed space characters, as in the “radius” field.We write new terms in italic, as we have just done. Examples are “a prototype is…”or “the addChildren event causes…”This manual is written using English spelling, the official VRML97 specificationis written in American English. The two differ in some cases. The choice hasbeen so that English spelling is used in free texts while field and variable namesare in American English.Code examples appear indented and in a fixed space font. Example:momentum->set( Vec3f( 0, 0, 0 ) );Code quotations, i.e. integers, in the text are also in a fixed space font.Example CodeThe example code is located in the examples\rapg directory.x<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>prefacexi


2<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


Part I- Overview of the <strong>Reachin</strong> <strong>API</strong>The topics covered in this part are:• An overview of what the <strong>API</strong> does• An outline of how the <strong>API</strong> works• A description of the <strong>API</strong>’s different modules• An overview of the <strong>API</strong>’s event handling systemsThis first part of the book covers the major functional areas of the <strong>Reachin</strong> <strong>API</strong>.It is written from a conceptual point of view rather than an in-depth technicalview.This chapter has no particular background knowledge requirements.3


4<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>1 The <strong>Reachin</strong> <strong>API</strong> FrameworkThe <strong>Reachin</strong> <strong>API</strong> brings together a multitude of different techniques for buildingmulti-sensory 3D applications. This chapter explains most of them conceptually.Topics covered in this chapter are:• <strong>Reachin</strong> <strong>API</strong> scene graph overview• Event handling• <strong>Reachin</strong> <strong>API</strong> module overview• <strong>Reachin</strong> <strong>API</strong> and VRML node overview• The basics of multi-sensory 3D rendering1.2 <strong>Reachin</strong> <strong>API</strong> scene graph overview<strong>Reachin</strong> <strong>API</strong> is based around the concept of the scene graph. A scene graphis a hierarchical data structure that describes a 3D scene. It typically holds thegeometry of all objects in the scene and their relative positions, as well asappearance attributes such as colour, transparency, textures and surfaces. Otherinformation contained in the scene graph can include light sources, viewingposition and information about the scene.chapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 5


One of the key programming benefits of a scene graph is the abstraction layerabove low level graphics. <strong>API</strong>s such as OpenGL are focused on the “How?”instead of the “What?”. Scene graph programming can be considered declarative asopposed to procedural. For programmers this means less code and less complexity.The scene graph structure also allows optimisations in graphics rendering that arenot possible (or very difficult) in low-level graphics <strong>API</strong>s. For example, a scenegraph can be ordered such that the number of state changes (e.g. changing thecolour of objects being rendered) are minimised by rendering objects of the samecolour at the same time. The benefits to both development time and renderingperformance have made the scene graph an important mechanism in computergraphics.1.2.1 Multi-sensory scene graphsTo perform haptic (touch) rendering, the haptic rendering engine needs towork with geometric data in order to know when the user should feel an object.Ideally, this should be rendered from the same data that the graphics engineis using for drawing.It is acceptable for graphics rendering to update the screen 20 to 30 times persecond without effecting the user’s interaction with the application. In contrast, ifthe haptics rendering is not performed at close to a thousand times per second,the haptic illusion breaks down. If you push at a solid object and the hapticrendering is not being performed fast enough, your hand can move too far insidean object and your hand will be jerked out of the object. Therefore, we need ahaptics loop that is executed more frequently than the graphics loop. We cannotexecute the graphics loop as part of the haptics loop either, because the graphicsloop would typically take more than one thousandth of a second to renderthe scene. The solution is to make the application multi-threaded, allowing thehaptics rendering to be performed in a separate, higher priority, thread to thegraphics. This introduces problems of data synchronisation that the applicationdeveloper should not be concerned with.In older frameworks, the two loops have run so independently that each keeps itsown copy of all the triangles, position information, and so on: two scene graphs,in fact two scenes. This both wastes desperately needed CPU time, and makessynchronisation tricky — if your eye tracks an object your finger pushes around,the senses should agree on its motion, so the two loops should be updating eachother. The <strong>API</strong>’s multi-sensory scene graph shares the data between the graphicand haptic rendering and hides the multi-threaded aspect of the rendering.6<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>1.2.2 Typical tasks of a scene graphA scene graph provides a convenient framework for managing objects in a scene,and makes it easy to express the relationship between those objects. For instance, itis useful to group related objects in a shared coordinate system so that even when anobject is moved or rotated, the relationship with related objects is not disturbed. Forexample, to model a chess set you should be able to move the entire board. Whendoing this, you do not want to separately program the motion of the pieces one ata time, but rather move the board with the pieces staying in their squares. Equally,you want to be able to move the individual pieces when you play the game.Transform"Chess board"Transformposition of the"Castle"Transformposition of the"Chess board"Transformposition of the"Pawn"Castle ShapeappearancegeometryBoard ShapeappearancegeometryPawn ShapeappearancegeometryFigure 1.1: The scene graph concept.Figure 1.1 shows a partial scene for a chessboard, along with a possible scenegraph representation. Some of the nodes in the scene graph describe groupingof related objects in the scene, while the leaf nodes at the bottom of the scenegraph contain the actual geometry and appearance of the shapes that constitutethe scene objects. By changing the Transform node above any individual chesspiece, you can move it without affecting the other pieces or the board. Bychanging the node at the top, you move the whole board configuration, keepingthe pieces in their squares.chapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 7


1.3 Event handling overviewIf the scene graph is a structure for arranging the entities of a scene, the eventhandling capabilities makes that scene move and react to the users actions. Ascene graph without dynamic capabilities would not be much more than a stilllife painting. Combining event handling capabilities and the multi-sensory scenegraph, we turn the painting into an interactive movie.The basic mechanism that links all the scene graph components together is theevent handling field network. Through the field network, we define the interactionsbetween objects and other dynamic elements in the scene.The key benefits of having a uniform event handling infrastructure is thatthe programmer has a clear set of methods for describing data dependencies,controlling data access and applying functions to data. The field network assuch gives the programmer the right tools to express the “who can cause what”relationships, once the dependencies have been identified. In addition, fields havethe mechanisms necessary to express decision points or to transform data as itpropagates through the field network.1.3.1 Why applications use field networksWhen your virtual brush meets virtual paper, the paper must ‘know’ about it andreact appropriately. Classical programming had a main routine and passive datastructures enabling the programmer to apply functions on that paper. Objectorientedcode makes paper and brush into things that can change themselves,having an event handling network to let them pass messages to one another sothat they respond and change.When we can model the scene in such a way that each parent in the scene graphautomatically communicates with each of its children and vice versa, we get thepower to express the “What?” of some behaviour as opposed to the “How?”.Achieving such behaviour could be done in many ways. One way is to includea separate object that does nothing but store relative positions and tell objectswhere they ought to be. Another just telling the graphics engines where to draw.The point is that a scene graph alone could not handle the interaction of brushand paper. We need some sort of data dependency and synchronisation structurein addition to the scene graph. The field network handles such data and messagepassing tasks in the scene.If the field network only could pass messages and data in the field network,its usefulness would be limited. So to further increase field usefulness, theycan be set to perform calculations on incoming and outgoing values. This8<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>extremely powerful feature allows the programmer to incorporate complicatedbehaviour directly within the fields of a node which makes up for easilyverifiable building blocks.1.3.2 Typical event handling tasksWhen programming simulations or plain user interaction, the programmercannot explicitly list out all possible interaction patterns between user and objectsin the scene. What he or she can do is to set up rules for how things areallowed to interact. When a field network manages changes in the scene graph,the interaction rules that define what can happen and the field network makessure that all data needed to perform that action is accessable.In the example of the chess set, the field network performs a number of tasks.When moving a chess piece, the scene must be re-rendered. A change in theposition of the chess piece will trigger an event propagation through the fieldnetwork. The field network then updates all fields that have events pending,including the fields used for graphics rendering, which causes the scene graphto be visually rendered.Additional benefits of field networked scene graphsThe modularity that comes from nodes and field networks lends itself extremelywell for systematic, diagram based modelling. Nodes can be seen as black boxesin the sense that one can use a node without knowing anything of its internalimplementation. A node can be accessed by it’s fields, which act as interfaces tothe internal node implementation.With fields as the foundation of the <strong>API</strong>, we have a stable base for buildingcomplex multi-sensory applications. The field provides a tool to seamlesslyintegrate modules built on the <strong>Reachin</strong> <strong>API</strong>, and is the key to extensibility.1.4 <strong>Reachin</strong> <strong>API</strong> module overviewThe <strong>Reachin</strong> <strong>API</strong> is an <strong>API</strong> for defining and interacting with multi-sensory threedimensionalvirtual worlds. In particular, we target the co-location of graphicsand haptics to interface with the primary senses of interaction.More concretely, the <strong>Reachin</strong> <strong>API</strong> is a multi-sensory rendering engine thatintegrates visual and haptic rendering through the use of one single scene graphmanager.chapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 9


Before programming with a scene graph, you need to know what nodes areavailable to be used in a scene graph. Figure 1.2 is one way of organising themajor conceptual modules within the <strong>Reachin</strong> <strong>API</strong>.At the top is the multi-sensory scene graph manager which maintains theintegrity of the scene. Directly below this layer we have the event handlingcapabilities, the framework that allows you to plug in different modules in acoherent way and to define your own interaction models within the scene graph.Event handling and field networks are further discussed in chapter 3 “EventHandling”.Multi-sensory Scene Graph ManagerEvent Handling CapabilitiesScripting- PythonScriptGeometries- Box- Cone- Cylinder- Sphere- Panel- IndexedFaceSet- IndexedLineSetShapesAppearances- Surface- Material- TextureSimulation- Interpolators- Dynamic- Inertia- GravityDynamic- SlotDynamicTrackingDevices- PhantomDevice- HapticsDevice- Magellan/SpaceMouse- TrackingDevice- DeviceInfoHapticCollisionDetectionSurfaceGeneration3D VolumeVisual RenderingDisplay listmanagementOpenGL codegenerationCaching1,000+ Hzrealtime loopForce Renderingof SurfacesGraphicsRendering30 Hz scenegraph loopHaptic I/OOpenGLGeneric DriversFigure 1.2: A conceptual overview of the major modules present in the <strong>Reachin</strong> <strong>API</strong>.10<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Connected to this, a number of modules are integrated into the scene graphmanager through the event handling layer. These modules give the <strong>API</strong> an evergrowingrange of different functions. The module list given here is by no meanscomplete, but it makes up a map that we will further breakdown to a morefine-grained picture. We explain each module below.The third layer is the rendering layer. The <strong>API</strong> renders objects in pixels and forces.Initially, you do not need a technical understanding of how things are rendered.The <strong>API</strong> performs a multi-sensory rendering of the virtual objects. It interfacesto OpenGL (for graphics) and the haptics device driver (for haptics) without theprogrammer needing to be concerned with the details.We now step through each of the modules in Figure 1.2, starting with the modulelayer; i.e., with scripting, shapes, simulation and tracking devices. Chapters thenturns towards how multi-sensory rendering is achieved and is concluded by amore detailed discussion about event handling and field networks.1.4.1 ScriptingThe <strong>Reachin</strong> <strong>API</strong> supports scripting in VRML using the Python scripting language.You can use the VRML parser for rapid prototyping of all or parts of an application.The scripting nodes allow you to integrate application behaviour directly in thescene graph structure. PythonScript nodes let you explore and seek differentsolutions to a problem without the delays of recompiling and linking.The close relationship between Python and C++ puts almost all of the <strong>API</strong>’sfunctionality at your fingertips while prototyping applications. If maximumefficiency is required, python-scripted solutions can be translated almost directlyinto C++.Chapter 6 “Python, fields and field networks” describes how to use Python withthe <strong>Reachin</strong> <strong>API</strong>, so that you can start modelling application behaviour.PythonScripturlchapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 11


Shapeappearancegeometry1.4.2 ShapesThe <strong>Reachin</strong> <strong>API</strong> supports multi-sensory rendering of many different built-inshapes, some with fixed geometry like spheres, others user-specifiable like the‘indexed face sets’ that let you import your own model such as a skull or a milkcarton. The Shape node is an abstract building block that references the actualbuilding blocks that make up a shape, namely its appearance and geometry. The<strong>API</strong> supports enough shapes (geometries and appearances) for you to design allsolid objects you can find in reality.Chapter 2 “<strong>Reachin</strong> <strong>API</strong> Nodes” walks you through the different geometriesbuilt into the <strong>API</strong>. This is done to give a basic idea of what they are and whatone can do with them. For now, all you need to know is that everything thatcan be visually and haptically rendered on to the screen are Shapes divided intogeometries and appearances.GravityDynamicsourcegravityradius1.4.3 SimulationThe simulation module found in Figure 1.2 is actually not a stand-alonemodule; it is more of a way to group conceptually a number of different nodes.The names in the simulation box (Figure 1.2) all identify nodes that can beused to achieve animation and simulated real world behaviour, including rigidbody physics.Most of the simulation node types are among the more complex nodes, but theyfollow exactly the same pattern of use in the scene graph. We will introduce thedifferent node types in order of complexity, ending with the simulation nodes.Section 2.7 describes what you can do with these nodes, and lists them inmore detail.12<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>1.4.4 Tracking devicesTwo general categories of input devices help us interact with three-dimensionalapplications.The first category is tracking devices. Widely available hardware enables a fullsix degrees of freedom in input, reporting the full motion of a device the userholds: up, down, sideways, and forward and back, plus rotation around any axis.The term “Degrees of Freedom” is usually shortened to DOF. Examples of 6DOFtracking devices are the Magellan/SpaceMouse and SensAble’s PHANTOM.The second category common in 3D applications are the haptic devices, providingforce feedback. ‘Haptic’ describes what most of us normally call “touch”, but itis quite different from tactile feedback. With tactile feedback devices, such asdatagloves that vibrate when you pass through an object. You can feel the wallbut your hand goes through it. Nothing with leverage based only on the handand wrist can pull down your hand except by its own weight. With a hapticdevice, the user holds a tool connected to a haptic device, which can simulatethe weight, surfaces and so on.The <strong>Reachin</strong> <strong>API</strong> supports 6DOF input and 3DOF haptic output. Full 6DOF hapticoutput devices are currently very costly, although there is some promising workdone in that area. The most widely used haptic feedback device is SensAble’sPHANTOM, which combines 6DOF tracking with 3DOF haptic, withouttwisting forces.The default graphical representation of the input stylus (Figure 1.4) is a smallball disconnected from the handle. It is disconnected to show that the handle isnot subjected to force-feedback meaning that the handle is there for visualisationpurposes. However, since the PHANTOM is a 6DOF input device, one canrender different forces depending on how the user holds the stylus.The modules found in Figure 1.2 are, as mentioned, just one way to groupthe building blocks that make up the <strong>API</strong>. Now that we have described thesemodules, it is time to plunge deeper into the actual building blocks of a <strong>Reachin</strong><strong>API</strong> application. The next chapter presents a more fine-grained picture of thesemodules and presents them for what they are – Nodes.TrackingDevicepositionorientationbuttondevPositiondevOrientationdevCalibrationornCalibrationchapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 13


Figure 1.3: When simulating touch whith a 3DOF haptic device, it makes no difference to theuser how the tool is held. As seen in the picture, both representations would make the user feelthe same forces at the tip whether or not the tool is “inside” or above the surface.Figure 1.4: The default graphical representation of the input stylus.The tip is disconnected fromthe handle to emphsise that the tip alone is subjected to force-feedback, as well as providinggood visiblity around the tip.14<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>chapter 1 the <strong>Reachin</strong> <strong>API</strong> framework 15


16<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>2 <strong>Reachin</strong> <strong>API</strong> NodesThe relationship between <strong>Reachin</strong> <strong>API</strong> and VRML is critical in understandingnodes. This chapter describes the major categories of different nodes presentin the <strong>API</strong>. We start with the top-level nodes, shape nodes, mentioned in theprevious chapter. Other topics covered in this chapter are:• Scene graph basics and scene graph diagrams• How to construct scene graphs• <strong>Reachin</strong> <strong>API</strong> and VRML nodesAs stated in the previous chapter, the <strong>Reachin</strong> <strong>API</strong> uses a VRML based scene graph.This is because the VRML scene graph is well defined and public, with manyexcellent tutorial resources. It lets you use a host of sophisticated VRML manipulationtools, such as Cosmo Worlds, in creating objects and applications to be used withthe <strong>API</strong>.We strongly advise the reader to look at some of the VRML learning and referenceresources, in particular the VRML 2.0 Sourcebook [11] and the VRML specification [10].We cannot use only the VRML standard, which does not include entities (nodesand fields) for defining haptic properties. The <strong>Reachin</strong> <strong>API</strong> extends the VRMLscene graph with additional features so that haptic properties can be incorporateddirectly within the scene.chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 17


In order to explain these differences, we begin with the basic building blocksof a scene graph, common to both the <strong>Reachin</strong> <strong>API</strong> scene graph and the strictVRML scene graph.2.1 Building a scene graphNodeThe basic unit of a scene graph is a node. Shape, Appearance and Box are allexamples of nodes.A node is abstractly a building block that has certain attributes. A Box node, forexample, knows how to render itself visually to the screen when it is placed in thescene graph. If we merely state that we want a Box, it will be rendered with itsdefault values. The default values for a Box, according to the VRML specification,set all sides of the Box to length 2. More accurately, the field named “size” hasdefault values set to “2 2 2”. Since all units in VRML and the <strong>Reachin</strong> <strong>API</strong> aregiven as SI units, this would result in a box with each side two meters long, mostappropriate for head immersive VR. Using a hand immersive display, it would bemore common to set the box’s size to 0.1 meters.Fields are the second most important building block in the scene graph. Theirmost basic role is to store data and direct behaviour for the particular node theybelong to. Fields have a lot more functionality attached to them, but for learningpurposes, you can regard them as the data holders of any particular node.In most programming languages data holders, i.e., variables, have a type. That isalso the case with <strong>Reachin</strong> <strong>API</strong> fields. In the Box case where the field named sizecontains three individual values denoting the X, Y and Z sides of the box, thetype is something called a SFVec3f. The “SF” part means that it is a Single Fieldvalue, the “Vec” section means that it is a vector (it is more than one value), thenumber “3” means that it is a three dimensional vector. Finally, the “f” indicatesthat it consists of floating point values.There are a quite large number of different field types available in the <strong>Reachin</strong><strong>API</strong>. Most of them are self-explanatory, and — more importantly — you neednot take them all in at once. It is enough to realise that a field has a type, just asany variable or class in most programming languages.Throughout this book, node names will be written in Bold. Field names will bewritten in a fixed space font. In scene graph diagrams, nodes appear in lightgrey and fields in the white boxes underneath, as seen in Figure 2.1.18<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>NodeShapeappearacegeometryFieldsFigure 2.1: The elements of a scene graph diagram. Nodes appear in light gray and fields aswhite boxes underneath.(Informative arrows are printed in light gray.)Fields and nodes are two separate entities, but fields and nodes may have thesame name. It is important to distinguish the between the two. For example theShape node has an appearance field (used to store an Appearance node)and a geometry field that stores any one geometry, such as a Box. You candistinguish the appearance field from the Appearance node both by thetypeface and more importantly, by the convention of naming only nodes with aninitial capital letter. If a field shares its name with some node, that field is alwaysused to store instances of that node.With this basic nomenclature, we can briefly describe the major node typesavailable. The <strong>Reachin</strong> <strong>API</strong> scene graph is structurally equivalent of the VRML one,but does omit some nodes present in the official VRML specification. The reason isthat the VRML specification is mostly aimed at VRML browsers such as the SiliconGraphics ‘Cosmo Player’ (a browser plug-in), and some entities serve only thattarget.Some nodes in the <strong>Reachin</strong> <strong>API</strong> Online Reference Manual are not present in theofficial VRML specification. Some of these nodes are extensions to standard VRML,others serve as abstract base classes useful when subclassing, i.e., building newnode types using C++. For example, the Geometry node is used when building alldifferent geometry nodes, such as the Box, Sphere and Cone nodes.Other fields and their usesAll node types found in the <strong>API</strong> have fields, from just one to more than twentydifferent fields. These act as the primary data holders for any one node, and moreimportantly, can be connected to each other to provide synchronised behaviouror synchronised visual and haptic rendering properties. Fields are therefore, apartfrom being data holders, also event triggers and receivers. Fields are connectedto other fields by routes. These field routes define the event handling layer, orthe field networks, mentioned above. Understanding field networks requires someunderstanding of the higher level particulars in the <strong>API</strong> so the discussion about fieldnetworks and event handling is postponed until Chapter 3 “Event Handling”.chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 19


2.2 Visible and touchable ShapesShapeappearancegeometryThe Shape node is an abstraction of what will be rendered visually and (optionally)haptically. Dividing a shape into its geometry and its appearance, (colour,hardness, etc.) lets us reuse a certain appearance on different geometries or reusesome geometries using a single appearance node.As the small picture at right shows, the Shape node has two fields,appearance and geometry. These are used respectively to store anAppearance node and one of the different geometries available in the <strong>API</strong>, saya Box node. The scene graph below (Figure 2.2) would produce a red threedimensionalbox on the screen.ShapeappearacegeometryAppearancematerialBoxMaterialdiffuseColor 1 0 0size 0.1, 0.1, 0.1Figure 2.2: The scene graph for a red box. - A Shape node containing an appearance and ageometry field. The appearance field in turn contains an Appearance node and the geometryfield, used to store a Box node. The Appearance node has a material field, used to store aMaterial node. The Material node holds a diffuseColor field set to be red (RGB). The Box's sizeis set to 0.1, 0.1, 0.1 and the values are stored in the field named size.This scene graph would result, if rendered, in a plain symmetric 3D box. Thematerial field found in the Appearance node stores a Material node, withdata (fields) that set graphic information on how the node should be rendered;here a diffuse colour stored with RGB (Red Green Blue) values.With this basic understanding of how shapes are handled in both VRML and the<strong>Reachin</strong> <strong>API</strong>, let us take a closer look at the various geometries available in the <strong>API</strong>.The following geometries can be referenced in the geometry field of a Shapenode.20<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Geometry nodesBoxConeCylinderSpherePanelIndexedLineSetIndexedFaceSetPointSetA three dimensional boxA three dimensional coneA three dimensional cylinderA three dimensional sphereA simple flat 2D squareArbitrary 3D geometry formed by polylines between 3D verticesArbitrary 3D shape formed by polygonal faces between verticesA set of infinitesimal pointsImageTextFaceTable 2.1: The <strong>Reachin</strong> <strong>API</strong> geometry nodesA geometry that renders text2.2.1 Appearance and surface nodesThe purely visual node functionality in VRML and the <strong>Reachin</strong> <strong>API</strong>, describedabove, is completely compatible but the <strong>API</strong> also supports features for touchablesurfaces. In the Appearance node (picture at right), are four different fields:material, texture, textureTransform and the surface field.Not that the surface field (and node) is not standard VRMLThe first field, material, stores a (graphical) Material node, which directscolour attributes, transparency and shininess, all relevant for the graphicrendering engine. The next two standard VRML fields are concerned with textures(showing bricks, scales, photographs, etc, on objects).The surface field is a pure <strong>Reachin</strong> <strong>API</strong> field holding a reference 1 to sometype of surface node. A surface node, the haptic equivalent of the Materialnode, adds haptic properties to the individual object we attach it to in thescene graph. <strong>Reachin</strong> <strong>API</strong> ships with many different predefined surface properties,ranging from simple user-specifiable stiffness to magnetic, bumpmapped andrough surfaces. Designing a red box that the user can feel becomes no moredifficult than adding a SimpleSurface node to the surface field in theAppearance node. The resulting scene graph is in Figure 2.3.Different types of surface nodes (for haptic behaviour) are treated just like anynode in the official VRML97 specification; but remember that they are exclusiveto the <strong>Reachin</strong> <strong>API</strong> and therefore will not be available in other third party VRMLpackages.ApperancematerialtexturetextureTransformsurface1 Fields are as stated the data holders of a node. More functionality is attached to them as discussedin Chapter 3 “Event Handling”. When programming, fields hold references to other nodes by themeans of a C++ auto pointer.chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 21


ShapeappearaceAppearancegeometry surface SimpeSurfacematerialstiffness 100Boxsize 0.1, 0.1, 0.1MaterialdiffuseColor 1 0 0Figure 2.3: The scene graph for a touchable box. Adding a SimpleSurface node to the surfacefield in the Appearance node makes the box touchable.With this basic understanding of how surfaces are added, let us take a closer look at thepredefined surfaces available. The following surfaces can be referenced in the surfacefield of an Appearance node.Surface nodesSimpleSurfaceFrictionalSurfaceImageSurfaceFrictionImageSurfaceRoughSurfaceBumpmapSurfaceMagneticSurfaceButtonSurfaceA simple surface implementing only surface repulsionFrictionalSurface extends the SimpleSurface, adding staticand dynamic frictionA surface texture based on an imageA FrictionalSurface whose starting_friction and thedynamic_friction are determined by the greyscalecomponent of an imageRoughSurface implements an algorithmically textured surfaceA BumpmapSurface uses a grey-scale image as a height-mapover the surface, from texture coordinates to height.MagneticSurface is a template for modifying surface types sothe surface appears to be magnetic. Current magnetic surfacesavailable are: MagneticSimpleSurface,MagneticBumpmapSurface and MagneticFrictionalSurface.ButtonSurface is a template for modifying surface types so thatthey behave like a push-button, for instanceButtonSimpleSurface, ButtonFrictionalSurface,ButtonBumpmapSurfaceTable 2.2: The <strong>Reachin</strong> <strong>API</strong> surface nodes22<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>2.2.2 Materials and TexturesThe material field in the Appearance node is used for referencing aMaterial node. The Material node, as stated, specifies the colour of the associatedgeometry. Note that if this field is left unspecified all lights defined in theworld are ignored when rendering the associated geometry.Fields available for specifying colour, light reflection and transparency are as inthe picture at right.The diffuseColor field defines the colour of the geometry. TheemissiveColor is used to define glowing objects. The ambientIntensityfield specifies the amount of light ambient the geometry reflects. ThespecularColor field defines the colour of the shiny spots of the geometry. Theshininess controls the intensity of the glow for the shiny spots, small valuesrepresent soft glows, whereas high values define smaller and sharper highlights. Thetransparency field controls the transparency of the associated geometry. If avalue of 0.0 is specified then the related geometry is completely opaque, a value of1.0 means that the geometry is completely transparent.All the “Color” fields have an RGB value associated, i.e., three floating pointvalues between 0.0 and 1.0. The other fields have a single floating point valuebetween 0.0 and 1.0. When texturing a shape the texture is applied by default toeach of the faces of the shape.One can also specify if the texture is to be repeated for each of the faces of theshape as well as repeated vertically, horizontally, both, or none.The ImageTexture node specifies the location of the image to be used for texturingthe shape, as well as if the image is to be repeated vertically, or horizontally,along each of the faces of the shape.Three fields are present in ImageTexture node: url specifies the locationof the image, repeatS specifies if the image is to be repeated vertically andrepeatT specifies if the image is to be repeated horizontally. All fields areMaterialambientIntensityshininesstransparencydiffuseColoremissiveColorspecularColorchapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 23


optional having default values applied if the field is not specified. If the repeatvalues are set to true, the image texture will be repeated twice. If more repetitionsare needed, use the TextureTransform node.Material and Texture nodesMaterialTextureMaterial defines a node that contains VRML97 style materialfields for defining colour attributes, transparency and shininessThe base class for texturesImageTextureTextureCoordinateA texture from an image file. Currently only the PNG format issupportedThis node takes a set of points in 2-D to define how a texture isapplied to IndexedFaceSets. If this node is not specified thetexture is applied to the shape as a whole. Specifying aTextureCoordinate node causes the texture to be applied toeach face of the shape according to the coordinates givenTextureTransformThis node allows you to perform simple geometrictransformations, scale, rotation, and translation, on texturecoordinates.Table 2.3: The <strong>Reachin</strong> <strong>API</strong> material and texture nodes2.3 Object positioning and groupingGroupaddChildrenremoveChildrenchildrenbboxCenterbboxSizeThe previous examples simply placed a box in the default position in the middleof the screen. It is clear that we need some way to group things together as inthe chess example (Figure 1.1) so that we can move an entire structure withoutwriting code to move each individual object (node).The <strong>Reachin</strong> <strong>API</strong> has several grouping nodes, e.g. the Group, Inline, Switch,Transform, Dynamic and GravityDynamic nodes. The Group node simplycollects some nodes under a parent node i.e. the grouping node. Nodes groupedunder a grouping node are said to be children to that node and consequently, thegrouping node are said to be a parent node. As in the picture at right, the Groupnode has a field called children. The children field stores the node’schildren in the scene graph. Further, there are two fields named addChildrenand removeChildren. These do not store data, they dynamically add andremove children from the grouping node, so we postpone that discussion to theprogramming section. The bboxCenter and bboxSize create a boundingbox surrounding the group. Normally you do not have to set these values unless24<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>you are modelling something performance critical. The default values are notfixed constants like the size of a Box, but automatically calculated from theobject’s geometry when the need arises.The Transform node arranges (for example) that a pawn belongs with achessboard: the Transform node extends a Group node by giving it a positionrelative to the board, such as on a particular square, with its base flat to theboard. Among other things, the Transform node can position (translate), rotate,scale a node or a group of nodes in the 3D environment, relative to the Worldcoordinates or to parent node. Grouping nodes allows you to manage andposition your scene graph objects relative to its immediate surroundings. Anexample scene graph with a touchable box is shown in Figure 2.4.Transformtranslation 0.2 0.1 0.1childrenShapeappearaceAppearancegeometry surface SimpeSurfacematerialstiffness 100BoxMaterialsize 0.1, 0.1, 0.1diffuseColor 1 0 0Figure 2.4: A Transform node inserted above the Shape node lets us freely position the shapein the 3D environment. The Shape node is stored in the children field of the Transform node.The values stored in the translate field indicates that the box should be positioned with thecoordinates X = 0.2, Y = 0.1 and Z = 0.1 in the World coordinate space, i.e., slightly to theright and slightly closer to you than the default.The third grouping node, Inline, acts as a normal Group node, with the onlydifference that it loads its children from an external file, incorporating its childrenin the scene graph.The fourth grouping node, the Switch node acts as a grouping node that willrender only one or none of its children, according to a criterion you specify.We end this basic overview with the complete listing of grouping nodes foundin Table 2.4.chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 25


Grouping nodesGroupTransformInlineGroup defines a node that can have children. We define fieldsfor storing, adding and removing children.A grouping node that defines a coordinate system for itschildren relative to the coordinate systems of its immediateparent in thescene graph. The Dynamic and GravityDynamicnodes are extensions of the Transform node and as such theyare also grouping nodes.A grouping node that specifies its children from an externalVRML fileSwitchA Switch node traverses zero or one of its children nodes asspecifiedTable 2.4: <strong>Reachin</strong> <strong>API</strong> grouping nodes2.4 Lights, Viewpoints and bindable nodesViewPointset_bindfieldOfViewjumporientationpositiondescriptionbindTimeisBoundThe scene graphs so far have been lit by a default light, the headlight, used if noother light is specified. This is a light set to point straight forward from the user.It is turned on or off in the NavigationInfo node.With a directional light the light rays are all parallel. This light has no definedlocation, only a direction. It is as if the light is infinitely far away from yourworld, consequently all things are lit equally. The DirectionalLight also hasintensity, ambient intensity and colour fields for customising the way the lighteffects the scene.The Viewpoint node defines the position (direction, distance and field of view)of a camera, how you see things. The Viewpoint node describes a number ofattributes of a camera (see the picture in margain).Clearly, you can’t have several viewpoints independently active at the same timei.e. looking at an object from several different angles. It is therefore time tointroduce bindable nodes.Bindable nodesBindable nodes are a special type, in that only one instance of each can be active(or bound) at a time. They provide information about the environment and theuser. <strong>Reachin</strong> <strong>API</strong> has several bindable nodes that set environment options, etc.The following table summarises them.26<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Bindable nodesViewpointBackgroundStereoInfoDeviceInfoNavigationInfoThe attributes of a camera i.e. the users viewing positionThe background colour of a sceneStereoscopic information that is environment specific, such asthe distance between the current user's eyes and focal distanceThe set of hardware input devices availableAttributes of the viewer such as headlight settingsTable 2.5: The <strong>Reachin</strong> <strong>API</strong> bindable nodes2.5 TimeSensorThe TimeSensor node is a clock that generates events as time goes by. The eventscan be used to drive animation for instance. The TimeSensor node has the fieldspresent in the picture at right. Their respective use is as follows: enabled whichspecifies the status of the sensor, startTime specifying when the TimeSensorstarts to generate events. The value is the number of seconds since midnight,GMT January 1, 1970, stopTime specifying when the TimeSensor stops generatingevents. The value is the number of seconds since midnight, GMT January1, 1970. Note that a startTime of 0 implies that the TimeSensor shouldstart when it is created. The cycleInterval field specifies the number ofseconds that the TimeSensor will generate events and loop specifies when thecycleInterval is over and the TimeSensor should be restarted.When the TimeSensor is enabled it will start ticking when the startTimeis reached. When enabled, the timer will continuously output an event with thecurrent time. After that, the TimeSensor will stop generating events when eitherthe stopTime or startTime + cycleInterval is reached. In thelatter case, if loop is true, the TimeSensor will continue generating events again.If using cycleInterval, the TimeSensor will output an eventfraction_changed. The value of this event is between 0.0 and 1.0 andrepresents the fraction of the cycleInterval elapsed. If loop is true then thecycleInterval is over and the fraction_changed starts over from 0.0TimeSensors output one more event, the isActive event. This event willhave the value true when the clock starts ticking and false when the clock stopsticking.Table 2.6 summarises time-related nodes found in the <strong>API</strong>.TimeSensorcycleIntervalenabledloopstartTimestopTimecycleTimefraction_changedisActivetimechapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 27


Time related nodesTimeDependentTimeSensorTimeDependent is the abstract base class for all nodes that areplayed. They have a start and stop time, a loop flag and anactive flag.TimeSensors generate time-dependent events.Table 2.6: The <strong>Reachin</strong> <strong>API</strong> time related nodes2.6 Animation and InterpolatorsInterpolatorfractionkeykey_valuevalue<strong>Reachin</strong> <strong>API</strong>’s animation capabilities come from a group of nodes collectivelycalled interpolators. With the help of an interpolator node, you get intermediatesteps (of positions, colours…) calculated automatically for you. You specify astarting position and an ending position of, for example, an animated box. Theinterpolator node calculates where the box is at intermediate times, and thescene graph displays the appropriate image as time goes by. Much animationis after all, not more than moving objects (actors, cameras, lights, etc.) overtime. Interpolators are most often driven by time (specifically the output of aTimeSensor node), however they can be driven by any scalar, for example theoutput of a slider-bar.The various interpolator nodes in the <strong>API</strong> differ in what types of values theyinterpolate between. The Interpolators are enumerated in Table 2.7 below.As seen in the picture at right, each interpolator has a standard set of fieldsavailable, best described with an example. If we would like to move our redbox back and forth, we need some sort of continuous loop. Since animationsare movements over time, we use the TimeSensor node to produce a smoothlyvarying control scalar. The course of events can be seen in Figure 2.5.28<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Cube startingat position-2.0 0.0 0.0and time 0.0Cube positioncontinuouslyinterpolatedCube atinterpolatedposition0.0 0.0 0.0 attime fraction0.25Cube positioncontinuouslyinterpolatedCube atposition2.0 0.0 0.0and time 0.5First halftime periodCube endingat position-2.0 0.0 0.0and time 1.0Cube positioncontinuouslyinterpolatedCube atinterpolatedposition0.0 0.0 0.0 attime fraction0.75Cube positioncontinuouslyinterpolatedCube atposition2.0 0.0 0.0and time 0.5Second halftime periodFigure 2.5: The course of a red box moving back and forth. The positions giveninside the boxes are 3D coordinates and the box is moving along the X-axis, X valuesrange from “-2” to “2”. The whole loop takes place in a timeframe moving repeatedlyfrom 0 to 1.Before we look more closely at interpolator nodes, we must more formallyintroduce routes and the VRML DEF statement.As you may have noticed in Figure 2.5, the red box has a name “Cube”. This is anessential feature of VRML. The DEF (DEFine) statement allows you to give a nameto any node or group of nodes that you use in designing a scene graph. Here,we have DEFed the Shape node to be called Cube. In doing that, we can referto the whole shape construction (Shape, Box, Appearance, Surface etc.) as asingle entity. Moreover, since we have named the Cube, we can use this name toroute values to and from it.In designing the sample animation, we connect the TimeSensor node to aPositionInterpolator by a route, and let the PositionInterpolator direct thevalues in the translate field that moves the whole group around. Figure 2.6lays out the entire scene-graph.chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 29


GroupchildrenCube (Transform)childrentranslation 0 0 0ShapeappearancegeometryAppearancematerialMaterialsurfacediffuseColor 1 0 0CubePath (PositionInterpolator)SimpleSurfacekey [0.0 0.5 1.0]keyValue[ -2.0 0.0 0.0, 2.0 0.0 0.0, -2.0 0.0 0.0]value_changedset_fractionBoxsize 0.1 0.1 0.1stiffness 100Clock (TimeSensor)cycleInterval 10loop TRUEfraction_changedFigure 2.6: The complete scene graph structure for a continuously animated red box movingback and forth over the screen. Routes are placed from Clock’s fraction_changed to CubePath’sset_fraction and from CubePath’s value_changed to Cube’s translation.We collect the objects to be moved under the Transform node DEFed as“Cube”. Two routes are required for the animation to work. The first isfrom the TimeSensor’s fraction_changed to the PositionInterpolator’sset_fraction field. This causes the interpolator to change with time.Secondly we route the interpolated position from the interpolator to theTransform node. Now the Transform node moves with time, causing its children(the Box) to move also.Interpolator nodesInterpolatorColorInterpolatorCoordinateInterpolatorOrientationInterpolatorPositionInterpolatorThe Interpolator node is the abstract base class for allinterpolatorsInterpolates over a HSV model using a list of RGB colour valuesto produce a single colour (RGB) at any one time/positionInterpolates linearly among a list of coordinate valuesInterpolates among a list of rotation valuesInterpolates linearly among a list of 3D vectorsScalarInterpolatorTable 2.7: The <strong>Reachin</strong> <strong>API</strong> interpolator nodesInterpolates linearly among a list of floating point values,making it appropriate for any parameter defined using asingle floating point value. Examples are width, radius, andintensity fields.30<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>2.7 Dynamics and InertiaThere are two main approaches to simulating moving objects with the <strong>API</strong>. Thefirst is taken from the keyframe animation capabilities as discussed above. Theother is through rigid-body mechanics simulation. The Dynamic node and itssub-classes provide this functionality.The Dynamic node is a sub-class of Transform and as such performs a coordinatespace transform on its children. The rotation and translation of thecoordinate space is controlled by forces, torques and the momentum.One can program forces to be applied to the dynamic in order to simulate variousphysical effects such as gravity, springs, viscious damping, etc. Forces are alsocontributed by the haptic device when it interacts with children of the dynamic(see Chapter 5.2.6 Dynamic Objects).For example a gravitational field can be simulated by routing a dynamic’s positionto a gravitational force. This is provided in the GravityDynamic node. Springsand dampers can be easily simulated at the scripting level as in the dynamic demo(see urn:inet:reachin.se:/demos/dynamic/dynamic.wrl).Dynamic nodesDynamicGravityDynamicA rigid-body dynamics simulation nodeA dynamic node with gravitational forcesDynamicmomentumangularMomentumvelocityangularVelocityinertiamassforcehapicForcetorquehapticTorqueSlotDynamicTable 2.8: The <strong>Reachin</strong> <strong>API</strong> dynamic nodesA dynamic node constrained to move along a line segment2.8 DevicesDevice nodes are specific to the <strong>Reachin</strong> <strong>API</strong> and are not standard VRML. Thedefault values set up hand-eye co-ordination for a <strong>Reachin</strong> Display; if you areusing or making another display system, you will probably need to calibrate yoursystem. Device nodes usually change only with a modification of the physicaldisplay environment, not with changes in a particular scene you are creating. Assuch, device nodes are used in a similar way to the browser specific nodes foundin the VRML standard. Device nodes must be referenced by the DeviceInfo nodein the scene graph.Device nodes provide an abstract interface to input devices, removing the need tobe aware of device calibration and low level I/O.DeviceInfodevicesset_bindbindTimeisBoundchapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 31


Extending support to a new device then involves writing an I/O handler for it,after which it seamlessly integrates with the device calibration code and thus with<strong>Reachin</strong> <strong>API</strong> applications.Device nodesDeviceTrackingDeviceA Device node is an abstract base class representing a hardwareinput device. We provide a mechanism by which the state fields(position, orientation, etc) of sub-class instances can be retrievedby name.A TrackingDevice represents a generic 3d tracker with a switchHapticsDevicePhantomDeviceMagellanTable 2.9: The <strong>Reachin</strong> <strong>API</strong> device nodesA HapticsDevice is a TrackingDevice that is used in constraintbasedhaptics renderingA PhantomDevice is a HapticsDevice that sets the calibrationcomponents appropriately for the PHANTOM haptic deviceA Device that represents the Magellan Space Mouse 6-dofinput device2.9 Data nodesSome nodes do not perform any significant functionality but are rather used forstoring data. We store these data in a separate node so they may be referenced byseveral other nodes without the need for replication.Data nodesCoordinateColorNormalThis node is referenced by the IndexedLineSet andIndexedFaceSet. It has a single field which takes a list of 3Dcoordinates.This node is referenced by the IndexedLineSet andIndexedFaceSet. This node has a single field which takes a listof RGB.This node appears inside an IndexedFaceSet. It has a singlefield which takes a list of 3D vectorsInertiaInertia and its sub-classes are referenced by dynamic nodes todescribe the mass distribution of a rigid-bodyTable 2.10: The <strong>Reachin</strong> <strong>API</strong> data nodes32<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>chapter 2 <strong>Reachin</strong> <strong>API</strong> nodes 33


34<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>3 Event HandlingUntil now, we have skipped over the details of the event handling mechanism i.e.the field networks. This is because one can make use of the <strong>Reachin</strong> <strong>API</strong> at ahigh level without this knowledge. However, when building complex applicationsa deeper knowledge is necessary.3.1 Field networksThe field network is the basic behavioural building blocks of the <strong>API</strong>. At thehighest level, fields offer a means of expressing dependency between pieces ofdata. Those pieces of data, or fields, may be contained within a single node, orspread over the whole scene graph. In building interactive virtual worlds, thereis a strong need for complex dependencies between values in various parts of thescene. The <strong>API</strong>’s field classes and templates provide an optimised infrastructurefor handling these dependencies. We shall now take a closer look at the fieldnetwork mechanisms.A route is a directed connection between the fields of different nodes or withina single node. Graphically, we render routes as dashed one directional lines asin Figure 3.1.chapter 3 event handling 35


CubePathCubevalue_changedtranslation 0 0 0Figure 3.1: A route from the value_changed field to translation. (The picture is a close up fromthe Interpolator example in Figure 2.6).Routes may be established either directly in C++ or through the ROUTEcommand in VRML. The VRML specification provides a set of field types and rulesfor how fields may be routed. We begin with the VRML style routes.VRML defines four types of fields as follows.• eventOut - A field that may only be routed from (e.g. thevalue_changed field in Figure 2.6).• eventIn – A field that may only be routed to.• field – A field that may not be routed to or from, but may be initialisedwith a value.• exposedField – A field that may be routed to, routed from and be initialisedwith a value.We do not make such distinctions in the C++ implementation of fields. In the <strong>API</strong>we have a much more general control over routability as discussed below.3.1.1 Type checkingVRML specifies the restriction that only fields of the same type may be routedtogether. For example an eventOut of type SFVec3f may be routed to aneventIn of type SFVec3f but not of type SFFloat. In the <strong>API</strong> theserules are expressed using a general run-time type checking mechanism (seesection 7.3.3). As we shall see, these rules may be expanded to allow arbitrarytyping policies, analogous to the typing of function prototypes in programminglanguages. The difference is that these typing policies are enforced at run-time,rather than at compile-time.3.1.2 Event propagationTypically we will connect a multitude of different fields to enable complexbehaviour. Sequences of events in such a network are termed event cascades orevent propagation. Event propagation occurs constantly as part of the internalfunctioning of a <strong>Reachin</strong> <strong>API</strong> application. There are a number of optimisationsperformed by <strong>Reachin</strong> <strong>API</strong> to ensure that field propagation and value updatingonly occur when necessary.36<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>When an event is generated in a particular field, a timestamp is generated.The event (and its timestamp) is propagated throughout the connected fields.Each field maintains its own timestamp which it compares to the timestamp ofincoming events. Only events more recent that the field’s timestamp are dealtwith. In this way a single event can never visit the same field more than once,even if there are cycles in the field network graph.3.1.3 Lazy evaluationEvents do not bear any field data. In fact the only information transferred in theevent propagation stage is the timestamp and a pointer to the field sending theevent. Event propagation is simply a means of notifying fields that they are nowout-of-date with respect to one or more of their inputs.When fields are queried for their current value, they first perform a validationcheck. In validating, we check to see if there has been an event received since thelast time we were up-to-date. If so, the field performs an update. It is the updatestep which finally defines the field’s value with respect to any fields routing to it.Note that there may be many events arriving at a particular field before an updateis necessary. Since we only update the field value when we really have to, we termthis optimisation lazy evaluation.An eventCubePathvalue_changedCubetranslation 0 0 0Figure 3.2: An event is passed from value_changed to translation.3.1.4 Event dependence / independenceNote that the lazy evaluation optimisation relies on the fact that a field only caresabout the most recent event to arrive. For example if we were simply copying ourvalue from the field routed to us, then we only care about the most recent valueof the input. In this way we can safely ignore some events and only update ourvalue when it needs to be known.Consider though, the case where a field’s value is dependent upon each event thatit receives. For example we might be setting our value as the sum of each inputevent value or we might want to print each input value to standard output. Inthis case we can mark the field as being dependent (see section 7.3.5). The eventchapter 3 event handling 37


propagation infrastructure will immediately update the field whenever an eventarrives and lazy evaluation will not be performed. The vast majority of fields inthe <strong>API</strong> are independent and hence lazily evaluated.Initial Color_changedevent, Timestamp "X"A: Color_changedSecond, additional Color_changedevent with Timestamp "X+1"B: set_ColorD: set_ColorPopagated Color_changedevent, still with Timestamp "X"C: set_Color E: Color_changedFigure 3.3: If a color_changed event starts in field A with timestamp X, the value propagates tofields B, C and D. However, the new event from field E will overwrite the first event in fields B,C and D, in the resulting a event cascade.3.2 Function fieldsWe have now described enough to handle the VRML model of fields and events.However in the <strong>Reachin</strong> <strong>API</strong> we extend these concepts further to make fields afundamental building block for node functionality. Central to this capability isthe concept of the function field.Function fields are just like normal value-bearing fields except they providespecial means of defining their value. A field can set its value to be an arbitraryfunction of its input values. For example, let’s examine a function field of typeSFVec3f (a 3-vector of floats) that sets its value as the average of two otherSFVec3f inputs (see Figure 3.4).SFVec3f ASFVec3f BAverage = ( A + B ) / 2Figure 3.4: We define a function field “Average” that defines its value as the average of its inputs.Once we’ve defined such a field type, we can use it in place of a SFVec3ffield. Note that we’ve retained the lazy evaluation optimisation inherent to fieldnetworks. The average will only be computed when absolutely necessary. And,since we implement the field updating function in C++, it executes very efficiently.38<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Furthermore we can incorporate a runtime typing policy into the Average field toensure that it will have exactly two inputs of type SFVec3f. Alternately we couldallow it to have any number of SFVec3f inputs and calculate the average of allof them. Finally, since Average is also an SFVec3f we can use one instance as aninput to another, as in Figure 3.5.SFVec3f AAverage = ( A + B ) / 2SFVec3f BSFVec3f CFigure 3.5: Using Average as an SFVec3f.Average = (A+B)/4 + C/23.2.1 Process dependenceSo far we have examined the use of field networks to express dependencybetween various values (i.e. value dependence). However the associated infrastructureis general enough to provide dependency between processes. We termthis process dependence.We can use the event mechanisms to carry out a complicated series of proceduresin a desired order by using routes to represent computational dependencies. Forexample, if we use a function field to represent the computations involved inrendering a graphical scene containing a ball and a box, we might declare thefollowing network.Render a Ballfunction fieldRender the Scenefunction fieldRender a Boxfunction fieldFigure 3.6: Function fields representing rendering dependence of the scene.Here we have a function field whose update step is to render the graphical scene.We know that in order to render the scene, we must render the components ofthe scene – the box and the ball. We can therefore create two function fields forrendering the objects in the scene, and indicate the computational dependencywith routes. Now when we query the “Render the Scene” function field, it willfirst query all of its incoming field routes and then execute its function. Theball and box fields will execute their functions respectively when queried bythe “Render the Scene” field. This allows us to implement lazy evaluation ofrendering and lazy building of OpenGL cache lists.chapter 3 event handling 39


3.3 Overview of Haptics RenderingProgramming for three-degree-of-feedback haptics devices essentially consists ofgenerating force vectors to be applied to the haptic device 1,000 or more timesper second. Typically the force vectors are computed based on the position of thehaptics device, allowing us to generate forces only if the device is inside an object.If the device is inside a cube, for example, we can generate forces that will pushthe device out of the cube. However the haptic device must be allowed to enterthe cube before forces are felt. This allows for the possibility that one could pushstraight through an object and out the other side.To combat this, <strong>Reachin</strong> <strong>API</strong> performs haptic geometry interaction using a proxy,a small sphere that represents the desired location of the haptics device. Animportant property of the proxy is that it always remains on the surface of thegeometry. With this mechanism, we can render the geometry by applying a forcethat pushes the haptics device towards the proxy. The proxy constantly tries toreach the haptic device position and is constrained by the surfaces it encounters(see section 7.9.1).(proxy)outside of SurfaceThe force that pushes thestylus back to the surfacethe tip of the inputstylusFigure 3.7: The haptic collision state takes place in the scene graph loop. It is here that the<strong>Reachin</strong> <strong>API</strong> calculates if the input stylus is found to be within an object. If so, the force withwhich the stylus should be pushed back out of the surface is calculated.The force calculationaare done based on surface properties and the distance to the proxy indicating where the stylusshould be.3.3.1 <strong>Reachin</strong> <strong>API</strong> haptics rendering behind the scene<strong>Reachin</strong> <strong>API</strong> is designed to facilitate these tasks by providing a multi-sensoryscene graph and a haptics rendering engine. As we saw in Figure 2.3, adding aSimpleSurface to a shape’s Appearance node makes a shape in the scene graphtouchable by the haptics device. In the <strong>Reachin</strong> <strong>API</strong>, haptics programming can beas simple as using predefined surfaces for the shapes in your scene graph.40<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Two loops are used for haptics rendering; the high frequency loop, referred to asthe realtime loop and the slower scene graph loop that is run as often as possiblewithout slowing down the realtime loop. These loops are run as separate threadsto help ensure that the realtime loop can run every millisecond or faster.Two types of haptics rendering services are provided; surface rendering (seesection 7.9.1) and force operators (see section 7.8). Both of these services canbe accessed from the VRML and Python scripting interfaces, however creatingextended or new haptics effects must be done with C++ due to the speedrequirements of the realtime loop.SurfacesThe scene graph loop performs a scene graph traversal to detect collisionsbetween geometry in the scene and the haptic device. If a geometry is beingtouched by the device, it generates a simplified representation to be used in therealtime loop. The realtime renders these representations at 1kHz to determinewhat forces should be applied to the haptic device. The scene graph and fieldnetwork is not touched from the realtime loop. This topic is covered in depthin section 7.9.4.Force OperatorsThe second haptics service, force operators, is somewhat more complicated thansurface rendering. A force operator is a mapping from position to force, and canbe used to achieve free space effects that do not need to be tied to a shape in thescene graph. Like the realtime geometry surfaces, force operators are also createdin each scene graph loop traversal and rendered by the realtime loop. This topicis covered in depth in section 7.8.chapter 3 event handling 41


42<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


Part II- High Level ProgrammingThe topics covered in this part are:• Loading VRML files into the <strong>Reachin</strong> <strong>API</strong>• A brief description of the equivalence between VRML and a scene graph• The similarities and differences between <strong>Reachin</strong> <strong>API</strong> and VRML• A description of <strong>Reachin</strong> <strong>API</strong>’s additions to VRML• How to use Python for scripting with <strong>Reachin</strong> <strong>API</strong>This chapter provides an introduction to using <strong>Reachin</strong> <strong>API</strong> with VRML and Python.It also contains a review of the many important concepts used in the <strong>API</strong>, includingscene graphs and field networks. It is desirable that anyone new to the <strong>Reachin</strong> <strong>API</strong>reads this chapter. It is intended to provide a rapid and simple introduction to usingand understanding the <strong>API</strong>. For readers who are familiar with VRML this chaptercan be skimmed, the most interesting parts for such readers will be the section on<strong>Reachin</strong> <strong>API</strong>’s new nodes and using Python as a scripting language for the <strong>API</strong>.43


44<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>4 VRML and <strong>Reachin</strong> <strong>API</strong>This chapter outlines how to load files into the <strong>Reachin</strong> <strong>API</strong> from VRML anddescribes how to use the <strong>API</strong> with VRML.It should be noted that in this chapter and the next ones, VRML is used withseveral meanings. It is used to refer to the VRML97 specification and to the VRMLinterface of the <strong>Reachin</strong> <strong>API</strong>. In each case the context makes the meaning clear. Itis, however, worth keeping this in mind.4.1 Loading VRML filesLoading VRML files into the <strong>Reachin</strong> <strong>API</strong> is very simple. <strong>Reachin</strong>Load.exe isinstalled with the <strong>Reachin</strong> <strong>API</strong> and its source code can be found in theexamples/<strong>Reachin</strong>Load directory. It is used as follows:<strong>Reachin</strong>Load As an example of what can be constructed using <strong>Reachin</strong> <strong>API</strong>, try loading theDynamic demo:<strong>Reachin</strong>Load dynamicNote that with <strong>Reachin</strong> <strong>API</strong>’s flavor of VMRL, you may need to make changes to aVMRL file to be able to load it. Note also that performance of your application isdirectly proportional to the computational power of the system it is running on.chapter 4 vrml and <strong>Reachin</strong> <strong>API</strong> 45


If the VRML file does not contain a Display node, <strong>Reachin</strong>Load will create one andadd the contents of the file to the children field.4.2 <strong>Reachin</strong> <strong>API</strong>, VRML and scenegraphsThis section includes a brief review of scene graphs, emphasising the equivalencebetween VRML syntax and a scene graph, and it gives an introduction to usingVRML to create systems with the <strong>Reachin</strong> <strong>API</strong>.4.2.1 Scene graphsThree-dimensional computer graphics is fundamentally the definition of geometricprimitives and their display. This can be done by specifically using transformationmatrices to describe translations, rotations and explicitly specifying theproperties of each piece of geometry. However, this method of defining threedimensionalobjects makes for extremely long and complex programs that areunnecessarily difficult to understand and debug. In addition, for multi-sensoryapplications the primitives must be have at least two methods of being rendered,resulting in yet more complexity and duplication.A simplification is obtained by using the concept of a scene graph. A scene graphis a tree, or directed acyclic graph structure, that is used to specify geometryand properties of that geometry. Each node of the tree is either geometry or aproperty of the geometry, such as location or colour. Traversing the graph andrendering the objects encountered with appropriate properties renders the scenegraph tree. The method of displaying a scene graph is to traverse the tree in someorder, displaying the geometric primitives that have been described in the mannerdescribed by the properties in the scene-graph.4.2.2 VRML syntax and scene graphsVRML is a system for describing computer graphics in a compact text format. Itssyntax can be used to describe a scene graph. As a straightforward example thediagram below is equivalent to Program 1 : Hello .wrl d !GroupchildrenShapegeometryBoxsize 0.1 0.1 0.146<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Program 4.1: Hello.wrl scene graph diagram.4.2.3 <strong>Reachin</strong> <strong>API</strong> and VRML scene graphs<strong>Reachin</strong> <strong>API</strong> scenes can be described using VRML syntax. This is useful because theVRML syntax is well defined, public and allows the sophisticated tools that exist formanipulating VRML to be used in the creation of objects and applications for usein the <strong>Reachin</strong> <strong>API</strong>. This can speed the development process.In addition there are a number of excellent resources for learning and using VRML.The reference section of this manual contains a list of VRML references. In particular[11] is recommended as a good introduction and as a good reference book.However, it is important to note that <strong>Reachin</strong> <strong>API</strong> is not VRML. Some parts ofVRML are not included in the <strong>API</strong> and the <strong>API</strong> also contains things that are notin VRML. This is because VRML is used to define a general method for describinggraphical and audio 3D virtual worlds that can be navigated by an avatar (avirtual representation of the user). <strong>Reachin</strong> <strong>API</strong> is an <strong>API</strong> for writing applicationson devices that a user operates, meaning that many of the features of VRMLare not required and would be superfluous. For example, VRML has a great dealof information about the user, collected in an avatar, which activates thingsusing the VRML node TouchSensor. However, in <strong>Reachin</strong> <strong>API</strong> where the userhas a viewpoint and various interaction devices, this makes no sense. The <strong>API</strong>also contains nodes that describe haptic surfaces which are not in the VRMLspecification.Included in the appendix is a list of VRML and <strong>Reachin</strong> <strong>API</strong> nodes, outliningthe nodes that exist in VRML and the <strong>Reachin</strong> <strong>API</strong> and the difference betweenthe two.4.2.4 Beginning to write VRML codeThis section provides a practical introduction to using <strong>Reachin</strong> <strong>API</strong> via its VRMLlikeinterface.A very basic <strong>Reachin</strong> <strong>API</strong> exampleProgram 1 : Hello .wrl d !File: hello.wrl#VRML V2.0 utf8Group {children [Shape {chapter 4 vrml and <strong>Reachin</strong> <strong>API</strong> 47


}]}appearance Appearance {}geometry Box { size 0.1 0.1 0.1 }To explain very briefly what is going on, the string at the start of the file issimply to say that this file is a VMRL 2 file. This line, or something like it, isnormally required for all VMRL files. However, <strong>Reachin</strong> <strong>API</strong> does not require it.The following lines (Group and children) 2 are used to give the scene graph apoint to load into the <strong>Reachin</strong> <strong>API</strong>, somewhat like a top to the graph. Theappearance line makes the box visible.A touchable basic <strong>Reachin</strong> <strong>API</strong> exampleThe previous example probably underwhelmed you. One of <strong>Reachin</strong> <strong>API</strong>’s mostcrucial features is its ability to easy render both graphically and haptically anobject and in the previous example this was not done at all. In order to do this allwe need to do is add a few lines to the above code to change it.Program 2: Touch me Baby!File: touchable.wrl#VRML V2.0 utf8Group {children [Shape {appearance Appearance {surface FrictionalSurface{} # Only this line added!}geometry Box { size 0.1 0.1 0.1 }}]}4.2.5 The coordinate space and standard unitsThe units of the coordinate system used for <strong>Reachin</strong> <strong>API</strong> are the standard VRMLmetric system using the standard metric units of metres, kilograms etc. Howeverin most VRML systems the units are used slightly differently to the <strong>Reachin</strong> <strong>API</strong>.In <strong>Reachin</strong> <strong>API</strong> the units correspond to the real size of objects in the display.2 The top of the scene graph is actually the Display node which is created by <strong>Reachin</strong>Load.48<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>This difference must be noted. VRML files intended for browsers often describelarge worlds that the viewer has a small view into, these files will be too large forhand-immersive displays and need to be scaled down.The axes and workspace is set out in the following way - the x axis is from leftto right across the screen, going towards the right being positive. The y-axis goesdown to up on the screen and the z-axis is normal to the screen, with out of thescreen being the positive direction.For readers using the <strong>Reachin</strong> Display the file coordinateSpace.wrl inthe examples/rapg/PART_II directory will draw a box that shows thescreen volume and will place a cross at the origin of the co-ordinate system.For those not using a <strong>Reachin</strong> Display it would be a worthwhile experiment tomodify the program so that the box occupies the effective user area. It is alsoworth noting where the origin of the co-ordinate system is. In the case of the<strong>Reachin</strong> Display the origin is along the line of sight into the display.4.2.6 Instancing (DEF USE) and the lack of PROTOVRML uses some slightly curious syntax that may come as a surprise to manyprogrammers who are familiar with more normal programming methods. TheDEF statement in particular is somewhat misleading. DEF merely gives a nameto a single instance of code. USE then reuses the same instance in a differentposition. Two instances are not created. The PROTO statement in VRML allows fornew classes to be created. However PROTO is not yet implemented in the <strong>API</strong> asnew nodes can be created using C++.chapter 4 vrml and <strong>Reachin</strong> <strong>API</strong> 49


50<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>5 <strong>Reachin</strong> <strong>API</strong> Specific NodesThis section provides a description of the additions that the <strong>Reachin</strong> <strong>API</strong> makesto VRML. The main classes of these are surfaces, which allow haptic surfaces tobe created using VRML, and dynamic objects, which are physical simulations ofreal bodies, which are extremely useful for many haptic systems. In additionforces can be described that act on haptic devices, the example described isthe Magnet node. The PythonScript node, as it requires a more involvedtreatment, is described in the next chapter.5.1 Online Reference ManualThe <strong>Reachin</strong> <strong>API</strong> Online Reference Manual is located in the <strong>API</strong>/doc directory.Being able to use and understand the Online Reference Manual is extremelyvaluable for the <strong>Reachin</strong> <strong>API</strong> programmer. In this part of the Programmer’s <strong>Guide</strong>we describe what is available in the Online Reference Manual and in particularhow to look at the VRML and <strong>Reachin</strong> <strong>API</strong> specifications and what the varioussigns mean. This information is available online and is accessible by clicking onthe icons to the side of the field names but is worth considering and clarifying.For this section it is assumed that the reader has access to the manuals. For thenext paragraph please look at the Appearance node section in the manuals.chapter 5 <strong>Reachin</strong> <strong>API</strong> specific nodes 51


The Appearance node will be considered, as it is a node that shows a number ofthe features of the VRML interface. When using VRML the main thing to consideris the Fields section. This shows what fields of the node can be accessed via theVRML interface. The first field, whose name is material shows most of theproperties. The blue – green arrow pointing both ways shows that this field can beset and used to send events, that is it is an exposed field. By clicking on the arrowthe descriptions of what the arrows mean can be obtained. The page showingthis describes if a field is an eventOut, eventIn or an exposedField.The meaning of them is the same as in VRML. To revise this briefly, eventIncan be routed to, eventOut can be routed from and exposedField canbe routed to and from.Note that there is no differentiating made between fields that the VRML97 speccontains and other fields, meaning that the surface field has no VRML97equivalent. Most nodes are the same as in the VRML97 specification but a few aredifferent. Those differences are outlined in the Appendix comparison of <strong>Reachin</strong><strong>API</strong> and VRML97.5.2 <strong>Reachin</strong> <strong>API</strong>’s additional non-VRMLnodesOne of the main additions that the <strong>Reachin</strong> <strong>API</strong> makes to VRML is in the Appearancenode. The field surface is added that is used to add haptic surfaces for objectsin a similar way to the material field of Appearance is used to specify graphicsurfaces.Surfaces make the creation of solid objects extremely easy. By specifying geometryand then describing a surface, VRML files can be made solid and have differenthaptic properties. A number of surfaces are supplied with the <strong>API</strong> and more caneasily be added. The surface hierarchy can be obtained by looking at the hierarchydiagram in the Online Reference Manual. One thing to note here is that some ofthe classes are abstract, and thus cannot be instantiated via VRML.It is strongly recommended that in order to gain an understanding of surfacessome of the surfaces are simply played with by altering their field values. This willgive a good understanding of how each surface works.The first example in this section is given in full but the latter examples have onlythe parts of their source code that are directly relevant quoted. The examples arein the examples/rapg/PART_II directory.52<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>5.2.1 SimpleSurfaceSimpleSurface is the parent class for all the other surfaces and possesses the twofundamental properties of surfaces – stiffness and damping. Stiffness is measuredin Newtons per metre and damping in Newton seconds / metre.Stiffness is the most important variable to alter. Stiffness alters how much thesurface pushes against the user’s haptic interface, the higher the stiffness theharder the surface feels. It can be considered similar to a spring from the tip of thehaptic interaction device to the surface. For users using PHANTOMs, maximumvalues of around 1100 are recommended, as beyond that forces tend to becometoo high and the servo loop shuts down. Interestingly, negative values for stiffnesscan be used. They suck the user into the surface. The problem with this is thatforce gets stronger as the user goes further into the surface, resulting in forces thatare too high, causing the haptic rendering loop to fail.Damping is the retardation force that is offered when entering a surface. In realsurfaces it is extremely large and rapidly applied. The limits of haptic devices areimportant to consider, immediate extremely large forces, which are created whendamping has value greater than 1.5 on low (i.e. 200) values for stiffness, willcause the haptic server loop to shut down.Program 3 : A SimpleSurface Example.File: simpleSurface.wrl#VRML V2.0 utf8Group {children [Shape {appearance Appearance {surface SimpleSurface {}]}stiffness 100damping 1.5 # 0-1.5 ( Keep lower ( ~0.5 )} # if stiffness > 200}geometry Box { size 0.1 0.1 0.1 }# 0-1100 Play Around with these variableschapter 5 <strong>Reachin</strong> <strong>API</strong> specific nodes 53


5.2.2 FrictionalSurfaceFrictionalSurface adds friction to the model for haptic interaction. Friction isadded in a very normal manner. It is added the same way as it is modelled inNewtonian Mechanics, by having co-efficients of friction.In addition there are co-efficients to determine when the type of friction shouldchange. The starting friction is proportional to the amount of resistance that’sneeded before the tip of the haptic device can move. The stopping friction is theamount of friction that will be encountered before static friction becomes themode of friction. Setting stopping friction higher than dynamic friction changesthe motion to a shuddering motion as dynamic friction is not used at all. ThestiffnessT is the same as stiffness except that it is along the planeof the surface instead of normal to it, again, think of it as a spring that goesfrom a theoretical interaction point on the surface to the actual tip of the hapticinteraction device.Program 4: There’s a fraction too much Friction. . .surface FrictionalSurface {startingFriction 0.8 # Play with these 4 values (1)dynamicFriction 0.4 # 2stoppingFriction 0.2 # 3stiffness 900stiffnessT 700 # 4damping 0}. . .5.2.3 RoughSurfaceRoughSurface is a child of FrictionalSurface. It simulates something akin tosand paper. The haptic device’s motion is constantly stopped and a new startingfriction value is randomly calculated using a Gaussian probability distributionwith a mean and standard deviation supplied by the user. The mean is selfexplanatoryand the standard deviation allows the randomness to be contained.The effect of RoughSurface is similar to that of setting the stopping frequencyhigher than the dynamic frequency, although what occurs is a more interestinglyhaptically textured as the randomness present alters the values. Note also thatsome of the variables that can be set have no effect. In particular the startingfriction is constantly over-written.54<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>The two main values that can be altered that are added to RoughSurface aremean and deviation. By increasing the deviation the changes in the surface areincreased, by increasing the mean the factor by which the stopping friction ismultiplied is changed.Program 5: A very Rough Example. . .. . .mean 0.5 # Play with these two valuesdeviation 0.1 #5.2.4 BumpmapSurfaceThe BumpmapSurface uses a texture to provide a haptic depth to a surface. Thisallows complex textures to be easily generated from images. The grey values for atexture are used to generate heights below the surface, with the maximum valuebeing the full bumpHeight below the surface. The fact that the heights arebelow the surface is slightly surprising. But this is the only feature about thesesurfaces which is unexpected.In this example the graphic texture loaded is different from the texture used tocalculate the height of the bump map. The graphic texture itself can be used,but then some parts of the texture feel somewhat odd. The texture used for thebump map was extracted from the texture visually displayed using PhotoShop byextracting grey values and then using those values.Program 6: The long and bumpy road. . .surface BumpmapSurface {texture ImageTexture {url “urn:inet:reachin.se:/library/bumpmaps/steel.png”}bumpHeight 0.001. . . .}. . .FrictionImageSurfaceWhereas BumpmapSurface uses a texture to set the values for the height valuefrom a surface, FrictionImageSurface uses a texture to modulate the starting anddynamic friction for a surface.chapter 5 <strong>Reachin</strong> <strong>API</strong> specific nodes 55


Program 7: Sticky picturesFile: frictionImageSurface.wrl#VRML V2.0 utf8Group {children [Shape {appearance Appearance {texture ImageTexture {url “urn:inet:reachin.se:/library/textures/jupiter.png”}surface FrictionImageSurface {texture ImageTexture {url “urn:inet:reachin.se:/library/bumpmaps/jupiter.png”}startingFriction 0.8stoppingFriction 0.3dynamicFriction 0.5stiffness 900stiffnessT 700}}geometry Box { size 0.1 0.1 0.1 }}]}5.2.5 Button SurfacesButtonSurface is a template that can be used to create surfaces that send out anevent when they are activated. The two events sent out are the armed event,which is sent out when the surface is touched and the activate event whichis sent out when the surface is released. For an example of the use of one of thesesurfaces please see Section 6.1.2.5.2.6 Dynamic ObjectsIt is important for haptics simulations to include rigid-body dynamics simulation.This basic functionality allows solid objects to be manipulated using forces in arealistic manner. To this end the <strong>API</strong> includes the Dynamic node. The Dynamicnode is a descendent class of the Transform node and so inherits its fields, andin addition it allows physical properties to be specified. It executes a rigid-bodymechanics simulation to define the rotation and translation of the coordinatespace. In the example below a cube is drawn and given rotational inertia. The usercan then spin the cube with the haptics device.56<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Note that pushing the cube does not move it. This is because the cube has had nomass specified, and as no mass has been specified it is set to infinity, thus allowingno motion to be given to the cube. Similarly if no rotational inertia is specified anobject cannot rotate as it’s rotational inertia is then set to be infinite.The inertia characteristics of the Dynamic node are specified in the Inertia node.This node contains a description of the mass and rotational inertia. Rotationalinertia is specified as a 3x3 matrix called the inertia tensor. It is beyondthe scope of this manual to describe inertia tensors, however several utilitynodes are provided as sub-classes of Inertia to make it easier to specifyrotational inertia of common shapes and combinations of shapes. Thesenodes are SphereInertia, HollowSphereInertia, BoxInertia, ConeInertia,CylinderInertia, TransformedInertia and GroupInertia.Program 8: A really top exampleFile: top.wrl#VRML V2.0 utf8Group {children [Transform {rotation 0 1 0 0.785children [Dynamic {rotation 1 0 0 0.785inertia BoxInertia {mass 1.0size 0.06 0.06 0.06}mass 0children [Shape {appearance Appearance {texture ImageTexture {url “urn:inet:reachin.se/library/textures/bubbles_purple.png”}surface FrictionalSurface {stiffness 200}}geometry Box { size 0.06 0.06 0.06 }}]}]}]}chapter 5 <strong>Reachin</strong> <strong>API</strong> specific nodes 57


Other Dynamic nodesThe GravityDynamic and SlotDynamic nodes are derivative of Dynamic.GravityDynamic provides forces to simulate a radial gravity field. SlotDynamicconstrains the haptic device to move back and forth along a line segment. Thereare many more opportunities for specialisation of the Dynamic node for thesimulation of various constrained systems.5.2.7 MagnetThe Magnet node provides a very simple example of a VRML interface to a simpleforce. The node places a force on the haptic device that attracts it to the currentlocal origin. It provides a spring-like force that grows in proportion to distance. Ifthe forces become too great it then releases the point. It behaves like a magnet inthat there is a bonding force and a minimum bond-breaking force.Program 9: An attractive ideaFile: magnet.wrl#VRML V2.0 utf8Group {children [Magnet {startDistance 0.05effectDistance 0.06stiffness 100}]}5.2.8 TextAs yet <strong>Reachin</strong> <strong>API</strong> does not have the standard VRML text node – instead a simplernode called ImageTextFace is provided.File: imageTextFace.wrl#VRML V2.0 utf8Group {children [Shape {appearance Appearance {}geometry ImageTextFace {string “I will not buy this record, eeyt is scratched “font “urn:inet:reachin.se:/fonts/verdana.wrl”}}]}58<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>chapter 5 <strong>Reachin</strong> <strong>API</strong> specific nodes 59


60<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>6 Python, Fields and fieldnetworksThis section describes how to use PythonScript nodes for complex behaviour.We describe field routing in VRML and how to use Python to define morecomplex behaviour.6.1 Field networks using Python and VRMLThe examples of VRML that we have loaded with <strong>Reachin</strong> <strong>API</strong> so far havelacked an important element – sophisticated programmatic behaviour. In orderto demonstrate this we need to build upon the understanding of the <strong>Reachin</strong><strong>API</strong>’s field networks concept.At the heart of the <strong>API</strong> lies a very simple and yet powerful mechanism – thefield. It is important to gain an understanding of fields and field networks inorder to use the <strong>API</strong> as it has been intended. <strong>Reachin</strong> <strong>API</strong> and VRML are based onnodes constructed from fields. A field is a value contained within a node. Fieldsmentioned above include the size field of the Box and the radius field ofthe Sphere.In order to add interactivity to models in <strong>Reachin</strong> <strong>API</strong> we can connect fields.Fields then generate events, which can be passed on to other fields. For examplewe can connect the output of a TimeSensor, via a script to the colour of anchapter 6 python, fields and field networks 61


object so that the colour of the object pulses. Another possibility is that we canconnect the output of a TimeSensor via a ScalarInterpolator to the size ofsome spheres as in the following code example.Program 10: Field NetworksFile: throbber.wrl#VRML V2.0 utf8Group {children [Transform {translation -0.07 0.0 0.0children [DEF theShape Shape {appearance Appearance {material Material {diffuseColor 0 0 1}}geometry DEF theSphere Sphere { radius 0.1 }}]},}]Transform {translation 0.07 0.0 0.0children [USE theShape]},DEF theClock TimeSensor {cycleInterval 4loop TRUE},DEF theInterpolator ScalarInterpolator {key [ 0.0 , 1.0 ]keyValue [ 0.0 , 0.06 ]}# Routing for the sphere to change sizes using the interpolatorROUTE theClock.fraction_changed TO theInterpolator.set_fractionROUTE theInterpolator.value_changed TO theSphere.radius62<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>There are a few things to note here. Firstly, the TimeSensor generates events thatare sent to the ScalarInterpolator that generates its own events that are then sentto the two spheres. Secondly, since we have used DEF and USE the events aresent to a single Shape instance which appears twice in the scene.6.2 Using PythonScript nodesThe behaviour shown in the first example was very simple. In order to performmore sophisticated and interesting tasks, an event generated has to be processedand have some calculations performed using it’s values and then have values sentout that perform various actions. In order to do this some sort of programminglanguage needs to be used. One way is to create a new type of Node sub-class inC++, provide a VRML interface to it and use it as a processing object within theVRML scene (see ection 7.4 Extending <strong>Reachin</strong> <strong>API</strong> Nodes). The VRML97 standarddefines a script node that includes a URL field. The language to be used is notspecified.In the <strong>API</strong>, we have a node called PythonScript which contains a URL field thatreferences a python script or defines python code directly prefixed by python:The reason that PythonScript is used rather than the VRML Script node is thatevents are passed to python scripts in a different manner to the way in whichVRML passes fields in a script node. This allows a far simpler way of doing thingsin python than would otherwise be the case.6.2.1 Basic field routingIn order to understand the next example an understanding of Python is required.If you require an introduction to Python please see the Python introduction inAppendix C. Also there are a number of Python resources available (many online)which are listed in the references section.The code that follows is in two parts. The first part is VRML code showing howthe PythonScript node is inserted into the VRML interface. The second part isthe python code itself. The first part shows how a PythonScript node is createdand used. The most important thing to notice is the way that the PythonScriptcontains nothing other than a reference to a URL. Also note the way in which thescript’s fields are referenced. The lines# Routing for the sphereROUTE sphereSurface_BSS.armed TO touchHandler.sphereTouchedROUTE touchHandler.sphereTouched TO sphereMat.diffuseColorchapter 6 python, fields and field networks 63


show how this is done. A route is created between an instance of a field in thePythonScript and the nodes in the VRML file.Program 11: Getting snakey – A PythonScript example Part 1File: box_sphere.wrl#VRML V2.0 utf8# The objects to be displayedGroup {children [Transform {translation -0.07 0 0children [Shape {appearance Appearance {material DEF cubeMaterial_m Material {diffuseColor 1 0 0}surface DEF cubeSurface_BSS ButtonFrictionalSurface { }}geometry Box { size 0.07 0.07 0.07 }}]}Transform {translation 0.07 0 0children [Shape {appearance Appearance {material DEF sphereMaterial_m Material {diffuseColor 0 1 0}surface DEF sphereSurface_BSS ButtonFrictionalSurface { }}geometry Sphere { radius 0.05 }}]}]}# Python script nodeDEF touchHandler_psct PythonScript {url “touchHandler.py”}# Routing for the sphereROUTE sphereSurface_BSS.armed TO touchHandler_psct.sphereTouchedROUTE touchHandler_psct.sphereTouched TO sphereMaterial_m.diffuseColor64<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong># Routing for the cubeROUTE cubeSurface_BSS.armed TO touchHandler_psct.cubeTouchedROUTE touchHandler_psct.cubeTouched TO cubeMaterial_m.diffuseColorThe Python script itself is listed in Program 12: Getting snakey. A PythonScriptexample Part 2. The first thing to notice is that it is unnecessary to include theline to load Python at the start. This is because the <strong>Reachin</strong> <strong>API</strong> loads a Pythoninterpreter itself. It also includes some Python code above the script itself. Thiscode defines things used by Python allowing it to communicate with the <strong>API</strong>.The class definition is of interest to us. The first line:class touchHandlerClass( TypedField( SFColor, None, SFBool ) ):states that the class we are creating is a subclass of SFColor modified accordingto a TypedField. TypedField is the Python name for the C++ templateTypedField. Normally SFColor fields may only be routed from otherSFColor fields. This TypedField modification changes this behaviour toroute only from SFBool fields (for more information on TypedField, see7.3.3 TypedField and <strong>Reachin</strong> <strong>API</strong>’s run-time type checking).Another important part of the program is the initialisation functiondef __init__( self, inColor ):SFColor.__init__( self )self.originalColor = inColorself.hit = 1The __init__ member function in python is equivalent to C++ constructors.However in contrast to C++, we must always explicitly call the base-class constructor(in this case SFColor.__init__). This is because all methods inPython are virtual and may be overridden by methods in sub-classes. This isparticularly important as the C++ constructor of the node must be called in orderto register the object with the runtime library of objects that the <strong>API</strong> maintains.Program 12: Getting snakey. A PythonScript example Part 2File: touchHandler.py# Note NO "#!/usr/bin/python load" needed# The class for the instances to be created fromclass touchHandlerClass( TypedField( SFColor, None, SFBool ) ):def __init__(self, inColor ):SFColor.__init__( self )self.originalColor = inColorself.hit = 1def evaluate( self , IN_SFBool ):if self.hit == 1:self.hit = 0chapter 6 python, fields and field networks 65


eturn Color ( 0 , 0 , 1 )else:self.hit = 1return self.originalColor# Create instances of the classescubeTouched = touchHandlerClass( Color(1,0,0) )sphereTouched = touchHandlerClass( Color(0,1,0) )6.2.2 Accessing the scene-graph directlyThe PythonScript node has a second field called references. This is an MFNodewhich allows python scripts to directly access nodes defined in the scene graph.Consider the following example:DEF DIS Display {children [DEF A Shape {appearance DEF B Appearance {material DEF C Material {diffuseColor 1 0 0}surface DEF D FrictionalSurface {}}geometry DEF E Box {size 0.05 0.05 0.05}}PythonScript {url “python:print references;print writeVrml( references[4] ”references [ USE A USE B USE C USE D USE E ]}]}Note that we have embedded the python script directly into the VRML file inthis example. This is indicated with the “python:” prefix. When the python scriptexecutes, there is a global variable called references which is a list (not a field)of Nodes. By printing out references[4]we see that it is the Box node from theVRML script (labelled E). The python script could then directly modify thisnode, for example withreferences[4].size.set( Vec3f( 0.1, 0.1, 0.1 ) )Alternatively it could set up routes to or from fields of the node.66<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>You may use <strong>Reachin</strong>Load to load Python files directly. <strong>Reachin</strong>Load willcreate a Display node and a PythonScript node, add the Display node tothe references field, add the filename to the url field and add thePythonScript node to the children field of the Display node.6.2.3 Creating dependent fields<strong>Reachin</strong> <strong>API</strong> uses lazy evaluation when updating its fields. What this means isthat the evaluate function is generally only called on a field when the field’svalue is required. This is not always desirable behaviour. In particular let usimagine creating a file using a Python script. No value within the <strong>API</strong> is requiredfor evaluation but a value must be written out. In order to do this we need tocreate a field that is dependent.The next program, Program 13: Getting dependent Part 1, does this. The activationof a button is again routed to a field within a PythonScript node.Program 13: Getting dependent Part 1File: dependent.wrl#VRML V2.0 utf8# A nice square to be displayed and generate events.Group {children [Transform {children [Shape {appearance Appearance {material DEF cubeMat Material {diffuseColor 1 0 0}surface DEF cubeSurface_BSS ButtonFrictionalSurface {}}geometry Box { size 0.07 0.07 0.07 }}]}]}# Python script nodeDEF dependentHandlerS PythonScript {url “dependentHandler.py”}chapter 6 python, fields and field networks 67


# Routing for the cubeROUTE cubeSurface_BSS.armed TO dependentHandlerS.cubeTouched# Notice no routing out from our fieldThe Python code is as follows in Program 14: Getting dependent Part II. Thedifference between this and the first example is that the example here is a subclassof Dependent. It is also interesting to note that as this class performs noinitialisation it is safe not to explicitly call the super-class constructor as it is calledautomatically as we are not overriding it.Program 14: Getting dependent Part IIFile: dependentHandler.pyimport os# The class for the instances to be created fromclass dependentHandlerClass( Dependent( SFBool ) ):def evaluate( self , input ):print os.getcwd()print os.listdir( os.getcwd() )print input.get()# Actually create the instance of the classcubeTouched = dependentHandlerClass()The program also shows how to access a field value that is coming in by usingthe get() method on the arguments to the evaluate()method. If thePython class has a number of inputs coming in to its evaluate function they canbe accessed as a Python list. This is done by using inputs[index].get().For example if we had a classclass fred( TypedField( SFBool, None , SFFloat ) ):We could access the incoming fields in the evaluate function as follows:def evaluate( self, inputs ):tempFloat1 = inputs[0].get()tempFloat2 = inputs[2].get() – inputs[1].get()There are a few points about Python scripts that are important tonote. The first is that the types used by Python to communicate with<strong>Reachin</strong> <strong>API</strong> are pre-loaded by the <strong>Reachin</strong> <strong>API</strong>. The resource locatedat urn:inet:reachin.se:/python/PythonScript.py defines theinterface used by Python scripts in the <strong>API</strong>. This file itself should be referredto as the definitive up-to-date reference for Python scripts in <strong>Reachin</strong> <strong>API</strong>. All68<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>PythonScript scripts automatically import this module, however if you defineyour own python module to be imported into your PythonScript script, youmay need to explicitly import PythonScript as follows.from PythonScript import *See urn:inet:reachin.se:/python/InsertMagellan.py for anexample of this usage.6.2.4 Creating nodes from VRMLJust as VRML scripts can instantiate PythonScript nodes, so can Python scriptscreate nodes from VRML scripts. As in C++, we can specify VRML scripts via aURN or directly with an in-line string. The following functions return a tuple (nodes, defmap ) where nodes is a list of Node instances (see below) and defmapis a DefMap (see below) instance which allows access to the nodes labelled withthe VRML DEF keyword.def createVrmlFromString( str ):return nodes, defmapdef createVrmlFromURL( url ):return nodes, defmapThere is a Node class defined in PythonScript.py that corresponds to the C++Node class. The python Node class must not be instantiated directly, insteadinstances are returned from the createVrml* functions, as well as the get()functions of SFNode and MFNode fields. It defines a number of useful memberfunctions as follow.def isNull( self ):return reachinapi.isNodeNull( self.ptr );def field( self, name ):if self.isNull():raise ‘Null node reference’return reachinapi.getNodeField( self.ptr, name )def __getattr__( self, name ):return self.field( name )name returns the type name of the node (e.g. "Group").def name( self ):if self.isNull():raise 'Null node reference'return reachinapi.getNodeName( self.ptr )chapter 6 python, fields and field networks 69


Also, the fields, eventIns, eventOuts, exposedFields andelements functions return lists of strings describing the VRML interface forthe node. The most useful member function is __getattr__ which allowsusages such as myapp.material.get() where myapp is a Node instance,myapp.material is an SFNode field and myapp.material.get()is another Node instance (or None, indicating a NULL reference).The DefMap class (defined in PythonScript.py) provides the followingmember functions which may be used to retrieve either a DEFed node or a fieldof a DEFed node.def findNode( self, node ):return reachinapi.defMapFindNode( self.dm, node )def findField( self, node_dot_field ):return reachinapi.defMapFindField( self.dm, node_dot_field )def __getitem__( self, a ):if string.find( a, “.” ) != -1:return self.findField( a )return self.findNode( a )Generally, when nodes such as FrictionalSurface (not a sub-class of Child) areinstantiated in VRML, they must be assigned to a field (in this case the surfacefield of an Appearance node). This is because they cannot be assigned to thechildren field (which is of type MFChild) nor can they appear at the highestlevel of a normal VRML file. The following functions are similar to the generalcreateVrml* functions above except they create a single node at the highest levelthat does not have to be a sub-class of Child. Also, they return a single nodeinstance rather than a list of nodes. They also return the DefMap, since the singlehighest-level node may reference other nodes.def createVrmlSingleFromString( str ):return node, defmapdef createVrmlSingleFromURL( url ):return node, defmapAn example usage might be as follows.app, dm = createVrmlSingleFromString( “””Appearance {material DEF MAT Material {}}””” )dm[ ‘MAT.diffuseColor’ ].set( Color( 1, 1, 0 ) )70<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Here we create an Appearance node (not a Child sub-class) that referencesa Material node named MAT. Then, using the DefMap we set thematerial.diffuseColor field to yellow. This could also be done withdm[ ‘MAT’ ].diffuseColor.set( Color( 1, 1, 0 ) )Often we are uninterested in the DefMap and may employ the followingcompact usage.n = createVrmlSingleFromString( “BumpmapSurface {}” )[0]Finally we provide a function for returning the names of all the VRML interfacescurrently registered with <strong>Reachin</strong> <strong>API</strong>.def allNodes():return [ "Appearance", ... ]6.2.5 Looking up URNs with pythonPython scripts may also access the <strong>Reachin</strong> <strong>API</strong> URN resolving system (seeURNResolver.h). The following function provides access to the C++URNResolver::findWithType function.def lookupURN( urn ):return file, maintype, subtypeThere is also a URN interface for the python import command. One maywrite a python module (eg. mylib.py) and install it somewhere wherethe <strong>Reachin</strong> <strong>API</strong> URN resource system can find it with a given URN(eg. urn:inet:xyz.com:/python/mylib.py) Then one may use theimportURN function as follows.importURN( ‘urn:inet:xyz.com:/python/mylib.py’ )This is equivalent to “from mylib import *”, except you do not haveto make sure that the mylib.py file is somewhere in the python library pathsys.path.6.2.6 Accessing the bindable stacksSpecial access to the <strong>Reachin</strong> <strong>API</strong> bindable stacks is provided forPython scripts (see sections 2.4 and 7.5.1 and the Bindable.h andBindableInterface.h files). The general purpose function expects a stringgiving the name of the bindable stack (eg. ‘DeviceInfo’).def getBindableStackTop( name ):...chapter 6 python, fields and field networks 71


For convenience we provide direct functions for the built-in bindable types.getBackgroundTop(), getDeviceInfoTop(), getNavigationInfoTop(),getRenderHintsTop(), getStereoInfoTop() and getViewpointTop().However the general function is necessary if you create your own bindablenodes in C++.72<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>chapter 6 python, fields and field networks 73


74<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


Part III- Low-Level ProgrammingFor this section it is assumed that the reader has a good working knowledgeof ANSI C++. In particular more modern C++ features including namespaces,templates and the STL are assumed knowledge. See the C++ reference section atthe back for references to background information.It is particularly useful to have the <strong>Reachin</strong> <strong>API</strong> Online Reference Manual(located in the <strong>API</strong>/doc directory) on hand while working through this section.The <strong>Reachin</strong> <strong>API</strong> header files are important for developing an understanding ofthe <strong>API</strong>. Another part of the Online Reference Manual that is particularly usefulfor this section is the <strong>Reachin</strong> <strong>API</strong> class hierarchy, which is available from theindex page of the documentation from the hierarchy link.75


76<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7 <strong>Reachin</strong> <strong>API</strong> and C++7.1 Building a scene graph from C++The <strong>Reachin</strong> <strong>API</strong> uses VRML more than just as a convenient interface to loadobjects, it also defines the object structure used in the C++ part of the <strong>API</strong>to define the complete hierarchy of objects. This means that the scene graphstructure internal to the <strong>API</strong> is the same as the structure used in VRML. Creatingscene graphs using C++ is very similar to creating scene graphs using VRMLalthough it is rather more verbose. The differences are that all objects need tobe created by hand, and all field-values need to be set. In order to do that thecompiler needs to know about all the class-specifications and so the appropriateheader files need to be included.7.1.1 Creating nodes from C++To create nodes from C++, include the corresponding headerfile, and create aninstance of the desired node. For example to create a Material node:#include Material *mat = new Material;chapter 7 low-level Magma 77


7.1.2 Reference countingSince the nodes in <strong>Reachin</strong> <strong>API</strong> are joined together via references to formthe scene graph we implement the reference-counting model of memorymanagement at the node level. Nodes may be allocated as necessary and do notneed to be explicitly deleted. When creating scene graph nodes from VRML thereis no issue since all nodes are constructed and immediately referenced by a field(such as the children field of a Group node). However when creating a scenegraph from C++ we need to make sure that all nodes in our scene graph getreferenced. The reference counting is managed by the SFNode and MFNodefields, and in general, pointers to Nodes should only be held by SFNodes orMFNodes. If not the AutoRef template should be used to instantiate the node(see AutoRef.h).When creating a scene graph from C++ this means that all nodes that are addedto the scene graph get referenced when added to another SFNode or MFNodein the scene-graph. The exception to this is the top node in the scene graphstructure. When creating a scene graph for the <strong>Reachin</strong> <strong>API</strong>, the topmost node isthe Display, and hence that node is not added to a SFNode or MFNode field.To maintain the reference counting the Display then needs to be instantiatedwith the AutoRef template as follows.#include #include AutoRef< Display > display( new Display );7.1.3 Setting Field ValuesFrom VRML when we want to initialise fields with a specific value we set themdirectly as follows.appearance Appearance {material Material {diffuseColor 0 1 0}}To do the same from C++ we need to explicitly create all nodes that we want touse, and we must also set all their fields to their correct values. To achieve thecorresponding effect as in the VRML above, we first need to create a Material node,and set its diffuseColor field to green.78<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>At this point, it is worth mentioning the naming conventions that are usedby <strong>Reachin</strong> <strong>API</strong>. When a VRML field is several words joined together, i.g.diffuseColor, all but the first word begin with a capital letter. Fields in C++(indeed all data members) are lowercase with words separated by an underscore,e.g. diffuse_color.To set a field to a specific value you need to know its type. Looking inthe Online Reference Manual for the Material node, it can be seen that thediffuse_color field is of type SFRGB, meaning that the field contains anRGB structure as its value. So to set the diffuse_color of the material wecall the set function of the field with an RGB value, i.e.Material *mat = new Material;mat->diffuse_color->set( RGB( 0, 1, 0 ) );In the same way, by creating an Appearance node and setting its material fieldto the Material node earlier created, we create the same structure as specifiedfrom VRML above.Appearance *app = new Appearance;app->material->set( mat );7.1.4 Adding children to group nodesTo build up the scene graph structure, the nodes in the scene graph need to beadded as children of the Display. Adding a node as a child of a grouping node isdone by calling the add function in the children field of the grouping nodewith the node we want to add as the argument.display->children->add( sh );The add function of the children field adds the sh node as a child of thisnode, if the given child isn’t already a child of this node. The function returnstrue if the child was added successfully.7.1.5 Making the scene runTo start the event loop and multi-sensory rendering, the staticScene::startScene function needs to be called. To make the function work,there needs to be a scene defined. In this case the Display node takes care ofthis for us.#include Scene::startScene();chapter 7 low-level Magma 79


7.1.6 Running the C++ created scene graphProgram 15 shows a simple C++ program that creates a scene graph and rendersit using the Display node.Program 15: Let’s make a sceneFile: scene.cpp#include #include #include #include #include #include #include #include using namespace <strong>Reachin</strong>;void main () {try {// Create nodesAutoRef< Display > display( new Display );Shape *sh= new Shape;Appearance *app = new Appearance;Material *mat = new Material;FrictionalSurface *surf = new FrictionalSurface;Sphere *sph= new Sphere;// Set valuessurf->stiffness->set( 500 );sph->radius->set( 0.05 );mat->diffuse_color->set( RGB( 0, 1, 0 ) );// Build the scene graphapp->material->set( mat );app->surface->set( surf );sh->appearance->set( app );sh->geometry->set( sph );display->children->add( sh );Scene::startScene();}catch( Error::Error &e ) {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7.2 Using VRML to create nodesTo simplify the creation of scene graph structures to use from C++, there is anoption to create the scene graph nodes from VRML code, and then use these nodesfrom your program. There are two ways to do this. Either the VRML code isspecified in a string in the C++ code, or the VRML code is loaded from a separatefile. The latter alternative is often preferable as it makes it possible to changeparameters without the need to recompile.With the functions described in this section, only nodes of type Child, i.e. onlynodes that can be used in the children field of a group can be created. Thesenodes can have nodes of other types as values for their fields. The reason that onlychildren can be the base-nodes to these functions is due to the fact that all nodescreated from VRML will be added as children to a Group node. To use these VRMLfeatures the header file vrml.h needs to be included. Functions for creatingnon-Child nodes can be found in VRMLSingle.h.7.2.1 Using a Group of created nodesThere are a number of ways in which these functions can be used.The simplest is just to use the Group node returned which has all thenodes specified in VRML as children. This is sufficient if no access to theseparate nodes is required.Group *createVrmlFromString( const char *vrml_string );Group *createVrmlFromURL( const char *file_name );To create a program that displays a sphere on the screen all that is required is tospecify the scene graph in VRML, i.e.Group *g = createVrmlFromString( “Display { ... }”);For demonstration purposes we use a string directly in the code to specify theVRML code for the nodes to create. However, it is often more useful to use thecreateVrmlFromURL function and specify a file from which the VRML codecomes from. The loader for the VRML files in the previous section was created inthis way, the only addition being that a Display node was added at the top levelin order to simplify things.7.2.2 Creating several separate nodes from VRMLIn all the commands for creating nodes from VRML code, the first argumentis always a VRML string, either directly in the code, or specified in a file. It isalso possible to specify up to four additional arguments to the function. Thesechapter 7 low-level Magma 81


arguments are pointers to nodes (child nodes). The pointers are set to point to thenodes created from the VRML. The first node is placed in the first node pointer,the second node created is placed in the second node pointer and so on. Thenode pointer must be the same type as the node created in VRML or else anexception is thrown.Group *createVrmlFromString( const char *vrml_string, *&n1[up to three additional pointers] ) );If created in this way we have access to the separate nodes, which can be used,for example, when loading up different nodes for configuration purposes. Thefollowing code segment is an example that shows how a calibration file with twoTransform nodes are used to initialise the calibration settings of the haptic device.void PhantomDevice::initialize() {DEFMap defmap;Group *g = createVrmlFromURL( calib_url->get(), &defmap );Transform *pos, *orn;Box *pos_reset, *orn_reset;defmap.find( “POSITION”, pos );defmap.find( “ORIENTATION”, orn );...7.2.3 Using DEFed Nodes and Fields from VRML<strong>Reachin</strong> <strong>API</strong> also supports a way of using nodes that have been given a name inVRML through the DEF keyword. This enables one to reference fields and nodescreated in VRML via C++.To use the node names DEFed in VRML, the function to create nodes from VRMLneeds to be given another argument. This argument is a table in which it places amapping between DEFed names, and actual nodes.Group *createVrmlFromString( const char *vrml_string,DEFMap *def_use_map );Group *createVrmlFromURL( const char *file_name,DEFMap *def_use_map );To gain access to the node identified by a name, the find() function of themapping table is used. The find function has two arguments, the first is thegiven name of the node and the second a pointer that will be set to point atthe node. Naturally, the type of the node pointer that is the second argumentneeds to match the node created in VRML. The node found is then returned inthe node pointer.void DEFMap.find( const char *name, T *&t );82<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>There are also ways of directly accessing the fields from VRML. By DEFing a namefor a node in VRML the fields of the node are accessible. Two ways of specifying thefield are supported. Either call the find function with the DEFined node nameas first argument and the field name as the second argument, or specify the fieldand node in the form nodename.fieldname, where the nodename is theDEFined name of the node, and fieldname is the name of the field. Thesefunctions are declared as follows.Field *DEFMap.find( const char *node, const char *field );Field *DEFMap.find( const char *node_field );It is important to note that the find functions for Field return pointers toobjects of type Field, and as such are generally only used for routing. In orderto set() or get() the pointer to the field needs to be cast to the appropriatetype, i.e. SFInt32. For more information see section 7.3.The following example shows some different ways of using the DEFined namesfrom VRML.Program 16: DEFinitive answersFile: defMap.wrl#VRML V2.0 utf8DEF myTransform Transform {translation 0.12 0 0children [Shape {appearance Appearance {geometry DEF vrmlSphere Sphere {radius 0.1}}]}File: defMap.cpp#include #include #include #include #include #include using namespace <strong>Reachin</strong>;void main () {try {chapter 7 low-level Magma 83


}// Create a Display node as the top of the scene graphAutoRef< Display > display( new Display );// A table where to store DEFined names from VrmlDEFMap def_map;// Create nodes and save their names in the mapping tableGroup *g = createVrmlFromURL( “defMap.wrl”, &def_map );// Find the transform node and add it to the DisplayTransform *my_transform;def_map.find( “myTransform”, my_transform );display->children->add( my_transform );// Create a new sphereSphere *my_sphere = new Sphere();// Let the radius of the created sphere follow the radius of// the sphere created in Vrmldef_map.find( “vrmlSphere.radius” )->route( my_sphere->radius);// Create a shape, and let the new sphere be its geometry.// Add this new shape to the Display nodeShape *my_shape = new Shape();my_shape->geometry->set( my_sphere );display->children->add( my_shape );// start the eventloopScene::startScene();}catch( Error::Error &e ) {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>This section requires familiarity with templates. It is important that readers arefamiliar with the syntax and semantics of templates, and have a reference onthis topic nearby when reading this section. Consult the references on C++ formore information.Probably the most difficult part of using fields is carefully and correctly usingand understanding the templates that define them. The base class Field itselfimplements routing properties, but does not actually contain any field valuedata. Instead, template modifiers are applied to Field types in order to adddata members and change the behaviour. They do this by adding extra methodsand overriding or extending virtual functions. For example the SField andMField field templates encapsulate an arbitrary data type. In this sense they canbe considered similar to STL containers 3 .As has been outlined in section 3.1.3, fields propagate values in a lazily evaluatedmanner. This makes them an efficient means of managing change propagationin the scene graph.Each field has zero or more input fields, which are the fields routed into it.Fields may also have output fields, which are the fields that it routes into. Thisterminology will be used throughout this section.In Program 17 below, we encapsulate integers using the SField template. Thisallows us to route them together and investigate the effects of setting, getting,routing and unrouting.In particular, note how routing from one field to another does not change thevalue contained in the field we are routing to. This is because fields only updatewith respect to input fields when the input fields generate events, that is whenthe value of the output field is out-of-date with respect to an input field. Valuesof output fields become out-of-date with respect to input fields when the valueof the input field is changed, or is touched. This is done by using the set()method or the touch() method of the input field and using the get()method on the output field. When the get() method is called on the outputfield it checks to see if it has any events pending from any input field routed to itusing the validate() method. If it does then it updates its own value.Program 17: “Fred, Joe. Joe, Fred.”File: fredJoe.cpp#include #include using namespace <strong>Reachin</strong>;3STL containers contain a data type, as do the derived encapsulation extensions of Field, but notall the extensions of Field have iterators, so they cannot be called containers.chapter 7 low-level Magma 85


void main() {// Usually we would use SFInt32 which is typedefed to SField< int >// (see Fields.h).SField< int > *fred = new SField< int >;SField< int > *joe = new SField< int >;fred->set(3);joe->set(2);cout touch();// TOUCH FREDS VALUE TO SET JOEcout


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In Program 18 an example is set up where a number of SFields are routed intoa ComposeMField which is then routed into an MField. This example alsoshows how to iterate over the value of an MField in three different ways.Program 18: Thar she fl owsFile: fieldFlow.cpp#include #include #include #include #include using namespace <strong>Reachin</strong>;void main() {SField< int > *francis = new SField< int >;SField< int > *paulie = new SField< int >;SField< int > *chris = new SField< int >;typedef ComposeMField< SField, MField > ComposeInt;ComposeInt *nick = new ComposeInt;MField< int > *matthew = new MField< int >;// Route em upfrancis->route( nick );paulie->route( nick );chris->route( nick );nick->route( matthew );// Set em upfrancis->set( 2 );paulie->set( 4 );chris->set( 5 );// Go through matthew’s values with a straight iteratorcout end() ; ++itr ) {cout


}cout class MyModifier : public T {// override a virtual function.virtual void bar() { cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>needs to be done once. For example the SField and MField types (see 7.3.1)implement run-time typing policies to ensure that they only route from fieldsof the same type.The TypedField field modifier template is an easy way to define the typecheckingvirtual methods which will perform the appropriate dynamic_castchecks as the routes are put in place. It provides type checking for routes. Ittakes two or three template parameters. The first indicates the field class to bemodified, the second two arguments determine the types of inputs that the classcan receive. The second is a list of types that determine what the first N inputsto the field must be. The list of types can be constructed using the STL pairtemplate and is interpreted as follows.voidis a list of zero typesG1is a list of one type, G1pair< G1, G2 >is a list of two types, G1 and G2pair< G1, pair < G2, G3 > > is a list of three types G1, G2 and G3,and so on. The number of types allowed in such a list is, in theory, unlimited.The third argument says that after the N types defined in the second argumentany further arguments must be of a certain type. For example consider thefollowing TypedField instantiations.TypedField< F, void, H > States that the field is a derivative oftype F and allows any number ofinputs of type HTypedField< F, void > States that the field is a derivative oftype F and accepts no inputsTypedField< F, pair< G1, States that we have a derivative ofpair< G2, type F and must have exactly 3 argu-G3 > > > ments of type G1, G2 and G3TypedField< F, pair< G1, States that the field is a derivative ofG2>, H > type F and accepts exactly 2 inputsof type G1 and G2 followed by anynumber of inputs of type HObviously these type specifications can become somewhat unwieldy. Often theyare stated only once (for example Group::Renderer in Group.h) howeverit is wise to use typedefs for field types that will be used many times.We define a large number of such typedefs in Fields.h. They are forchapter 7 low-level Magma 89


convenience and in some cases to match the typenames used in VRML (e.g.SFInt32, etc). We recommend that one uses these definitions to avoid confusion.For example in Program 18 we wrote:SField< int > *chris = new SField< int >;Whereas typically this should be written:SFInt32 *chris = new SFInt32;7.3.4 FField and EvaldFFieldSField and MField update their value when necessary by simply copying thevalue from the field that most recently sent them an event. This behaviour isimplemented in the virtual update method of those fields. We can override thisfunction in sub-classes to change this behaviour. Generally speaking, we coulddefine the update function so set the field value based on an arbitrary functionof the input fields values.The FField field modifier template is provided for this purpose. It keeps trackof all the fields that are routed to it with the inputs data member 5 . In theupdate function we can query any or all of the input fields and define ourvalue however we like. In Program 19 we define a sub-class of SFInt32 thatdefines its value as the sum of all the SFInt32 fields routed into it.Program 19: A Function Field ExampleFile: funkField.cpp#include #include using namespace <strong>Reachin</strong>;// The following declaration indicates that Summer is a class derived// from a SFInt32 and takes inputs from SFInt32sclass Summer : public TypedField< FField< SFInt32 >,void, SFInt32 > {virtual void update() {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>}};}void main() {// construction.SFInt32 *fred = new SFInt32;SFInt32 *emma = new SFInt32;SFInt32 *bred = new SFInt32;Summer *sum = new Summer;// routing.fred->route( sum );emma->route( sum );bred->route( sum );// setting (and event generating).fred->set(3);emma->set(2);bred->set(4);// evaluation.cout


the value is queried. In Program 20, the class DependentDrone is madedependent by using the class template modifier Dependent on an SFInt32.It is useful to compare the behaviour of this program to the previous programs.As soon as set() is called on the emma field, the update() function offred is executed.Program 20: A Dependency ProblemFile: dependencyProblem.cpp#include “Fields.h”#include using namespace <strong>Reachin</strong>;// This declaration says that we are subclass SFInt32 and// we can have as many SFInt32s routing into us as we like.class DependentDrone : public TypedField< Dependent< SFInt32 > ,void , SFInt32 > {virtual void update() {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>• We implement two virtual functions with the following declarations.virtual void enterNode( Node *n );virtual void exitNode( Node *n );These are called when the field begins (or ceases) to hold a referenceto the node n. These functions handle the reference counting for thereferenced nodes (see section 7.1.2). Sub-classes may extend these functionsby extending them. Take care to call the base-class versions of thesefunctions from any overridden versions.• The value data member is initialised to 0 (normally the value is notinitialised).It is important to use MFNode and SFNode whenever storing a reference toa Node. This is because allocation and deallocation of Nodes is managed by areference counting system.We provide run-time typing of the types of Node sub-classes that can be stored ina particular SFNode or MFNode with the TypedSFNode and TypedMFNodefield modifier templates. They do this by extending the enterNode virtualfunction of either SFNode or MFNode and by attempting to dynamic_castthe entering node to the specific node type. If it fails then a runtime exception isgenerated (see TypedSFNode.h and TypedMFNode.h for details).These types of fields are used extensively in the <strong>API</strong>. For example the geometryfield in the Shape node is derived from TypedSFNode< Geometry>(as typedefed in Fields.h). This allows it to hold references to Geometry orany of its sub-classes.7.4 Extending <strong>Reachin</strong> <strong>API</strong> NodesIn order to access the full power of the <strong>API</strong>, the developer often needs to extendthe nodes with additional and modified behaviour. In contrast to ordinary objectorientedprogramming, where the programmer just extends or overrides virtualfunctions, in <strong>Reachin</strong> <strong>API</strong> the user may extend and override functionality of thedata members of the nodes. This is due to the field structure of the <strong>API</strong>. Sincealmost all node functionality is implemented by the component fields, it is thefields themselves and their functions that we need to change. To allow this, allnodes provide data member specialisation by using the Defined template in theirconstructor (see sections 7.4.1 and 7.4.2).The use of VRML to configure the scene graph is supported by exposing variousfields of a node via a VRML interface. <strong>Reachin</strong> <strong>API</strong> supplies a means for the userto provide a VRML interface for new node types. This is described in sectionschapter 7 low-level Magma 93


7.4.3 and 7.4.4. To be able to load new nodes via their VRML interface, the Loader(<strong>Reachin</strong>Load.exe) needs to be recompiled to incorporate the object files of thenew node, which is described in section 7.4.5.7.4.1 Data member SpecialisationTo be able to specialise data members of a base-class, the <strong>API</strong> provides theDefined template, see Appendix D: Data Specialisation in C++ for details. TheDefined template allows one to specialise not just member functions as inC++ but also data members. All node-classes use the Defined template whenspecifying their data members in the constructor. This makes it possible tooverride a data member of a node, by sending a new specialised data member 8 asan argument to the constructor of the base-class.For a non-trivial example, the Geometry class (see the Online Reference Manualclass hierarchy diagram and the specification of Geometry) has a data membercalled renderer. The renderer is of type RenderGLField whichhas a virtual method called makeGL(). makeGL() is used to generatethe OpenGL code to render a sub-class of Geometry. By default makeGL()does nothing. It is up to the sub-classes to define it. So, in order to specify ageometry represented by certain OpenGL code, a data member which is a subclassof RenderGLField needs to be declared with a makeGL() functionthat creates the correct OpenGL code. In the constructor of the new geometryan instance of this data member is sent as the argument to the constructor ofthe sub-class of Geometry.OurGeometry():Geometry( 0, new OurRenderer ) {};Observe that the order of the arguments to the constructor is very importanthere. In the case of Geometry, Geometry’s constructor takes a number of arguments,the second one being the RenderGLField. If there are no argumentsto the constructor, or the arguments are just zeros, then the constructor will createan instance of the original data member. If there is a non-zero data member thenthat data member will be instantiated as the data member instead. If the datamember to specialise is the third argument to the constructor then the first andsecond arguments must be zeros. The arguments after the third argument caneither be zeros or left out.With this in mind we create a geometry sub-class that defines its ownRenderGLField that takes care of rendering the object 9 .8Note that this data member must have the old data member as a base-class.9Note that each object just specifies how to render itself in its local coordinate space. Thetransformation of the geometry to the global coordinate space is taken care of by the Transformnode.94<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>#includeusing namespace <strong>Reachin</strong>;class OurGeometry : public Geometry {protected:// our specialized rendererstruct OurRenderer : public RenderGLField {virtual void makeGL() {// Create the open GL code for the geometry}};// the constructorOurGeometry() :geometry( 0, new OurRenderer ) {}};If the specialised data member contains any new data members of its own, inaddition to the original ones, those data members cannot be directly used usingthe old variable name since the type of that does not include the new variable. Forexample, say that we needed a reference to OurGeometry in OurRendererfrom the example above then we would have:struct OurRenderer : public RenderGLField {OurGeometry *our_geometry; // This is our NEW variablevirtual void makeGL() {// Create the open GL code for the geometry in some way// using the geometry.}};In the constructor of OurGeometry, to be able to set the our_geometryvariable in the renderer we need to do a static type cast to reach the new parts ofthe overridden object. Since the renderer is an auto_ptr we first need to getthe actual pointer to the renderer before we can do the typecast.static_cast( renderer.get() )->our_geometry = this;Keep in mind that this usage of .get() is usually indicating an auto_ptrmember function whereas ->get() is usually indicating an SField orMField member function.chapter 7 low-level Magma 95


7.4.2 Allowing data member specialisation by a sub-classIn the previous example, we made use of the Geometry node’s specialisationfriendlyconstructor to create the OurGeometry sub-class. Suppose we wouldnow like to sub-class from OurGeometry. It is only polite that the sameservices provided by Geometry are passed down from OurGeometry. All thatneeds to be done is to use the Defined template for the member in the constructorof the class. The desired type of the member is the argument to the Definedtemplate and the default value of the argument should be zero so that the defaultbehaviour is to create the specialised type.The Geometry node allows specialisation of both the Renderer and Collidercomponents. The following revised example shows how to pass on this capability inour sub-class as well as for an additional component called NewMember.#includeusing namespace <strong>Reachin</strong>;class OurGeometry : public Geometry {protected:// our specialized rendererstruct OurRenderer : public RenderGLField {OurGeometry *our_geometry;virtual void makeGL() {// Create the open GL code for the geometry}};struct NewMember {// does nothing};// the constructorOurGeometry( Defined< Collider > _collider = 0,Defined< OurRenderer > _renderer = 0,Defined< NewMember > _new_member = 0 ) :new_member( _new_member ),Geometry( _collider, _renderer ) {static_cast< OurRenderer* >( renderer.get() )->our_geometry =this;}auto_ptr< NewMember > new_member;};Here we are passing Collider along, specialising Renderer and introducingNewMember.96<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7.4.3 Creating a VRML-interface for a nodeAn Interface specifies the VRML aspect of a node. To be instantiatedfrom VRML a node needs to define its own interface. Through thisinterface a VRML file can access the different parts of the node.This interface includes the string name of the Node, a functionfor instantiating an instance of the node, usually the create staticmethod of the node, and a description of the events and fields in theNodes interface. While the details are not particularly complicated,it is usually sufficient to copy-and-paste an interface definition fromsomewhere and make the necessary text substitutions.To create an interface for the node, a static constant variable of theinterface needs to be added to the class declaration.static const Interface interface;Specifying the fields is done by first specifying the access that VRML has to thefields, i.e. whether each field has access defined by eventIn, eventOut,exposedField or field 10 . To identify the field each one is given a VRMLname as a string and the address of the associated data member within the node.Observe that the interface does not have any default values for the fields. Thedefault setting of the fields are set in the constructor of the node.Below is an example from the <strong>Reachin</strong> <strong>API</strong>, which shows how the interface of theGroup node is specified. By comparing this definition with the VRML interface ofthe Group node, as in the Online Reference Manual or the VRML specification,it can be seen how the interface function description corresponds to the VRMLspecification and how straightforward it is to create a VRML interface for a node.const Interface Group::interface(“Group”,typeid(Group), Create::create,eventIn ( “addChildren”, &Group::add_children ) +eventIn ( “removeChildren”, &Group::remove_children ) +exposedField( “children”, &Group::children ) +field ( “bboxCenter”, &Group::bbox_center ) +field ( “bboxSize”, &Group::bbox_size ));Note that all fields that are exposed to the VRML interface need to be auto_ptrs e.g.auto_ptr< MFChild > children;And hence also need to be created in the constructor of the node i.e.Group::Group() :children( new MFChild ) {}10 Note that this is not a <strong>Reachin</strong> <strong>API</strong> field. Field in VRML means a field that can only be set oninitialisation and cannot be routed to or from (see section 3.1).chapter 7 low-level Magma 97


7.4.4 Loading created nodesThe Loader (<strong>Reachin</strong>Load.exe) that is included with the <strong>API</strong> is linked with<strong>Reachin</strong> <strong>API</strong>’s nodes, but are not linked with nodes created by users. Toinstantiate newly created nodes, one must create a loader and link it with the newcode. Program 21 is such an example.Program 21: A simple loader#include #include using namespace <strong>Reachin</strong>;void main( const int argc, const char *argv[] ) {}if( argc != 2 ) {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Note that we need to reference the Display as we are creating it ourselveswhereas createVrmlFromURL automatically does this, so we do not needto AutoRef it.7.4.5 Extending a VRML interface for a nodeWhen building a node by inheritance, it is often desirable to expose the base-classVRML interface in the created node, and add some new exposed fields to theinherited interface. To expose an inherited interface, add the address of the baseclassinterface as the final argument as in the following example.const Interface GravityDynamic::interface( “GravityDynamic”,typeid(GravityDynamic), Create::create,exposedField( “source”, &GravityDynamic::source )+exposedField( “gravity”, &GravityDynamic::gravity )+exposedField( “source_radius”, &GravityDynamic::source_radius ),&Dynamic::interface);Here we expose three new fields in GravityDynamic and pull in the fieldsdefined in Dynamic::interface (all 34 of them!).7.4.6 InitializeWhen a node is created via VRML the order in which the field values are set isquite important.First the node is created and the constructor of the node is called. So any valuesthat are set in the constructor are set. This includes the node default values (e.g. aSphere’s radius is set to 1). After the construction of the node, all the field valuesdefined in the VRML file are set in the order in which they are specified in.This is enough in most cases, but in some circumstances, final initialisation needsto be done after the values have been set from VRML. To achieve this, nodes have avirtual method called initialize(). The initialize() method is calledthe first time a node gets referenced (see section 7.1.2) and at this stage the fieldvalues have been set both from the constructor and from the VRML interface.For example, suppose we have a node that uses an initialisation file. The file hasa default name “defaultIni” which we would like to allow to be overriddenfrom VRML. We only want to load the contents of the initialisation file once. Theconstructor is an inappropriate place to load the initialisation file since the VRMLcode has not had a chance to change the file name yet.chapter 7 low-level Magma 99


#include#includeusing namespace <strong>Reachin</strong>;class OurNode : public Child {public:static const Interface interface;auto_ptr< SFString > ini_file;}};OurNode() : ini_file( new SFString ) {ini_file->set( "defaultIni" );}virtual void initialize() {// At this point the value of the ini-file has// either been set explicitly in the VRML file or// the constructor’s default value has been used.cout


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7.5.1 The DeviceInfoThe DeviceInfo is a bindable (see Bindable.h) node that represents the setof hardware input devices that are currently available. We query the currentDeviceInfo with the getStackTop static function as follows.DeviceInfo *di = DeviceInfo::getStackTop();The devices field of DeviceInfo contains a vector of Device instances.An individual device may be queried by name using the operator[]function as follows.Device &haptic_device = (*di->devices)[ “tracker” ];Each different type of hardware device has various state fields. The fields of theDevice nodes can in turn be retrieved through the operator[] function ofthe devices, which extracts the field of the given name as follows.SFVec3f *device_position =static_cast< SFVec3f *>(haptic_device[ “position”]);SFRotation *device_orientation =static_cast(haptic_device[“orientation”]);Observe that all values are in global coordinates.The following program shows how the DeviceInfo can be used to access thebutton of the haptic device. The button is then routed to a field that just printsout the state of the button. Observe that the button tracker needs to be aDependent field, since we want it to evaluate as soon as the button is pressed.#include#include#includeusing namespace <strong>Reachin</strong>;struct ButtonTracker : public Dependent< SFBool > {virtual void update() {if( static_cast< SFBool* >( notif )->get() )cout


}DeviceInfo *di = DeviceInfo::getStackTop();Device &haptic_device = (*di->devices)[ “tracker” ];haptic_device[ “button” ]->route( button_tracker );// Start the sceneScene::startScene();7.6 The scene graph loop<strong>Reachin</strong> <strong>API</strong> executes two parallel event loops. One of them is configured toexecute once per millisecond and is used to interface with the haptics renderinghardware. It is important for haptics rendering to occur at regular one millisecondintervals in order to preserve the stability and quality of the haptics rendering.This loop is referred to as the realtime loop. The realtime loop does not interfacedirectly to the scene graph or field network. It works with temporary, simplifiedand localised objects which are derived from the scene graph representation.Running concurrently to the realtime loop in a separate process is the scenegraph loop. The scene graph loop simply runs as fast as possible given theCPU constraints of the realtime loop. It is responsible for interfacing with andinterpreting the scene graph data structures (via traversals). <strong>Reachin</strong> <strong>API</strong> handlescommunication between the scene graph loop and the realtime loop at a lowlevel.Following is pseudo code for the body of the scene graph loop.• Update the global Timer::time field to the current time.• Perform the collision traversal of the scene graph (see section 7.6.1).• Perform graphics rendering of the scene-graph.Note that the frame rate of the graphics rendering is the same as the collision traversalrate and the scene graph loop rate. These terms may be used interchangeably.7.6.1 The collision traversalEach scene graph loop, an explicit traversal of the scene graph is performed forthe purposes of haptics rendering. This traversal is referred to as the collisiontraversal since we are detecting collisions of the haptic device with various objectsin the scene. Each Drawable node has a Collider component which managesthe collision traversal for that Drawable. A CollisionState object is created at thebeginning of the collision traversal and passed to each of the Collider objects102<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>throughout the traversal. A Collider object may inspect and/or modify theCollisionState object as appropriate. For example, Geometric objects willinspect the CollisionState to determine the motion of the haptics deviceand possibly modify it to register one or more RealtimeGeometry objectsfor rendering in the realtime loop.All Collider objects have a drawable data member which is a pointerto the Drawable node they are associated with. In a Drawable sub-class wecan safely static_cast this member to the appropriate node type as in thefollowing example.void MyDrawable::Collider::collide( CollisionState *cs ) {MyDrawable *d = static_cast< MyDrawable* >( drawable );}...The most common actions to be performed by a Collider object in thecollision traversal are as follows.• Add a force operator (section 7.8)• Add a RealtimeGeometry (section 7.10)• Record the current accumulated coordinate transform (section 7.7.2)7.7 Moving between coordinates spacesFundamental to the purpose of a scene graph is the notion of hierarchicalcoordinate space transformation. In building the scene graph, we construct a treewith Transform nodes that modify the coordinate space of the children beneaththem. Coordinate spaces in the <strong>Reachin</strong> <strong>API</strong> (and in OpenGL) are specifiedusing quaternion transform matrices. These are 4×4 matrices defining an affinetransformation of quaternion coordinates. See [5], [6] and [7] for backgroundinformation on this topic.7.7.1 The Transform nodeIn VRML, the scale, rotation, scale_orientation, translationand center fields of the Transform node are just a convenient means ofsetting the value of the 4×4 transform matrix. In the <strong>API</strong> this data dependency isreflected using a field network where the above mentioned fields are routed intothe matrix field of the Transform. Since the coordinate spaces may be nested toan arbitrary depth, we will consider a Transform node as well as its parent nodeschapter 7 low-level Magma 103


leading all the way up to the global coordinate space as in Figure 7.1. The globalcoordinate space is the highest level untransformed space. It is the space thatOpenGL projects onto the screen with the camera’s projection transform matrix[6] and the space in which the tracking devices move. In <strong>Reachin</strong> <strong>API</strong> the globalcoordinate space is in units of metres.global coordinatesTransformBchildrenA's parent's coordinate spaceA->matrix->getInverse()TransformAchildrenA->matrix->getForward()A's coordinate spaceFigure 7.1: The forward and inverse transform matrices of the Transform::matrix field.Transform B represents the cumulative transformation of all the Transform nodes between Aand the global space. Note that the forward (and inverse) matrices of A only transform fromA’s coordinate space to A’s parent’s coordinate space (and visa versa). They are unrelated to theTransforms existing above A in the scene graph tree.One can derive the forward transform matrix of Figure 7.2 from the inverse(and visa versa) by the process of matrix inversion. However this is an expensiveoperation so the <strong>Reachin</strong> <strong>API</strong> provides low-level optimisations to derive boththe forward and inverse transforms at the same time. This eliminates redundantcomputation and constructs both matrices for only slightly more than thecomputational cost of constructing just one of them.The matrix field provides two other matrix access functions (see Figure 7.2).These are explained in more detail in section 7.7.2.A->matrix->getAccumulatedInverse()global coordinatesTransformBchildrenA's parent's coordinate spaceTransformAchildrenA's coordinate spaceA->matrix->getAccumulatedForward()Figure 7.2: The accumulated forward and inverse transform matrices of the Transform::matrixfield. These are the transforms accumulated in the current scene graph loop. They are used todefine the last accumulated transforms in the CollisionState in the next collision traversal.The matrices returned by these four functions are of type Matrix4f. Onetransforms points according to these transforms by pre-multiplication, as follows.Vec3f parent_p = transform_node->matrix->getForward() * local_p;104<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In transforming vectors we want to perform the scaling and rotating componentsof the transformation, but not the translation part. This is done with thescaleRotate member of Matrix4f as follows.Vec3f parent_v =transform_node->matrix->getForward().scaleRotate( local_v );7.7.2 Collision traversal coordinate spacesThe CollisionState object of the collision traversal (see 7.6.1) contains astack with an entry for each of the coordinate spaces between the current oneand the global space. All the information relating to the current coordinate spaceis available from the element at the top of the stack. This is directly analogousto the matrix stacks in OpenGL [6]. In terms of coordinate spaces, the relevantfunctions in CollisionState are as follows.Matrix4f getAccumulatedForward() const;The getAccumulatedForward function transforms from local coordinatesto global coordinates.Matrix4f getAccumulatedInverse() const;The getAccumulatedInverse function transforms from global coordinatesto local coordinates.Matrix4f getLastAccumulatedForward() const;Matrix4f getLastAccumulatedInverse() const;The getLastAccumulatedForward and getLastAccumulatedInversefunctions are the accumulated transforms that were calculated in theprevious scene graph traversal. These matrices are set from thegetAccumulatedForward and getAccumulatedInverse values atthe previous collision traversal. They are used to calculate coordinate spacemotion (see section 7.7.3).All four transforms are named accumulated meaning they are accumulatedover all the entries in the stack (as in Figure 7.2, not as in Figure 7.1). Theseaccumulated transforms are used by the <strong>Reachin</strong> <strong>API</strong> haptics rendering engineso that Geometry nodes can perform collision detection in the local geometriccoordinate space without needing to know anything about its position, size andorientation in the global space. However it is sometimes necessary for nodes toaccess this information. The following code listing shows how a sub-class of theBox geometry might calculate a global vector global_v corresponding to alocal vector local_v.chapter 7 low-level Magma 105


#includeusing namespace <strong>Reachin</strong>;class OurBox : public Box {public:struct Collider : public Box::Collider {virtual void collide( CollisionState *cs ) {// extend the collider not overrideBox::Collider::collide( cs );// get the accumulated forward matrixMatrix4f accumulated_fwd = cs->getAccumulatedForward();Vec3f local_v = Vec3f( .5, .5, .5 );};}// Get the local vector in global coordinatesVec3f global_v = accumulated_fwd.scaleRotate( local_v );OurBox() : Box( new Collider ) {}};Pay special attention to the call to Box::Collider::collide. This makesour collide function an extension of the Box version, not an override assuch. Also note that we must pass an instance of our Collider to the Boxconstructor so that the Box constructor will use our Collider instead of itsown (see Appendix D: Data Specialisation in C++).7.7.3 MotionIn haptics rendering it is useful to have a representation of the instantaneousmotion of a particular object. This allows us to interpolate betweencoordinate spaces at the haptics rendering frequency of 1kHz withouthaving to actually re-calculate the exact intermediary transform matrices.This is the purpose of the getLastAccumulatedForward() andgetLastAccumulatedInverse() functions in the CollisionStateobject. By comparing the forward and inverse transform matrices betweensuccessive collision traversals we can derive the transform matrix associatedwith the motion of the coordinate space over a certain period. This motion isrepresented using <strong>Reachin</strong> <strong>API</strong>’s Motion class. A Motion instance can be scaled torepresent motion over an arbitrary time step. Specifically they are scaled to givemotion during the 1kHz haptics rendering loop.106<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>The constructor of Motion takes a Matrix4f defining the transformation ofmotion for a particular time step.Motion( const Matrix4f &_m );We can then divide the motion instance by the time step duration to get themotion for one second.inline void operator/=( Motion &l, const mgFloat &s );<strong>Reachin</strong> <strong>API</strong> then provides a interpolation functions that transform a given pointor vector according to the motion, with an interpolation factor f in the range[0,1].inline Vec3f transform( const Vec3f &p, const mgFloat &f ) const;inline Vec3f transformVec( const Vec3f &v, const mgFloat &f ) const;If the Motion has been normalised to one second then the interpolation factor fwill be the desired time step as a fraction of a second. For an example of how touse the Motion to move an object in the realtime loop see section 7.8.8.7.8 Force operatorsAll haptics rendering can be regarded as implementing a conversion of the hapticdevice position/orientation to a rendering force. This conversion process willusually involve hystereses since the renderer often must remember where thehaptic device has been and other such state information. The surface renderinginfrastructure in the <strong>API</strong> is an example of a complex state-maintaining forcefunction. While the geometry/surface rendering services will suffice for a largerange of haptics requirements, <strong>Reachin</strong> <strong>API</strong> offers a less friendly but morepowerful service referred to as force operators.A force operator is simply a function for mapping from the haptic device position toa force vector in Newtons. In this way we can implement free-space effects such assprings, strings, magnets, vibration, wind, volume-rendering, etc.7.8.1 InterpolationBecause force operators are only generated at the scene graph loop rate, theyonly get to glimpse the scene graph perhaps 30 times a second. Meanwhile theyhave to produce a force vector 1000 or more times a second. When a new forceoperator is created to replace the one generated in the previous scene graph loop,it might have a slightly different picture of the scene graph state than the previouschapter 7 low-level Magma 107


one. If we just swapped the new force operator in, there may be a jump in therendered force. These small jumps are easily discernible as a “tic” in the hapticdevice. To combat this problem we interpolate linearly between successive forceoperators as in Figure 7.3.scene-graphloop 1scene-graphloop 2scene-graphloop 3scene-graphloop 4add forceoperator Aw Aadd forceoperator Bw Brealtimeloop xtimeFigure 7.3: The timeline of interpolation. The red circle in scene graph loop 1 representsthe addition of force operator A. In scene graph loop 2 we submit force operator B as thereplacement for A. The coloured triangles indicate the lifetimes of the two force operators. Inrealtime loop x we evaluate both force operators with weightings W Aand W Brespectively whereW A+ W B= 1.The realtime loop is responsible for calling the force operator. It uses a C++function call operator declared as follows.inline Vec3f operator()( const Vec3f &p, const mgFloat &w );Where p is the haptic device position (see below) and w is the weighting as inFigure 7.3. Note that the weighting factor w should generally not be used to scalethe resulting force in the force operator. It is provided in case the force operatorneeds to effect the scene graph somehow. In that case it can use the w factor todetermine what the actual force will be on the scene graph (see 7.8.7).108<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Force operators must be added during the collision traversal by overridingthe Collider::collide function of a Drawable node. Program 22demonstrates specialisation of a Child node’s Collider component and theaddition of a constant force in the positive y direction.Program 22: A simple force operator example#include #include #include #include using namespace <strong>Reachin</strong>;struct NewCollider : public Child::Collider {virtual void collide( CollisionState *cs ) {Child::Collider::collide( cs );ConstForce f( Vec3f( 0, 0.5, 0 ) );cs->addAbsoluteGlobalFO( f );}};void main( ) {try {Display *ri = new Display();Child *force_child = new Child( new NewCollider );ri->children->add( force_child );Scene::startScene();}catch( Error::Error &e ) {}}cerr


otated and translated then it’s fine to simply transform the vertices of the box justlike in graphics rendering (we transform the shape). However when rendering thesurface of the walls we want to maintain the same stiffness with the same valuein Newtons/metre. Scaling should just change the shape and size of objects, nottheir intrinsic properties (such as colour and stiffness). In other words, we wouldlike to describe the shape of an object separately from the surface properties.When defining scaling or skewing coordinate transforms, it should be clear thatthis is a transformation of shape and nothing else.There are four functions for adding force operators to the currentCollisionState. They are categorised in two ways. Firstly the force operatoris added either in the local coordinate space orientation or in the global. Secondlythe position of the haptic device may be given relative to the coordinate spaceorigin (i.e. absolute) or relative to the position of the haptics device when the forceoperator is created (i.e. relative). The only difference here is the orientation ofthe coordinate space and the origin position (there is no scaling). The followingfunctions are available in CollisionState.void addAbsoluteGlobalFO( const Op &fo );void addRelativeGlobalFO( const Op &fo );void addAbsoluteLocalFO ( const Op &fo );void addRelativeLocalFO ( const Op &fo );These are template functions where the argument type Op can be any type whichis a model of Force Operator. The following sections describe each of these basedon the following example. Suppose we have a cube in a local coordinate spacerotated and translated with respect to the global space as in Figure 7.4.ayxybxFigure 7.4: The global coordinate space is indicated with the axes in the lower left. A Boxgeometry has a local coordinate space indicated with the axes to the upper right. The hapticdevice was at a when our force operator was created and has since moved to b.110<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>The haptic device has moved over the lifetime of the force operator from point ato point b. There are four ways of providing the haptic device position b to theforce operator function as follows.7.8.2 Absolute global coordinatesIf the CollisionState::addAbsoluteGlobalFO function is used toadd a force operator, then the haptic device position b is given in metres accordingto the global orientation and the global origin as in Figure 7.5.ayxybxFigure 7.5: The haptic device position is in metres relative to the global origin.7.8.3 Relative global coordinatesIf the CollisionState::addRelativeGlobalFO function is used toadd a force operator, then the haptic device position b is given in metres accordingto the global orientation and relative to the position a when the operatorwas created as in Figure 7.6.chapter 7 low-level Magma 111


ayxybxFigure 7.6: The haptic device position is in metres relative to the starting position.7.8.4 Absolute local coordinatesIf the CollisionState::addAbsoluteLocalFO function is used toadd a force operator, then the haptic device position b is given in metres accordingto the local orientation and relative to the local origin as in Figure 7.7.ayxybxFigure 7.7: The haptic device position is in metres relative to the local origin.112<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7.8.5 Relative local coordinatesIf the CollisionState::addRelativeLocalFO function is used toadd a force operator, then the haptic device position b is given in metres accordingto the local orientation and relative to the position a when the operator wascreated as in Figure 7.8.ayxybxFigure 7.8: The haptic device position is in metres relative to the starting position a.The purpose of relative coordinates is not immediately obvious. They are usefulin cases where a force function is being approximated in the realtime loop fromaccurate updates in the scene graph loop. Suppose we have a force function fwhich we can afford to calculate in the scene graph loop (at say 30 hertz) butnot in the realtime loop (at 1000+ hertz). If we can also calculate the threedimensionalgradient of f (that is ∇f) in the scene graph loop then we use alinear approximation of f in the realtime loop. In this case we want to calculatef(a) + ∇f(a)· p where p is the haptic device position relative to the startingposition a.7.8.6 Creating new force operatorsCreating a force operator involves:• Choosing a coordinate space (as above).• Creating a type that defines a function call operator of the following typeinline Vec3f operator()( const Vec3f &p, const mgFloat &w );• Creating a node with a specialised Collider object that will add the forceoperator to the CollisionState when appropriate (see Program 22).chapter 7 low-level Magma 113


As an example we create a force operator that snaps the haptic device to a positionin the workspace with a spring. The haptic device is released from the snapposition if it gets too far away from the snapping point. The complete programis shown in the end of this section.First the force operator called SnapForce is defined. In the realtime loopwe need to calculate the snapping force according to the current position ofthe haptic device. To be able to do that the force operator needs to havesome member variables defining the position to snap the haptic device to, thestiffness of the spring and the distance from the snapping point where the hapticdevice will be released. These variables are set when an instance of SnapForceoperator is created.struct SnapForce : binary_function< Vec3f, Vec3f, mgFloat > {Vec3f hook_position;mgFloat release_distance;mgFloat stiffness;In the function call operator of the SnapForce these variables are used tocalculate the force to apply. The operator begins by calculating the distancebetween the haptic device and the hook_position. (Observe that if the forceoperator is added in local coordinates then the hook_position needs to begiven in local coordinates. If the force operator is given in global coordinates thenthe hook_position needs to be given in global coordinates). If the hapticdevice is within the release distance then we generate a spring force, otherwise wereturn a zero force. The spring force is calculated as the stiffness times the lengthof the spring and is directed towards the hook_position.Vec3f operator() ( const Vec3f &pos, const mgFloat &w ) {Vec3f diff = hook_position - pos;if( diff * diff < release_distance * release_distance )return stiffness * diff;elsereturn Vec3f( 0, 0, 0);}};The force operator is added during the haptic collision traversal in theCollider::collide function as usual. We first call the collider of the baseclass. The force operator only needs to be added if the haptic device is as close tothe snapping point that it is a possibility that it is going to be inside the snappingarea during the next scene graph traversal.struct OurCollider : public Collider {virtual void collide( CollisionState *cs) {Child::Collider::collide( cs );// only add forces if we are close to the snapping point114<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>}};Vec3f haptic_position = cs->getFinger();if( haptic_position * haptic_position < 0.02 ) {SnapForce snap;snap.hook_position = Vec3f( 0, 0, 0 );snap.release_distance = 0.05;snap.stiffness = 50;cs->addAbsoluteLocalFO( snap );}Here we create the SnapForce instance. Note that no memory fromthe heap needs to be allocated, we add the force operator instancedirectly. The hook_position is set to the origin. Since we are usingaddAbsoluteLocalFO to add the force operator, the subsequent hapticdevice positions will be reported relative to this origin.Program 23 shows the complete example program. It defines a node containingthe new Collider, and runs the scene on a scene graph where the node isadded to the Display.Program 23: A Force Function Example#include #include using namespace <strong>Reachin</strong>;struct SnapForce : binary_function< Vec3f, Vec3f, mgFloat > {Vec3f hook_position;mgFloat release_distance;mgFloat stiffness;Vec3f operator()( const Vec3f &pos, const mgFloat &w ) {Vec3f diff = hook_position - pos;if( diff * diff < release_distance * release_distance )return stiffness * diff;elsereturn Vec3f( 0, 0, 0);}};class OurForceNode : public Child {struct OurCollider : public Collider {virtual void collide( CollisionState *cs) {Child::Collider::collide( cs );// only add forces if we are close to the snapping pointVec3f haptic_position = cs->getFinger();if( haptic_position * haptic_position < 0.02 ) {SnapForce snap;snap.hook_position = Vec3f( 0, 0, 0 );snap.release_distance = 0.05;chapter 7 low-level Magma 115


}};}snap.stiffness = 50;cs->addAbsoluteLocalFO( snap );public:OurForceNode():Child( new OurCollider ) {}};void main() {try {// Build up a sceneDisplay *ri = new Display();OurForceNode *ge = new OurForceNode();ri->children->add( ge );Scene::startScene();}catch( Error::Error &e ) {cerr


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>There are two versions of the addImpulse function as follows.inline void addImpulse( const Vec3f &f, const Time &t, const Vec3f &p );inline void addImpulse( const Vec3f &f, const Time &t );The first argument f is the force to apply. This should be the opposite of theresulting force operator force and should be scaled by the interpolation factorw (see 7.8.1). The second argument t is the time step over which the force isapplied. This should be the period of the realtime loop. Generally this will be0.001s but may vary. Accurate measurements can be obtained using the Timeobject (see Time.h). The time step is important so that the correct impulse canbe applied. The third argument p is the position in global coordinates at whichto apply the force. This is used to calculate the correct torque on the rigid body.If omitted the force is applied at the fulcrum (center of gravity) of the rigid body,meaning no torque is applied.Here we modify the SnapForce example to include movement of a Dynamicnode. The SnapForce instance needs a reference to the dynamic node to effectas well as a means of keeping track of the duration of each realtime loop. We addthe following data members.Time old_time;Dynamic *node;The Time defaults to the current time at creation, so old_time will initiallybe the current time.The function call operator of the SnapForce is extended to first calculate theelapsed time since the last call as follows.Time now;mgFloat delta_t = now - old_time;old_time = now;We use the addImpulse member to supply the appropriately scaled forces tothe Dynamic node. We recommend that you spare a thought now for what willhappen if the minus sign is left out.node->addImpulse( -force * w, delta_t );Since the force returned from this operator will be scaled before being addedto the haptic device (see 7.8.1) the forces to apply to the Dynamic also needto be scaled so as to preserve momentum. We use the second argument to theforce operator function, w. The elapsed time for the forces to be applied to theDynamic node is approximated as the time since the last call to the operator.Note that the third argument to addImpulse is left out so that no torquewill be applied.chapter 7 low-level Magma 117


Program 24 is a complete listing for an example that uses a force operator tocontrol a sphere in a dynamic node. The use of Motion is explained below insection 7.8.8.Program 24: A force operator effecting a dynamic node#include #include #include #include #include using namespace <strong>Reachin</strong>;struct SnapForce : binary_function< Vec3f, Vec3f, mgFloat > {Time old_time;Dynamic *node;Motion motion;Vec3f hook_position;mgFloat release_distance;mgFloat stiffness;}};Vec3f operator()( const Vec3f &pos, const mgFloat &w ) {Vec3f force;// Calculate the elapsed time since the last callTime now;mgFloat delta_t = now - old_time;old_time = now;// If we are inside the hook area apply a force to keep the// haptic device there, otherwise no force.Vec3f diff = hook_position - pos;if( diff * diff < release_distance * release_distance )force = stiffness * diff;elseforce = Vec3f( 0, 0, 0);// Add the impulse back to the dynamicnode->addImpulse( -force * w, delta_t );// Move the object according to its specified motionhook_position = motion.transform( hook_position, delta_t );// Return the calculated forcereturn force;118<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>class OurForceNode : public Dynamic {struct OurCollider : public Collider {virtual void collide( CollisionState *cs) {Dynamic::Collider::collide( cs );Dynamic *node = static_cast< Dynamic* >( drawable );Vec3f pos = node->translation->get();// Add forces operator only if the haptic device// is close to the snapping pointVec3f diff = pos - cs->getFinger();if( diff * diff < 0.016 ) {SnapForce snap;snap.node = node;snap.hook_position = node->translation->get();snap.motion = Motion( cs->getAccumulatedForward() *cs->getLastAccumulatedInverse() );snap.motion /= cs->getDeltaT();}};}// We grab if we are inside the spheresnap.release_distance = 0.01;snap.stiffness = 300;cs->addAbsoluteGlobalFO( snap );public:OurForceNode():Dynamic( 0,0,0,0,0,0,0,0,0,0,0,0,0,0, new OurCollider ) {};};// Let the dynamic contain a sphereShape *sh;createVrmlFromString( “ Shape { ““ geometry Sphere { ““ radius 0.01 } ““ } “, sh );children->add( sh );// Need to set mass to avoid 0 infinite massmass->set( 10 );void main() {try {// Build up a sceneDisplay *ri = new Display();OurForceNode *ge = new OurForceNode();ri->children->add( ge );chapter 7 low-level Magma 119


Scene::startScene();}catch( Error::Error &e ) {cerr getLastAccumulatedInverse());We then normalise the motion so that it is representative of one second ofmotion. CollisionState::getDeltaT() returns the duration since thelast scene graph loop.snap.motion /= cs->getDeltaT();Finally in the function call operator of the force operator, the motion is used tomove the hook_position.7.9 SurfacesIn section 7.8 we described the concepts underlying the haptic renderinginfrastructure of the <strong>API</strong>. In this section we discuss the application of thoseconcepts to the most common haptics requirement – rendering surfaces. <strong>Reachin</strong><strong>API</strong> provides extensive services specific to surface rendering that are separate fromthe general force operator services described in section 7.8.Surface rendering restricts itself to the task of producing force renderings ofmostly solid objects. By mostly solid we mean objects that have a well definedsurface, but may be somewhat soft and spongy or exhibit unusual effects at thatsurface. <strong>Reachin</strong> <strong>API</strong> implements a surface-rendering force operator at a low level120<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>that records state information about the surfaces being touched and the point atwhich they are being touched. There are two points in space that are importantin describing the surface rendering in the <strong>API</strong> – the proxy and the finger. Theproxy is a small sphere that is constrained to remain outside of all surfaces inthe scene whereas the finger is the actual haptic device position which is not soconstrained (see Figure 7.9).proxyfingerFigure 7.9: Constraint based rendering of a solid. The proxy must stay outside the surface whileattempting to minimise the distance to the finger. The finger is constrained only by the forcerendering defined by the surface. Note that the proxy has a radius.The proxy’s radius is customisable in the Scene or Display nodes and is usuallyaround 3 millimetres. The graphical representation of the haptic device is drawnat the proxy position, not at the finger. In the Display node we modify thisbehaviour slightly to draw the pen 10% of the way from the proxy to the finger.This gives the effect of the proxy sphere sinking slightly into the surface whenpushed hard which is visually more realistic.7.9.1 Surface contactsThere may be many surfaces that constrain the proxy from reaching its goal,the finger. <strong>Reachin</strong> <strong>API</strong> deals with these cases so that we only need to considerthe interaction of the proxy, the finger and a single surface. We refer to suchan interaction as a contact. Since the proxy is only contacting the surface at apoint, we are only concerned with the normal of the surface at that point. Thatis, the surrounding topology of the surface is irrelevant. Figure 7.10 shows theresulting simplified view.chapter 7 low-level Magma 121


proxyfingerFigure 7.10: Close-up view of a single proxy-surface contact.Furthermore we note that the behaviour of the surface is independent of theorientation of the point of contact. We generate an orthonormal 11 basis in metresat the contact point with axes labelled u, v and w. The w axis points in thedirection of the surface normal. Also, the <strong>API</strong> takes care of the fact that the proxyis a sphere and not a point. When defining surface types this allows us to treatthe proxy as an infinitesimal point. The final scenario for haptics rendering of asurface is represented in Figure 7.11.wproxyvufingerFigure 7.11: Proxy and finger representation at a surface contact point. The proxy is consideredto be the origin and the finger is expressed in metres relative to the proxy according to anorthonormal coordinate system uvw where w is normal to the surface and u and v lie inthe surface.The u, v and w axes are generated independently from the finger position(they depend only on the normal vector of the surface contact). <strong>Reachin</strong> <strong>API</strong>transforms the finger from the global space into the uvw space by subtracting theproxy position and projecting onto the u, v and w axis vectors. This produces anatural coordinate system to deal with surface contacts because the w coordinateof the finger will tell us how far in metres the finger is penetrating the surface.The u and v coordinates of the finger tell us how far in metres the proxy is pullingsideways. Furthermore the scenario in Figure 7.11 is completely independent ofthe type of surface or geometry we are dealing with.11 Orthonomal means that each axis is of unit length (it measures metres) and that each pair of axes isat right angles to each other. This allows us to calculate correct distances and move in and out of thecoordinate system without the need to invert a 3×3 matrix.122<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>7.9.2 Texture coordinatesThe uvw coordinate system allows us to forget about the 3D position andorientation of a surface contact. However this also means that we lose whereon the surface we are touching. As we shall see in section 7.10, representationsof geometry include not only the 3D coordinates of the associated surfaces, buta parameterisation over those surfaces referred to as texture coordinates. Texturecoordinates allow us to implement surface types that vary as we move over thesurface without tying them to a particular geometry type.Supplying a geometry with texture coordinates is similar to wrapping a squarepiece of rubber over a geometry. A coordinate s, t on the original piece of rubberwill map to a 3D point on the geometric surface. Conversely each point on thegeometric surface must have an s, t coordinate. In this sense texture coordinates inthe <strong>API</strong> are no different than in normal OpenGL rendering where we can specifya texture coordinate for each vertex of a triangle in order to map an image onto3D geometry. However, in <strong>Reachin</strong> <strong>API</strong> we are not just using texture coordinatesto map images, they are more general than that.When dealing with a surface contact as in Figure 7.11 several pieces of texturecoordinate information are available (see Figure 7.12).dsdtproxyvst_originuFigure 7.12: Texture coordinates at a surface contact. We are viewing the contact from above thesurface looking down the w axis. st_origin is the texture coordinate of the proxy. ds and dt arethe derivatives of texture coordinates with respect to the in-surface axes u and v. All three texturevalues are 2-dimensional vectors.chapter 7 low-level Magma 123


The derivatives ds and dt form a two-dimensional basis in the surface whichis not necessarily orthogonal nor normalised. They allow us to determine howthe texture coordinates will change as the proxy moves over the surface. This isimportant for surface types like BumpmapSurface which determine the realtimerendering of the surface according to a texture image.Since the position of the finger under the surface is represented in uvw coordinatesrelative to the proxy, if we needed the texture coordinate of the finger wecould calculate it as( st_origin s+ ds ufinger u+ ds vfinger v,st_origin t+ dt ufinger u+ dt vfinger v)Similarly if we have determined a proxy_movement in uv space, we cancalculate the associated change in texture coordinates as follows.( proxy_movement uds u+ proxy_movement vds v,proxy_movement udt u+ proxy_movement vdt v)This is the most commonly usage of the texture coordinate information. Givena change in texture coordinates ( ∆s, ∆t ) we can determine the change in uand v as follows.∆u = ( dt v∆s – ds v∆t ) / d∆v = ( ds u∆t – dt u∆s ) / dWhere d = ds udt v– ds v dt uNote that if d = 0 (or close to it) then this calculation can not be performed. Inthis case one of the ds or dt vectors is zero, or they are co-linear (i.e. they do notspan the uv space). Note that this operation is simply inversion of a 2×2 matrixwhere the d = 0 case corresponds to inversion of a singular matrix.7.9.3 The contact force operatorSurfaces are analogous to force operators (see section 7.8). They are providedwith the information of Figure 7.11 and Figure 7.12 and are expected to returna force to apply to the haptics device and a movement to apply to the proxy.The force returned is in Newtons according to the u, v and w axes. The proxymovement is in metres according to just the u and v axes (it must be in theplane of the surface).Specifically, Surface nodes define a nested Contact type which defines a virtualfunction of the following form.124<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>virtual void evaluate( const Vec3f &finger,const Vec2f &ds,const Vec2f &dt,const Vec2f &st_origin,Vec3f &force,Vec2f &proxy_movement );Note we are not given the u, v or w axes. They are not needed since the Contactobject operates strictly within the coordinate space of the surface. The forceand proxy_movement arguments are output-only and are set according to theresult of the function call. Also since Vec2f and Vec3f are standard <strong>Reachin</strong><strong>API</strong> types with data members x, y and z we must be aware of the followingnotational equivalence.fingeru == finger.xfingerv == finger.yfingerw == finger.zdsu == ds.xdsv == ds.yand so on.7.9.4 Creating new surfacesCreating a new Surface node involves sub-classing one of the existing Surfacenodes. The include directory contains the files TemplateSurface.hand TemplateSurface.cpp which may be used as a starting point forcreating such a sub-class. Here we work through an example which modifiesSimpleSurface to make it swirl around with small circular vibrations. The fullexample is listed in Program 25.First we copy the TemplateSurface.h and TemplateSurface.cppfiles to SwirlSurface.h and SwirlSurface.cpp, replacing allinstances of TemplateSurface with SwirlSurface and all instances ofBaseSurface with SimpleSurface, modifying the relevant comments.Each surface contact will maintain a current direction that the swirl force isacting in. This direction will rotate around during the contact, producing aswirling vibration in the haptic device. We add a mgFloat directiondata member to the SwirlSurface::Contact object and initialise itto 0 in the constructor. We then define the evaluate function of theSwirlSurface::Contact object to first call the SimpleSurface version.// get the SimpleSurface to calculate it’s force as usual.SimpleSurface::Contact::evaluate( finger, ds, dt, st_origin,chapter 7 low-level Magma 125


force, proxy_movement );The SimpleSurface has now taken care of setting the surface-repulsiveforce and the proxy movement. These have been set in the force andproxy_movement arguments. We then modify the force vector in orderto add the swirling effect. Since the texture coordinates are the only way ofdetermining the local surface directionality, we start with the ds axis (it couldequally well be the dt axis). This is because the u and v axes can point in anydirection as long as they are orthogonal to the surface normal. If we started with adirection in uv space, it could be different each time.// construct a dir vector of 0.3 Newtons in the direction of// the ds axis.mgFloat ds2 = ds * ds;if( mgAbs( ds2 ) < Util::epsilon ) return;Vec2f dir = ds * ( 0.3 / mgSqrt( ds2 ) );Note that if the ds axis is degenerate, we return without modifying the behaviourof SimpleSurface. Now we rotate the dir vector according to the directiondata member of the contact.// rotate dir according to the current direction.mgFloat cosd = mgCos( direction );mgFloat sind = mgSin( direction );dir = Vec2f(cosd * dir.x - sind * dir.y, sind * dir.x + cosd * dir.y);Finally we add the swirl force to the force that was calculated in SimpleSurface.// add it to the resulting force.force.x += dir.x;force.y += dir.y;Also, we must update the direction member so that it actually swirls.// modify our current direction.direction += 0.03;if( direction > Util::two_pi ) direction -= Util::two_pi;The full listing is shown in Program 25. The SwirlSurface.cpp file could becompiled and linked in with the <strong>Reachin</strong>Load utility and used from VRML asshown in the swirl.wrl file.Program 25: A swirling surfaceFile: SwirlSurface.h#ifndef REACHIN_SWIRLSURFACE_H#define REACHIN_SWIRLSURFACE_H#include “SimpleSurface.h”using namespace <strong>Reachin</strong>;126<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>class SwirlSurface : public SimpleSurface {protected:// class Contact : public SimpleSurface::Contact {public:// // initialise any state variables of the contact, possibly// copying from the field values of the surface.inline Contact( SwirlSurface *s ) :SimpleSurface::Contact( s ),direction( 0 ) {}// // we require the void constructor in all sub-classes.inline Contact() {}// // we define evaluate to produce therendering force and update// the proxy position.inline virtual void evaluate( const Vec3f &finger,const Vec2f &ds,const Vec2f &dt,const Vec2f &st_origin,Vec3f &force,Vec2f &proxy_movement ) {// get the SimpleSurface to calculate its force as usual.SimpleSurface::Contact::evaluate( finger, ds, dt, st_origin,force,proxy_movement );// modify our current direction.direction += 0.03;if( direction > Util::two_pi ) direction -= Util::two_pi;}// construct a dir vector of 0.3 Newtons in the direction of// the ds axis.mgFloat ds2 = ds * ds;if( mgAbs( ds2 ) < Util::epsilon ) return;Vec2f dir = ds * ( 0.3 / mgSqrt( ds2 ) );// rotate dir according to the current direction.mgFloat cosd = mgCos( direction );mgFloat sind = mgSin( direction );dir = Vec2f( cosd * dir.x - sind * dir.y,sind * dir.x + cosd * dir.y );// add it to the resulting force.force.x += dir.x;force.y += dir.y;chapter 7 low-level Magma 127


mgFloat direction;};public:// .inline SwirlSurface() {}// // sub-classes should contain exactly the following in order to// register their new Contact sub-classes.inline virtual Surface::Contact *newContact() {return new Contact( this );}inline virtual Surface::Contact *copyContact() {return copyContactT( new Contact );}// our interface.const static Interface interface;};#endifFile: SwirlSurface.cpp#include “SwirlSurface.h”using namespace <strong>Reachin</strong>;const Interface SwirlSurface::interface(“SwirlSurface”, typeid(SwirlSurface),Create::create,&SimpleSurface::interface);File: swirl.wrl#VRML V2.0 utf8Display {proxyRadius 0.002children [Transform {children [Shape {geometry Sphere {radius 0.05128<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>}]}]}}appearance Appearance {material Material {diffuseColor 1 0.8 0.9}surface SwirlSurface {}}Note that the choice of SimpleSurface as a base-class is somewhat arbitrary.In fact we could have chosen just about any Surface type in <strong>Reachin</strong> <strong>API</strong>.This suggests the usage of a C++ template so that we can generate severalnew surface types for the price of one. The reader is encouraged to examineMagneticSurface.h and MagneticSurface.cpp for an example ofjust such a modifier template.7.10 GeometriesWhereas surfaces describe the intrinsic properties of an object with respect tohaptics rendering, geometries describe the extrinsic properties. Geometry nodesmust define the shape of an object as well as the mapping from position to texturecoordinates (see 7.9.2).When graphically rendering geometries, the surface must be decomposed intopolygons so as to be sent to OpenGL for rendering. However in hapticsrendering, geometry does not necessarily have to be described using polygons.The direct mathematical object may be rendered haptically. For example, theSphere node is haptically rendered as a quadric 12 that may be scaled nonuniformly,skewed, etc. The user of the Sphere node need not be concerned withthe difference since they are dealing with the sphere as a geometric primitive. Itsonly extrinsic properties are the radius and texture coordinate mapping.This section will not go into detail on the subject of how to decompose ageometry into polygons (other than by example), we leave that to the books onOpenGL [8], and [9]. Here we describe the means by which the <strong>API</strong> deals withhaptics rendering of primitives.12A quadric is a mathematical representation of a surface expressed with an equation of the form:c xxx 2 + c yyy 2 + c zzz 2 + c xyxy + c yzyz + c zxzx + c xx + c yy + c zz + c = 0Equations of this form can be used to describe cones, cylinders, ellipsoids, paraboloids, planes, etc.chapter 7 low-level Magma 129


Geometry nodes perform haptics rendering in the collision traversalstage of the scene graph loop (see section 7.6.1). Specifically theGeometry::Collider::collide function interprets thegetLastProxy(), getProxy(), getNewProxy() andgetFinger() positions of the CollisionState and generates a numberof line segments to collide with the geometry. It takes care of:• Look-ahead of the proxy motion.• Motion of the coordinate space with respect to the proxy.• Zero-line-segment culling.It then calls the virtual function collideSegment() to deal with a singleline segment of proxy motion. collideSegment() will be called one, two orthree times per collide() call. It is declared as follows.virtual void collideSegment( const Vec3f &from,const Vec3f &to,const mgFloat &radius,CollisionState *cs );The specialised Collider component should overridecollideSegment() as appropriate. The radius argument is the radius inthe units of the local coordinate space 13 .The geometry in question may be composed of a number of components.For example a Box geometry might be considered to be composed of 6polygonal components. Each component is considered for collision with theline segment joining the from and to arguments. If a collision may occur,we render the component with a call to the addCollision function of theCollisionState as follows.cs->addCollision( new SomeRealtimeGeometry(...) );The SomeRealtimeGeometry(...) instance must be aRealtimeGeometry sub-class that will render the component in the realtime-loop(at 1kHz). The collision detection in collideSegment does notneed to be precise. It should generate RealtimeGeometries for any componentthat might lie within radius of the given line segment. For example, todetect collisions with triangles, one might simply detect collision with a boundingsphere of each triangle. In other words, false-positive collisions are tolerated butfalse-negatives can cause problems.13 The proxy is spherical in the global coordinate space, however if the local coordinate space isnon-uniformly scaled then it will actually be an ellipse when represented locally. For the purposes ofcollision detection in the scene graph loop we deal with this by providing the radius of the largest axisof the ellipse. This problem goes away in the final realtime rendering of the geometry.130<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Since there may be several calls to collideSegment in a single collisiontraversal we must ensure that the same component is not collided more thanonce. This is typically done with a hit data member of the specialisedCollider. In the case of an IndexedFaceSet, our hit member is a set oftriangle indices. We use this to ensure that the same triangle is not added to therealtime rendering structure more than once per collision traversal. To achievethis we need to override collide() as follows.void IndexedFaceSet::Collider::collide( CollisionState *cs ) {hit.clear();Geometry::Collider::collide( cs );}Then in the collideSegment function calls we can inspect the hit memberand add to it where necessary. In the case of simple objects like the Sphere, wejust have a boolean hit flag to indicate whether the entire sphere has been hityet in the current collision traversal.7.10.1 Creating new geometriesCreating new Geometry nodes is similar to creating new Surface nodes in thatwe must define a sub-class of an existing Geometry type. Here we will workthrough an example that creates a pyramid with a square base. It will exposewidth, depth and height fields to define the shape and size. It is essentialthat we have a clear picture of how the vertices will be placed in local coordinates(see Figure 7.13) and how the texture coordinate will be defined (see Figure 7.14)before we start coding.yheightxdepthzwidthFigure 7.13: The Pyramid geometry in local coordinates. The local origin lies in the center of thepyramid so that the bounding box is centred at (0,0,0) with size (width,height,depth).chapter 7 low-level Magma 131


(0.5, 1)y(0, 1) (1, 1)zx(0, 0) (1, 0) (0, 0) (1, 0)Figure 7.14: Texture mapping for the triangular sides and square base of the pyramid.Now we are ready to construct the Pyramid node. We can break down theimplementation tasks as follows.• Create a Geometry sub-class skeleton with the width, depth and heightfields in place.• Create a specialised Renderer field to generate the appropriate OpenGL.• Create a specialised SFBound field to define the bounding volume basedon the dimensions.• Create a specialised Collider component to perform haptics rendering.• Define the constructor.• Instantiate and test.Creating the Geometry sub-class skeletonWe sub-class from Geometry and add the width, depth and height fields.These will be collectively referred to as the dimension fields. We also create the VRMLinterface for the Pyramid so that we can instantiate it from VRML (see section 7.4.3).File: Pyramid.hclass Pyramid : public Geometry {public:auto_ptr< SFFloat > width;auto_ptr< SFFloat > depth;auto_ptr< SFFloat > height;static const Interface interface;};132<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>File: Pyramid.cppconst Interface Pyramid::interface(“Pyramid”, typeid(Pyramid), Create::create,exposedField( “height”, &Pyramid::height ) +exposedField( “width”, &Pyramid::width ) +exposedField( “depth”, &Pyramid::depth ));Creating the specialised Renderer fieldWe define the following Renderer type nested within the Pyramid declaration.The renderer is dependent only upon the dimensions of the Pyramid. Assuch, it routes from the width, depth and height fields (in that order).Note the use of TypedField (see section 7.3.3) to ensure that exactly threeSFFloat fields are input to the renderer. We override the makeGL virtualfunction in RenderGLField to examine the dimension fields and generate theappropriate OpenGL function calls (the OpenGL calls are omitted for brevity, seethe full code listing in the examples/rapg/PART_III directory).File: Pyramid.hstruct Renderer : public TypedField < RenderGLField,pair < SFFloat,pair < SFFloat,SFFloat > > > {protected:virtual void makeGL();};File: Pyramid.cppvoid Pyramid::Renderer::makeGL() {// inputs are the three dimensionsmgFloat halfwidth = static_cast< SFFloat* >( inputs[0] )->get()/2;mgFloat halfdepth = static_cast< SFFloat* >( inputs[1] )->get()/2;mgFloat halfheight = static_cast< SFFloat* >( inputs[2] )->get()/2;...}Note that an essential step in specialising the renderer lies in defining thePyramid constructor (see below).Creating the specialised SFBound fieldThis step is very similar to specialising the Renderer in that we create anested type sub-classing from the corresponding nested type in Geometry and weroute the dimension fields to it. We use the same field typing policy as for thechapter 7 low-level Magma 133


enderer field and we define the update function to access the BoxBoundinstance. It is essential that the Pyramid constructor installs this instance (seebelow).File: Pyramid.hstruct SFBound : public TypedField< FField,pair< SFFloat,pair< SFFloat,SFFloat > > > {protected:virtual void update();};File: Pyramid.cppvoid Pyramid::SFBound::update() {BoxBound *bb = static_cast< BoxBound* >( value.get() );bb->size = Vec3f ( static_cast< SFFloat* >( inputs[0] )->get(),static_cast< SFFloat* >( inputs[2] )->get(),static_cast< SFFloat* >( inputs[1] )->get() );}To test the resulting BoxBound, we can use the BoundGeometry node as awrapper around our Pyramid node. This will draw the bounding box in additionto the pyramid itself. In the VRML file below, this is done by specifying:geometry BoundGeometry {geometry Pyramid { … }}Creating the specialised Collider componentThe Collider is somewhat different to the Renderer and SFBound fieldsin that it is not a field. However we can still specialise it in the same manner. Weoverride the collide and collideSegment virtual functions and add thehit flag. The collide function simply resets the hit flag before deferring tothe usual Geometry::Collide behaviour.File: Pyramid.hstruct Collider : public Geometry::Collider {virtual void collide( CollisionState *cs );virtual void collideSegment( const Vec3f &from,const Vec3f &to,const mgFloat &radius,CollisionState *cs );bool hit;};134<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>File: Pyramid.cppvoid Pyramid::Collider::collide( CollisionState *cs )hit = false;Geometry::Collider::collide( cs );}The Geometry::Collide object will only call collideSegment ifthe segment is non-zero and is intersecting the bounding volume of thePyramid (as defined above). For such a simple geometry we decide that thisis good enough collision detection and in collideSegment we simplyassume that all collisions are occurring. We divide the Pyramid into fivecomponents (one for each face) and add a RealtimeGeometry instance tothe CollisionState for each of them. We could improve performance byimplementing a more selective policy here.void Pyramid::Collider::collideSegment( const Vec3f &from,const Vec3f &to,const mgFloat &radius,CollisionState *cs ) {Pyramid *pyramid = static_cast< Pyramid* >( drawable );mgFloat halfwidth = pyramid->width->get() / 2;mgFloat halfdepth = pyramid->depth->get() / 2;mgFloat halfheight = pyramid->height->get() / 2;if( ! hit ) {hit = true;// square base.cs->addCollision( new RealtimeSquare(Vec3f( -halfwidth, -halfheight, -halfdepth ), // cornerVec2f( 0, 0 ),// corner tex coordVec3f( 2*halfwidth, 0, 0 ), // edge 1Vec2f( 1, 0 ),// edge 1 texVec3f( 0, 0, 2*halfdepth ), // edge 2Vec2f( 0, 1 )// edge 2 tex) );// side 1cs->addCollision( new RealtimeTriangle(Vec3f( 0, halfheight, 0 ),Vec2f( 0.5, 1 ),Vec3f( -halfwidth, -halfheight, halfdepth ),Vec2f( 0, 0 ),Vec3f( halfwidth, -halfheight, halfdepth ),Vec2f( 1, 0 )) );// side 2cs->addCollision( new RealtimeTriangle(Vec3f( 0, halfheight, 0 ),Vec2f( 0.5, 1 ),Vec3f( -halfwidth, -halfheight, -halfdepth ),chapter 7 low-level Magma 135


}};Vec2f( 0, 0 ),Vec3f( -halfwidth, -halfheight, halfdepth ),Vec2f( 1, 0 )) );// side 3cs->addCollision( new RealtimeTriangle(Vec3f( 0, halfheight, 0 ),Vec2f( 0.5, 1 ),Vec3f( halfwidth, -halfheight, -halfdepth ),Vec2f( 0, 0 ),Vec3f( -halfwidth, -halfheight, -halfdepth ),Vec2f( 1, 0 )) );// side 4cs->addCollision( new RealtimeTriangle(Vec3f( 0, halfheight, 0 ),Vec2f( 0.5, 1 ),Vec3f( halfwidth, -halfheight, halfdepth ),Vec2f( 0, 0 ),Vec3f( halfwidth, -halfheight, -halfdepth ),Vec2f( 1, 0 )) );Note that we access the dimension fields of the Pyramid directly in theCollider, rather than through an inputs member as in the Renderer andSFBound (which are fields). The use of the hit member means that we only actif we have not already done so in the current collision traversal.We allocate instances of RealtimeSquare and RealtimeTriangle withthe new operator and we leave it to the haptics rendering infrastructureto de-allocate them when it is finished with them. These are bothsub-classes of RealtimeGeometry and accept geometry and texturecoordinate information in their constructors. See the associated header filesfor more information. It is possible to create custom RealtimeGeometrysub-classes for more exotic geometry types. For example, we could createa RealtimePyramid node that would render the whole pyramid veryefficiently in realtime. Unfortunately this topic is beyond the scope of this guide.Defining the constructorNow that we have defined three new nested types we need to ensure that theyare actually used. We must pass instances of them to the Geometry constructorto effect specialisation (see section 7.4.1 and Appendix D: Data Specialisation inC++). The constructor also places the appropriate intra-node routes and initialisesthe field values. In particular we initialise the bound field with an allocatedBoxBound instance (see SFBound.h for more information).136<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>File: Pyramid.hPyramid( Defined< Collider > _collider = 0,Defined< Renderer > _renderer = 0,Defined< SFBound > _bound = 0,Defined< SFFloat > _width = 0,Defined< SFFloat > _depth = 0,Defined< SFFloat > _height = 0 );File: Pyramid.cppPyramid::Pyramid( Defined< Collider > _collider,Defined< Renderer > _renderer,Defined< SFBound > _bound,Defined< SFFloat > _width,Defined< SFFloat > _depth,Defined< SFFloat > _height ) :width( _width ), depth( _depth ), height( _height ),Geometry( _collider, _renderer, _bound ) {width->route( renderer );depth->route( renderer );height->route( renderer );width->route( bound );depth->route( bound );height->route( bound );bound->setBound( new BoxBound( Vec3f( 0,0,0 ), Vec3f( 2,2,2 )));width->set( 2 );depth->set( 2 );height->set( 2 );};Instantiate and testTo test our new Geometry we could either write a C++ program or prepareand load a VRML file. The latter is the medium of choice here. It is importantto test contact with all the components of the geometry as well as the texturecoordinate mapping. Hence we make use of a texture mapped, bump mappedPyramid placed under a Dynamic node to allow rotation. We will look for thefollowing checkpoints.• Visual correctness (the field routings and makeGL function are working).• Correspondence between visual and haptic surface position and haptictexture position (the collideSegment function is working).• No fall-through at the surface or missing haptic surfaces (the bound fieldis functioning).chapter 7 low-level Magma 137


The VRML file for testing is as follows. Note that we specify the inertia tensormatrix explicitly. We could have just used a BoxInertia or SphereInertia instanceto approximate the mass distribution of a pyramid, however we choose here tospecify the correct values explicitly.File: pyramid.wrl#VRML 2.0 utf8Display {children [# dynamic node to let us examine our new geometry in detail.Dynamic {inertia Inertia {mass 20tensor [5.66875e-3 0 00 0.01316875 00 0 0.0125]}# reset the mass, to make the object only rotationablemass 0children [Transform {translation 0 0.01625 0children [Shape {appearance Appearance {# we apply a visual and haptic surface texture so# that we can test the texture mapping.texture ImageTexture {url “urn:inet:reachin.se:/library/textures/steel.png”}material Material {}surface BumpmapSurface {texture ImageTexture {url “urn:inet:reachin.se:/library/bumpmaps/steel.png”}}}# finally we specify the geometry itself.geometry Pyramid {width 0.1depth 0.05height 0.065}}]}]}]}138<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>chapter 7 low-level Magma 139


140<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


Part IV- Cluster TutorialThis section describes a non-trivial example of a program that uses many of thefeatures of the <strong>Reachin</strong> <strong>API</strong>. It is intended to aid the reader in understanding the<strong>API</strong> by providing a large amount of code used in a realistic way.The features demonstrated are:• C++ and VRML interfacing.• Use of the <strong>Reachin</strong> <strong>API</strong>’s rigid-body simulation classes.• Use of the <strong>Reachin</strong> <strong>API</strong>’s force operators.• Use of the <strong>Reachin</strong> <strong>API</strong>’s field networks.• Integration issues using <strong>Reachin</strong> <strong>API</strong> in a non-trivial program.Cluster itself is a simulation of a dynamic system that is a group of planet like objects.141


142<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>8 ClusterThis tutorial provides an in-depth example of <strong>Reachin</strong> <strong>API</strong> code that displayssubstantial complexity. We will work through the implementation of the clusterdemo program.We suggest that you start with running the cluster demo available in the demosdirectory.The whole scene is based on a sun with a fixed position in the middle of thedisplay. Planets are moving around the sun in a gravity field. The haptic devicecan be used to push planets around. By clicking the button on the haptic device, arubber band is created between the haptic device and the nearest planet, and thisrubber band can be used to throw the planets around.Cluster is an example that demonstrates many of the major features foundin the <strong>API</strong>. As such, it explains how to build scene graphs, route fields, useforce operators and how to utilise the <strong>API</strong>’s simulation nodes (specifically theGravityDynamic node). Cluster is an example of a <strong>Reachin</strong> <strong>API</strong> simulationwhere the user can interact with all objects in the scene, causing them to movein complex ways.We will work though Cluster in a step-by-step fashion. This involves extendinga node into a prototype node for a planet. The Planet node will then be reusedfor each planet in Cluster. We also need to create the Cluster scenegraph and the different parts therein. This involves force operators,collision detection, creating a VRML interface and the use of functionchapter 8 cluster 143


fields. Each step of the tutorial has a corresponding sub-directory in theexamples/rapg/PART_IV directory of the <strong>Reachin</strong> <strong>API</strong> distribution. Forexample source code for section 8.9.1 “Composing the rubberband end points”can be found in examples/rapg/PART_IV/cluster8.9.1.8.1 Building the scene graph representationWe begin by creating the Planet prototype node. The Planet node should lookand behave like a planet, i.e. in the form of a Sphere with dynamic behaviourincorporated so that it behaves as if it is located in a gravity field. There is a nodetype called GravityDynamic that can be used for objects that are supposed tomove in a gravity field in a physically correct way. The GravityDynamic node isa subclass of the Dynamic node, specialising it with functionality for describingthe forces similar to gravity according to a spherical mass. The Dynamic nodeadds rigid-body dynamics to a Transform node, and provides both linear andangular motion.To create a prototype node for the Planet, we begin by choosing the “topmost”node as the base class for the prototype. The scene graph representation that wewant for the Planet is as in Figure 8.1.GravityDynamicchildrenShapegeometryappearanceSphereradiusAppearancematerialsurfacetextureMaterialdiffuceColorSimpleSurfacestiffnessImageTextureFigure 8.1: The scene graph structure used for a planet in cluster.imageIn scene graph representation for the planet, GravityDynamic is the topmostnode. We use it as a base class for the Planet node. We add the rest of the scenegraph nodes as children to GravityDynamic.To implement the above scene, we begin by creating a directory calledcluster. In it, create a header file for the Planet node (i.e. Planet.h). Sincewe are using GravityDynamic as base class, include its header file. We are usingthe <strong>Reachin</strong> namespace so we state that as well with the using keyword.144<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>#include using namespace <strong>Reachin</strong>;To define the Planet class to use GravityDynamic as base class, we need to definea public default constructor for the Planet node.class Planet : public GravityDynamic {public:Planet();};We create a second file in the cluster directory, Planet.cpp. We includePlanet.h in it so we can access the declarations.#include “Planet.h”Planet::Planet() {}In the constructor we would like to create the rest of the scene graphrepresentation of the Planet node. We can do that directly in C++ but the easiestway of building the scene graph is to read it in from a VRML file. So, we createa VRML file and name it Planet.wrl. Edit Planet.wrl and put the VRMLcode defining the scene graph representation.Since the base class node (GravityDynamic) is accounted for in the C++ files, wejust need to add its children in the scene graph. The code is as follows:Shape {geometry Sphere {radius 0.025}appearance Appearance {material Material {diffuseColor 1 0 0}texture ImageTexture {}surface SimpleSurface {stiffness 100}}}GravityDynamicchildrenGravityDynamic is a C++ nodeShapegeometryappearanceThe Shape node and its subscene graph is loaded from aVrml fileSphereradiusAppearancematerialsurfacetextureMaterialdiffuceColorSimpleSurfacestiffnessImageTextureimageFigure 8.2: Nodes created using c++ and VRMLchapter 8 cluster 145


We want to create the entire Planet node directly in the constructor, reading in thechildren nodes from the VRML file. However, in order to do that, we must include somemore headers. In Planet.cpp include the following files: Vrml.h to be able to readin the scene graph representation from a VRML specification and Shape.h to be able tospecify a shape for the planet from the VRML file.First, we need to create a pointer to a Shape node directly in the constructor. Init, we store the VRML created Shape node. To create a node from VRML, use thecreateVrmlFromURL() function. It can be used to create the VRML nodes foundin the specified VRML file, and assign the first node to the Node pointer sent asthe second argument. Observe that the type of the pointer sent as the argument tocreateVrmlFromURL() and the node created in VRML must be of the same type.We then can add the shape created in VRML as a child of this node. In code, it appearsas follows.Planet::Planet() {Shape *planet_shape;createVrmlFromURL(“Planet.wrl”, planet_shape);children->add(planet_shape);}We now have a planet node with a scene graph representation . To be able to lookat the scene, we write the following test program, testPlanet.cpp.Program 8.1: testPlanet.cpp.#include #include #include #include “Planet.h”using namespace <strong>Reachin</strong>;void main() {AutoRef< Display > display( new Display );Planet*planet = new Planet();}// Add the planet to the scene graphdisplay->children->add( planet );// start the simulationScene::startScene();First we created an instance of a Display for visual and haptic rendering.TheDisplay node contains information about the graphical representation of thehaptic device and viewpoint data and creates a up the Scene node. Childrenadded to a Display get into the scene graph with the Display as root node.146<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Using AutoRefIn order for a node to be used in the scene graph, it needs to be referenced. Whena node is added as a child of another node, references are done automatically.In this case, the Display is not added as a child, and consequently needs to bereferenced manually. This is done using the AutoRef template.Create a planet, and add it as a child of the Display node, which adds it tothe rendered scene.To start the simulation we then call the startScene() function, which startsthe simulation on the scene created by the Display.8.2 Setting default field valuesDefault values for the Shape section of the planet scene graph are easiest to setdirectly within the VRML file (Planet.wrl). Default values for the base nodeGravityDynamic fields, need to be set in the constructor.Lastly in the constructor we set the default values for the translation, mass,momentum, gravitational source and the amount of gravitation. The momentumis set as the mass times the desired velocity.The fields are set as seen in the following code (Planet.cpp).translation->set( Vec3f( 0.03, 0.1, 0.07 ) );mass->set( 50 );momentum->set( mass->get() * Vec3f( -.05, 0, 0 ) );source->set( Vec3f( 0, 0, 0 ) );gravity->set( 0.0005 );Compile and run the program and the planet will orbit around the centre of gravity.8.3 Creating the VRML interfaceIf we want to be able to create Planet nodes by using VRML, we need to createan interface to VRML. An interface specifies the VRML aspect of a node. A VRMLinterfaces includes the string name of the node, the type_info of the nodetype, a function for creating instances of the node (usually the create staticfunction of the node), and a description of the fields and events in the node.Begin by declaring the interface in the header file (Planet.h). The interfacebelongs in the public section.chapter 8 cluster 147


The VRML interfacestatic const Interface interface;The VRML interface can then be used to access the node. We define theinterface in the very beginning of Planet.cpp, just before the constructor.The implementation definition consists of a string name of the node (here“Planet”), the type_info, a function for instantiating an instance of thenode (Create::create), and then a description of the eventsand fields in the nodes interface.const Interface Planet::interface(“Planet”, typeid( Planet ), Create::create,exposedField( “gravity”, &Planet::gravity ) +exposedField( “source”, &Planet::source ) +exposedField( “position”, &Planet::translation ) +exposedField( “momentum”, &Planet::momentum ) +exposedField( “mass”, &Planet::mass ));The above declaration states that we can create a Planet node in VRML by usingthe name “Planet”. Each exposed field consists of two parts: an identifier forthe field and a reference to that field in the node. By using the identifier we areable to set field values directly in a VRML file.With this interface, we can set, route to and route from the translationfield by using the identifier name “position”. Note that we have set theidentifying name position for the translation field, which is inheritedfrom the GravityDynamic class.The momentum default value should preferably be set to zero. Therefore, in theconstructor of the planet, we change the old setting of momentum as in thefollowing code. The same applies to the amount of gravity.momentum->set( Vec3f( 0, 0, 0 ) );gravity->set( 0 );To create the Planet using the VRML interface, open up a new file and name itWorld.wrl. Put the code below in it. Now, to show the Planet in a display, wemust add the Planet as a child of the Display node. While we are at it, we also setsome values for the fields that we have exposed through the VRML interface.Display {children [Planet {gravity 0.00025source 0 0 0position -0.05 0 0.12momentum 0.15 0 0mass 4}]}148<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>To see our planet, we need a test program to run the scene. So, create a newfile, name it worldLoader.cpp and put the following code in it. Compileand run to start the scene.Program 8.2: worldLoader.cpp#include #include using namespace <strong>Reachin</strong>;void main( const int argc, const char *argv[] ) {if( argc != 2 ) {cerr


The solution is to create new fields for the Planet’s color, radius andtexture fields. These fields we can expose through the VRML interface and uselater when setting the values in the application scene graph. So, begin by creatingthe public fields in Planet.h. We use the same field types as in the VRMLversion of the scene graph.// Our public fieldsauto_ptr< SFRGB > color;auto_ptr< SFFloat > planet_radius;auto_ptr< SFTexture > texture;Since they are auto_ptr’s they need to be created directly in the constructor ofthe planet. The new constructor is as follows (in Planet.cpp):Planet::Planet():color ( new SFRGB ),planet_radius( new SFFloat ),texture ( new SFTexture ){...To be able to access the values from the VRML file, we need to add them to ourVRML interface in Planet.cpp:const Interface Planet::interface(“Planet”, typeid(Planet), Create::create,exposedField( “gravity”, &Planet::gravity ) +exposedField( “source”, &Planet::source ) +exposedField( “position”, &Planet::translation ) +exposedField( “momentum”, &Planet::momentum ) +exposedField( “mass”, &Planet::mass ) +exposedField( “color”, &Planet::color ) +exposedField( “radius”, &Planet::planet_radius ) +exposedField( “texture”, &Planet::texture ));With the newly created fields, we can change field values directly from the VRMLinterface. The fields are not yet connected to the colour, radius and texture ofthe Planet’s scene graph representation. To connect the values, we need a wayof referencing the different nodes in the VRML file. By using the VRML DEFcommand, we can use the DEFed names to access those nodes.The nodes defined in Planet.wrl that we need to access are the Shape node(to add it as a child), the Material node (to change color), and the Texturenode (to change the texture).In Planet.wrl, we do this by using the VRML DEF command to name therelevant node instances.150<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>DEF planetShape Shape {geometry DEF planetSphere Sphere {radius 0.025}appearance DEF planetAppearance Appearance {material DEF planetMaterial Material {diffuseColor 1 0 0}texture ImageTexture {}surface SimpleSurface {stiffness 100}}}To be able to use these names, we need to save all the defined names in a mappingtable. To do that, declare, in Planet.h, a protected DEFMap to save the valuesin. As follows:// Store defined nodes from VRMLDEFMap defmap;To populate the DEFMap, we need to invoke the createVrmlFromURL functionin a slightly different manner.createVrmlFromURL( “Planet.wrl”, &defmap );Since we no longer have an explicit reference to the Shape node, we must obtainthe reference though the DEFMap so that we can add the shape scene graph as achild of Planet, just as we did before.To get the reference, we use the DEFMap find function. The find function takesthe DEFed identifier name as the first argument and a node pointer as the secondargument. Observe that the pointer used as the second argument must be a nodepointer of the same type as the node created in VRML.// Add the shape of the planet as a child of this nodeShape *planet_shape;defmap.find( “planetShape”, planet_shape );children->add( planet_shape );All that is left now is to route the C++ created fields to the corresponding fieldsin the VRML scene graph. This must be done so that values set through the VRMLinterface propagates to the correct fields in the scene graph.Previously we used the DEFMap find function with an identifier name and anode pointer as arguments. When we want to use the DEFMap for routingpurposes, we can use the find function in a different manner. By specifyingchapter 8 cluster 151


a nodeIdentifierName.fieldName, we get a pointer to the field. Thisallows us to route directly to the returned field, without first creating a newvariable for the node.color->route( defmap.find( “planetMaterial.diffuseColor” ));texture->route( defmap.find( “planetAppearance.texture” ));planet_radius->route( defmap.find( “planetSphere.radius” ));We can now use our own fields using VRML syntax in the World.wrl file.color 0.8 0.6 0.2radius 0.03texture ImageTexture {url “urn:inet:reachin.se:/library/textures/moon.png”}8.5 The initialize functionWe have one more improvement to do before the planet can be consideredcomplete. Instead of setting the mass using the VRML interface, we would liketo set the mass according to the radius of the planet, given a certain density. Todo so we need to create a new field called density, delete the mass field,and then calculate the mass of the planet. The calculation must be done afterwe have set the radius and density fields in the VRML instantiation of the Planet.Instantiation of the VRML fields takes part after construction, hence we cannotcalculate density directly in the constructor. Instead we must use a function calledinitialize() which initialises a node the first time it gets referenced (eitherby being added as a child of another node or straight after creation if it is createdwith the AutoRef template). This moment always occurs after both thecreation of the node and the initialisation of VRML-set values.We begin by creating the new field as well as deleting the old mass field.In Planet.h, add a density field of type SFFloat.auto_ptr< SFFloat > density;In Planet.cpp, replace the mass field with the density field.exposedField( “density”, &Planet::density ) +In the constructor, set the default value for the density.density->set( 50 );In the constructor add a line to create a field for the auto_ptr for density.density( new SFFloat )152<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In Planet.h, override the initialize function by adding the followingline to the public section of Planet.h.virtual void initialize();Define the mass calculation function in Planet.cpp. To do that, we first needto get the radius and density values to get hold of their updated values. We thencan use them for calculating and setting the mass of the planet.void Planet::initialize() {// Get the radius and density valuesmgFloat pl_radius = planet_radius->get();mgFloat pl_density = density->get();// Calculate and set the mass of the planetmass->set( Util::pi * pl_radius * pl_density * 4 / 3 );}Replace, in World.wrl, the mass value with an appropriate density value.Planet {density 40...8.6 Adding rotational behaviour for the planetTo add rotational behaviour to the Planet node, we need to set the Dynamic’sinertia field to a SphereInertia. Since we need to set the mass of theSphereInertia, we do this in the initialize function.Instead of setting the mass of the Dynamic directly after the calculation, wesave it in a variable.mgFloat m = ( Util::pi * pl_radius * pl_density * 4 / 3 );Then create a SphereInertia, and set its values according to the Planet’s values.Observe that the mass of the Inertia is set ten times higher then the mass ofthe Dynamic node. This is to allow small planets and still maintain a stablesimulation.SphereInertia *sp = new SphereInertia;sp->mass->set( m * 10 );sp->radius->set( planet_radius->get() );Set the inertia of the GravityDynamic to the created SphereInertia.inertia->set( sp );chapter 8 cluster 153


The mass of the Dynamic node is routed from the mass field from the Inertianode. We would like the mass of the Dynamic to be the actual mass set for thePlanet node, and we achieve this by setting the value of the Dynamic’s massfield, to the actual calculated value for the mass.mass->set( m );8.7 Building the cluster world structureThe Planet node is now finished. It is used in the Cluster node as the commonrepresentation of all planets.In the Cluster world, we need to combine some different types of objects. First,the sun in the middle and a number of planets moving around it in its gravityfield. To be able to draw the line for the rubber band between the haptic deviceand a grabbed planet, a geometric object for that line is also needed.It is the job of the Cluster node to group together these objects, as well as addingbehaviour for collision detection between planets. It should also handle rubberband used for grabbing hold of planets. We use the Group node as base class forthe Cluster node, and add additional behaviour to it.Now we would like to have the sun, rubberband and planets as children ofthe Cluster node. By doing so, the rendering behaviour of Group takes care ofrendering those objects.Since the children field of the Group is of type MFNode, we need a ways ofcombining the sun, rubberband and planets together into a single MFNode.Then we can route the value of the MFNode to the children field of ourgroup node.Open and edit a header file for the Cluster node in the cluster directory. Name itCluster.h and use Group as the base class:#ifndef CLUSTER_H#define CLUSTER_H#include using namespace <strong>Reachin</strong>;class Cluster : public Group {public:Cluster();};#endif //CLUSTER_H154<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>We create one SFNode auto_ptr for the sun and one SFNode auto_ptr forthe rubberband. Since we add planets from a VRML file, we would like toensure that all nodes are of type Planet. Therefore, define MFPlanet as typeTypedMFNode to ensure type checking for all added Planet nodes. Also wecreate an auto_ptr for storing the MFPlanet. It would be better if these fieldswere protected. However we would like the MFPlanet to be exported throughthe Cluster VRML interface and consequently it has to be public.In Cluster.h, we add the following code.protected:struct MFPlanet : public TypedMFNode< Planet > {};public:auto_ptr< MFPlanet > planets;protected:auto_ptr< SFNode > sun;auto_ptr< SFNode > rubberband;To be able to create the MFPlanet we need the specification of the Planetnode.#include “Planet.h”To be able to have planets, sun and rubberband stored as children of Cluster,they need to be combined into a single MFNode. We can combine nodestogether using a function field. The function field shall allow us to route thesun, rubberband and planets to it, and have as values an MFNode with the sun,rubberband and planets composed. The routes to the function field are known inadvance, which allows us to use the EvaldFField type.We name the function field WorldComposer. Routes to WorldComposerare the sun (SFNode), the rubberband (SFNode) and the planets(MFPlanet). We define the actual updating (composing) behaviour in theCluster.cpp file later. Add the following code, still in Cluster.h:struct WorldComposer : public EvaldFField< WorldComposer, MFNode,SFNode, SFNode, MFPlanet> {void evaluate( SFNode *sun, SFNode *rubberband, MFPlanet *planets );};Create an auto_ptr for an object of WorldComposer.auto_ptr< WorldComposer > world_composer;chapter 8 cluster 155


Ok, time to create the C++ file for the Cluster. Create the file and callit Cluster.cpp. The file is at this stage will consist of two parts. Theconstructor and the definition of the WorldComposer evaluation field. Webegin by including the Cluster.h file.#include “Cluster.h”In the constructor, create objects for the auto_ptr’s. Set up the routingof the different objects to the composer, and route the value of theworld_composer to the children field of the Cluster.Cluster::Cluster() :sun ( new SFNode ),rubberband ( new SFNode ),planets ( new MFPlanet ),world_composer ( new WorldComposer ) {// compose the sun, rubberband and planets together, and let// them be the children of this nodesun->route( world_composer );rubberband->route( world_composer );planets->route( world_composer );}world_composer ->route( children );Now, each time the value of the sun, rubberband or the planets field is set(changed) the composer knows that it needs to update. The composer then tellsits children field that its value is incorrect. The next time then, when the renderertries to render the children field, the children field tries to update itself. In doingthat, it asks the world_composer to update itself. Consequently, we nowneed to declare what the world_composer shall do when updating itself.For starters, when an EvaldFField needs to update, it calls theevaluate() function with the routes as arguments. In our case, the composerneeds to get the updated values from the sun, rubberband and the planets (whosevalues do not get updated until we ask for them).The evaluate function then should delete all old nodes form the MFNodeand then add all the nodes currently routed to it. The problem with doing thisis that both the add and delete functions generate events, which we do notwant the evaluate function to do. Instead of deleting all nodes from the MFNodeby using the delete function, we just call the exitNode function for eachnode. The exitNode function does all things needed when a node is deleted,but does not generate events. After the exitNode call all the nodes have beenun-referenced and the MFNode can be resized to zero.156<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>After that the sun and the rubberband are added first in the MFNode, and afterthat we add all planets. Then we need to call the enterNode function for allnodes to achieve ordinary behaviour for adding nodes, as reference counting etc.void Cluster::WorldComposer::evaluate( SFNode *sun,SFNode *rubberband,MFPlanet *planets ) {// exit all existing valuesUtil::for_each( value.begin(), value.end(), this, &MFNode::exitNode );// Set the MFNode to contain nothingvalue.resize(0);// Add the new valuesvalue.push_back( sun->get() );value.push_back( rubberband->get() );for( MFPlanet::const_iterator i = planets->begin();i != planets->end(); i++ ) {value.push_back( (*i) );};// Enter all new valuesUtil::for_each( value.begin(), value.end(), this, &MFNode::enterNode );}Observe that the values of the sun and the rubberband get validated with calls totheir respective get() functions. The MFPlanet containing the planets getsvalidated with a call to the planet’s begin() function (or end() function).Now we have created a structure for the cluster node. Using this structure, we canadd a sun, a rubberband and some planets composed together to as an MFNodeto be rendered by the children field of the cluster node.8.7.1 Populating the worldNow we have created the structure of the world, but we have not filled it withany real nodes.The sun and the rubberband we can create from VRML, directly in the constructor.The planets we would like to read in using the VRML interface of the Clusternode. That is to make it possible to create an arbitrary amount of planets, andspecifying their properties directly in the VRML-code.chapter 8 cluster 157


We start by creating the sun and the rubberband nodes. Do so by creating aVRML-file called Cluster.wrl. In it, we specify the sun i.e. a Shape, whoseappearance and geometry we specify to match our purpose.#VRML V2.0 utf8DEF sun Shape {appearance Appearance {material Material {diffuseColor 0.9 0.6 0.2}texture ImageTexture {url “urn:inet:reachin.se:/library/textures/jupiter.png”}surface SimpleSurface {stiffness 600}}geometry Sphere {radius 0.04}}The rubberband is also a Shape. We specify the line for the rubberband as anIndexedLineSet with two endpoints. Initially, we set one of the endpoints at theorigin and the other one some distance away enabling us to see that the lineis actually drawn.DEF rubberband Shape {appearance Appearance {}geometry IndexedLineSet {colorPerVertex FALSEcolor Color {color [ 1 0 0 ]}coordIndex [ 0 1 ]coord Coordinate {point [ 0 0 0, 0.1 0.1 0.1 ]}}}Now we need to read in these two nodes in the constructor of the Cluster,assigning them to the sun and rubberband node holders we have created.In Cluster.h, create a DEFMap table to save VRML DEFined nodes andinclude vrml.h.#include DEFMap defmap;158<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In the constructor in Cluster.cpp, after the setting of cross-fielddependencies, create the nodes from the Cluster.wrl file, and fill theDEFMap table with the DEFed names. Then use the sun and rubberband nodesfrom the table.createVrmlFromURL( “Cluster.wrl”, &defmap );// Assign the sun node created in VRML as the value of the sunShape *sun_node;defmap.find( “sun”, sun_node );sun->set( sun_node );// Assign the rubberband node created in VRML as the value of the// rubberbandShape *rubberband_node;defmap.find( “rubberband”, rubberband_node );rubberband->set( rubberband_node );Include Shape.h to enable usage of Shapes.#include Now it’s time to make a VRML interface for the Cluster node. In Cluster.h,declare the VRML interface.static const Interface interface;In Cluster.cpp, define the interface. For starters, it consists of the identifier name,the type_info and the creation function as usual. The only field at this momentis the planets field.const Interface Cluster::interface(“Cluster”, typeid( Cluster ),Create::create,exposedField( “planets”, &Cluster::planets ));To run the program we can use the same worldLoader as in the previousexample. It just needs to be re-linked with this new node type. So, changeWorld.wrl to specify the world we would like to show now. It still consistsof the Display with Cluster as a child. The planets we put in the planetsfield in Cluster as follows.Display {children [Cluster {planets [Planet {position 0 .04 0.03color 0.1 0.2 1chapter 8 cluster 159


}]}]}radius 0.03density 400texture ImageTexture {url “urn:inet:reachin.se:/library/textures/moon.png”}Now we should see a sun in the middle, a line drawn from the sun and upwardsand a planet in front.8.8 Adding planet behaviourNow we would like to be able to change the sun’s color, radius andgravity from the Cluster node’s VRML interface. Additionally, we would alsolike planet behaviour to follow the specifications of cluster, in respect to thesource and amount of gravity.So, we start by creating fields for gravity, radius and color of the sun.We do this so we can expose them through the VRML interface. In Cluster.h,add auto_ptr’s for the fields in the public section.auto_ptr< SFRGB > sun_color;auto_ptr< SFFloat > sun_radius;auto_ptr< SFFloat > gravity;Create instances of the fields in the constructor:Cluster::Cluster() :sun ( new SFNode ),rubberband ( new SFNode ),planets ( new MFPlanet ),sun_radius ( new SFFloat ),sun_color ( new SFRGB ),gravity ( new SFFloat ),world_composer ( new WorldComposer ) {Add the fields in the VRML interface for the Cluster:160<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>const Interface Cluster::interface(“Cluster”, typeid(Cluster), Create::create,exposedField( “sunRadius” , &Cluster::sun_radius ) +exposedField( “sunColor” , &Cluster::sun_color ) +exposedField( “gravity” , &Cluster::gravity ) +exposedField( “planets” , &Cluster::planets ));We need to route these exposed fields to the actual values in the scene graphdirectly in the constructor. We do this to ensure that values set from the VRMLinterface propagate into the rendered scene graph. The radius and colorfields for the sun can be routed to their corresponding parts in the scene graphby DEFed names in the VRML file.We start by DEFining the names in VRML:DEF sun Shape {appearance Appearance {material DEF sunMaterial Material {diffuseColor 0.9 0.6 0.2}texture ImageTexture {url “urn:inet:reachin.se:/library/textures/jupiter.png”}surface SimpleSurface {stiffness 600}}geometry DEF sunSphere Sphere {radius 0.04}}Now we can find the nodes from the DEFMap table and route their fields.In the constructor, add the routing code:sun_color->route( defmap.find( “sunMaterial.diffuseColor” ) );sun_radius->route( defmap.find( “sunSphere.radius” ) );The sun_radius field of the Cluster node is going to be used later in theprogram. For this reason, we need to ensure that the sun_radius will containthe correct radius value, whether or not the values are set from the Clusternode’s VRML interface. To achieve this, the sun_radius value needs to be set toa correct initial value using the value found in the scene graph representation.The sun_radius value returned from the defmap.find function is anordinary field pointer; so we need to typecast it to an SFFloat pointer beforewe can get its actual value.chapter 8 cluster 161


sun_radius->set( dynamic_cast< SFFloat * > ( defmap.fi nd(“sunSphere.radius” ) )->get() );sun_color->set( dynamic_cast< SFRGB * > ( defmap.fi nd(“sunMaterial.diffuseColor” ) )->get() );8.8.1 enterNode/exitNodeIn order for the planets to be affected by the sun’s gravity, we must somehowroute the gravity source and the amount of gravity to all involved planets andthis must be done each time a new planet is added. We can do this using theenterNode and exitNode functions present in MFNode.Each time a node is added to a MFNode the enterNode function is called, andeach time a Node is deleted, the exitNode function is called. These functionscan be extended to achieve special behaviour when a planet is added.In Cluster.h, we extend these functions but before doing so, we need areference to the Cluster node to be able to reach the different fields of Clusterfrom the MFNode member functions. Some of these fields are protected, sowe also have to declare MFPlanet to be a friend class. Concretely, in theenter/exitNode functions, we add a holder for that reference.struct MFPlanet : public TypedMFNode< Planet > {Cluster *cluster;virtual void enterNode( Node *n);virtual void exitNode ( Node *n);};friend class MFPlanet;We set the reference to the cluster node in the constructor:planets->cluster = this;In Cluster.cpp, we specify the behaviour of the enterNode andexitNode functions. The enterNode function first calls the enterNodefunction of the MFNode, to achieve enterNode standard behaviour forMFNodes. The next step is to set up the routes. Routing involves, initially,dynamically casting the node to a planet. If that works , we route the sun’sexposed radius field to the radius of the gravity field in the planet. We also routethe Cluster gravity to affect the gravity of the planet.void Cluster::MFPlanet::enterNode( Node *n ) {MFNode::enterNode( n );Planet *planet = dynamic_cast< Planet * >( n );if( planet ) {162<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>}}cluster->sun_radius->routeAndTouch( planet->source_radius );cluster->gravity->routeAndTouch( planet->gravity );When a planet is deleted, the exitNode function is called. When deletionoccurs, we need to unroute the routes that were set up in the enterNodefunction. Lastly, we call the exitNode function of the MFNode to fulfilstandard exitNode behaviour.void Cluster::MFPlanet::exitNode( Node *n ) {Planet *planet = dynamic_cast< Planet* >( n );if( planet ) {cluster->sun_radius->unroute( planet->source_radius );cluster->gravity->unroute( planet->gravity );}MFNode::exitNode( n );}In both these functions, the cluster reference that was set up in the constructor isused to reach the member properties of the cluster.Now we can set the sun’s properties in the cluster using the VRML interface. Valuesfor gravity, radius and the amount of gravity are consequently goingto apply to all planets added.8.9 Creating the graphical rubberbandHere we work through the graphical part of the rubberband functionality. Wedescribe how to define the end points of the rubberband IndexedLineSet, howthe planet is grabbed and how to use information from the haptics device. Insection 8.10 we will add forces to the rubberband.8.9.1 Composing the rubberband end pointsTo be able to change the end points of the rubberband string, we must be able toroute a SFVec3f to each endpoint. The point field of the Coordinate node isa MFVec3f and hence we can not route the SFVec3f’s directly to the pointfield. To set the desired routes, we therefore need to have some field in betweenthat composes these two SFVec3f to a single MFVec3f. It should also do morethan just compose the two SFVec3f’s together. Preferably, it should provide areset function to use when no line is going to be drawn, ensuring that both endpoints are set to zero.chapter 8 cluster 163


We start by creating the RubberbandComposer in Cluster.h. It shouldinherit from the ComposeVec3f field that takes a number of SFVec3f asinput and creates a MFVec3f of them. The updating behaviour of the fieldis inherited from ComposerVec3f, and the RubberbandComposer justneeds to be extended with the reset function. The reset function shall,when called, make sure that all routes to the field are unrouted. It should alsoset the length of the MFVec3f to 2, where both positions shall consist of apoint at the origin.In the member functions of a function field the fields that are routed to thefield are in a vector of fields called inputs. So in the reset function we firstunroute everything that’s in the inputs vector. We then resize the valueof the Composer to be of length 2, and in both positions we put a zero point.When the field is reset, the value of the field is changed, so we need to call thetouch function. We do this to notify the fields routed from the Composer thatits value has changed.struct RubberbandComposer : public ComposeVec3f {inline void reset() {while( inputs.size() ) inputs[0]->unroute( this );value.resize(2);value[0] = value[1] = Vec3f( 0, 0, 0);touch();}};Create an auto_ptr to a RubberbandComposer field:auto_ptr< RubberbandComposer > rubberband_composer;And create an instant of it in the constructor:rubberband_composer( new RubberbandComposer) {The output of the rubberband_composer shall be routed to thepoint field of the Coordinate node from the rubberband line’s scene graphrepresentation. In doing that, the rendered line is synchronised with the valuesfrom the rubberband_composer.In Cluster.wrl, DEFine a name for the Coordinate node:DEF rubberband Shape {appearance Appearance {}geometry IndexedLineSet {colorPerVertex FALSEcolor Color {color [ 1 0 0 ]}coordIndex [ 0 1 ]coord DEF stringLine Coordinate {164<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>}}}point [ 0 0 0, 0.1 0.1 0.1 ]The DEFined name of the Coordinate can now be used to set up routes from therubberband_composer to the rubberband line points in the scene graph.The routes are set up in the constructor.rubberband_composer->route( defmap.find( “stringLine.point” ) );In the constructor we also reset the rubberband_composer since the rubberbandshould be “hidden” at startup.rubberband_composer->reset();When running the program, the rubberband line is hidden since therubberband_composer is reset to origin points in the constructor. Thevalues for the rubberband_composer are routed to the endpoints of theline which means that changing the endpoint values for the rubberband involvesno more than to route (and touch) the start and the stop positions to therubberband_composer. We are going to use this in the section 8.10.8.9.2 Defining the rubberband end pointsAs stated in the introduction, we want to use the haptic device to grab a planet.The rubberband line between the planet and the haptic device works as the visualrepresentation when grabbing a planet. No force operators are involved at thisstage, just rendering of the rubberband.To be able to grab a planet when pushing the haptic device’s button as well asreleasing a hooked planet when releasing the button, we need to keep track of thebutton presses. For that use, we can use a function field that takes care of whatto do when the button is pressed or released. To that function field, we route thestate of the button of the haptic device.As in WorldComposer, we know what the types routed to this function fieldare going to be. - In this case, just a route from the button on the hapticdevice. Knowing that, we can use the EvaldFField. The field needs to be aDependent field since it needs to update each time the button is pressed orreleased, and consequently can not wait with the evaluation.chapter 8 cluster 165


In Cluster.h, declare the EvaldFField to be called ButtonTracker.The value of the field is just an ordinary Field since the ButtonTrackerdefines the behaviour when the button is pressed, not a value. The function fieldtakes one SFBool as input route.To be able to access the fields in cluster from the ButtonTracker field, weneed a reference to the cluster. We also need a holder for the planet we aregoing to hook.The evaluate function of the EvaldFField is declared with an SFBoolas input field, which is the haptic device’s button.struct ButtonTracker : Dependent< EvaldFField< ButtonTracker,Field, SFBool > > {Cluster *cluster;Planet *hooked_planet;virtual void evaluate( SFBool *button_down );};friend class ButtonTracker;Create an auto_ptr for a ButtonTracker in the header file:auto_ptr< ButtonTracker > button_tracker;Create an instance of the ButtonTracker in the constructor:button_tracker( new ButtonTracker ),In the constructor, we also set up routes from the haptic device’s button tothe button_tracker. To get hold of the haptic device’s button, we usethe DeviceInfo node. DeviceInfo is a bindable node and represents the set ofhardware input devices. From the DeviceInfo node, we pick the tracker,which is the current haptic device. This tracker device has a field calledbutton which is an SFBool indicating the state of the stylus switch. Weroute the button field to button_tracker, so that button_tracker canhandle the different states when the button changes.DeviceInfo *di = DeviceInfo::getStackTop();Device &haptic_device = (*di->devices)[“tracker”];haptic_device[“button”]->route( button_tracker );The header file of the DeviceInfo needs to be included:#include The button_tracker’s pointer to cluster, in the constructor, needs to beinitialised. Similarly, the hooked_planet pointer needs to be nullified.button_tracker->cluster = this;button_tracker->hooked_planet = 0;166<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>The evaluate functions behaviour is defined in Cluster.cpp. Since theButtonTracker field is Dependent, it will be called each time the state ofthe button changes. In the ButtonTracker evaluate function, we need tocheck whether the button is pressed or released. If the button is pressed, we grab aplanet, otherwise we release a grabbed planet.void Cluster::ButtonTracker::evaluate( SFBool *button_down ) {if( button_down->get() ) {if( !hooked_planet ) {//Grab a planet}} else {if( hooked_planet ) {// Release the hooked planet}}}We begin with defining the pressed button’s behaviour. When we grab a planet,we do so by pointing the haptic device towards a planet while pressing thebutton. The algorithm should then pick the closest planet found in the directionthat is being pointed in. To be able to calculate this, we need to know the hapticdevice’s position and we need to know in which direction it is pointing. To findthe information of the haptic device, we can, again, use the DeviceInfo node.DeviceInfo *di = DeviceInfo::getStackTop();Device &haptic_device = (*di->devices)[“tracker”];This time, we use two different HapticsDevice fields: position andorientation. The position is an SFVec3f holding the position of thetip of the haptic device, and the orientation is an SFRotation describingthe rotation of the device relative to a starting orientation, pointing down thenegative z-axis (see section 7.7)First, we pick the two fields from the haptic device:SFVec3f *device_position =static_cast< SFVec3f* >( haptic_device[“position”] );SFRotation *device_orientation =static_cast< SFRotation* >( haptic_device[“orientation”] );We also need the vector indicating the direction of the haptic device. Theorientation is given as the rotation of the haptic device local coordinate space.By specifying the direction we would like to have in the haptic device localcoordinate space, and then rotate it according to the rotation, we can get thecurrent direction.chapter 8 cluster 167


In the haptic device coordinate space the direction we are searching for is the onepointing down negative z, since at startup the direction we are searching for ispointing into the display (from the user). We then can take this vector and rotateit according to the rotation of the haptic device.Vec3f hd_direction = Vec3f( 0, 0, -1 ) *device_orientation->get();Vec3f hd_position = device_position->get();Now when we have the haptic device’s direction and the position of the hapticdevice’s endpoint, it is time to find the nearest planet.We calculate a score for each planet that is a measure of how close the planet is tothe haptic device in relation to aiming accuracy. When we have iterated throughall planets, the planet with the best score is contained in the hooked_planet.// Find the nearest planet the haptic device is pointing atmgFloat best_score = -1e9;mgFloat planet_score;for( MFPlanet::const_iterator i = cluster->planets->begin();i != cluster->planets->end(); i++ ) {Planet *planet = static_cast< Planet * >(*i);Vec3f diff = planet->translation->get() - hd_position;mgFloat diff2 = max( diff * diff, Util::epsilon );planet_score = (diff * hd_direction) / diff2;if( planet_score > best_score ) {best_score = planet_score;hooked_planet = planet;}}Now we have stored the chosen one (if any) in the hooked_planetpointer. So, now it is time to set up the rubberband routings. Since therubberband shall be shown between the haptic device and a grabbed planet;we route the haptic device’s position and the planet’s position to therubberband_composer.// if we have hooked a planet, then the rubberband endpoints// shall follow the position of the haptic device and the// position of the hooked planetif( hooked_planet ) {hooked_planet->translation->route( cluster->rubberband_composer);device_position->route( cluster->rubberband_composer );}}}168<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>When pressing the button, we now hook up a planet.When the button is released, we also need to let go of the hookedplanet. To unroute the rubberband’s endpoints and set them to zero, wecall the rubberband_composer’s reset function. Since we release thehooked_planet, we set its pointer to 0.else {// The button is releasedif( hooked_planet ) {cluster->rubberband_composer->reset();hooked_planet = 0;}}}When we run the program and press the button, we see the rubberband betweenthe haptic device and a planet. Release the button and the rubberband disappears.8.10 Adding forces to the rubberbandFor us to feel the weight of some planet, a force must be applied to the rubberband.Furthermore, the rubberband affects the planet with a force as well,which enables us to move or swing the planet. To achieve realistic effects, wecalculate these forces based on how stretched the rubberband is compared to itsoriginal length.Doing things one step at the time, we start by simply adding a simple force to thehaptic device, from rubberband to planet.For each scene graph traversal, force operators are added and are in use till thenext scene graph loop. (Or rather, they are used until the second next scene graphloop since two force operators always work simultaneously). During collisiontraversal, the Collider::collide function of each Drawable gets called sothat it can add the force operators for its Drawable.To change Cluster’s ordinary haptic rendering behaviour, we need to specialisethe Collider component inherited from the Group node.In Cluster.h, specify a Collider object that inherits from Group’sCollider. We are going to extend the collide function to add the rubberbandforce operator.struct Collider : public Group::Collider {virtual void collide( CollisionState *cs );};friend class Collider;chapter 8 cluster 169


To inform the renderer that we would like to use this Collider object insteadof the Group Collider, we need to tell the Group constructor to use thisCollider object instead of its own.To do this, the Group constructor use the Defined template for all of itsparameters. Let us look at the Group constructor:inline Group(Defined< Collider > _collider = 0,Defined< Renderer > _renderer = 0,Defined< SFBound > _bound = 0,. . .Using the Defined template is an example of template specialisation. It ensuresthat if we send a zero, or no argument at all (0 is the default) as the first argumentto the Group constructor, the constructor will create the default Colliderinstance. If we instead call the Group constructor with an object of Collidertype as the first argument, the constructor will instead instantiate that Colliderobject as its Collider. The same of course applies to the Renderer as the secondargument, the SFBound as the third etc…We can use this sort of construction when we want Group to instantiate theCluster Collider as its Collider. So in the Cluster constructor, we callthe Group constructor, but with our new Collider as first argument.Since we do not provide any more objects as arguments, the Group constructorwill use the default.. . .world_composer( new WorldComposer ),rubberband_composer( new RubberbandComposer),Group( new Collider ) {// Set up the dependenciessun ->route( world_composer );rubberband->route( world_composer );. . .We specify the collide function in the Collider object inCluster.cpp. In it, we need to call the Group Collider function toensure that ordinary group behaviour is preserved.void Cluster::Collider::collide( CollisionState *cs ) {Group::Collider::collide( cs );}170<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>After we have called the Group collide function, we can add the forceoperators we want. We start with adding a trivial constant force operator insection 8.10.1. We then create a customised force operator with successivelyimproved behaviour.8.10.1 Using a simple force operatorThe simplest force operator is one that generates a constant force over itslifetime. The ConstForce operator applies a constant force according to thevector sent as argument at creation time. Forces shall only be applied when thereis a hooked planet consequently the first thing to do is to control if we have ahooked planet. We check that with the button_tracker that keeps track ofthe hooked planets.To access the button_tracker, we need a reference to the cluster. By usingthe drawable data member, we have that reference. In the collide function,this variable always contains the node currently being rendered. In this case,it thus contains a reference to the Cluster node. The reference needs to betype-cast for us to be able to reach the button_tracker, since there is nobutton_tracker in the drawable member.Cluster *cluster = static_cast< Cluster * >( drawable );Use the cluster reference to get the currently hooked planet from thebutton_tracker.Planet *hooked = cluster->button_tracker->hooked_planet;Now if we have a hooked planet, we can add a force operator.if( hooked) {// Add force operatorThe first thing we need to calculate is the force vector. We will add it as a constantforce. The force should apply in the rubberband’s direction, so the first thingto do is to query the vector from the haptic device to the planet. We retrievethe vector by taking the position of the hooked planet, minus the position ofthe haptic device.We get the position of the planet (in cluster local coordinates) from the referenceto the hooked planet. We get the haptic device position by using the getFingerfunction from the CollisionState object sent as argument to the collidefunction. The getFinger function returns the current position of the hapticdevice in local coordinates to the node of the collider (in this case clustercoordinates).From these two, we get the rubberband vector.chapter 8 cluster 171


Vec3f rubberband_vector = hooked->translation->get() –cs->getFinger();The force to apply is in Newtons. Since the distance here is so small, the appliedforces get very small. Therefore, we need to increase the forces by multiplying therubberband vector with a suitable value.rubberband_vector *= 25;Time to create and add the force operator. When creating a force operatorwe won’t allocate memory from the heap. Memory allocation is taken care ofby <strong>Reachin</strong> <strong>API</strong> when the force operator is added to the CollisionStateobject.Create the constant force by sending the rubberband_vector as argument tospecify the amount and direction of the force.ConstForce c_force( rubberband_vector);The ConstForce is declared in ForceOperator.h, so this file has to beincluded in at the top of the file.#include Now we just need to add the force to the CollisionState to activate it.cs->addAbsoluteGlobalFO( c_force );} // endifNow, if you run the program and grab a planet, there will be a force applied inthe direction of the rubberband.>>cluster World.wrlThis works fine when just using simple graphics. If we instead run the programusing a VRML world with, say, nine planets we can feel that the haptic forcesupdate too slowly.In the cluster directory, there is a VRML file called 9planets.wrl. Run it withthe current program. Observe that the forces are unstable, and that the planetsmight get stuck in the sun. If you use a high performance computer, you mightnot observe these errors.>>cluster 9planets.wrlWhen running the program with nine planets in the world, the haptic deviceno longer behaves very well. That is because we are using a constant force forsimulating a force that depends on the position of a moving object. In therealtime loop the same force is going to be used in all realtime loops during theentire next scene graph loop. To make good haptic behaviour the forces to applyneed to be recalculated more often than at each scene-graph traversal.172<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In the next section, we create a custom force operator that recalculates the forcesto apply at each step in the real-time loop, instead of in the scene graph loop.8.10.2 Create a custom force operatorTo make the rubberband force operators behave smoothly, we create a forceoperator that updates forces in the realtime loop. The forces to apply are to becalculated according to the position of the haptics device at every real-time cycle.A force operator is a binary function from a Vec3f (the position of the hapticdevice), and an mgFloat (the weighting of this force operator), to a Vec3f (theforce to apply to the haptic device during the next time step).We create a force operator for the rubberband force, and declare the function calloperator that is called at each real-time loop.To be able to calculate what force to return, we need to keep track of the planetposition. When using force operators we save the planet position in the actualforce operator, and use this for reading, updates etc, as opposed to getting andsetting the values in the scene graph. It must be done this way because the setand get functions can start scene graph updating. The scene-graph updating shalljust take place in the scene-graph loop. In the realtime loop, it is desired to keepcalculations as sparse as possible to maintain a high update rate.struct RubberForce : binary_function< Vec3f, Vec3f, mgFloat> {Vec3f planet_position;Vec3f operator() ( const Vec3f &haptic_position,const mgFloat &weight );};Time to specify the function call operator of our RubberForce inCluster.cpp. The rubberforce function is called with two arguments.The first arguments is the position of the haptic device. And since we areplanning to add the RubberForce with the addAbsoluteGlobalFO()command, is it given as the actual position of the haptic device in globalcoordinates. The second argument is the weighting of the current force operator.Since two force operators are working simultaneously, weighting indicates towhich degree of the total applied force, this force operator is going to contributewith.Vec3f Cluster::RubberForce::operator()(const Vec3f &haptic_position,const mgFloat &weight ) {}chapter 8 cluster 173


Now we will calculate which force to apply to the haptic device for every realtimeloop. First we need to calculate the rubberband vector, the same way as we earlierdid in the collide function. We multiply it with a value to achieve an appropriateforce and return the force from the function.Observe that haptic_position is given in global coordinates, and that theplanet_position also needs to be given in global coordinates. We need toensure this when we set the planet_position in the collider function.Vec3f rubberband_vector = planet_position - haptic_position;rubberband_vector *= 25;The force to apply to the haptic device throughout the next realtime loop(from this force operator) will be the return value from this function, weightedaccording to the weight factor sent as an argument to the function. <strong>Reachin</strong> <strong>API</strong>takes care of weighting so the resulting force from the force operator should notbe weighted.return rubberband_vector;In the Collider::collide function we need to change the behaviour fromadding the constant force to add our custom force operator instead. After thecheck that there is a hooked planet, we need to do some changes.First we creates a force operator of our newly declared type RubberForce:RubberForce rf;When the operator gets called in the realtime loop, the haptic device’sposition is given in global coordinates. To perform correct rubberband vectorcalculations, the position of the planet needs to be given in global coordinatesas well. To do this, we need to use the accumulated forward matrix from theCollisionState. This is to transform the position of the planet from localcoordinates to global coordinates.rf.planet_position = cs->getAccumulatedForward() *hooked->translation->get();Then add our force operator to the CollisionState.cs->addAbsoluteGlobalFO( rf );This added force operator will now be called at 1000 Hz in the realtime loop,executing over two scene graph loops. At each real time loop, a new force valuewill be calculated according to the current position of the haptic device.Time to compile and test the nine planets scenario.>>cluster 9planets.wrl174<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Now the rubberband behaves much better. If you want to compare the behaviourof the two different programs but have overwritten previous versions, use theprogram directory used by the previous section.8.10.3 Affecting the scene graph with the force operatorNow we also like the rubberband to apply forces back to the planet node. Therubberband shall apply an equal and opposite force to the planet.When we want to add forces from a real-time loop to a Dynamic node, we usethe addImpulse function of the Dynamic. The addImpulse function takesthree arguments: the force to apply, the time period to apply the force, and aposition where to apply the force (in global coordinates).In the force operator, we need to calculate the time since the last call. We use thetime since the last call to estimate the time to the next call, which is the time thecalculated force is going to be applied to both the haptic device and hence theplanet. We can use the last elapsed time as an estimate for this, since loop timedoes not vary much between two consecutive calls.Add a Time variable and a pointer to a planet in the RubberForce declaration.Time last_time;Planet *planet;Initialize the variables in the collide function.rf.last_timerf.planet= Time();= hooked;To calculate the elapsed time since the last call, we store the time of the lastcall in a variable that we initialise at construction time. We use the fact that theconstructor of Time initialises its value to the current time as default.Time now;mgFloat delta_t = now - last_time;last_time = now;Then we calculate the rubberband_vector as before, but instead of actuallychanging the value of the rubberband vector; we set the value of a force variableto the calculated value.Vec3f rubberband_vector = planet_position - haptic_position;Vec3f force = rubberband_vector * 25;The first argument to addImpulse is the force to apply. This shall be the sameforce as applied to the haptic device, but in the opposite direction. The forcereturned by this function will be weighted according to the weight variable beforechapter 8 cluster 175


it is added to the haptic device. To use the same amount of force to the planetas for the haptic device, we need to scale it manually according to the weightingbefore we can add the force as an impulse to the planet.The second argument is the estimated time for when the force is going affectthe haptic device.The third argument is the position at which to apply the force. This thirdargument is optional, and since we do not want to add a rotational force to theplanet we can leave it out. In this case the forces are applied at the fulcrum (centerof gravity) of the Dynamic and no rotation occurs.planet->addImpulse( -force * weight, delta_t );Then it is just to return the force as result of the function.return force;Compile and run.Now the forces that are applied to the haptic device are also applied to thehooked planet. This means we can use the rubberband to grab planets and movethem around.8.10.4 Accounting for planet motion in the realtime loopNow when we are adding forces to the planet from the real-time loop, we needto keep better track of the position of the planet. We constantly add forces tothe planet, and calculate them as if the planet was staying in the same position.To achieve better behaviour we would like to keep track of the movement of theplanet in the scene-graph loop, and then assume that the planet is continuing tomove in the same manner during the real-time loops.In the force operator we need to keep track of the motion, so we add a membervariable of the RubberForce.Motion motion;In Collider we set the motion to the movement of the planet overthe last scene-graph loop. We then need to normalise the motion so itdescribes the motion over one second. The getDeltaT() function of theCollisionState returns the elapsed time since the last scene-graph traversal.rf.motion = Motion( cs->getAccumulatedForward() *cs->getLastAccumulatedInverse() );rf.motion /= cs->getDeltaT();176<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In the realtime loop, we then use the motion object to move the planet in eachtimestep.planet_position = motion.transform( planet_position, delta_t );The planet motion is now recorded during each scene-graph traversal, and usedto move the planet in each step in the realtime loop.8.10.5 Adding slack to the rubberbandCurrently, we have a somewhat primitive way of calculating the rubberbandforces. Rather than just base the magnitude of the force to add on the length ofthe rubberband, we would like to base it on how stretched the rubberband is.To be able to do that we need to keep track of the original length of therubberband when we hook a planet. When the button is pushed and we grab aplanet we measure the length of the rubberband, and save that length.In the ButtonTracker we need a holder for the length:mgFloat rubber_length;In the evaluate function of the ButtonTracker we measure the length of therubberband. When calculating the distance between the haptic device and theplanets when we try to find the nearest planet to the haptic device we use thesquared distance in the diff2 variable. When we find a planet to grab, we takethe square root of this squared distance to get the distance of the rubberband.if( planet_score > best_score ) {best_score = planet_score;hooked_planet = planet;rubber_length = mgSqrt(diff2);}We also need to create a variable in the force operator where we can store thisinitial length of the rubberband. In order to do that, we add a data member forthe initial rubber length in the RubberForce.mgFloat initial_rubber_length;In the Collider::collide when we create the force operator, this variableis initialised to the calculated distance:rf.initial_rubber_length =cluster->button_tracker->rubber_length;Now its time to change the calculations in the rubberband operator.chapter 8 cluster 177


We would like it to work so that we measure the part of the rubberband that isstretched out, and take that part times a certain stiffness to get the amount offorce to apply. The direction of the force should still be in the direction of therubberband, from the haptic device towards the hooked planet.First set a value for the stiffness of the rubberband:const mgFloat stiffness = 70;We need to calculate the direction and the length of the current rubberband. Therubberband vector we calculate as before, and after that we add the calculation ofthe length of the rubberband and the unit vector for the rubberband.Vec3f rubberband_vector = planet_position - haptic_position;mgFloat rubberband_length = rubberband_vector.length();Vec3f rubberband_unit_vec = rubberband_vector /rubberband_length;To calculate the stretched part of the current rubberband we usethe calculated current rubberband_length above, and the savedinitial_rubberband_length:mgFloat length_difference = rubberband_length -initial_rubber_length;Now we have all the values to calculate the force to apply. If the rubberband isshorter than it was when the planet is grabbed, we do not like this force operatorto add any force at all, so in that case, the force to apply is zero. If the rubberbandis stretched out, the force to apply is calculated by taking the stretched out parttimes the stiffness, in the direction of the rubberband:Vec3f force;if( length_difference > 0 )force = stiffness * length_difference * rubberband_unit_vec;elseforce = Vec3f( 0, 0, 0);Update the position and velocity, add impulses to the planet and return theforce as before.Now when grabbing a planet, the amount of force applied depends on thestretched part of the rubberband, and when the planet is closer to the hapticdevice than it was when grabbed, no forces are applied at all.178<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>8.11 Hooking on to the planet’s surfaceBy connecting the rubberband to a point on the surface of the planet instead as tothe middle of the planet, we can use the rubberband to rotate planets.First, the rubberband representation needs to have its endpoint changed to beon the planet’s surface. When we grab a planet, we keep track of the positionon the surface of the planet, where the planet is grabbed. While updatingthe IndexedLineSet endpoints for the rubberband’s graphical representation, wetranslate the grabbed point on the planet to global coordinates to represent oneendpoint of the rubberband.To simplify this we create a function field where we can set the position of theplanet in planet local coordinates. Then we can route the matrix of the planetto it, and have as a value the global position of the surface point where theplanet is hooked.struct PlanetSurface : public EvaldFField< PlanetSurface,SFVec3f, Transform::Matrix > {Vec3f hooked_surface;virtual void evaluate( Transform::Matrix *matrix ) {value = matrix->getAccumulatedForward() * hooked_surface;}};auto_ptr< PlanetSurface> planet_surface;Don’t forget to create it in the constructor:planet_surface( new PlanetSurface ),In the ButtonTracker::evaluate function, where we set up the routingto the rubberband_composer, we use instead the PlanetSurface torepresent one of the rubberband’s endpoints.First, we calculate the hooked position for the surface of the planet. To get theposition in planet coordinates, we first take the position of the planet in clustercoordinates. The point on the surface where the planet is hooked, is this vectorminus the radius of the planet.The planets matrix->getInverse() function is, after that, used to transformthat point from cluster coordinates to planet coordinates.Vec3f surface = hooked_planet->translation->get();surface.setLength(surface.length() -hooked_planet->planet_radius->get() );cluster->planet_surface->hooked_surface =hooked_planet->matrix->getInverse() * surface;chapter 8 cluster 179


We change the earlier routing for the rubberband_composer to:hooked_planet->matrix->routeAndTouch( cluster->planet_surface );cluster->planet_surface->route( cluster->rubberband_composer );device_position->route( cluster->rubberband_composer );And add the unrouting before hooked_planet is set to 0:hooked_planet->matrix->unroute( cluster->planet_surface );The force operator also needs to change its behaviour. It shall add impulsesto the place at the surface where the rubberband is attached instead of atthe fulcrum. First, at creation time of the force operator we need to set theposition of the planet as the position of the hooked surface point (in globalcoordinates). The easiest way of getting this is to use the calculated value fromour PlanetSurface field.rf.planet_position = cluster->planet_surface->get();In the RubberForce::operator() the impulses added back to thedynamic, shall be added at the planet position (now the position of the hookedpoint on the surface of the planet ).planet->addImpulse( -force * weight, delta_t , planet_position);Now the rubberband is connected to the planet surface, and the rubberband canbe used to rotate the planet.8.12 Gradually raising the gravityThe first scene graph traversals that take place consume a lot more time than thefollowing ones. This can cause a problem when the gravity gets higher since theplanets move for too long before a new direction is calculated from the gravity.This can result in planets moving out of the workspace in the initial phase. Toavoid this we can increase the gravity slowly during the start up.First, we create a changing gravity constant that is going to be the one we routeto the gravity fields of the planets. This ch_grav_const will go from zero up to theset value during the first scene_graph traversals.auto_ptr< SFFloat > ch_grav_const;and create it in the constructor.ch_grav_const( new SFFloat ),In the constructor we set the changing gravity constant to its initial value zero.ch_grav_const->set( 0 );180<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>In the enterNode and exitNode functions, route and unroute thech_gravity_const to the gravity field of the planets instead of theconstant one used before.In enterNode:cluster->ch_grav_const->routeAndTouch( planet->gravity );In exitNode:cluster->ch_grav_const->unroute( planet->gravity );During the collision traversal we slowly raise the gravity. This is done by addingthe following line the Collider::collide() function.cluster->ch_grav_const->set( 0.99 * cluster->ch_grav_const->get() +0.01 * cluster->gravity->get() );Now the gravity can be set to a value of 0.05 as an example, and the planets arenot going to jump off the workspace.8.13 Coordinate spacesIf we place the Cluster node under a Transform node that changes the coordinatespace, we will have a problem. This is due to the fact that we use globalcoordinates for the position of the haptics device and cluster coordinates for theposition of the planet, when calculating distances etc in the ButtonTracker.The endpoints of the rubberband are given in global coordinates, but the drawingof the IndexedLineSet takes place in the cluster coordinate space.To correct this we first extend the Rubberband composer’s update functionto transform the vectors to cluster coordinate space. To do this we save theaccumulated inverse matrix for the cluster during the collision traversal. Thismatrix is then used in the update function of the RubberBandComposer totranslate the vectors.Add a cluster member variable to save the matrix inMatrix4f last_accumulated_inverse;Set the variable in the collider:cluster->last_accumulated_inverse = cs->getAccumulatedInverse();And extend the update behaviour of the rubberband composer to use the matrixto change all vectors to cluster coordinate space.chapter 8 cluster 181


struct RubberbandComposer : public ComposeVec3f {Cluster *cluster;inline virtual void update() {ComposeVec3f::update();for( vector::iterator i = value.begin();i != value.end(); i++ )(*i) = cluster->last_accumulated_inverse * (*i);}}friend class RubberbandComposer;The cluster pointer is initalized in the Cluster constructor.rubberband_composer->cluster = this;In the button tracker the position of the planet needs to be transformed intoglobal coordinates to be compared in the correct way with the position of thehaptic device given in global coordinates. To do this the accumulated forwardmatrix also needs to be saved during the collision traversal. So we create and setthe forward matrix of cluster in the same way we set the inverse matrix above.Member variable:Matrix4f last_accumulated_forward;Set in collider:cluster->last_accumulated_forward = cs->getAccumulatedForward();In the ButtonTracker calculate the position of the planet in globalcoordinates using the forward matrix, before calculating the distance between theplanet and the haptic device.for( MFPlanet::const_iterator i = cluster->planets->begin();i != cluster->planets->end(); i++ ) {Planet *planet = static_cast< Planet * >(*i);Vec3f planet_pos_glob_coord =cluster->last_accumulated_forward * planet->translation->get();Vec3f diff = planet_pos_glob_coord - hd_position;...The length of the rubberband needs to be calculated after we have got the surfaceposition in global coordinates.Vec3f surface_glob_coord =cluster->last_accumulated_forward * surface;rubber_length = (surface_glob_coord - hd_position).length();Now it should be possible to put the cluster under a transform node and scale it,rotate it and translate it properly.182<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>8.14 Performing collision detectionNow its time to include collision detection for the planets and the sun.It is beyond the scope of this tutorial to discuss the energy preservationoptimisation and the sphere-sphere collision handling (specifically theCluster::preserveEnergy and Cluster::collisionDetectfunctions).See the source files in the examples/rapg/PART_IV directory for details.8.15 Using the most recent force operatorTo get a better behaviour we could add all impulses to a grabbed planet from the mostrecently created force operator, and no impulses from the older one. The most recentforce operator will have more correct values than the old one regarding the planetposition, and can then add more correct impulses back to the planet.To achieve this we use a data member for the last_weight in the RubberForce.mgFloat last_weight;Initialise when force operator is created.rf.last_weight = 0;Add all the impulses from the newly created force operator.}if( last_weight < weight ) {last_weight = weight;planet->addImpulse( -force, delta_t, planet_position );}return force;chapter 8 cluster 183


184<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Appendix A:URNs - Universal Resource Names<strong>Reachin</strong> <strong>API</strong> comes with a resource database that contains all of the vrml, imageand other data resources used by <strong>Reachin</strong> software. Resources stored here can beaccessed by applications through Universal Resource Names (URNs).A URN is a unique name for a particular resource (conceptually a supersetof ISBN). It is an identifying name that indicates nothing of the locationof the resource (just as an ISBN indicates nothing of where you can buy abook). <strong>Reachin</strong> <strong>API</strong> incorporates a simple URN locator that takes a URNand searches for a copy of the resource. It will search for files using theREACHIN_URN_INDEX file as the main index for URN resolution.Files contained in the resources directory can be specified with the URN namedomain of urn:inet:reachin.se:/*. For example, loading the Earthdemo can be performed by executing the <strong>Reachin</strong>Load utility as:<strong>Reachin</strong>Load urn:inet:reachin.se:/demos/earth/earth.wrlThis will ask the URN resolver to find set of Universal Resource Locations(URLs) for the VRML file describing the Earth demo, returning the possiblelocations to the VRML parser.appendix a 185


As a time-saving feature, if the URN resovler cannot locate a particular resource,it will look in the current directory, so that you do not have to put all files in theresource database. Also, if the REACHIN_URN_INDEX is not specified or doesnot exist, the default index of "all resources exist relative to the current workingdirectory" will be assumed.A.1 URN Index FileURN index files define the rules to be used for resolving a name to a location.Lines commencing with a hash (#) are comments, and are ignored by the parser.The following represents the grammar for URN files:urn_file := header rulesheader:= "#URN1.0\n"rules:= rule| rule rulesrule:= urnpattern location| urnpattern '@'index_url| urnpattern '!' error_messageurnpattern := {alphanumeric, "/", ":" and "*"}The location can be either a URL or a UNIX filename/path. The URN resolverwill test an URN against each URN pattern in the index file, and if the patternmatches, the resulting location is generated based on the pattern. The URNresolver then returns a set of possible locations that it was able to resolve. Eachlocation can be tested for a copy of the resource.A location commencing with the ‘@’ symbol, followed by a URL, indicates thatthe resolver should recurse into another URN index file if the resource namematches the pattern. A location commencing with the ‘!’ symbol denotes that theURN resolver should raise an unrecoverable error with error_message beingthe reason given for the error.A.2 URN ExampleA suggested URN index configuration is for users to have their own index filethat defines personal resources for application development, and includes the<strong>Reachin</strong> <strong>API</strong> URN index, or others such as a company resource database. Such afile for an employee of the Foo company may look like this:#URN1.0# first search in the users directory for personal resourcesurn:inet:foo.com:/personal/mark/* /usr/people/mark/VRML/*186<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong># check the company index for any Foo URNurn:inet:foo.com:* @/foo/index.urn# explicit rule for the spotted chickenurn:inet:reachin.se:/spotted_chicken.wrl/api/special/spotted_chicken.wrl# rule for project resourcesurn:inet:reachin.se:/ro/* /api/reachover/resources/*# check <strong>Reachin</strong> <strong>API</strong>'s index for any URN, "*" will match every URN* @/api/resources/index.urnHere an example URN urn:inet:reachin.se:/ro/loader.wrl wouldmatch the last two patterns in the file. The unix path/api/reachover/resources/loader.wrl would be added to the listof locations returned by the resolver. The resolver would then continue byrecursing into the /api/resources/index.urn index file, possibly adding morelocations before finishing.appendix a 187


188<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Appendix B:Environment Variables<strong>Reachin</strong> <strong>API</strong> supports overriding of VRML field values via user specifiedenvironment variable settings. This can be useful when you want to quicklychange the behaviour of an application between executions without editing theVRML source files.An obvious example here is in modifying configuration attributes for a particularapplication. If you wish to swap quickly between using the <strong>Reachin</strong> Display andan ordinary workstation setup (non-mirrored etc.), then you can use environmentvariables to override the value of various fields in the configuration files.B.1 SyntaxField names in a VRML file can be augmented with a list of environment variablesto search. The syntax for this is as follows:fieldname(VARIABLE1/VARIABLE2/VARIABLE3) There must be no spaces in the fieldname or variable list, so that the fieldnameand search list is recognised as one field name by the parser. More than onevariable can be specified by “/” separating each variable name.appendix b 189


When setting this field, the parser will first check the users environmentfor “VARIABLE1”, then “VARIABLE2” and “VARIABLE3” in order. Ifit finds an environment variable declared, it then parses the value of theenvironment variable and sets the value of the field to this value instead ofthe “”. If none of the environment variables have beendefined, then the “” is set as per normal.B.1.1 Simple ExampleThe following is a code fragment of the test.wrl file, in which we declare aShape node:Shape {geometry Sphere {radius(REACHIN_MY_SPHERE_RADIUS) 0.01}appearance USE GOLD}This will create a sphere of radius 1cm, unless we override the radius with theenvironment variable REACHIN_MY_SPHERE_RADIUS. For example, if wetyped:tcsh# setenv REACHIN_MY_SPHERE_RADIUS 0.05Then when we run the test.wrl file, the sphere would have a radius of 5cm.B.2 Using lists of variablesUsers can specify more than one environment variable to search for by separatingvariable names with a “/” character. The main purpose of this functionality is topresent a more hierarchical way of setting values, if set up correctly.For instance, we can specify a hierarchy of variables to determine which displayhardware to use for a particular application. An application that has three ballscould support the ability to set the size of all balls at once, or each ball individually,e.g.:DEF BALL1 Shape {geometry Sphere {radius(REACHIN_BALL1_RADIUS/REACHIN_BALL_RADIUS) 0.012}appearance USE GOLD}190<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>DEF BALL2 Shape {geometry Sphere {radius(REACHIN_BALL2_RADIUS/REACHIN_BALL_RADIUS) 0.011}appearance USE GOLD}DEF BALL3 Shape {geometry Sphere {radius(REACHIN_BALL3_RADIUS/REACHIN_BALL_RADIUS) 0.010}appearance USE GOLD}Here, setting REACHIN_BALL_RADIUS would set all three balls tothe same value. Then setting REACHIN_BALL_RADIUS andREACHIN_BALL3_RADIUS would effectively set the first two balls to thesame value and the third ball to a (possibly) different value.This hierarchical feature can be used to good effect in supporting commonsubsets of configuration that can be easily customised.B.3 DefaultsIf you do not want to specify a value for a field in your own application, butstill want to allow environment variable overriding of the field, then you canmake use of the “default” keyword. In the above example, if we wanted theradius of the sphere to be the VRML default unless the user overrides it with anenvironment setting, then you must specify “default” as the last environmentvariable in the list, i.e.:Shape {geometry Sphere {radius(REACHIN_MY_SPHERE_RADIUS/default) NULL}appearance USE GOLD}The “default” keyword is necessary because the VRML grammar requires thateach field name is followed by a value for that field. The field value “NULL” isrecommended to clearly indicate that the value will not be set unless overriddenby an environment variable.appendix b 191


B.4 Easily Specifying URN’sAs a convenience feature, the parser will check for values starting with the string“urn:” and insert quote characters around the string. This means that insteadof typing:tcsh# setenv REACHIN_DISPLAY\”urn:inet:reachin.se:/Display1.wrl\”¨To declare a URN, you can skip the escaped quotes and simply type:tcsh# setenv REACHIN_DISPLAY urn:inet:reachin.se:/Display1.wrlB.5 ConventionsIt is recommended that all <strong>Reachin</strong> <strong>API</strong> VRML Environment Variables areuppercase, underscore separated words starting with “REACHIN”. This willensure a common convention that is easily recognisable, and should not interferewith the environment variables of other applications. Examples are:REACHIN_DISPLAYREACHIN_CLUSTER_GRAVITY192<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>appendix b 193


194<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Appendix C:A Very Brief Python PrimerThis section is extremely brief. It is intended to allow someone who is familiarwith object-oriented languages and scripting languages to use Python very quicklyto the extent needed for the <strong>Reachin</strong> <strong>API</strong>.Python is an object-oriented, easily extensible language that is very easy to learnand is quite elegant. For this reason it was adopted as the scripting languagefor <strong>Reachin</strong> <strong>API</strong>.Only two examples are given each being packed with features to learn.The first example, Program 3, shows Python’s basic syntax and structure and howto define a function and call it, how to use an imported libraryProgram 3: A first Python script : first_function.py#!/usr/freeware/bin/python# This first line needs to be set for your environment.# NOTE: For Python Scripts for <strong>Reachin</strong> <strong>API</strong> this line should# not be used.# first_function.py# An import statement ( very important )import os# The print statement, how useful !print ‘Eh Oh!‘appendix c 195


# To define a function ( def !! )def fibonnaciFunction(n):a = 0b = 1# The while statement below shows a how# a block is constructed. A statement and# then a (:) followed by indented lines# that are within the block.# The usual if,else and for statements are used# in the same way.while b


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>The second example, Program 4: A really classy Python, shows as much Python as most<strong>Reachin</strong> <strong>API</strong> scripting will require. It shows how to define a class and subclass, and howto use that class for a trivial operation.Program 4: A really classy Python script : first_class.py#!/usr/freeware/bin/python# Set the above line for the local environment# first_class.py# A simple class, defined using the class keyword# The methods are defined in the same was as functions, with# the only notable thing being that the first argument# must be (self) !# Internal variables of the class are referenced by# preceeding them with self.class incrementer:# Special member function __init__ called when the class# is instantiated.# In Python this is just a normal functiondef __init__(self):self.myVal = 1passdef incVal(self):self.myVal = self.myVal + 1def returnmyVal(self):p = self.myValreturn p# define a subclass# This class decrements as well ! How excitingclass changer(incrementer):# NOTE NOTE NOTE NOTE NOTE NOTE# This part is critical for <strong>Reachin</strong> <strong>API</strong>, a brief explanation# is that subclasses must explicitly call superclass# constructors when the child has a# constructor. This is bad but must be dealt with.def __init__(self):incrementer.__init__( self)passdef decVal(self):self.myVal = self.myVal - 1## MAIN ### Use the classEric = incrementer()# Set a variable within the classappendix c 197


Eric.myVal = 2print “Eric.myVal = “,print Eric.returnmyVal()Eric.incVal()print “Eric.myVal = “,print Eric.returnmyVal()# Use the subclassSpam = changer()# Set a variable in the superclassSpam.myVal = 2print “Spam = “,print Spam.returnmyVal()# Use the exciting new decrement function.Spam.decVal()print “Spam = “,print Spam.returnmyVal()Spam.incVal()print “Spam = “,print Spam.returnmyVal()The most important thing to notice from the class listing here is the way in whichthe subclass is instantiated. This is not normal practice in an object-orientedlanguage. The reason things are done like this is because in Python all methodsare virtual, i.e. are overridden by subclasses. This is particularly important forscripting in <strong>Reachin</strong> <strong>API</strong>, as Python scripts for <strong>Reachin</strong> <strong>API</strong> tend to be used toroute information, so they are subclasses of VRML types such as SFVec3f. Forthese to be placed into the run time <strong>Reachin</strong> <strong>API</strong> library of fields they must calltheir C++ constructors and so it is crucial that this part of the code is included.For more information on Python the reader is advised to use the extensiveresources available on the web and the books described for learning Python. Thereader is encouraged to look through the reference section at the back, which listsPython resources that are available on the web.198<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>appendix c 199


200<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Appendix D:Data Specialisation in C++C++ offers a neat way to specialise the member functions of a base-class; virtualfunctions, but often it is necessary to specialise the data members also. Consider aDoor class that references a Handle component with a pointer.struct Handle {virtual void turn();};struct Door {Door() : handle( new Handle ) {}virtual ~Door() { delete handle; }void open() { handle->turn(); }Handle *handle;}Suppose we now refine Handle to produce a FancyHandle sub-class.struct FancyHandle : public Handle {virtual void turn();};Now we wish to create a FancyDoor that makes use of a FancyHandleinstead of a normal Handle. However this cannot be done using virtual functionssince they cannot be called from within the constructor of Door. A crude solutionwould be as follows.appendix d 201


struct FancyDoor : public Door {FancyDoor() {delete handle;handle = new FancyHandle;}};This is not efficient though because the Handle member will be created by theDoor as usual and then deleted by the sub-class’s constructor and replaced by aFancyHandle. What is needed is a means by which the Door constructor canbe instructed not to create a Handle, as usual, but to accept an alternate Handlefrom the sub-class constructor.struct Door {Door(Handle *_handle=0):handle(_handle ? _handle:new Handle){}. . .};struct FancyDoor : public Door {FancyDoor() : Door( new FancyHandle ) {}};Now we have removed the inefficiency of needlessly creating a Handle instance.Note that the Door class has a default constructor that sets handle tonew Handle as before, but now if we pass an alternate Handle instance to theconstructor, it uses that instead. This syntax can be cleaned somewhat by usingthe following template definition.template< class T >struct Defined {Defined( T *_ptr ) : ptr( _ptr ? _ptr : new T ) {}operator T*() { return ptr; }template< class S >operator Defined() { return Defined( ptr ); }private:T *ptr;};Now we can use a Defined in place of a T* argument to a function whenwe want a T to be created if a 0 is passed to the function. For example:struct Door {Door(Defined_handle=0):handle(_handle) {}. . .};struct FancyDoor : public Door {FancyDoor(Defined_handle=0):Door(_handle) {}};Externally we can instantiate Doors and FancyDoors as normal withoutspecifying arguments to the constructors, but the internal construction occurssuch that a FancyDoor instantiates a FancyHandle and a Door instantiates202<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>a Handle. Finally, as a safety and brevity measure we use the auto_ptr mechanism(as described in the 3rd edition of Stroustrup’s “The C++ ProgrammingLanguage”) to store the Handle pointer. The final implementation looks like this:struct Handle {virtual void turn();};struct FancyHandle : public Handle {virtual void turn();};struct Door {Door(Defined_handle=0):handle(_handle) {}void open() { handle->turn(); }auto_ptr< Handle > handle;};struct FancyDoor : public Door {FancyDoor(Defined_handle=0):Door(_handle) {}};appendix d 203


204<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Appendix E:<strong>Reachin</strong> <strong>API</strong> and VRML nodesVRML NODE <strong>Reachin</strong> <strong>API</strong> <strong>API</strong> DifferencesAnchorNoAppearance Yes <strong>Reachin</strong> <strong>API</strong> addssurface field for hapticsurface.AudioClipNoBackground Yes Has only colourBillboardNoBoxYesBoxInertiaBumpmapSurfaceButtonSurfaceCasheHintsCollisionNoColorYesColorInterpolator YesConeYesConeInertiaCoordinateYesappendix e 205


VRML NODE <strong>Reachin</strong> <strong>API</strong> <strong>API</strong> DifferencesCoordinateInterpolator YesCylinderYesCylinderInertiaDeviceInfoCylinderSensorNoDeformedCoordinateDirectionalLightYesDynamicElevationGridNoFrictionImageSurfaceFrictionalSurfaceExtrusionNoFogNoFontStyleNoGlutWindowGraspableTransformGravityDynamicGroupYesGroupInertiaHapticsDeviceHiddenlineGeometryHollowSphereInertiaIFSInertiaImageSurfaceImageTextFaceImageTextureYesImportIndexedFaceSetYesIndexedLineSetYesIndexedTriangleSetInertiaInlineYesInterpolatorYesITSInertiaLaparoscopicImpulseEngine206<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>VRML NODE <strong>Reachin</strong> <strong>API</strong> <strong>API</strong> DifferencesLayeredGroupLODNoMagnetMagneticSurfaceMaterialYesMembraneMovieTextureNoNavigationInfoYesNormalYesNormalInterpolator NoOptimised IFSOrientationInterpolator YesPanelPhantomDevicePiecewiseLinearStiffnessSurfacePixelTextureNoPlaneSensorNoPointLightNoPointSetNoPositionInterpolator YesProximitySensorNoPythonScriptRoughSurfaceScalarInterpolator YesSceneScript Yes PythonScript NodeSetVariablesSimpleSurfaceShapeYesSimpleSurfaceSlotDynamicSound Yes SoundSource NodeSoundBufferSoundSourceSphereYesSphereInertiaStereoInfoSphereSensorNoappendix e 207


VRML NODE <strong>Reachin</strong> <strong>API</strong> <strong>API</strong> DifferencesSplitGeometrySpotLightYesStereoInfoStiffnessImageSurfaceSurfaceSwitchYesText No ImageTextFaceTextureCoordinate YesTextureTransform YesTimeSensorYesTouchSensorNoTrackingDeviceTransformYesTransformedCoordinateTransformedInertiaVibratingSurfaceViewpointYesWindowWireframeGeometryVisibilitySensorNoWorldInfoNo208<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>appendix e 209


210<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>ReferencesC++[1]“The C++ Programming Language”, Bjarne Stroustrup, Addison Wesley, ISBN0-201-88954-4, Copyright AT&T 1997.This book is probably the standard C++ reference, written by Bjarne Stroustrup,the creator of C++. Its style is somewhat terse and the examples are often a littleinvolved, but its descriptions are excellent and it is a very valuable reference.[2]“The Waite Group’s C++ Primer Plus”, Stephen Prata, Waite Group Press, ISBN1571691626, 1998.This and the next book are very large, comprehensive references for C++ thatdescribe almost everything in the language.references 211


[3]“C++ Primer”, Stanley Lippman & Josee Lajoie, Addison Wesley, ISBN0201824701, 1998.As said before, a very large and comprehensive reference for C++. Either of thesebooks makes a valuable contribution to programming effectively in C++.[4]“Effective C++ 2nd Edition & More Effective C++”, Scott Meyers, ISBN0201924889 & 020163371X, Addison Wesley, 1998 & 1995.These two books provide great in depth explanations of many of the subtletiesand problems in C++. They are not for beginners, but for people who havean understanding of C++ they will reduce errors and improve the code that isproduced.GraphicsGeneral Graphics[5]“Computer Graphics: Principles and Practice” James Foley & Andries van Dam& Steven Feiner & John Hughes, Addison-Wesley, ISBN 0-201-84840-6, 1997.This book is the bible for graphics programming and is a superb book. It isstrongly recommended for anyone doing computer graphics.[6]“Interactive Computer Graphics - A Top-Down Approach with OpenGL”Edward Angel, Addison-Wesley , ISBN 0-201-85571-2, 1997.A strong graphics textbook that uses OpenGL to explain concepts which is agood idea.[7]“3D Computer Graphics 2nd Edition” Alan Watt, Addison-Wesley, ISBN 0-201-63186-5, 1993.Another good graphics textbook. Well worth looking at. Good diagrams andexplanations.212<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>OpenGL[8]“OpenGL Programming <strong>Guide</strong>: The Official <strong>Guide</strong> to Learning OpenGL Version1.1”, Mason Woo & Jackie Neider & Tom Davis, Addison Wesley, ISBN0201461382, 1997.The standard guide to learning OpenGL, for programming OpenGL this is justabout mandatory.[9]“OpenGL Reference Manual : The Official Reference Document to OpenGL,Version 1.1”, Opengl Architecture Review Board & Chris Frazie & RKempf, Addison Wesley, ISBN 0201461404, 1997.This book defines the standard and is an excellent reference book.VRML[10]“The Annotated vrml 2.0 Reference Manual”, Rikk Carey & Gavin Bell, DevelopersPress, ISBN 0201419742, 1997.The standard references for vrml. Not an introductory text but THE vrmlreference.[11]“The vrml 2.0 Sourcebook”, Andrea Ames & David R. Nadeau & John L.Moreland, John Wiley and Sons, ISBN 0471165077, 1997.Quite simply the best overall book on vrml and the only good introductorymanual that we have found. The examples in the book are extensive and showhow some of the features of vrml can be cleverly used. As a reference it is not assuccinct as the previous vrml book, but it is still strong in this area.references 213


HapticsHaptics are a new technology. This means that there are few books that have anyreference to them, let alone thorough descriptions of how haptic rendering works.It is recommended that for gaining further understanding of haptic rendering theinternet is used. A number of useful links are listed here.[ 12 ]“Haptics Community Web Page” http://haptic.mech.nwu.edu/One of the better resources on the web for information on haptics.[ 13]“Haptics Bibliography”http://marg.www.media.mit.edu/people/marg/haptics-bibliography.htmlAn extensive list of publications on haptic devices.[ 14]“SensAble Technologies, Inc.®”www.sensable.comSensAble Technologies, the manufacturers of the PHANTOM, have a goodlisting of haptics resources on the web.[ 15]“Proceedings of the PHANTOM Users Group”These contain a number of excellent technical articles describing haptic rendering.They are available from SensAble Technologies.PythonThere are not a large number of books on Python. However the online referencematerial is excellent. It is strongly recommended that the online references areinvestigated.214<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>[16]“Learning Python”, Mark Lutz & David Ascher, O’Reilly and Associates, ISBN1-56592-464-9, 1999.A strong book that introduces Python easily and effectively.[17]“Programming Python”, Mark Lutz, O’Reilly & Associates, ISBN 1565921976,1996.This book is the only general reference book available on Python. The authorshave found that the online references were a better source of information thanthis book.[18]www.python.orgThis is the online reference for python. It is a really well organised and thoroughreference.references 215


216<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>IndexAAbsolute, 110-112Abstract, x, 12, 19, 31, 52Accessing, 66, 71, 83Accounting, 120, 176Anchor, 205Animation, 12, 27-31ANSI C, 75<strong>API</strong>, ix-x, 3, 5-6, 9-13, 17-22, 24,26-28, 30-32, 35-36, 38, 40, 43,45-49, 51-52, 56, 58, 61, 63, 65,67-68, 71, 75, 77-79, 82, 84, 88,91-94, 97-98, 100, 102-107, 120-123,125, 129, 141, 143-144, 172, 174,185-187, 189, 192, 195-198, 205-208Appearance, 5, 7, 12, 18-23, 29, 40,48, 51-53, 56-57, 62, 64, 66-67,70-71, 78-80, 129, 138, 146, 151,158-159, 161, 190-191, 205Application Programming Interface, ixAssign, 146, 159Autoref, 78, 80, 83-84, 98-99, 147,153BBaseSurface, 125BIN, 65, 68, 195, 197Bindable, 26-27, 71-72, 101, 166BindableInterface, 71BoundGeometry, 134BoxBound, 134, 136-137BoxInertia, 57, 138, 205Build, 61, 79-80, 101, 116, 119, 143index 217


BumpmapSurface, 55, 71, 124, 138,205Button, 56, 67, 100-102, 143,165-167, 169, 171, 177-178, 182ButtonSurface, 56, 88, 205ButtonTracker, 101, 166-167, 177,179, 181-182CChild, 54, 70-71, 79, 81-82, 100, 109,114-116, 146-147, 149, 151-153,160, 197Children, 8, 24-25, 30-31, 46-48,53, 56-58, 62, 64, 66-67, 70,78-81, 83-84, 97-98, 103, 109, 116,119, 128, 138, 145-147, 149, 152,155-158, 160Class, 18, 53, 56, 65, 68-70, 75,85-86, 88-90, 92, 94-97, 100, 106,114-115, 119, 127, 132, 144-146,149, 154-155, 162, 166, 170, 182,196-198, 201-202Clock, 27, 30Cluster, ix, 141, 143, 145, 154-175,178-183, 192Cluster Collider, 170-171Cluster Tutorial, ix, 141Collide, 103, 106, 109, 114-116, 119,130-131, 134-135, 169-172, 174-175,178, 181Collider, 96, 102-103, 106, 109,113-116, 119, 130-132, 134-137,169-172, 174, 177-178, 181-182Collision, 40, 102-106, 109, 114, 120,130-131, 135-136, 143, 154, 169,181-183, 205Color, 23, 38, 65-66, 70-71, 79-80,150-152, 159-162, 165, 205ColorInterpolator, 205Compile, 148-150, 175-176ComposeInt, 87ComposeMField, 86-87ComposeMFields, 86Composer, 156-157, 161, 164-165,168-170, 179-182ComposerVec3f, 164ComposeVec3f, 164, 182Cone, 19, 205ConeInertia, 57, 205ConstForce, 109, 171-172Constraint, 121Conventions, x, 79, 192Coordinate, 7, 25, 31, 48-49, 56,94, 103-106, 109-110, 113, 120,122-125, 129-131, 136-137, 159,164-165, 168, 181-182, 205CoordinateInterpolator, 206Cosmo Worlds, 17Cube, 29-30, 40, 56-57, 65, 68, 110CubePath, 30Cylinder, 206CylinderInertia, 57, 206218<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>DDamping, 31, 53-54Data Specialisation, 92, 94, 106, 136,201DEF, 29, 49, 62-71, 82-84, 128, 151,158-159, 161, 165, 190-191, 196-197DEFed, 29-30, 70, 82, 151-152, 159,161DEFine, 8-10, 19, 23, 29, 38, 47,56, 61, 69, 77, 85-86, 89-90, 94,97, 104, 124-127, 129, 131-134, 145,148, 153, 155-156, 160, 163, 165,186, 195-197DEFined, 17, 23, 26, 38, 47, 66,69-70, 79, 83-84, 88-89, 93-94,96-97, 99, 114, 120-121, 131,135-137, 150-151, 159, 165, 167,170, 190, 197, 202-203DEFing, 83DefMap, 69-71, 82-84, 151-152, 159,162, 165Degrees of Freedom, 13Dependency Problem, 92Dependent, 37, 67-68, 88, 91-92,101, 133, 166-167DependentDrone, 92Designing, 21, 29Device, 11, 13-14, 31-32, 40-41,53-54, 56, 58, 82, 100-103, 107-115,117-119, 121, 124-125, 143, 147,154, 163, 165-169, 171-182DeviceInfo, 31, 71, 100-102, 166-167,206Devices, 11, 13, 31, 40, 47, 51, 53,100-102, 104, 166-167, 214DirectionalLight, 26, 206Display, 18, 31, 46, 48-49, 64, 66-67,78-81, 83-84, 98-101, 109, 115-116,118-119, 121, 128, 138, 143, 147,149, 160, 168, 189-190, 192DOF, 13Door, 201-203Drawable, 102-103, 109, 119, 135,169, 171Dynamic Objects, 31, 51, 56Dynamics, 31, 56, 144DEarth, 185Easily Specifying URN, 192ElevationGrid, 206Emacs, 196Environment, 25-26, 31, 189-192,195, 197Environment Variables, 189-190, 192EvaldFField, 90-91, 156-157, 166,179Event, x, 3, 5, 8-11, 19, 21, 27, 35-39,56, 63, 79, 84, 86, 90-91, 102, 116Event Handling, 3, 5, 8-11, 19, 21,35, 84Example Code, xExtending, 32, 63, 85, 93, 99-100,143index 219


Extending <strong>Reachin</strong> <strong>API</strong> Nodes, 63, 93FFancyDoor, 201-203FancyHandle, 201-203FField, 90-91, 134Field, x, 8-11, 18-27, 29-31, 35-39,41, 43, 46, 51-52, 58, 61-64, 66-70,78-79, 81, 83-86, 88-93, 97, 99-104,116, 127, 132-134, 136-137, 141,143-144, 147-158, 160, 162-167,179-181, 189-191, 205Field Networks, 8-11, 19, 35, 38-39,43, 61-62, 84, 141Files, 43, 45, 48-49, 52, 67, 71,75, 77, 81, 94, 125, 136, 146, 183,185-186, 189Force, 13, 31, 40-41, 53, 58, 103,107-118, 120-121, 124-127, 141,143, 165, 169-180, 183Force Function, 107, 113, 115Force Operator, 41, 103, 107-118,120, 124, 170-178, 180, 183ForceOperator, 109, 172Forces, 11, 13-14, 31, 40-41, 51, 53,56, 58, 109, 114-115, 117, 119, 144,163, 169, 171-173, 175-177, 179Friction, 54-55FrictionalSurface, 48, 54, 57, 66, 70,80, 206FrictionImageSurface, 55-56, 206Function, 36, 38-39, 65, 67-68,70-72, 79, 81-83, 86-88, 90-95,97, 101, 105-109, 111-118, 120,124-125, 130-131, 133-134, 137,143, 146-148, 151-154, 156-158,160, 162-167, 169-177, 179, 181,195-198, 202Function Field, 38-39, 90, 156,164-166, 179Function Field Example, 90GGaussian, 54Geometry, 5, 7, 12, 19-21, 23, 25,40-41, 46, 48, 52-53, 56-58, 62, 64,66-67, 80, 83-84, 93-96, 105, 107,109-110, 119, 122-123, 128-138,146, 151, 158-159, 161, 165, 190-191GlutWindow, 206Graphics, ix-x, 6, 8-9, 11, 19, 46, 102,110, 172, 212Gravity, 31, 58, 99, 117, 143-144,148-149, 151, 154, 160-163, 176,180-181, 192GravityDynamic, 24, 31, 58, 99,143-147, 149, 154, 206Group Collider, 170-171GroupInertia, 57, 206Grouping, 7, 24-26, 79220<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>HHaptic, ix-x, 6, 9, 13-14, 17, 19,21, 31, 40-41, 47, 51-55, 58, 82,101-102, 107-115, 117-121, 125,137-138, 143, 147, 154, 165-179,182, 205, 214Haptics, ix, 6, 9, 11, 40-41, 56,102-103, 105-107, 110, 120, 122,124, 129-130, 132, 136, 163, 173,181, 214Haptics Bibliography, 214HapticsDevice, 167, 206HollowSphereInertia, 57, 206Hooking, 179IImageSurface, 206ImageTextFace, 58, 206, 208ImageTexture, 23, 55-57, 138, 146,151-152, 158, 160-161, 206Import, 12, 68-69, 71, 195, 206Independent, 38, 88, 91, 122Index, 68, 75, 185-187, 217IndexedFaceSet, 131, 206IndexedLineSet, 158-159, 163, 165,179, 181, 206Indexing, 87Inertia, 31, 56-57, 138, 154, 206Informative, 19Initialize, 82, 99-100, 152-154, 175Inline, 24-25, 69, 82, 107-108, 113,117, 127-128, 164, 170, 182, 206Install, 71Instancing, 49Instantiate, 69, 78, 98, 132, 137, 170,202Integration, 141Interface, ix, 9, 31, 45, 47, 52-53, 58,63, 68, 70-71, 77, 93-94, 97, 99-100,102, 128, 132-133, 143, 148-152,155, 158-163Interpolation, 107-108, 117, 120Interpolator, x, 28-30, 36, 62, 206ISBN, 185, 211-213, 215LLazy Evaluation, 37-39, 67, 84, 91Libraries, ixLights, 23, 26, 28Lines, 35, 48, 63, 91, 186, 196Linking, 11Links, 8, 214Loader, 81, 94, 98, 187Loading, 43, 45, 82, 98, 100, 185MMagellan, 13Magnet, 51, 58, 116, 207MagneticSurface, 88, 129, 207index 221


Main, 8, 31, 51-52, 55, 80, 83, 86-87,91-92, 98, 101, 109, 116, 119, 147,149, 185, 190, 196-197Make, 6, 12-14, 35-36, 38, 45, 52,57, 71, 78-80, 91, 97, 125, 137-138,158-159, 164, 173, 191Making, 31, 79Material, 20-21, 23-24, 52, 62, 64,66-67, 70-71, 77-80, 129, 138, 146,151, 158, 161, 207, 214Matrix, 57, 82, 103-106, 122, 124,138, 174, 179-182Matrix4f, 104-107, 181-182Member, 65, 69-70, 86, 90, 92-97,103, 105, 114, 117, 120, 125-126,131, 136, 150, 162-164, 171,176-177, 181-183, 197, 201-202Memory, 78, 115, 172MFChild, 70, 97MField, 85-87, 89-90, 92, 95MFields, 86MFNode, 66, 69, 78, 92-93, 155-158,162-163MFNodes, 78, 163MFPlanet, 155-158, 161-163, 168,182MFVec3f, 164Modifier, 88-93, 129MovieTexture, 207Multi-sensory, ix, 5-6, 8-12, 40, 46, 79NNavigationInfo, 26, 207NewCollider, 109NewMember, 96Newtonian Mechanics, 54Node, x, 5, 7, 9, 12, 18-32, 35, 38,40, 46-48, 51-52, 56-58, 61, 63-67,69-71, 77-79, 81-84, 92-94, 96-105,109, 113, 115-121, 125, 129, 132,134, 136-138, 143-149, 151-160,162-167, 170-172, 175, 181, 183,190, 205-208Nodes, x, 7, 9-13, 17-19, 21-22,24-32, 35, 43, 47, 49, 51-52, 57-58,61, 63-64, 66, 69-70, 72, 77-84, 88,93-94, 97-99, 101, 103-105, 121,124-125, 129-131, 143, 145-146,148, 151, 155-159, 162, 205Normal, 25, 38, 49, 54, 70, 92,116, 121-123, 126, 190, 196-198,201-202, 207OObject, x, 6-8, 13, 21, 24-26, 40, 48,57, 62-63, 65, 77, 94-95, 102-103,105-107, 110, 113, 117-118, 120,125, 129, 135, 138, 154, 156,170-173, 177Object-oriented, 8, 93, 195, 198Online Reference Manual, 19, 51-52,75, 94OpenGL, 6, 11, 39, 94, 103-105, 123,129, 132-133, 212-213222<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>OrientationInterpolator, 207Orthonomal, 122PPanel, 207Path, 71, 186-187PHANTOM, 13, 214PhantomDevice, 82, 207PhantomPremium, 82PhotoShop, 55Planet, 141, 143-155, 158, 160,162-163, 165-169, 171-183PlanetSurface, 179-180PositionInterpolator, 29-30, 207PROTO, 49ProximitySensor, 207Proxy, 40, 121-127, 130Python Primer, 195Python Script, 63-67, 195, 197PythonScript, 11, 51, 61, 63-70, 207QQuit<strong>API</strong>, 98, 150RRadius, x, 61-62, 64, 80, 83-84, 99,119, 121, 128-130, 134-135, 146,150-154, 158, 160-163, 179-180,190-191<strong>Reachin</strong> <strong>API</strong> Nodes, 12, 17, 47, 63, 93<strong>Reachin</strong> <strong>API</strong> Specific Nodes, 51<strong>Reachin</strong> Display, 31, 49, 64, 67, 78,80, 83-84, 98, 147, 189, 192<strong>Reachin</strong>Load, 45-46, 48, 67, 94, 98,126, 185Realtime, 41, 102-103, 107-108,113-114, 116-117, 120, 124,130-131, 136, 173-177RealtimeGeometry, 103, 130, 135-136RealtimePyramid, 136RealtimeSquare, 135-136RealtimeTriangle, 135-136Reference, ix, 17, 19, 47, 51-52, 63,68-70, 75, 78-79, 82, 85, 92-95, 99,116-117, 148, 152, 157, 162-163,166, 171-172, 198, 211-215Reference Manual, ix, 19, 51-52, 75,94, 213References, x, 12, 21, 47, 63, 66-67,71, 75, 78, 85, 88, 93, 147, 150, 201,211, 213-215Render, ix, 6, 13, 18, 25, 35, 39-40,48, 84, 94, 130, 136, 157Renderer, 89, 94-96, 107, 132-134,136-137, 156, 170RenderGLField, 94-96, 133Resources, 17, 47, 63, 185-187, 198,214RGB, 20, 23, 79-80Root, 147, 177Rotation, 13, 31, 56-57, 82, 103, 137,167-168, 176RoughSurface, 54-55, 207index 223


Route, 29-30, 35-36, 62-65, 68,84-89, 91-92, 102, 116, 133, 137,143, 149, 152, 155-156, 161-166,168-170, 179-181, 198Routing, 31, 37, 61-65, 68, 83, 85-86,88, 91-92, 150, 152, 156, 162-163,179-180Rubberband, 144, 155-159, 161,163-165, 168-182RubberbandComposer, 164, 170,181-182RubberForce, 173-177, 180, 183SScalarInterpolator, 62-63, 207Scaling, 105, 110Scene, x, 5-12, 17-22, 24-26, 28-31,35, 39-41, 43, 46-48, 63, 66, 77-81,84-85, 93, 98, 101-105, 107-109,113-116, 118-121, 130, 143-147,149-152, 161-162, 164-165, 169,173, 175, 180, 207Script, 61, 63-67, 69, 195, 197, 207Scripting, x, 11, 31, 41, 43, 195-198SensAble, 13, 214Sequences, 36Setting Field Values, 78SF, 18SFBool, 65, 68, 101, 166-167SFBound, 132-134, 136-137, 170SFColor, 65-66SField, 85-87, 89-90, 92, 95SFields, 86-87SFInt32s, 90, 92SFNode, 69-70, 78, 92-93, 155-157,161SFNodes, 78SFRGB, 79, 150, 161-162SFRotation, 101, 167SFString, 100SFSurface, 70SFTexture, 150SFVec3f, 18, 36, 38-39, 101, 164,167, 179, 198Shapes, 7, 11-12, 20, 40, 57, 159Sphere, 19, 40, 61-64, 80-81, 83-84,99, 118-119, 121-122, 128-131, 144,146, 151, 158, 161, 190-191, 207SphereInertia, 57, 138, 154, 207Silicon Graphics, 19SimpleSurface, 21-22, 40, 53,125-129, 146, 151, 158, 161, 207Simulation, 11-12, 31, 56, 58, 98,141, 143, 147, 149, 154Single Field, 18SlotDynamic, 58, 207SnapForce, 114-115, 117-119SpaceMouse, 13Specialisation, 58, 92-94, 96, 106,109, 136, 170, 201Specifying, 23, 27, 46, 52, 83, 94, 97,134, 152, 158, 168, 192, 202224<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>Sphere, 19, 40, 61-64, 80-81, 83-84,99, 118-119, 121-122, 128-131, 144,146, 151, 158, 161, 190-191, 207SphereInertia, 57, 138, 154, 207Springs, 31, 107Standard Template Library, x, 107-108StereoInfo, 207-208Sticky, 56Stiffness, 21, 53-54, 56-58, 80, 110,114-116, 118-119, 146, 151, 158,161, 178STL, x, 75, 85-89Store, 8, 18-20, 24, 32, 84, 116, 146,151, 175, 177, 203Sub-classes, 31, 57, 65, 90, 93-94,127-128, 136, 197-198Sum, 37, 90-91Surface, 14, 21-22, 29, 40-41, 48,52-57, 64, 66-67, 80, 88, 107, 110,116, 120-129, 131, 137-138, 146,151, 158, 161, 179-180, 182-183,205, 208Surfaces, 5, 13, 21-22, 40-41, 47,51-53, 55-56, 120-121, 123-125, 129,137SwirlSurface, 125-129Switch, 24-25, 166, 208Syntax, 46-47, 49, 85, 152, 189,195-196, 202System, 7, 31, 45-46, 48-49, 71, 84,93, 122-123, 141, 196TTarget, 9, 19TemplateSurface, 125Text, x, 46, 58, 97, 208, 213Texture, 21, 23-24, 55-57, 123-124,126, 129, 131-132, 136-138, 146,150-152, 158, 160-161TextureCoordinate, 208TextureTransform, 21, 24, 208Three-dimensional, 9, 13, 20, 46, 113Time, 6-7, 13, 26-28, 30, 37, 99,102, 104, 106-107, 117-118, 126,150, 153, 156, 159, 162, 166-169,171-173, 175-178, 180, 183, 198Timer, 27, 102TimeSensor, 27-30, 61-63, 208TimeSensors, 27TouchSensor, 47, 208Tracking, 11, 13, 104TrackingDevice, 208TransformedInertia, 57, 208Traversing, 46TypedField, 65, 68, 88-92, 133-134TypedMFNode, 93, 155, 162TypedSFNode, 93UUniversal Resource Locations, 185Universal Resource Names, 185index 225


UNIX, 186-187Update, 6, 37-39, 85-86, 88, 90-92,101-102, 126-127, 134, 156-157,166, 172-173, 178, 181-182URL, 23, 55-57, 63-64, 66-67, 69-70,98, 138, 149, 152, 158, 160-161, 186URN, vii, viii, 31, 55-58, 68-69, 71,82, 138, 152, 158, 160-161, 185-187,192URN Index File, 185-187URNResolver, 71USE, ix, 8-9, 11-12, 17, 24, 27-29, 35,38-39, 43, 45, 47, 49, 51, 56, 61-64,66-67, 71, 78, 81-82, 86, 88-89, 91,93-94, 96, 106-108, 113, 117-118,120, 131, 133-134, 136-137, 141,143, 145-147, 150-156, 159-160,163-167, 169-183, 189-191, 195-198,201-203Util, 126-127, 153-154, 157, 168VVec2f, 125-127, 135-136Vec3f, x, 104-109, 113-115, 117-119,125, 127, 130, 134-137, 148-149,164, 168, 172-174, 176, 178-180,182-183Verbose, 77Viewpoint, 26, 47, 147, 208Visual, 9, 19, 21, 137-138, 147, 165VRML, ix-x, 5, 11, 17-21, 29,31, 36, 38, 41, 43, 45-49, 51-53,56-58, 61-64, 66-67, 69-71, 77-79,81-84, 90, 93-94, 97-101, 103,118, 126, 128, 132, 134, 137-138,141, 143, 145-153, 155, 158-163,172, 185-186, 189, 191-192, 198,205-208, 213VRML Script, 63, 66VRML-interface, 97VRML97, x, 21, 45, 52, 63WWindow, 208WorldComposer, 156-157, 161, 166,170WriteVrml, 66, 69, 71ZZero-line-segment, 130226<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>index 227


228<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>


<strong>Reachin</strong> <strong>API</strong> Programmer’s <strong>Guide</strong>index 229

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!