modular design
interface
sensor
behaviour
command
control
speech
simulation
a modular design was chosen
in order to isolate code functionality into logical building blocks
the design used a tree structure with top level executable programs
displaying the user interface and lower level modules providing functionality as required
error trapping was used at all levels to log any errors to a log file
this module is the executable that is run to start the program
it is the only control system
module that has a user interface
it displays the results of simulation and allows servo limits, servo speed information and
any other parameters required by the other control system modules to be set and recorded
experience of unattended installations of inkha have shown that any user interface
parameter than can be changed will be changed so childproofing was added to the interface
to disable all parameter setting unless a password is entered
a timer driven loop within the
interface module calls the sensor module to read the camera, PIR and microphone statuses
a second timer driven loop interrogates the sensor module periodically in order to
retrieve a vector indicating the direction in which the head assembly should move and a
colour which it should report
should the cameras have moved recently then they are not re-sampled until the motors have
stopped moving to allow the image to settle: the asc16 servo controller notifies the
control software when motor movement is complete whereas the ssc servo controller must use
a time delay set from the user interface
there is no need for the cameras to settle before time based actions such as relaxation
occur as these actions are not vision dependent
the algorithm required, defined in pseudo code, is as follows:
If Time LastPIRMicrophoneTime >
ColourTime
Vector = PIR or microphone vector
LastPIRMicrophoneTime = Time
If camera is not moving
Vector = read vector from sensor module
If Time LastColourReadTime > ColourTime
Colour = read colour from sensor module
LastColourReadTime = Time
ProcessVector (Vector)
DoTimedActions
for some commissions a touch screen is used and the interface module calls into
the speech module to give directions, what's on information etc. when these buttons are
pressed
this module integrates the
cameras, pirs and microphone and displays the user interface for them
it allows parameters and thresholds for motion detection and colour recognition to be set
and displayed
a mirror image of the image received by the camera is displayed as are images showing
detected motion and areas of attention, pir indicators and a microphone level
the sensor module is called periodically by the interface module
it passes back a direction vector to indicate the direction in which the control system
needs to move the head and a colour which is interpreted by the behaviour module resulting
in a statement issued using the speech module
information about how the sensor module detects movement and colour can be found in the vision section
behaviour module
this module modifes sensory input according to user settings (selected behaviour)
it alters inkha's responses to sensory input
behaviours are:
activity: nonchalant, skittishness (interprets interest and fright responses)
verbal: charming, tetchy, obnoxious (select speech-set from database in response to
activity)
mindset: astrological, fashion, factual (selects speech-set for the hue in response to
colour)
colour: density, average (alters colour recognition algorithm)
modifying vectors:
a direction vector passed from the sensor module is modified before it is sent on to the
command module
for example if behaviour is nonchalant then a "high level of interest at centre"
vector passed from the interface module is converted to a "low level of interest at
centre" vector before it is passed to the command module
modifying colour interpretation:
a hue is determined by the sensor module
for example if behaviour is mystic then any given hue will produce a verbal comment
related to astrology
this module produces instructions
for moving the head assembly in a lifelike way according to the passed-in command vectors
and its co-operation rules
it also caters for time based movements such as relaxation (centring of the eyes),
blinking and boredom, the period of which can all be controlled from the user interface
after experience of unattended
installations logic was added to allow control of a relay that inkha's motors can be
turned off whenever sleep mode is entered
this allows the motors to rest during unattended installation whenever there are no
visitors
furthermore a working hours facility was added so that motors can be rested for given
hours of the day
for each non-central
vector received the target eye position is incremented by one tenth of its total movement
in the direction of the vector
the percentage was determined by experimentation
co-operation logic is used to move other axes and to limit the eye movement
as required
next, any relaxation is applied to the target positions before they are sent to the control module
once the desired position of the head assembly has been determined by the command module flow passes to:
this module uses a normalised co-ordinate system with a value of zero representing any movement at its lowest or most anticlockwise position and a value of one representing any movement at its highest or most clockwise position
colour recognition, boredom and blinking occur periodically
for some commissions the command module also checks databases and the internet periodically for other information
the following rules were designed for blinking:
this module takes the target normalised position of each head assembly servo and converts it to the raw servo values that need to be sent using the limit parameters that are passed in from the interface module
it was implemented in such a way
as to allow multiple servos to move simultaneously
it is also possible to control the maximum speed of any servo
more information can be found in the control section
this module uses comment types
passed in from any module and parameters provided by the behaviour module to produce
speech
for example the command module may instruct the speech module to produce speech for the
fourth time that it is bored and the behaviour module will add to this that the speech
should be obnoxious
this module also supplies a list of available voices in a format suitable for display in a list box and allows the active voice to be selected
information about how speech is produced can be found in the speech section
this module receives positions of
the various head assembly axes in normalised co-ordinates and produces a three-dimensional
scaled graphical simulation of it
the simulation can be displayed by the interface module and used to plot trajectories
accurate measurements of the
size and position of the key elements of the head assembly were taken and stored in
constants in the simulation module so that they can be changed easily in future if the
mechanism changes
measurements were also taken of the limits of rotation of each axis
from these measurements a number of simulation primitives were constructed using a series of straight lines between sampled or geometrically created points
the origins of each centre of
rotation of the elements were calculated relative to the zero origin at the base of the
neck
the sequence of transformations of each element around each origin was
analysed and the transformations were effected for each element using a series of
homogeneous co-ordinate transformations
after the last transformation the x and y co-ordinates of the resulting transformed straight lines had a scale and offset factor applied to map them to screen co-ordinates and the lines were then plotted on screen using visual basic line drawing routines