Blue for
Clarify or clean up b/c key point isn’t clear
A
systems based, physiologically robust, reference architecture for designing and
refining interactive human-computer interface systems in ways which increase
operational throughput of information.
The purpose of this dissertation is to
develop a systems model which is thought to be more representative of reality than
traditional models in that it is consistent with the phenomenological aspects
The
capacity of computers to receive, process, and transmit massive amounts of information
is continually increasing. Current attempts to develop new human-computer
interface technologies have given us devices such asgloves, motion trackers,3-D
sound and graphics.Such devices greatly enhance our ability to interact with
this increasing flow of information. Interactive interface technologies
emerging from the next paradigm of human-computer interaction are directly
sensing bio-electric signals (from eye, muscle and brain activity) as inputs
and rendering information in ways that take advantage of psycho-physiologic
signal processing of the human nervous system (perceptual psychophysics). The
next paradigm of human-computer interface will optimize the technology to the
physiology -- a biologically responsive interactive interface.
INTERACTIVE
INFORMATION TECHNOLOGY
Interactive
information technology is any technology which augments our ability to create /
express / retrieve / analyze / process / communicate / experience information
in an interactive mode. Biocybernetics optimizes the interactive interface,
promising a technology that can profoundly improve the quality of life of real
people today. The next paradigm of interface technology is based on new
theories of human-computer interaction which are physiologically and cognitively
oriented. This emerging paradigm of human computer interaction incorporates
multi-sense rendering technologies, giving sustained perceptual effects, and
natural user interface devices which measure multiple physiological parameters
simultaneously and use them as inputs. Biologically optimized interactive
information technology has the potential to facilitate effective communication.
This increase in effectiveness will impact both human-computer and
human-human
communication, "enhanced expressivity".
Interactive
interface technology renders content specific information onto multiple human
sensory systems giving a sustained perceptual effect, while monitoring human
response, in the form of physiometric gestures, speech, eye movements and
various other inputs. Such
quantitative measurement of activity during purposeful tasks allows us to
quantitatively characterize individual cognitive styles. This capability
promises to be a powerful tool for characterizing the complex nature of normal
and impaired human performance. The systems of the future will monitor a user's
actions, learn from them, and adapt by varying aspects of the system's
configuration to optimize performance. By immersion of external senses and
iterative interaction with biosignal triggered events complex tasks are more
readily
achieved.
This
paradigm shift of mass communication and information technologies is providing
an exciting opportunity to facilitate the rapid exchange of relevant
information thereby increasing the individual productivity of persons involved
in the information industry. Areas such as computer-supported cooperative work,
knowledge engineering, expert systems, interactive attentional training, and
adaptive
task analysis will be changed fundamentally by this increase in informatic
ability. The psycho-social implications of this technologically mediated
human-computer and human-human communication are quite profound. Providing the knowledge and technology
required to empower people to make a positive difference with information
technology could foster the development an attitude of social responsibility
towards the usage of this technology and may be a profound step forward in
modern social development. Applications which are intended to improve quality
of life, such as, applications in medicine; education, recreation and
communication must become a social priority.
In
the time between those early days and the late 70’s the ability for the
computer to respond to a task given it by a human was, for the most part,
limited by computation power. From the early days of computer programming where
each logical connection was “hard wired” by an army of technicians to the time
of punch cards, the concept of interaction with a computer had a very limited
and specific interpretation. A great advance was made when the computer had a
“typewriter” like mechanism which allowed computers to be programmed through
“terminals”. As the speed of the computer increased along with its capacity to
respond to the human, it began to become apparent that the humans ability to
convey intent to the computer and the computers capacity to display results of
its calculations to the human would become a limiting factor in the
“interaction” between humans and computers. Innovators at Xerox PARC, MIT, NASA, DARPA and other
computer research facilities began to rethink the concept of how humans and
computers would be able to communicate more effectively. (in this context the
term “communicate” is used to mean the ability to intentionally exchange
meaningful information).The first significant breakthrough to make it out of
the lab was the “WIMP” interface (Windows, Icons, Mouse, Pointer) graphical
user interface, referred to in the nerd zone as the GUI (pronounced gooey). In
the late 70’s with the commercial release of Apple’s“personal” computer, with
it’s GUI, to the general public …….blah..blah
interface
will map very close to the human body.
PHYSIOLOGICALLY
ORIENTED INTERFACE DESIGN
Knowledge
of sensory physiology and perceptual psychophysics is being used to optimize
our future interactions with the computer. By increasing the number and
variation of simultaneous sensory inputs, we can make the body an integral part
of the information system, "a sensorial combinetric integrator". We
can then identify the optimal perceptual state space parametersin which
information can best be rendered. That is what types ofinformation are best
rendered to each specific sense modality, "a sense specific optimization
of rendered information. Research in human sensory physiology, specifically
sensory transduction mechanisms, shows us that there are designs in our nervous
systemsoptimized for feature extraction of spatially rendered data,temporally
rendered data, andtextures. Models of information processing based on the
capacity of these neurophysiological structures to process information will
help our efforts to enhance perception of complex relationships by integrating
visual, binaural, and tactile modalities. Then by using the natural bioelectric
energy as a signal source for input; electroencephalography,
electroocculography, and electromyography (brain, eye and muscle) we can
generatehighly interactivesystems in which these biological signals initiate
specific events. Such a real-time analysis enables multi-modal feedback and
closed-loop interactions.
The
purpose of the reference architecture will be to provide insight into the
various components of the system in the context of how they might affect the
flow of information as information is passed through them
End introduction here (just add refs) and I
can clean up the text if you like! Chapter 2-your own theory begins with the
following
An
understanding if the human neuro physiology allows for exploitation of
predictable adaptive capabilities
The
nervous system is the primary information infrastructure for humans
The
nervous system supports the transduction transmission representation and
response to information in the environment
The
phenomena of interest (perception) occurs at the anthroscopic scale
Mind
happens at an anthroscopic scale..
Anthroscopic
scale—the natural scale of perceptibility of an individual human…
Anthroscopic
epistemology is biased by Neural systems
Or
three “state spaces”a state space is a set whose members are defined by an
n-tuple ofvalues which correspond to the parameter values of One set is a set
of parameter values of a computationalsystem
Information,
which is biologically/physiologicallymediated
Information,
which is directly experienced
that
human perception is mediated , for the most part, by the nervous system.
That
the physicality of the nervous system constrains perception ..space, time, mass
and energy
and
physiology of the human nervous system restrains perception ….
Complexity,
functionality, capacity the intra-activity of the nervous system sustains
perception
thus
the form and function of the nervous system influence various parameters of
perception
A
state of any system is defined by the set values which describe the condition
of the system in any given point in time (the value of all the state vectors)
Perceptual
state space--- the set of all perceptual states
and
thus the perceptual state space is dynamically
Forged
by combining the information of sensation with the interpretativemental
constructs
Mind
happens
Mental
constructs comprise the contents of consciousness
Mental
constructs are forged by integrating information of experience and sensation
A
neuro computational matrixis responsible for the actual function of forging
mental constructs
A
notational system has been derived which represents the flow of information
between the environment and the neurally mediated experience of consciousness
This
formal descriptivenotational systemwill enable the creation of“most probable
maps” of information flow between humans and their environment
A
Perceptual state spaceis the set of all experience-able conscious awareness
Perceptual
dimensions are experienced with varying levels of awareness
A
perceptual state is a unique momentary experience of conscious awareness
A
perceptual state may be comprised of several experiential dimensions
An
experiential dimension is specific , undecomposable quality
A
given mental condition can bias the occurrence probabilities of emerging
perceptions
Emerging
perceptions can be dynamically influenced thus Influencing the quality of the
experienced states
Perceptual
state space modulationis the intentional act of influencing emerging perceptual
states such that the perceptual state space which will contain the next series
of perceptual states is a specific subspace
The
physio info metrics of the neural info matrix determines the through flux
Information
is differential from the energy which conveys it
The
nervous system is the primary tissue for sustaining consciousness
The
nervous system couples the phenomena of experience to the environment
The
exchange of that information can be abstracted
The
flow of information can be parameterized by temporal, spatial,ergo-dyno-morphic
flux
Any
state of information is characterized by a specific state-space parameter set.
The
nervous systems capacity to transduce, transmit, characterize , experience and
respond to information of environmental conditions limits the know-ability of
the environment
Theoretical
construct
Notational
system which exploits interdisciplinaryinteraction
A
languaging system which can classify emerging observations
Phx
– life
Phi
–physics
Khi—synthetic
Sky—rendering
Sns—transductive
sensing
Med—healing
intent
Edu—knowledge
transfer
Com—directed
communication
Rec—recreational
enrichment
Physio
info metrics --- the quantitative measure of the information carrying capacity
of the of a physiologic system
Physiologically
mediated information exchange between external environment and experiential
awareness
Both
the physicality and the physiology contribute to the set of bio-physical
restraints
Demonstration
of an integrated system of human to computer input devices with the
multisensory rendering systems.
Demonstration
of an interactive, experiential environment optimized for intelligent interaction
with information.
Anthrotronic
systems designed for interactive information exchange continue to evolve
Applying
first principles of physio info metrics facilitatesdesign innovation for
operational refinement of the evolving interface system
The
development of hardware and software systems based on principles derived from
the reference architecture, and implementation of such systems in real world
settings
single
interface system and demonstrate data fusion to enable meaningful correlations
across various input modalities will significantly enhance progress toward this
end.
Viewing
the entire body as a perceptual and expressional technology opens up
possibilities for exploiting the heretofore untapped richness and greater
volumetric potential of its informatic capacities. Hence, we propose to develop
an interactive environment incorporating new ways to render complex information
to the user by optimizing the interface system to match the human nervous
system’s ability to transduce, transmit, and render to consciousness the
necessary information.
Weassessed
the limits of usability of traditional input devices such as mouse, joystick
and keyboards to determine when interface complexity precludes their use as
primary inputs.
We
researched an integrated system of human to computer input devices with the
multisensory rendering systems.
We
researched f an interactive, experiential environment optimized for intelligent
interaction with information.
have
been developed and refined
Clean up the above for theoretical chapter—which
leads into “case 1” all the visualization stuff
Ashley is going in the “case 2” chapter for
the development of THG1 and 1rst generation neattolls? I don’t remember the outline.
In any case, the next sections
on neattools and thgs needs to be cleaned up, shortened and put in the context of
the kids.
ASHLEY
The
main focus of the case examples will be to show the utility of this reference
architecture. Especially to show the capacity for enhanced perceptibility and
expressability that may be achieved through an intelligent application of this
physiologically robust interface systems model.
Biologically mediated
Externally apparent
Diagrammatic info – flow
Any state of information can be represented
as a point in a complex hyper-dimensional state-space
Expressional formulation
Derivational symbology
Notational examples
Developed experimental systemsof
instrumentation and software
QSI-AVS-
Glove talker
generations:
Introduction -- incl. overall design
philosophy
Neat DOS -- first generation
JOJO
BEC
NEAT
JOJO
GUO
Really neat
JOJO
EJ
and similarly one on hardware:
After all of the technological success and
potential of these 1990 and 1991 efforts, 1992 brought some new
difficulties.Dr. Will, who headed the Neurological Research Group at Loma
Linda, was appointed as Dean, and the research group disintegrated.Space became
an issue, as well as personnel, but eventually some space was made available in
a new building, the Outpatient Rehabilitation Center.Dave Warner, still a
medical student, was in charge of the new area, which was called the Human
Performance Institute. Here Dave and his new group tested a sound chair created
in Finland, by a company called Next Wave.The chair helped in relaxing spastic
muscle groups, by low-frequency sonic stimulation.Dave made a point of giving
frequent lectures on this project, as well as on the work with BioMuse – at
conferences, at universities, and at hospitals. The work was publicized on TV
programs and in other media.In a process that was to be repeated frequently,
the TV coverage attracted the attention of a highly talented individual, a
programmer named Jo Johansen, who approached the group at a conference in 1994
and volunteered to workwithout pay on software for controlling the chair.Three
engineers from Walla Walla University in Washington similarly responded to a
talk by Dave Warner; the presentation moved them to work on altering a Radio
Shack remote control car so it could be controlled by a disabled individual
through the BioMuse system.Jo Johansen then altered the chair software so it
would take in signals from BioMuse, and send commands through the parallel port
of a computer to control the car.Disabled kids could have fun doing this, and
rehabilitation patients could use the technology to exercise arm muscles in
therapy sessions.Still, these useful capabilities depended on the expensive
BioMuse as a middleman between the sensors on patients and the computer.And most
individuals could not afford the machine.
The
first breakthrough in this impasse came in 1995, when Salomo Murtonen, the
Finnish inventor of the Sound Chair, came to America to volunteer for the
project.A self-taughtelectronics engineer, Salomo committed himself to create
the equivalent of the BioMuse device at low cost.At first, he worked for next
to nothing, since the group had no substantial source of funds to pay its
volunteers.Salomo created a four-channel EMG interface that could take any
signal derived from muscle movements into the computer.The device was named
TNG-1 (Thing 1, from The Cat in the Hat by Dr. Seuss); TNG is short for
Totally Neat Gadget.Salamo produced TNG-1 with Radio Shack parts for a cost of
$200, far less than the cost of BioMuse.
In
1994, the group had begun to work with a seven-year-old girl named Ashley
Hughes.As a result of a broken neck during birth, Ashley is a C1 quadriplegic,
paralyzed from the neck down, dependent on a respirator for breathing.She could
move facial muscles, and TNG-1 could transmit the EMG signals from her facial
movements to her 286computer.But then software was needed to make those signals
meaningful to the computer, and to display them on the screen.Jo Johansen wrote
a program called BioEnvironmental Control (BEC) to make the EMG gesture signals
usable for the computer, and to convert these to graphical outputs.BEC was
designed for Ashley’s facial capabilities; it allowed her to express herself in
rich and complex ways, using her body as a way to control a computer. With
these powerful technologies, Ashley played computer games, drove a
remote-controlled car, experienced her world remotely through a camera and
microphone mounted on a styrofoam structure named Cindy Cyberspace, and
interacted with others in her environment. Now the group had created both
affordable hardware and software.But though TN-1 was inexpensive, it was not
free of problems.For one thing, the electrodes that detected the muscle
movement were not stable; and setting up TNG-1 was not easy for the families –
it could not just be left with them.Further development was needed, to make the
technology more stable and easier for family members to use.
Besides Jo Johansen and Salomo Maturnen, a
host of other volunteers committed themselves to this project of developing
state-of-the-art software and hardware to improve conditions of life for the
disabled.At least 20 individuals joined the project in California, including a
bright physicist named Markus Schmidt from Germany who read about the effort in
a German article.Some volunteers had solid professional credentials.Others were
young students in high school or college.Some had college degrees, but no career
plans.They either read about the project, learned about it on TV, or
encountered it at a conference or a talk; in some cases, their parents found
about it and steered them to join.Until the summer of 1993, they could not be
paid.Beginning that summer, Dave Warner began to rent a group house where the
volunteers could live and work and interact.And slowly, as monies were
available, he began to pay some salary and expenses.When Dave completed his
medical degree in 1995, he accepted a Nason Postdoctoral Fellowship at Syracuse
University, and many of the volunteers came with him. The tradition of the
group house, where volunteers live rent free and work together, has continued
in Syracuse.In both California and in Syracuse, the group houses were given the
same name: Center for Really Neat Research.Both have operated as development
and demonstration lab environments. They have applied the newest developments
in technology to show the viability of new concepts to help the
disabled.Typically, the new commercial technologies are too expensive for use
with the disabled population.The lab, once having tested that a concept could
work, then marshals its efforts to create a powerful inexpensive version.But
the lab does not serve as a clinical facility or as a marketing facility.The
group shows the viability of a new technology by testing it with some disabled
patients.They then publicize the development at conferences or through talks;
with grant funding, they have formed some partnerships with selected national
facilities that use these technologies in clinical situations with large
numbers of patients.
With
the development of the TNG-1 interface device and the BEC software in
California, the capability to interact with a computer was now affordable for
the disabled.But TNG-1 depended on the use of EMG sensors attached to the faces
of quadriplegic patients, where they had some muscle control.Whether they were
mounted on cheeks, or foreheads, the EMG sensors would not work for very long.They
didn’t stick to the face well, coming loose easily with repeated movement.To
get around the problem posed by dependence on EMG sensors, Salomo went to work
to create a more flexible interface device, that could receive a variety of
types of sensory signals.TNG-2 could detect changes in light signals resulting
from cheek motions, since such a motion would distort the light path to a light
receptor, and thus create a signal.The lights did not have to be attached to
the patients’ skin, as did the EMG sensor; the light sensors could be mounted
from a hat, a helmet, or most recently, from a pair of glasses.But light was
only one possible type of signal detectable by TNG-2, which was constructed to
receive up to four general-purpose analog inputs. The use of light signals had
advantages over the EMG approach, but raised new problems.Because photocells
change their signal levels when the brightness level changes in a room, the use
of light signals required frequent calibration, often beyond the capacities of
a disabled individual’s family.The group then experimented with signal sources
other than light – by using Hall Effect transducers to detect changes in
magnetic field and by using pressure transducers.Other approaches include using
bend sensors or tilt sensors.For Ashley Hughes, for instance, the group had
been able to use a tilt sensor on her head along with a pointing stick attached
to her head.With this combination, she could use a screen keyboard to type.
TNG-1
and TNG-2 could each convey four channels of information to NeatTools.In real
terms, this meant that the system was limited to information from, say,four
muscles, or from four light detectors. Or the user would have to depend on two
TNG devices at once – thus requiring two available ports on the computer.With
the creation of TNG-3, 16 channels became available – 8 analog and 8 digital.As
of January 2000, the group is now testing a working prototype of TNG-4, which
has eight analog and 16 to 20 digital lines, each of which can serve as input
or output.This increased capability will mean that the parallel port can be
left alone for other functions, such as connecting a printer or a zip
drive.TNG-3 and TNG-4 have been developed by Edward Lipson and Paul Gelling at
Syracuse University
Similarly,
as computer technology has advanced, the group has been working on creating new
iterations of the software, to take advantage of the capabilities that the
newest developments of technology will allow.In 1993, the BEC software was
renamed Neat; it was written for MS-DOS operating system, then common for
PC’s.In 1996, a version of this software was created for a Windows 95 PC
environment; and in late 1996, work began on another version based on a
Java-like environment. This is the current version called NeatTools.It is
suitable for virtually any computer system – it can run on Windows 95 or 98 or
Windows NT; it can run on Unix, Irix, or Linux; it will be able to run on Macs upon
the availability of multithreaded operating systems.It can interface with a
much broader range of devices.Its capabilities include both Internet
connectivity and multimedia sound.A user can simultaneously develop, edit, and
execute programs in Neat Tools.
TNGs
and widgits
Intro--
design philosophy; microcontroller technology; ...
TNG-1
-- EMG
TNG-2
-- 4 analog input channels
TNG-4
-- ~32 channels bi-directional
Widgits
-- sensors and transducers
Representative
Applications
Put the following paper as is, in an appendix
rather than cut and paste into document, this will allow you to cut out a lot of
the NeatTools details above.
Universal
Interfacing System for Interactive Technologies in Telemedicine, Disabilities,
Rehabilitation, and Education
Edward Lipson1,
David Warner2, and Yuh-Jye Chang3
1Department of Physics
and 1-3Northeast Parallel Architectures Center, Syracuse University,
Syracuse, NY 13244; and 1,2MindTel LLC, 2-212 Center for Science and
Technology, 111 College Place, Syracuse, NY 13244
A modular
hardware and software system for human-computer interaction is described that
allows for flexible, affordable interfacing of people, computers, and
instruments. The approach is illustrated with an application in the
disabilities area. Other application areas are outlined.
Emerging
methods for human-computer interaction [HCI; 1] offer revolutionary
opportunities to advance healthcare and quality of life, particularly as the
power, functionality, and affordability of computers continues to soar. In
particular, the advent of wearable computers calls for new types of interfaces,
since the users are typically not desk-bound. Further, for people with
disabilities who are unable to use a keyboard and/or mouse, the need for
alternative interfaces is compelling. Clinical environments can enjoy improved
efficiencies and outcomes, as new ways evolve to interface patients,
caregivers, and instruments to computers and networks.
Our group has been developing powerful, low-cost technologies combining modular software and hardware that accommodate expressional gestures and perceptual modalities as essential parts of the interface. These systems allow for adaptive rapid prototyping in which practically any input to the computer can be mapped to appropriate actions and outputs.
The
NeatTools visual-programming environment allows rapid prototyping and
implementation of HCI and other dataflow applications, in conjunction with
custom sensors, mounting hardware, computer interface boxes (TNGs), and
clinical/scientific instruments.
NeatTools
constitutes a visual-programming and runtime environment that produces
fine-grain dataflow networks for data acquisition and processing, gesture
recognition, external device control, virtual world control, remote
collaboration, and perceptual modulation. The design goals of NeatTools have
been to make it simple, object-oriented, network-ready, robust, secure,
architecture neutral, portable, high-performance, multithreaded, and dynamic.
The program and representative applications are downloadable from http://www.pulsar.org/. NeatTools can readily
accommodate custom interface devices, or commercial devices including clinical
instruments. Figure 1 shows two simple NeatTools programs. For a full-fledged
application program, see the section below on the JoyMouse Network.
NeatTools is written in C++ but built on top of a thin-layer Java-like cross-platform C++ application programming interface (API), which operates presently on Windows 95/NT, Unix (Sun), Irix (SGI), and Linux. In due course, Macintosh will be supported, once its multitasking, multithreaded operating system is released (note that it can run provisionally on a Mac-based PC-simulator, such as Connectix Virtual PCÔ ), along with appropriate C++ development tools.
Currently, NeatTools includes serial, parallel, and joystick port interfaces; multimedia sound; MIDI (Musical Instrument Device Interface) controls; recording and playback; Internet connectivity (sockets, telephony, etc.); various display modalities including for time signals; time generation functions; mathematical and logic functions (including a state machine module); character generation; and a visual relational database system including multimedia functionality. Keyboard and mouse events can be received or generated via Keyboard and Mouse modules. This allows, among other things, the user to control a graphical user interface by alternative input devices that in effect simulate keyboard and mouse events. Data types in NeatTools include integer, real, string, block, byte array, MIDI event, and audio or video streams. NeatTools allows the visual programmer to package a dataflow network inside a container module that constitutes a reusable "complex module" with simple overt appearance. This procedure can be iterated to accommodate several layers of hidden complexity.
NeatTools modules provide multithreaded, real-time support. Editing and execution are active concurrently, without need for compilation steps. This generally accelerates system design, and facilitates rapid prototyping and debugging. To construct a dataflow network, the user drags and drops modules (objects) from toolboxes to the desktop and then interconnects them with input/output and control/parametric lines. Properties of the desktop and many of the modules are set via a right-mouse-click. In this way, users are in effect developing elaborate interface programs without having to know C++ or the fundamental structure of NeatTools, or indeed having to write any textual program code at all. On the other hand, the system is open, so that experienced programmers can develop external modules by following instructions in an online developer’s kit. External modules can be loaded into the system at runtime, or arranged to preload automatically. The NeatTools executable development program, while massive in terms of source code (~40,000 lines of C++), is compact; the downloadable compressed archive file is about 600 kilobytes in size, so it easily fits on a diskette along with a compressed archive (under 100 kilobytes) of representative "*.ntl" files.
The
system hardware consists of mounting components, sensors, serial interface
boxes, computer, and optional output interfaces and devices. Our current
electronic interface module (TNG-3; www.mindtel.com/mindtel/anywear.html)
accommodates up to 8 analog and 8 digital (switch) sensors and streams the data
at 19,200 bits per second to the serial port of a computer. Connections are
made via standard stereo and mono plugs. The heart of TNG-3 is a programmable
microcontroller integrated circuit [2], a type of computer-on-a-chip commonly
used in industrial and office automation, and in automotive, communication, and
consumer electronics under the general rubric of embedded control systems. The
microcontroller in TNG-3 is programmed in assembly language for optimal
performance. TNG-3 requires no batteries or wall transformer, as it derives 5
volt power for the onboard circuitry and sensors (requiring only modest power)
by exploiting some of the unused serial-port lines—a technique commonly used to
power a serial mouse on a PC.
Among
the sensors we have used are switches, cadmium-sulfide (CdS) photocells, Hall
Effect transducers (magnetic sensors), rotary and linear-displacement potentiometers,
bend sensors, piezo film sensors, strain gauges, and custom
electroconductive-plastic pressure sensors. Most of these sensors are
inexpensive, some costing under a dollar and some costing but a few dollars.
Certain types (Hall Effect and capacitive) require preamplifiers and/or signal
processing electronics, which increase the cost, but not unduly.
The
most substantive technical result of our work to date is the development of the
NeatTools system along with the TNG interfaces and sensors, as described above.
We have begun to apply the core technologies in a number of key application
areas.
For
illustration, we describe the types of systems we developed for Eyal Sherman, a
member of our team who, is a brainstem quadriplegic, unable to move his head or
to vocalize. He is currently a senior at Nottingham High School in Syracuse. We
have enabled Eyal to precisely control mouse motion, and thereby control
graphical user interfaces, such as Windows 95. Eyal and his family have
achieved independence in using this system; his mother is able to set up the
hardware and software routinely in a matter of minutes.
The primary interface device is a chin joystick,
extracted from an inexpensive game controller, mounted to a curved support rod,
which is clamped in turn to the wheelchair headrest post, thereby allowing the
device to be rotated away when not in use. To allow easy mounting and
adjustment of sensors near Eyal’s expressive facial regions—mainly cheeks and
forehead—an industrial designer on our team, Michael Konieczny, built
lightweight adjustable mounts that attach to eyeglasses. Currently we are using
small switches as the expressional sensors, but we have also used Hall Effect
transducers (together with tiny rare-earth magnets) and photocells to detect
facial gestures.
An
application program demonstrating the considerable power of NeatTools is the
JoyMouse dataflow network (Fig. 2), which Eyal and other youngsters with
quadriplegia have been using with good results. For details, manual, images,
and downloads, see http://www.pulsar.org/neattools/edl/joymouse_docs/JoyMouseManual.html.
This uses a modest fraction of the channel capacity of TNG-3 (2 of the 8 analog
inputs; and currently 3 of the 8 digital inputs). The JoyMouse application uses
advanced features of NeatTools including logic gates, multiplexers and
demultiplexers, encoders and decoders, various timing and mathematical
operations, and sockets (here in "localhost" mode so that two windows
on the same platform can communicate). The network is shown here both in developer
mode and in user mode, wherein editing is blocked and only essential
regions of the network are visible.
Figure 2 includes a graph of the available relationships between mouse-cursor velocity and analog-joystick displacement. For all three functions, there is a dead band, or free-play zone, near the origin so that the mouse cursor is not subject to jitter when the joystick is physically at rest. The linear relation offers essentially proportional control. The nonlinear relations—quadratic (necessarily inverted for negative displacement) and cubic—offer fine control for up to about half-maximal displacement, and rapid travel with larger displacements. In most applications, the cubic function offers the best performance. Various parameters (pertaining to gain, resolution, etc.) can be set or modified using sliders while remaining in user mode.
The network also accommodates input from three switches: a) a left cheek switch for left-mouse-button; b) a variable-use right-cheek switch for right-mouse-button, enter-key or backspace-key; and c) a forehead switch to dynamically select action mode of the right-cheek switch. Alternatively, the switches could be replaced with analog sensors for which thresholds would be set with sliders within the JoyMouse program. Calibrator modules are included in the JoyMouse program, as in many other NeatTools programs, to automatically adjust to the signal range for analog inputs.
The network as shown can be minimized, once the Enable button has been activated, so that the operating system desktop becomes fully available to other application programs while the JoyMouse runs in background. An optional small satellite window (Fig. 2), a related NeatTools application, can remain visible to display the state of essential options that are under dynamic control of the user; this is made possible by using socket modules to communicate locally between the JoyMouse main window and satellite window. The user can toggle, for example, between mouse click and drag modes by a "smile" gesture (both cheek switches activated for 1 second).
By using this the JoyMouse in conjunction with low-cost commercial utility programs, including an onscreen keyboard (Fitaly™ from Textware Solutions, sometimes with their InstantText™ program for word/phrase prediction and abbreviation expansion), Eyal has been able to a) type text, b) generate speech, c) dial in to a server, d) invoke and use Web browsers and other application programs, e) compose and send e-mail messages, f) play video games alone or with others, g) operate remote controlled cars, h) draw sketches, and i) participate in science experiments and data analysis at school. Other people with severe disabilities have also successfully used the JoyMouse system and other applications with good success, for example children with cerebral palsy.
NeatTools
has many possible applications and roles in the education arena. We have
mentioned the use of NeatTools to allow students with disabilities to
participate actively in science laboratory activities. More generally,
NeatTools lends itself well to student projects in the classroom, laboratory,
science fairs, etc. Moreover, NeatTools can be used for training and
prototyping in an industrial or community college setting. Because NeatTools
can accommodate diverse external modules, the environment can be adapted to a
wide range of simulation applications, notably in medicine. With the increasing
use of sophisticated technology in healthcare, environments like NeatTools can
be expected to play an increasing role in practice and in training of
healthcare practitioners. Medical students, interns, and residents can benefit
from the rapid prototyping capability and flexibility of NeatTools. While prior
programming experience is clearly of benefit for those who wish to write
applications in NeatTools, it can serve, on the other hand, as a training
ground for practitioners and others who want to get their feet wet in
programming before learning conventional languages like C and C++. The
immediacy of the results in this visual programming/runtime environment,
without need to cycle through edit/compile/execute cycles is clearly an
advantage.
In limited testing, we have observed that schoolchildren are often able to grasp the essentials of NeatTools programming quite rapidly. For example, at SIGGRAPH 98 in Orlando, a number of schoolchildren came to our exhibit in the sigKIDS area. Typically, after the first daytime session, they downloaded NeatTools at home the first evening, proceeded to develop applications of their own, and then returned to our site the following morning to continue their programming and obtain more advanced training. Some of the programs they wrote were quite remarkable.
In
the rehabilitation field, our devices have been used for monitoring range of
motion, for example at an elbow or knee joint, during exercises, and other
aspects of human performance. Our systems are currently in use at two
rehabilitation centers, namely the Sister Kenny Institute at Abbott
Northwestern Hospital in Minneapolis and at East Carolina University Medical
Center. They are currently being implemented at the Extended Care Facility of
Oneida City Hospitals in Oneida, NY in a context focused more specifically on
monitoring of care of residents.
Development
of external modules for digital signal processing, digital image processing,
and a host of other advanced modalities will expand the scope of NeatTools for
clinical applications, basic research, and education and training. Areas of
telemedicine that we anticipate would be well served by NeatTools included
telerehabilitation, teleradiology, and general remote patient monitoring
including home healthcare, particularly for the elderly still living at home
but in need of continual observation. NeatTools already includes a module for
the Welch Allyn Vital Signs MonitorÔ . The Internet
socket feature of NeatTools, in conjunction with its audio (and soon video)
codec, recording, and database functions, already provide base functionality
for telemedicine applications.
Another
new project area for our HCI technologies concerns landmine detection and
related applications involving wearable computers and distributed robotics (our
BotMasters project, funded by DARPA). NeatTools and interfaces like ours can
facilitate the signal processing and alerts in such critical real-time
environments. Given the scourge of 100 million landmines on our planet, often
from conflicts settled long ago, we hope that our technology can help reduce
this nightmare while affording maximal safety to those engaged in this
dangerous task.
Our work is based on a
systems approach wherein we have developed modular HCI hardware and software
that is customizable, scalable, and extensible. Although most of the core
functionality is in place, NeatTools remains under development. Improvements in
the visual interface for the end user are needed. Expanding and enhancing the
documentation is now a major priority. Much of the functionality and design of
our software and hardware has been introduced according to the real needs of
users like Eyal, and this will continue as these systems evolve.
References
[1] B. Shneiderman, Designing the User Interface: Strategies for Effective Human-Computer Interaction, 3rd ed. ISBN: 0201694972. Addison-Wesley, Reading MA, 1998
[2] J. B. Peatman, Design with PIC Microcontrollers. ISBN: 0137592590.
Prentice Hall,Upper Saddle River NJ, 1998
The goal is to extend these environmental
control systems into new methods of investigative research. Such as a test of
basic cognitive functionality orthe capacity to maintainattentional focus
necessary to complete an iterative series of cognitive tasks.Data fusion of
sensor data with user interaction parameters will allow meaningful
correlation's to be made across various performance modalities. A goal of this
application is to seek to identify a qualitativedifference between the two
performance/behavior states and then investigate various methods of quantifying
that difference in a way that can be generalized.
It is postulateda differencewill beseen in
the modulation of some of the natural rhythms. It is also postulated that a
cognitively induced modificationwould be consistent in an individual but would
most likely be different between individuals. The psycho-social-behavioral
nature of individuals factors into initial assessment of their cognitive
function. Other indicators of cognitive function areshort-intermediate-long
term memory, sound judgment and the ability to identify similarities in related
objects. Performance
of these cognitive functions is a strong
indicator of the biologic health of the brain. Poor performanceis highly
correlated with organic brain dysfunction.
The articles below should be summarized in Chapter
X: the first set of case studies showing
the mathematics/physiological relationship to your theories
2.
The
following abstracts demonstrate the application of dynamical analysis to
physiological signals and show that it is possible to characterize abnormal
electrophysiological rhythms as low dimensional attractors.
3.
nSale EJ, Warner DJ, Price S, Will AD.Compressed complexity
parameter.Proceedings of the 2nd International Brain Topography
Conference., Toronto, Ontario. 1991
4.
nWarner DJ, Price SH, Sale EJ, Will AD.Chaotropic dynamical analysis of
the EEG.Brain Topography. 1990.
5.
nWarner DJ, Price SH, Sale EJ, Will AD.Chaotropic Dynamical Analysis of
the EEG.Electroencephalography and Clinical Neurophysiology. 1990.
6.
nWarner D, Will AD.Dynamical analysis of EEG: evidence for a
low-dimensional attractor in absence epilepsy.Neurology. 1990 April;40(1):351.
Warner
DJ, Will AD, Peterson GW, Price SH, Sale EJ, Turley SM.Quantitative motion
analysis instrumentation for movement related potentials.Electroencephalography
and Clinical Neurophysiology. 1991;79:29-30.
9.
The
basic problem being addressed by the following abstracts is that clinical
research involving neurological disorders is severely limited by the inability
to objectively and quantitatively measure complex motor performance.Large
double-blind randomized controlled trials of novel therapies continue to rely
on clinical rating scales that are merely ordinal and subjective.In addition,
research on the basic neuroscience of motor control is greatly impeded by the
lack of quantitative measurement of motor performance.
10.
nWill AD, Sale EJ, Price S, Warner DJ, Peterson GW.Quantitative
measurement of the “milkmaid” sign in Huntington’s disease.Annals of Neurology.
1991;30:320
11.
nWarner DJ, Will AD, Peterson GW, Price SH, Sale EJ.The VPL data glove
as an instrument for quantitative motion analysis.Brain Topography. 1990.
12.
nWarner DJ, Will AD, Peterson GW, Price SH, Sale EJ.The VPL data glove
as an instrument for quantitative motion analysis.Brain Topography. 1990.
13.
nWill AD, Warner DJ, Peterson GW, Price SH, Sale EJ. Quantitative motion
analysis of the hand using the data glove.Muscle and Nerve. 1990.
14.
nWill AD, Warner DJ, Peterson GW, Sale EJ, Price SH.The data glove for
precise quantitative measurement of upper motor neuron (UMN) function in
amyotrophic lateral sclerosis (ALS).Annals of Neurology. 1990;28:210.
15.
nWill AD, Warner DJ, Peterson GW, Price SH, Sale EJ.Quantitative
analysis of tremor and chorea using the VPL data glove.Annals of Neurology.
1990;28:299.
17.
nWarner DJ, Will AD, Peterson GW, Price SH, Sale EJ.The VPL data glove
as a tool for hand rehabilitation and communication.Annals of Neurology.
1990;28:272.
Do you really need to include anyting on NRW???
How does it fit in the outline we discussed? Maybe “case 3” in conjunction with Ayal, disussing the system.
Or could it be part of your conclusions
as “future work”. You can’t include
everything!
A
Clinical Application of Machine-Resident Intelligence (1993)
Human
Performance Institute
Loma
Linda University Medical Center
data
formats and structures.
The
other focus of our efforts is in developing highly interactive, biocybernetic
systems where biological signals can modify an environmental chambers'
parameters allowing the user to bioelectrically interface with spatialized
environments. We believe that such physiologically modulated environmental
systems may have a health preserving function. Interfaces to control
stimulation can adaptively
utilize
any biosignal. The result is the capacity tocreate a stimulus regime that
accelerates relaxation and facilitates stress reduction. This is an application
of wellness maintenance technology.
"The
Nirvana Express"
thus
removing ourselves from the determination of the problem. The ever increasing
ability of technology to quantitate complex physiological parameters and to
image volumetric anatomical structures are taking us to a point where we will
soon be unable to assimilate all the available information through the
traditional means (i.e., numbers and graphs). Recent attempts to solve this
problem have focused primarily on advanced visualization techniques. While much
progress has been made in this field, the visual sense is finite and is
reaching its saturation level.
complex
dynamic interactions involved in pathophysiological processes.
paradigm
shift and the beginning of a new era of computer assisted medicine.
Medical
Neuroscientist
Dir.
Medical Intelligence
MindTel
davew@well.com
www.pulsar.org
informatic
systems which augment both the general flow of medical information and provide
decision support for the health care professional; public accesses health
information databases designed to empower the average citizen to become more
involved in their own health care; and advanced training technologies which
will adaptively optimize interactive educational systems to the capacity of the
user. Selected Publications are:
Society
Press.