Sensors 2019 , 19, x doi: FOR PEER REVIEW www.mdpi.comjournal sensors [609576]
Sensors 2019 , 19, x; doi: FOR PEER REVIEW www.mdpi.com/journal/ sensors
Type of the Paper (Article) 1
A New Integrated System for Assistance in 2
Communicating with and Telemonitoring Severe ly 3
Disabled Patients 4
Radu Gabriel Bozomitu 1,*, Lucian Niță 2, Vlad Cehan 1, Ioana Dana Alexa 3, Adina Carmen Ilie 3, 5
Alexandru Păsărică 1 and Cristian Rotariu 4 6
1 Faculty of Electronics, Telecommunications and Information Technology, “Gheorghe Asachi” Technical 7
University, Iași 700050, Romania ; bozom [anonimizat] 8
2 Faculty of Electrical Engi neering, Energetics and Applied Informatics, “Gheorghe Asachi” Technical 9
University, Iași 700050, Romania ; [anonimizat] 10
3 “Grigore T. Popa” University of Medicine and Pharmacy, Iași 700115, Clinical Hospital “Dr. C.I. Parhon ”, 11
Iași, Romania; [anonimizat] ; adinacarmenilie@yahoo.c om 12
4 Faculty of Medical Bioengineering , “Grigore T. Popa” University of Medicine and Pharmacy , Iași 700115, 13
Romania; [anonimizat] 14
* Correspondence: [anonimizat] ; Tel.: +40-740-319-451 15
Received: date; Accepted: date; Published: date 16
Abstract: In this paper , we present a new complex electronic system for facilitating communication 17
with severe ly disabled patients and telemonitoring their physiological parameters. The proposed 18
assistive system includes three subsystems ( Patient , Server , and Caretaker ) connected to each other 19
via the Internet. The two -way communication function is based on keyword s technology using a WEB 20
application implemented at the Server level , and the application is accessed remotely from the 21
patient’s laptop/tablet PC. The patient’s needs can be detected by using different switch -type 22
sensors that are adapted to the patient’s physical condition or by using eye -tracking interfaces. The 23
telemonitoring function is based on a wearable wireless sensor network, organized around the 24
Internet of Thing s concept, and the sensors acquir e different physiological parameters of the 25
patients according to their needs. The mobile Caretaker device is represented by a Smartphone, 26
which uses an Android application for communicati ng with patients and perfor ming real-time 27
monitoring o f their physiological parameters. The prototype of the proposed assistive system was 28
tested in “Dr. C.I. Parhon” Clinical Hospital of Iași , Romania, on hospitalized patients from the 29
Clinic of “Geriatrics and Gerontology”. The system contributes to an increas e in the level of care 30
and treatment for disabled patients, and this ultimately lower s costs in the health care system. 31
Keywords: assistive technology ; communication system; disabled patients ; eye tracking; human – 32
computer interfaces; physiological parameters; telemonitoring; user interface; web browser ; 33
wireless sensors 34
1. Introduction 35
In recent years, national and internatio nal efforts have been made to increase the quality of care 36
for people with various disabilities while maintaining costs that are supportable by society. To 37
achieve these conflic ting requirements, researchers have made efforts to introduc e various 38
equipment and techniques that involve progressively more sophisticated components of informatics 39
and telecommunications . 40
When it comes to caring for severely disabled people, the requirements for round -the-clock 41
surveillance are universally accepted. As a rule, several trained people , who are responsible for 42
medical compliance and surveillan ce, attend to the physiological needs of disabled patient s. 43
However, the former can do very little for the psychological needs and the quality of life of the latter. 44
Sensors 2019 , 19, x FOR PEER REVIEW 2 of 30
Assistive technology ensures greater independence for people with disabilities, allowing them 45
to perform tasks that are otherwise impossible or very difficult to accomplish . The proposed system 46
provides these possibilities by improving or changing the way that patients interact with the objects 47
and equipment necessary to perform that task [1], [2]. 48
The progress in the field of electronics in the last few years has generated a real interest in the 49
development of new assistive systems adapted for different types of patients. These systems are very 50
useful for medical investigations and observation and also contribute to the increase in the patient’s 51
quality o f life. 52
Two research directions have been developed for the care of patients with severe disab ilities : 53
(1) telemonitoring physiological parameters and (2) ensuring communication with disabled patients. 54
Both directions, along side remote diagnostics, GPS tracking , and others , are included in what can be 55
called telemedicine. 56
Nowadays, telemonitoring is considered an excellent method for diagnosis and surveillance, as 57
proven by numerous studies and projects that are finalized or still in pr ogress [3], [4], [5], [6], [7], [8]. 58
Some of the most significant projects at the international level include U-R-SAFE (“Universal Remote 59
Signal Acquisition for Health” ) [9] and “CodeBlue” [10], which can be considered reference project s 60
carried out at Harvard University. Many systems have been implemented to assist people with 61
cardiac diseas es, such as HEARTS, “Health Early Alarm Recognition and Telemonitoring System” [11]; 62
EPI-MEDICS, “Enhanced Personal, Intellig ent and Mobile System for Early Detection and Interpretation of 63
Cardiological Syndromes” [12]; and the real -time health monitoring system for remot e cardiac patients 64
using a Smartphone and wearable sensors described in [13]. 65
The advanced technologies applied by many researchers in different technical fields meet the 66
needs of patients with autism [14] and dementia . Moreover, human–robot interaction system s [15] 67
are now able to recognize gestu res that are usuall y employed in human non -verbal communication 68
[16] to aid visually impaired people in perform ing ordinary activities [17], [18] or captur ing non- 69
visual communication [19]. These systems also facilitate the rehabilitation and augmentation of 70
children who suffer from motor and social function impairments [20] and people who have suffered 71
a stroke [21], [22]. Lately, particular attention has been paid to the i mplementation of assistive 72
systems for neuromotor rehabilitation to enhance the recovery and societal reintegration of severely 73
disabled patients [23]. 74
Despite all efforts, telemedicine has not yet become a standard procedure in current medical 75
practice, with many aspects still under study . Some relevant considerations are presented in [24]. One 76
of the reasons is the distinct iveness of the systems used for telemonitoring (like the system s described 77
in [25], [26]) and telecommunication with disabled patients (like the systems described in [27], [28]). 78
These systems are usually realized by different researchers or providers ([29], [30], [31], [32]) and are 79
not compatible. The direct consequences are difficult ies acquiring, installing, and maintaining these 80
systems and—last but not lea st—training staff and patients . Moreover , these systems have high 81
acquisition costs . 82
In general, the assistive systems presented in the literature have limited functions and address 83
a specific type of diseases (e.g., cardiovascular diseases , neuromotor rehabilitation after a stroke ). Our 84
purpose is to implement a new complex system that includes two functions : (1) communication with 85
disabled pati ents by using keywords technology and (2) telem onitoring their physiological parameters . 86
Unlik e other systems , the two functions of the proposed system are software -controlled using a new 87
WEB application. One of the advantage s of the proposed system is that it is easy to configure and can 88
be adapt ed to patients' needs . These need s can be detected by using switch -type sensor s that are 89
suitable for different degrees of disability or by using an eye -tracking interface for communication 90
with severely disabled patie nts. 91
The proposed system represents the results of many years of the author s’ work perfor med as a 92
part of severa l research contract s won by competition [33], [34], [35]. Partial results obtained during 93
these research activit ies have been presented in several papers published over the years [36], [37], 94
[38], [39], [40], but this paper is the first comprehensive presentation of the whole system. 95
Sensors 2019 , 19, x FOR PEER REVIEW 3 of 30
The main beneficiaries of ass istive technologies are patients with severe disabilities, which range 96
from severe neuromotor sequelae to permanent bed immobilization due to vari ous and multiple 97
causes: advanced chronic diseases (cardio vascular, renal, respiratory), osteo articular pathologies 98
with functional impotence, neglect, depression , etc. These patients usually have a low quality of life 99
because they lack the ability to communicat e with the people surrounding them . Connecting with 100
other people, even at a bas ic level, may ameliorate physical and psychological needs, thus improving 101
their life quality and self-esteem . 102
The paper is organized as follows. Section 2 describes the hardware and th e software of the 103
proposed system , as well as its functions and operating modes . System test ing done in laboratory 104
conditions and performed on hospitalized patien ts are illustrated in section 3. Section 4 presents the 105
main results and discussions . In section 5, some conclusions are drawn. 106
2. Materials and Methods 107
The system illustrated in this paper, implemented at the prototype level, is the result of an 108
interdisciplinary collaboration of specialists from the fields of electronics, telecommunication, computer 109
science , and medicine. The proposed system helps the healthcare system move toward a drastic 110
decrease in care costs . It functions by using keywords technology for two-way communication with severely 111
disabled patients and uses the Internet of Thing s (IoT) concept to implement the wireless and wearable 112
sensor network , which is used for telemonitoring the physiological parameter s of the patients. Both 113
function s of the system ( communication and telemonitoring ) share the same hardware platform (except 114
for the sensors used for capturing the physiologic parameters) and most of the software components. 115
From the communication perspect ive, the target consists of patients that are able to hear and/or 116
see and understand but are unable to easily communicate (because of a neuromotor handicap, severe 117
dyspnea, depression) using conventional methods , such as speech, writing , or hand ge stures. U sually , 118
these patients are able to make only limited movements , such as muscle contractions (r aising a forearm, 119
moving a finger or foot) or, as is the case in most situations, eye movements and blinking. Whateve r the 120
situation, the problems always remain the same: Patient ↔ Caretaker communication and medical 121
investigation and observation. 122
The purpose of telemonitoring physiological parameters is to efficient ly assess the patient’s current 123
condition, allowing rapid interventio n if necess ary. The system constantly observe s the previously 124
programmed physiological parameters and send s alarms to the caretaker when the values of the 125
monitored parameters ar e beyond normal limits. All monitored parameters, alarms , and caretaker 126
interventions are recorded in the Server memory for later analysis. 127
2.1. Description of assistive system hardware 128
The proposed assistive system is designed for severely disabled patients to enable the 129
simultaneous two-way communication and telemonit oring of the patient’s vital physiologic parameters . The 130
structure of the system is illustrated in Fig ure 1; it includ es the following three subsystems: Patient , 131
Server , and Caretaker . The Server Subsystem is represented by a con ventional desktop PC , and the 132
Caretaker Subsystem consists of a Smartphone device (Fig ure 1). 133
The Patient Subsystem includes a communication module implemented by different types of human– 134
computer interfaces and is used for communication through keywo rds technology , a telemonitoring 135
module , which is represented by a network of wireless and wearable s ensors that captur e physiological 136
parameters , and a laptop (tablet PC or other devices) for the collection of communication and 137
monitored data v alues . 138
2.1.1. Communication module 139
The communication module of the proposed assistive system is based on human–computer interfaces 140
that can be adapted for different types and degrees of disability. These devices are used for the 141
patient’s nee ds detection and are implemented in our system depending on the user’s physical 142
condition . The devices that can be used with this module are the following : 143
Sensors 2019 , 19, x FOR PEER REVIEW 4 of 30
144
Figure. 1. Structure of the proposed integrated system for assistance in communicating wi th and 145
telemonitoring severe ly disabled patients. 146
• Switch -type sensors are for patients who can perform controlled muscular contraction. These 147
sensors are USB -connected to the laptop and adapted to the physical condition of the patient. 148
Some examples of the switched -based sensors used by th is system are illustrated in Fig ure 2: 149
hand -controlled sensors , including a hand switch -click (Fig ure 2.a), pal pad flat switch (Fig ure 2.b), 150
ribbon switch (Fig ure 2.c), and wobble switch (Fig ure 2.d); foot-controlled sensors , such as a foot 151
switch ( Figure 2.e); sip/puff breeze switch with headset , as illustrated in Fig ure 2.f (the leading sip- 152
puff system for individuals with motor disabilities and limited dexterity ). 153
• Eye-tracking -based interface is for fully immobilized patients who cannot perform any controlled 154
movement, except eye ball movement , and who are unable to communicate verbally (Fig ure 3). 155
Eye tracking is the process of measuring either the point of gaze or the motion of an eye relative 156
to the he ad. An eye tracker is a device for measuring eye positions and eye movement [41], [42], [43], 157
[44]. Late ly, gaze direction detection techniques have developed in two b asic directions : electro- 158
oculography (EOG) , which was used by our team in the ASISTSYS project [34], and digit al image 159
processing using video -oculography (VOG) , which is used in the present system. 160
Digital image processing techniques use video cameras in the visible and infrared (IR) spectrum 161
to take video eye images, which are subsequently processed frame by frame using special software 162
installed on a computer to extract the coordinates of the eye pupil. Currently, increasing computing 163
power ha s led to the diversi fication of pupil detection algorithms (PDA s). The advantages of the 164
methods based on the analysis of the eye images provided by video cameras lie in their versatility ; 165
moreover, they are practically independent of the i ndividual characteristics of the patient’s eye. 166
The proposed assistive system uses two types of real-time eye -tracking interfaces to communicate 167
with severely disabled patients: 168
• head-mounted device (Figure 3.a), which measures the angular position of the eye with the head 169
as the point of reference [45]; 170
• remote device (Figure 3.b), which measures the position of the eye with the surrounding 171
environment as the point of reference and is implemented with a commercially available 172
interface developed by Tobii [32]; the IR sensors are placed at the base of the screen. 173
The head -mounted eye-tracking interface illustrated in Figure 3.a consists of an infrared video 174
camera mounted on a n eyeglasses frame placed close to the eyes and connected to a Patient Subsystem 175
(laptop) for eye image acquisition and processing. 176
Sensors 2019 , 19, x FOR PEER REVIEW 5 of 30
177
Figure 2. Different types of USB switch -type sensors used for keywords -based detection: (a) hand 178
switch -click (mouse button switch) with a Swifty USB switch interface; (b) pal pad fl at switch; 179
(c) ribbon switch; (d) wobble switch; (e) foot switch; (f) sip/puff breeze switch with headset. 180
181
(a) (b) 182
Figure. 3. Eye-tracking -based interfaces: (a) head-mounted eye -tracking interface consisting of an 183
infrared video camera mounted on a n eyeglasses frame ; (b) remote eye -tracking interface consisting 184
of a commercially available infrared sensor mounted on the pa tient’s laptop. 185
This communica tion module is used by patients with severe disabilities who cannot communicate 186
with other people through classical methods: speech, writing , or hand gestures. 187
The remote device presented in Figure 3.b is used by patients who, as a result of their physical 188
condition , cannot support a helmet on their head. 189
Because numerous types of available sensors can be used for detectin g the patient’s need s, our 190
system can be easily adapted and configured to different types of diseases and patients’ needs. 191
The communication function is two-way: Patient ↔ Server ↔ Caretaker . Patient ↔ Caretaker 192
communication relies on keywords technology : the system ensures the successive visualization/ audio 193
playback of keywords/ ideogram s/alphanumeric characters . The patient has the possibilit y of 194
choosing the desired keyword , as in Figure 4.a, or the character needed by using a virtual keyboard , as 195
in Figure 4.b. The patient express es their will or need by selecting a keyword by a technique imposed 196
by the disability: (1) using a switch -type sensor mounted onto the patient ’s forearm /finger/ leg/sip/puff 197
or (2) by detecting the patient’s gaze direction using the eye-tracking -based interface and performing a 198
video analysis of the eye ball movement. The selected keyword is then sent to the caretaker via the 199
WI-FI network. 200
Sensors 2019 , 19, x FOR PEER REVIEW 6 of 30
201
(a) (b) 202
Figure 4. Two -way communication function of the proposed assistive system: 203
(a) ideogram s/keywords displayed on the patient display; (b) virtual keyboar d used to build 204
sentences. 205
The system thus allows patients to use computers to communicate with other people not only 206
for medical investigation but also for other more complex activities. The caretaker’s response is sent 207
to the patient as sound and/or visual by text/ideogram. Consequently, a dialogue may take place 208
between the patient and the caretaker (doctor, nurse, etc.). This is essential for diagnosi ng patients 209
who are unable to communicate otherwise ; using our system , they can answer questions (e.g. , “Does 210
it hurt?”, “Yes”; “The leg?”, “No”; “The arm?”, “No”; “The throat?”, “Yes”). Using the keywords 211
technology based on the switch -type sensor, the patients can select only one objec t 212
(ideogram/keyword) for each phase (Figure 4.a). In such cases, the use of alphanumeric characters to 213
build words and phrases would be too slow. 214
The technique of patient gaze detect ion by video -oculography (eye pupil image analysis) mak es 215
it possible to identify and communicate ideas and complex moods by selecting alphanumerical 216
characters displayed on a virtual keyboard (Figure 4.b). This technique involves the display of several 217
characters on the screen simultaneous ly and the gradual construction of words or phrases using the 218
characters displayed. It can also be used to comman d the computer to use the Internet and even e- 219
mail . Furthermore, it can be us ed for reading books, listening to music, and watching TV progra ms 220
and movies. Of course, all of these options are available to patients who are cognitively able but 221
cannot communicate normally for a limited period of time, such as after surgery . However, these 222
alternatives can be adapted to the patient’s level of cognition, preoccupations, and interests , as they 223
can greatly contribute to inc reasing the individual’s quality of life. 224
2.1.2. Telemonitoring module 225
The telemonitoring module is based on a wireless and wearable sensor network in which each node is 226
equipped with a sensor for capturing the physiological parameters of patients according to their 227
needs. Thus, each Sensor Node (SN), which represent s a medical device in the network, is wirelessly 228
connected to the Central Node (CN) , which is USB connected to the patient’s computer , which is 229
routed to the Server Subsystem . The structure of this function is based on the IoT concept : by using 230
sensors, the entire physical infrastructure of the system is interconnected to transmit useful 231
measurement information via the distributed sens or network. 232
The network managemen t is efficiently controlled by the Server , which transmits the measured 233
data values to the caretaker ’s mobile device . 234
The telemonitoring module illustrated in the diagram f rom Figure 5 con sists of a wireless body 235
area network (WBAN) of medical devices attached to the patient’s body for acqui ring the following 236
physiological parameters: blood oxygen saturation (SpO 2), heart rate (HR), respiratory rate (RR) , and body 237
temperatur e (BT). The number and type of monitored parameters can be adapted to the patient’s 238
medical needs and can be selected according to medical recommendation s. 239
The medical devices of the network nodes consist of commercial modules with low energy 240
consumption that captur e the above -mentioned physiological parameters . 241
Sensors 2019 , 19, x FOR PEER REVIEW 7 of 30
242
Figure 5. WBAN of t elemonitoring module used for the acquisition of the different physiological 243
parameters of patients : SpO 2, HR, RR, and BT . 244
They are directly connected to the wireless data transmission module (eZ430 -RF2500) [46] placed in 245
the network central node , which is equipped with a microcontroller (MSP430F2274) [47] and a radio 246
transceiver (CC25 00) [48], and are battery powered. 247
For measurements of SpO 2 and HR , a commercially available medical module 248
AFE44x0SPO2EVM [49] is used . The module that uses the AFE4400 circuit is intended for the 249
evaluation of the acquisition and processing of photoplethysmography signals . The measuremen t of 250
the respiratory rate make s use of a specialized piezoelectric transducer , PNEUMOTRACE [50], 251
mounted around the patient’s thorax. It provide s an electrical signal in response to a modification in 252
thoracic circumference due to patient respiration . Body temperature measurements are taken using 253
an integrated temperature sensor TMP275 [51] that is directly connected to the eZ430 -RF2500 data 254
transmission module through an I2C serial interface . 255
The patient’s computer runs a background software application (presented in paragraph 2.2) that 256
takes the data collected by medical devices and sends th em to the Server device for processing ; from 257
here , the data are wirelessly transmitted to the caretaker device, where they are displayed in real- 258
time . When it detect s that certain limits are exceeded to a level that is considered dangerous , the 259
Server , by automatic data analysis, al erts the caretaker s, and their interventions are saved in the Server 260
memory for later analysis. The monitored data values can be processed at any time in or der to see 261
the evolution of the patient’s physiological parameters over time . This type of analysis is vital for 262
establishing optimal treatment. 263
2.2. Description of assistive system software 264
The three components of the assistive system ( Patient , Server , Caretaker ) are connected via the 265
Internet. The operating modes of the system components are controlled by software . In the following, 266
all the system software components are described in detail. 267
2.2.1. Software component of Patient Subsystem 268
The software component of the Patient Subsystem was developed for both communication and 269
telemonitoring functions . 270
The software component of the Patient Subsystem includes the Patient WEB Application (PWA), 271
which i s implemented on the Server Subsystem and can be accessed remotely through WI -FI Internet 272
connection by each patient ’s device. This network org anization of patients’ devices simplifies the 273
software component because the Server manages all the devices through the same configurable and 274
adaptable application for each type of patient. 275
The software component of the communicat ion function is based on keywords technology , 276
implemented using the Patient WEB Application , which enables both switch -based and eye-tracking – 277
based operating modes for needs detection . Independent of t he PWA on the patient’s devi ce is 278
Sensors 2019 , 19, x FOR PEER REVIEW 8 of 30
software developed for eye-tracking interfaces . All of these software components are based on 279
algorithms proposed by the authors. 280
The software component of the telemonitoring function is also include d in the Patient WEB 281
Application . It ensures the collection of data provided by the wireless and wearable sensors and the 282
real-time transmission of the monitored values to the supervisor device via the Server . The 283
telemonitoring function includes a graphical interface that runs on the patient ’s device and is used for 284
the real-time numeric and graphical display of the monitored physiological parameters, alerts 285
resulting from their processing, and the status of each node in the network . The communication 286
protocols used for transmi tting monitored data to the Server are also included. 287
288
1. Software component of communication function : Operating modes 289
In the case of switch -based communication , PWA runs the ideograms on the user screen by cyclical 290
highlighting (Figure 6). The patient can select an ideogram by activating the switch sensor only while 291
it is highlighted , which lasts for a time interval that is set according to the patient ’s experience in 292
using the system. The database with ideograms/keywords is organized hierarchically. Each level 293
from the database contains a set of ideograms/keywords referring to a class of objects defined in 294
ascending order. When the patient selects an ideogram using the switch (which can be of any type, 295
as illustrated in Figure 2), the request is forwarded to the caretaker device via the Server. 296
Figure 7 presents the process of sending a message using PWA when the ideogram 297
“Emergencies” , highlighted in Figure 6, is selected . Figure 7.a presents the set of ideograms ( “Doctor” 298
and “Nausea” ), which belong to the previously selected class , and Figure 7.b illustrates the last 299
hierarchical level of th e three, corresponding to the transmission of the message “Doctor” . 300
When the patient considers that the message can be sent, s/he selects the “Send” ideogram 301
(Figure 7.b) by us ing one of the previously described techniques , and the message is sent to the app 302
on the mobile Smartphone of the caretaker (nurse, medical staff). 303
The application running on the patient’s laptop displays the messages sent by the patient during 304
the dialogue with the caretaker on the right -hand side of the screen (Figures 6 and 7). The history of 305
these conversations is stored in the Server memory and can be checked if needed. 306
307
308
Figure 6. Patient WEB Application for communication using switch -based or eye-tracking -based patient’s 309
needs detection. (Patient device screen : Laptop ; Caretaker device screen : Smartphone ; raw eye image 310
provided by the IR video camera). 311
Sensors 2019 , 19, x FOR PEER REVIEW 9 of 30
312
(a) (b) 313
Figure 7. The process of sending a message to the caretaker: (a) ideograms belonging to the second 314
hierarchical order after the first selection by the patient; (b) ideogram corresponding to the 315
transmi ssion of the wanted message (“Doctor” ). 316
It is worth noting that the proposed system has the ability to customize databases with 317
ideograms and keywords organized in the form of trees for each individual pat ient. Thus, each 318
patient has an account in PWA that contains a personalized set of ideograms/keywords . The 319
telemonitored physiological parameters for each patient, according to medical recommendation s, are 320
also specifi ed in PWA. Depending on the evolution of the patient’s condition during treatment, this 321
database can be easily updated as needed. 322
In the case of eye-tracking -based communication , the user controls PWA by using one of the two 323
type s of eye -tracking interfaces illustrated in Figure 3. The patient ’s screen is the same as in the 324
previous case (Figure 6), but the ideograms are not cyclically highligh ted, as the user is able to fully 325
control the application by using gaze direction. 326
The operation of the eye -tracking interface is based on the real-time detection of the pupil ’s 327
center coordinates in the raw eye image provided by the infrared video camera (as shown in Figure 328
6). In order to do this, the authors developed an eye-tracking algorithm , which runs independent ly of 329
PWA on the patient’s laptop . The purpose of this algorithm is to move the cursor on the screen 330
according to the user’s gaze direction and to perform ideogram s/keyword s selection by simulating a 331
mouse click . The eye-tracking algorithm includes the following main procedures, whic h are described in 332
[36] in detail: (1) real-time eye image acquisition ; (2) system calibration ; (3) real-time pupil center detection ; 333
(4) mapping between raw eye image and scene image ; (5) ideograms and/or obj ects selection ; and (6) algorithm 334
optimization , which is needed to stabilize the cursor position on the user screen by real -time filtering 335
and high frequency spike canceling from the PD A signals. 336
In order to perform pupil detection, the dark-pupil technique was implemented [52], [53]. Dark- 337
pupil techniques illuminate the eye with an off -axis infrared source such that the pupil is the darkest 338
region in the image (Figure 6) and can be easily detected by using a threshold -based binarization 339
technique, illustrated in [37]. On the other hand, by using this illumination type, a consistent and 340
uniform illumination of the eye can be obtained without any user discomfor t. 341
In order to perform eye -tracking system calibration, nine calibration points are displayed on the 342
user screen in the order illustrated in Fig ure 8 [36]. These points , denoted by Mi (i = 1, …,9) , represent 343
the calibration points on the coordinat e system of the user display . During the calibration process, 344
the user’s gaze (the eyeball) follows these points in the order indicated in Fig ure 8, with equal pauses 345
between them. Corresponding to these points are another nine resu lting points , denoted by Ci (i = 1, 346
…,9) , in the coordinate system of the eye image (with 640 × 480 resolution) provided by the IR video 347
camera . 348
The conversion between the eye image coordinates (pupil ’s center position ) and the user screen 349
coordi nates (user ’s cursor position ) is performed by a special mapping function [54]. The coefficients 350
of the mapping function are obtained during the calibration process when a set of targets in known 351
position s are displayed to the subject and the eye tracker position data are recorded. 352
The cal ibration process is performed for each subject who uses the system. The accuracy of the 353
eye tracker operation depends on the successful completion of the calibration stage. 354
Sensors 2019 , 19, x FOR PEER REVIEW 10 of 30
355
Figure 8. Calibration of the eye -tracking device . 356
Patients with different ocular diseases (e.g. , strabismus) or neurologic diseases who were unable 357
to complete the calibration stage successfully were excluded from these tests and included in the 358
category of patients who were not able to perform their tasks by using the eye -tracking interface. 359
The pupil center can be detected by using d ifferent principles that are illustrated in the li terature : 360
the least -squares fitting of ellipse algorithm (LSFE) [55], [56], the RANdom SAmple Consensus 361
(RANSAC) paradigm [57], [58], circular/elliptical Hough transform -based approaches (CHT/EHT) 362
[59], [60], and projection method (PRO J) algorithm s [61], [62]. On the basis of these principle s, we 363
developed original PDAs that are adapted to our eye -tracking interface ; the algorithms were tested 364
and compared with the open -source Starburs t alg orithm illustrated in [63]. Some results on the 365
performance of these algorithms come from the authors ’ previous studies , includ ing the algorithms 366
LSFE [64], RANSAC [65], and CHT [36]. Details of the test mode s and the results obtained are given 367
in paragraph 3.1.1. 368
After detection, the coordinates of the pupil center undergo real-time transform ation into cursor 369
coordinates on the user screen by using the mapping function . The pupil detection algorithm delivers 370
two real -time signals corresponding to the cursor position on th e use r screen in both directions of the 371
coordinate system. In this way, the cursor movement on the user screen is controlled by the user’s 372
gaze direction. In order to perform this action, both types of eye -tracking interfaces illustrated in 373
Figure 3 are used in our system, depending on the patient’s physical condition. In recent years, 374
remote video -based eye trackers , which are preferred by patients, have become increasingly more 375
wides pread . 376
Regardless of the type of PDA used, the user must be able to perform two actions: 377
• cursor movement on the user screen according to the user’s gaze direction only for visual 378
inspection of PWA objects and 379
• ideogram/keyword s election by simulating a mouse click when the cursor controlled by the user’s 380
gaze direction is placed in the selection ar ea of the wanted object on the screen. 381
In order to find the best solution for our system, different ideogram selection methods were 382
investigated. The first method tested relie d on the algorithm ’s ability to identify voluntary blinking by 383
using the total number of consecutive frames that capture the blinking or using a dynamic threshold 384
for selection [66]. The application requires the user’s attention and concentration , and using the 385
system for a long time can become tiresome . Consequently, this may cause errors in distinguishing 386
between voluntary and involuntary blinking . 387
The proposed system uses the solution that was evaluated as the best for implementing a mouse 388
click : the user’s gaze direction is focused on the selection area of the wanted ideogram/keyword , and 389
the cursor position is maintained in that area for a certain dwell time . However, this technique can 390
lead to the false selection of ideogram s as a result of the Midas touch problem, which involves a 391
random selection of unwanted ideograms followed by the user ’s gaze direction [44]. For the optimal 392
operation, the system must be able to differentiate viewing from gaze control. This can be achieved 393
by setting the dwell time according to the user’s ability to use the system. 394
In order to improve the user ’s experience of operating the system , the application provides 395
feedback to the user by highlighting the selection area of the ideogram after its successfu l selection 396
on the user screen. 397
Sensors 2019 , 19, x FOR PEER REVIEW 11 of 30
Any eye -tracking system is affected by many sources of noise that significantly influence the 398
pupil detection process. The se sources cause noise signals that overlap the target signals provided by 399
the eye -tracking algorithm. These include a high-frequency noise component produced by the video 400
camera’s electronic circuits, different errors of the PDA during the detection process , artifacts due to 401
variable and non-uniform lighting conditions , and a low-frequency drift component , which is due to the 402
difficulty of keeping the eye absolutely fixed on a point on the screen [67]. 403
In addition, there are many other sources of noise that should be considered when the eye – 404
tracking system is designed, such as corneal reflection due to the infrared illumination of the eye, 405
physiological tremor of the eye [68], non -constant eye movement made up of saccades (rapid movements) 406
and fixations (short stops) [67], modif ication of the pupil image shape and position during fixations, 407
obstruction of the pupil by the eyelids or other art ifacts, involuntary blinking , and head movements , 408
which produce high frequency spikes in the signals provided by the PDA. 409
Because of these many noise sources and the physiological eye movement during the eye 410
tracking process, the vertical component of the signals provided by the PDA is more affected by noise 411
than the horizontal one (Fig ure 20). 412
These multiple sources of noise cause the main drawback of this technique, which consists of the 413
difficulty in maintain ing the cursor in a fixed position on a selection area of the display for a certain 414
dwell time to perform a selection. In order to increase the cursor stability on the user screen, the eye – 415
tracking algorith m implemented in th is system was optimized by introducing different techniques 416
that are based on real-time filtering and high fr equency spike cance ling from the signals provided by 417
the PDA. The best results were obtained by using the snap -to-point technique , especially for the 418
situation in which the selection area of an ideogram on the user screen is small in size, as is the case 419
in the communication procedure based on a virtual keyboard controlled by eye t racking. By using 420
this technique, it was possible to introduce more complex applications , such as browsing th e Internet 421
and accessing personal e -mail , into our communication module . 422
423
2. Software component of telemonitoring func tion 424
As a network protocol, we decided to use SimpliciTI [69] from Texas Instruments to transfer data 425
through the WBAN. SimpliciTI ’s main features include low memory needs, advanced network 426
control, and sleeping mode suppor t. It is intended to support the development of wireless networks 427
that contain battery -operated nodes and require low data rates. 428
The eZ430 -RF2500 modules connected to medical devices were configured as End Devices (EDs) . 429
The sam e modul e is connected to the patient’s computer as an Access Point (AP), and several others 430
can be configured as Range Extenders (RE) (Figure 9). 431
The flowchart of the firmware running on the MSP430F2274 microcontroller from the ED is 432
represented in Figure 10. In this instance, the eZ430 -RF2500 module is initialize d onto the network ; 433
then, after a START command, it wakes up to read the SpO 2, HR, RR, and BT values from the medical 434
devices. Also, MSP430F2274 read s the battery voltage and communica tes the data to the central 435
monitoring station through the RE and AP. In order to minimize energy waste, since an important 436
power consumer element is the radio transceiver, the CC2500 enter s a low-power mode af ter each 437
transmission cycle. 438
A use r-friendly Graphical User Interface (GUI) was developed for the patient ’s monitor 439
application to display the received measurements and alarms . The GUI running on the central 440
monitoring station was developed using LabWindows/ CVI and is shown in Figures 1 1 (a) [38] and 441
(b). 442
The GUI display s the temporal waveform of the SpO 2, HR, and RR parameters for the selected 443
patient, together with the status of the node (the battery voltage and distance from the nearby RE or 444
AP). The distance is represented in percent and computed on the basis of received signal strength 445
indication (RSSI) measured from the power of the recei ved radio signal. 446
In order to al ert the caretaker when the normal value s of the telemonitored patient’s 447
physiological parameters are exceeded, the proposed assistive system uses an alert detection algorithm , 448
which is implemented in the Server Subsystem . 449
Sensors 2019 , 19, x FOR PEER REVIEW 12 of 30
450
Figure 9. SimpliciTI wireless protocol [69]. Figure 10. Flowchart of firmware running on MSP430F2274. 451
452
(a) (b) 453
Figure 1 1. Graphical inter face running on the patient ’s device used for telemonitoring : (a) heart rate, 454
blood oxygen saturation (SpO 2), and respiration rate [38]; (b) body temperature. 455
456
The physiological condition s that may cause alerts are low SpO 2 if SpO 2 < 93%, bradycardia if 457
HR < 40 bpm, tachycardia if HR > 150 bpm, HR arrhyth mia if ΔHR/HR > 20% over the last 5 min , HR 458
variability if max HR variability > 10% /the last 4 readings, low body temperature if BT < 35 °C, high 459
body temperature if BT > 38 °C, low RR if RR < 5 rpm, low battery voltage if VBAT < 1.9 V, and a low 460
value for RSSI if the measured RSSI < 30%. 461
462
Figure 1 7 illustrate s the data values provided by the sensors mounted on the patient and 463
recorded in the Server database during the testing of the syste m on hospitalized patients . 464
465
466
Sensors 2019 , 19, x FOR PEER REVIEW 13 of 30
2.2.2. Software component of Server Subsystem 467
The Server Subsystem is a conventional desktop PC operating as a dispatcher with many 468
functions: it manages the databases of the patients (keywords data base, personal data and medical re cords 469
of the patients, data values of the monitored parameters) ; it receive s, org anize s, and process es the data 470
values of the telemonitored physiological parameters of the patients and send s this information to the 471
caretaker devices (Smartphone) ; it records and organizes the evidence and the history of the patients’ 472
conversations with the caretakers ; and it detects alarming situations and al erts who ever may be 473
concerned, among other functions . All of these features are available by running the Patient WEB 474
Application , implemented at the Server level. 475
Sensitive data are encrypted by the Server Subsystem before being sent to the system clients, so 476
the communication channel is protected regardless of whether the http or https protocol is 477
used. Medical data are completely anonymized and encapsulated into customized system objects so 478
that nobody can access the medical data of a given patient. Even if the communication system is 479
hacked, the data are unreadable and cannot be linked to a ny given patient because the patient ’s name 480
is never sent out by the Server , as all medical data communicated in the system are assigned to a 481
Globally Unique Identifier (GUID ). Only the system members with a specific account and password 482
can access the data. Each user has specific rights and can access only the data for which s/he has 483
received rights from the system administrator. 484
Since the system uses an Internet connection, the number of patient and caretaker dev ices that 485
can be connected in the network is unlimited , and these devices can be placed at any distance (in 486
different hospital rooms). All of the system components are controlled by the Server through the 487
Internet network. As a conseq uence, the system operation ( installation, configuration , and 488
administration) is simplified. Since the Patient Subsystem must be periodically adapted and 489
configured to the patient’s needs, the remote administration of these settings represents an impor tant 490
advantage. 491
The Patient WEB Application deals with both pa tient interaction and medical data management 492
(Figure 12). The “Patient inputs and dialogs” block is detailed in Fig ure 13, where the tasks of each 493
input device are illustrated. 494
The system u ses the new SignalR technology [70], which makes the dialogue between the 495
browser and Server very easy to implement [39]. 496
PWA has many benefits : it does not require installation on the patient ’s device, it does not 497
require synchronizations , and it is easy to maint ain. 498
The home page of the Patient WEB Application is shown in Figure 1 4. PWA can be accessed at 499
http://siact.rms.ro . By accessing this link, PWA can be customized and run by each patient in the 500
network according to their needs and medical recommendations. 501
502
503
Figure 1 2. Overview of database responsibilities. Figure 1 3. Overview of tasks for each device. 504
Sensors 2019 , 19, x FOR PEER REVIEW 14 of 30
505
Figure 1 4. Home page of the Patient WEB Application . 506
In the following, the structure and main functions of this WEB application are presented in 507
detail. The first task of the application is to define the organizational framework in which the assistive 508
system can be used. 509
The Patient WEB Application is design ed to be flexible and easily adaptable to the specificities of 510
each type of patient, dependin g on the type and degree of disability, and uses switch -based or eye- 511
tracking -based communication modes, as presented previously . For each patie nt, the personal and 512
medical data are loaded , and the structure and content of the keyword/ideogram database used for 513
communication are built. PWA sustains the dialogue between the patient and caretaker by building 514
the patient ’s message and the conversation page. At the same time, the physiological parameters of 515
the patients are monitored in real-time . When the normal values of these parameters are exceeded, 516
the application will al ert the supervisor by means of a message a ccompanied by a beep that is 517
received on the supervisor’s device, represented here by a Smartphone . 518
The system administrator assigns one of the following roles to users: admin , doctor , patient , or 519
nurse . Depending on the assigned role, each user has speci fic access rights. 520
The communicat ion function of PWA is based on keywords technology , as previously described. 521
Ideograms accompanied by keywords are intuitive images that characterize a particular state of the 522
patient. The patient can build simple sent ences to express a certain need o r state by putting the 523
ideograms in a logical sequence. This action is shown in Fig ures 6 and 7. 524
The database of ideograms and/or keywords is organized hierarchically as logical trees. They 525
can be customized for each patient . There is a root ideogram to which child ideograms are attached , 526
and child ideograms may expand the idea started from the root. Additional levels of ideograms lead 527
to an accurate descr iption of the patient’s state or need . 528
The ideograms used by the system must be personalized for each patient because patients have 529
different patholog ies and needs. Each patient has a personal account in t he application . Thus, the 530
application has the ability to introduce or remove ideograms from the system database. An example 531
of such a logical tree is illustrated in Fig ure 15. 532
Sensors 2019 , 19, x FOR PEER REVIEW 15 of 30
533
Figure 1 5. Building of a tree with several hierarchical levels in Patient WEB Application . 534
Another function of PWA consists of telemonitoring the patient’s physiological parameters . 535
The list of sensors that are attached to the patient is editable at an y time. Thus , it is possible to 536
add new types of sensors for a particular patient. All sensors are defined in the “Configuration/Sensor 537
Types” page of the application, illustrated in Figure 1 6. 538
The application display s a series of web serv ices to which the Patient WEB Application can 539
connect and transmit to the Server the values recorded by the sensors mounted on the patient. These 540
values are loaded into the system asynchronously, so the services must be active 24 hours a day. All 541
of the data values received by the Server are viewed in the “Configuration/Sensor Values” page of the 542
application, illustrated in Figure 1 7. 543
The system administrator can view all dialogues between the patient and the caretakers using 544
the “Dialogs” page in the application, illustrated in Figure 1 8. 545
546
Figure 1 6. Defining the types of sensors used in Patient WEB Application . 547
Sensors 2019 , 19, x FOR PEER REVIEW 16 of 30
548
Figure 1 7. Data values provided by sensors mounted on the patient and recorded in the Server 549
database. 550
551
Figure 1 8. Dialog ues between patients and caretakers recorded in the Server database. 552
Thus , the evolution of the patient’s condition and the medical and care services during 553
hospitalization can be monitored . Messages cannot be erased from the database by supervisors 554
(nurses, medical staff). Only the administrator has deleting rights . 555
2.2.3. Software component of Caretaker Subsystem 556
The Careta ker Subsystem consists of a Smartphone (or tablet PC), as illustrated in Figure 1, for 557
communication with patients via the Server and real -time monitoring of the patient’s physiological 558
parameters. The software application is written in Java for Android-enabled mobile devices using the 559
JSON protocol , and it provides the following actions [40]: 560
• displays the keywords selected and transmitted by the patient in written or audio form ; 561
• sends back the supervisor’s response to the patient ; 562
• displays the values of the patient’s moni tored physiological parameters ; 563
• receives an alarm notification in audio and visual form s, triggered by the Server , when the 564
normal values o f the vital physiological parameters of the patients are exceeded; 565
• displays the time evolution of the physiological parameters ’ values monitored during treatment. 566
The initiator of communication is always the Server Subsystem , which sends a noti fication when 567
a patient has a need or a request. Every alarm notification includes the message ID, message priority, 568
patient information, patient’s vital physiological parameters, patient’s message , and the 569
caretaker/nurse ’s confirmatio n once at the patie nt’s bedside [40]. The Server also sends notification 570
alarms to the caretaker devices in case of emergenc y when normal values of vital physiological 571
parameters are exceeded according to the alert d etection algorithm implemented on the Server 572
Subsystem . 573
Sensors 2019 , 19, x FOR PEER REVIEW 17 of 30
574
Figure 1 9. Two -way communication protocols between Caretaker device and Server Subsystem [40]. 575
The two -way communication protocols between Patient and Caretaker are as follows [40]: 576
a) Normal operation when the caretaker reads and answers a notification alarm. 577
b) Special situation (1) when the caretaker does not answer after reading the message. 578
c) Special situation (2) when the caretaker does not read the notification alarm. 579
In cases (b) and (c), when the caretaker does not answer or read the notification within the 580
scheduled time, the Server Subsystem will automatically send the message to another caretaker, 581
deleting the one already sent to the first caretaker. This operating mode of the application ensures 582
that an emergency expressed at any time is definitely resolved by a nurse. All communications are 583
logged on the Server for further analysis. In Figure 1 9, the three communication protocols are 584
illustrated [40]. 585
One of the main advantages of this application is that one caretaker can assist several patients 586
by communicating with them and telemonitoring their physiological parameters , and as a 587
consequence, the cost of treatment can be reduced. 588
3. System Test ing 589
The proposed assistive system was tested in laboratory conditions and in a hospital with 590
different types of severely disabled patients. 591
3.1. System testing in laboratory conditions 592
First, the proposed system was tested in laboratory conditions with healthy individuals for b oth 593
functions of the system : communication and telemonitoring the physiological parameters . 594
3.1.1. Testing the communication function 595
The communication function was tested for the detection of all type s of patient need s using switch – 596
type sensor s and eye-tracking interfaces. All switch -type sensors and both types of eye -tracking 597
interfaces illustrated in Figures 2 and 3 were tested usin g PWA, which was personalized for detecting 598
each type of patient’s need s, as shown in paragraph 2.2 .1. 599
Sensors 2019 , 19, x FOR PEER REVIEW 18 of 30
The communication function based on the switch -type sensor was tested very easily on all 600
subjects, and it was learned in a very short time. During the tests, the frequency of the cyclical 601
highlighting of the ideograms on the user screen was varied in order to determine the optimal value, 602
i.e., appropriate for most subjects . Obviously, the optimal value of this parameter depend s on the 603
type of switch sensor used and can be personalized according to the user’s experience in using the 604
system. 605
In order to implement the communication function based on the patient gaze detection, the 606
following pupil detection algorithms were investigated and tested : LSFE, RANSAC, CHT, E HT, and 607
PROJ (developed by the authors) and the open -source Starburst algorithm . These algorithms used by 608
the eye-tracking inter face were tested in different illumination conditions and for different operating 609
scenarios specific to assistive tech nologies in order to implement the optimum software solution for 610
the proposed system . 611
The accuracy of the analyzed pupil detection algorithms was tested on static eye images from 612
different databases and on video eye images for re al-time applications by using a new testing protocol 613
developed for the scene image . 614
The subjects who tested the system for a real -time scenario were required to keep their head in 615
a fixed position and look at the user screen placed approximately 60 cm away. In this case , the test 616
consist ed of moving the cursor by gaze direction over nine identical quadrants in the order shown in 617
Figures 20 (a) and (b) while maintaining the cursor position as stable a s possible in each circular target 618
area with a 50-pixel radius (illustrated in yellow in Fig ure 20) located in the center of each quadrant ; 619
the cursor was to be maintained on the target for a certain period o f time before moving to the next 620
quadrant . The stationary time in each target area of each quadrant represents the dwell time used for 621
simulating a mouse click . The performance of each algorithm was tested using the detection rate at 622
50 pixels in the scene (user scre en) image. The detection rate at 50 pixels represents the num ber of 623
frames (images) for which the Euclidean distance between the detected cursor position in a quadrant 624
on the user screen and the center of the target circle placed in that quadrant is less than or equal to 50 625
pixels. According to the experimental res ults, t he highest detection ra te (91%) was obtained with the 626
CHT algorithm . Considering the trade -off between accuracy , noise sensitivity , running time , and user 627
ability , the optimum solution for real-time applications is the PDA based on the circular Hough 628
transform implemented in our system. 629
The tests performed in laboratory conditions showed that subjects need significant learning time 630
to gain the ability to use the eye -tracking system. It has also been found that there are people who do 631
not have the necessary skills to use such a system, as shown in [71]. 632
Figures 20 (a) and (b) illustrate the eye-tracking test results obtained with a non -experienced 633
user and an experienced use r, respectively . 634
3.1.2. Testing the telemonitoring function 635
The tests performed on the telemonitoring subsystem in laboratory conditions included two 636
stages: assessing measurement accuracy and testing energy cons umption . 637
The accuracy of temperature measurement was assessed using a hig h-precision medical 638
thermometer as a reference . The reference thermometer was placed in a thermally insulated enclosure 639
with the body temperature monitoring device. The results obtained are presented graphically in 640
Figure 2 1.a. The data in Figure 2 1.a show that the body temperature telemonitoring device measures 641
the temperature with an error of less than ±0.15 °C, a tolerance suitable for medical measurements. 642
The accuracy of oxygen saturation (SpO 2) and heart rate (HR) measur ements was computed 643
using the METRON SpO 2 Analyzer simulator . The analyzer is usually used for the high -precision 644
testing of commercially available pulse oximeters. The results obtained are illustrated in Figure 2 1.b 645
and show that the heart rate is calculated with the maximum possible accuracy in the range of 30 – 646
250 bpm. In Figure 2 1.c, one can see that the SpO 2 measurement error is less than or equal to ± 2% for 647
80–99% SpO 2, similar to that of a commercial pulse oximeter. 648
For the evaluation of the energy consumption and battery life of each telemonitori ng device, we 649
used the configuration presented in Figure 2 2. 650
Sensors 2019 , 19, x FOR PEER REVIEW 19 of 30
651
Figure 20. Signals provided by PDA on both axes of the coordinate system and cursor movement 652
tracking on the user screen: (a) non -experienced user; (b) experienced user. 653
Figure 2 1. Testing telemonitoring function: (a) accuracy of body temperature measurements (°C); 654
(b) accuracy of HR measurements (bpm); ( c) accuracy of SpO 2 measurements (%). 655
(b)
(a)
(a)
(b)
(c)
Sensors 2019 , 19, x FOR PEER REVIEW 20 of 30
656
Figure 2 2. Evaluation of energ y consumption. 657
The power consumption of each telemonitoring device involves two components: power 658
consumed by the acquisition (measurement) of signals (physiological parameters) and power 659
consumed by the radio transmission component ( eZ430 -RF2500 wireless development kit). The 660
transmission of the measured value to the Server Subsystem is performed at discrete time points 661
(illustrated in Table 1) to optimize the power consumption of the radio module . After that , the device 662
is switched to sleep mode. Also, the microcontroller of each sensor is switched to low -power 663
operation mode at times of inactivity in order to preserve energy. 664
Table 1. Data t ransmission rate, energy consumption , and batter y life. 665
3.2. System t esting in hospital 666
The end product is intended for use in hospitals (for acute cas es), rehabilitation centers (for 667
subacute and chronic cases), nursing homes (for elderly patients with severe handicap and multiple 668
concomitant diseases) , as well as the patient’s home (for chronic patients) in order to enabl e individuals 669
with chronic conditions to remain independent and not require a nursing house. 670
The prototype of the proposed assistive system was tested in “Dr. C.I. Parhon” Clinical Hospital 671
of Iași, Romania, on 27 patients hospitalized in the Clinic of “Geriatrics and Gerontology” in July 2017. 672
The medical team involved in system testing comprised two doctors and two nurses. 673
The Ethics Board’s approval of our study is docume nted in the Agreement of the Ethics Board no. 674
4/12.07.2017 of “Dr. C.I. Parhon” Clinical Hospital of Iași, Romania. 675
The testing procedure consist ed of the following steps : 676
1. Training the patient to use the system. Depending on the patient’s condition, the patient learning time 677
varied between 10 and 20 minutes. 678
2. Testing both functions of t he system : communication using keywords technology and real -time 679
telemonitoring of the patient’s physiological parameters . Depending on the patient’s experience and 680
cooperation, this stage varied between 20 and 30 minutes. 681
The system functionality tests were conducted independently for each patient , as the patients 682
had different diseases, needs, capabilities , and learning time s. One of the great advantages of the 683
proposed system resides in its adapt ability to very different patients’ needs. Depending on the degree 684
of disability, patients can communicate using different switch -type sensors (illustrated in Fig ure 2) 685
Telemonitoring
device Data transmission
rate Average current
consumption Battery
capacity Days of
operation
HR and SpO2 1 measurement/10 s 6.1 mA 1250 mAh 8.50
Respiratory rate 1 measurement/10 s 0.14 mA 240 mAh 71.0
Body temperature 1 measurement/10 s 0.20 mA 240 mAh 45.5
Sensors 2019 , 19, x FOR PEER REVIEW 21 of 30
or eye -tracking interfaces (illustrated in Fig ure 3). In addition, the list of ideograms/keywords used 686
by the Patient WEB Application can be configurated according to the patient’s needs and level of 687
understanding. Also, the physiological parameters telemonitor ed can be customized according t o 688
each patient's need s. 689
3.2.1. Participants 690
The mean age of the patients included in this study was 72.27 ± 8.23 years ; the minimum age 691
was 55 years old and the maximum age was 89 years old. The proportion of female patients was 692
76.9% . The frequent ly associated patholog ies were degenerative osteo articular diseases (73.1%) and 693
cardiovascular diseases (34.6%). Some patients had more than one condition leading to disability. All 694
patients were able to complete a questionnaire regarding the usage of the new system . The relevant 695
demographic details are described in Table 2. 696
Table 2. The demographic details of the study population . 697
Included ( n = 27)
Age 72.27 ± 8.23
Female 20 (76.9%)
Cardiovascular diseases 9 (34.6%)
Pulmonary diseases 1 (3.8%)
Stroke 3 (11.5%)
Other neurological diseases 2 (7.7%)
Degenerative osteoarticular diseases 19 (73.1%)
Amputations 1 (3.8%)
698
The inclusion criteria were the following : (1) male and female hospitalized adults ; (2) the presen ce 699
of documented comorbidities leading to disability, such as neurodegenerative disorders, chronic severe 700
diseases that are associated with a high degree of handicap (cardiovascular disorders : congestive heart 701
failure, severe valvular disease, severe ar terial disease, chronic pulmonary diseases, previous stroke , or 702
other neurological diseases associated with disability, amputations, etc.); (3) preserved ability to answer 703
questions regarding the system usage ; (4) a cooperative attitude and ability to use the system. 704
The exclusion criteria were the following : (1) refusal to sign the informed consent form; (2) the 705
presence of a significant visual disorder ; (3) the presence of documented severe neurocognitive 706
impairment; (4) the pre sence of unstable major med ical conditions; (5) the use of medication that could 707
inter fere with the patient’s ability to test the system. 708
All patients included in the study population underwent a general medical evaluation , and 709
comorbidities were found in previous med ical record s or the patients’ chart s. The general medical 710
evaluation was performed by the geriatric medical team that participated in the study. All of the 711
patients read and signed the informed consent form before any measurements were taken . The study 712
team informed the patients about the testing procedures and all activities , and they agreed to perform 713
them for the purpose of testing the system . 714
3.2.2. Experimental sessions 715
In order to check all func tions of the proposed assistive system, three types of tests were carried 716
out on the patients, as shown by the testing plan detailed in Table 3. 717
The communication function based on a switch -type sensor was tested with all six type s of switch 718
sensors listed in Table 3. The sensor type was chosen according to the degree of disability of the 719
patients who tested the system. This mode of patient s’ needs detection is used for disabled pat ients 720
who can perform some controlled muscle contractions, such as mov ing a finger, forearm, or foot; 721
inspiration/expiration ; sip/puff (patients with partial paralysis, paraplegia). The test consist ed of 722
sending s everal messages ( composed by the testing team and includ ing keywords from different 723
Sensors 2019 , 19, x FOR PEER REVIEW 22 of 30
database hierarc hy levels ) to the supervisor using the switch -based communication mode of PWA. The 724
task was considered to be carried out when the patient’s request wa s received by the caretaker device 725
and the supervisor sent a response that was displayed on the patient ’s screen. This procedure 726
require d a minimum learning time (10 min s.). All patients w ith discernment were able to use this 727
type of communication. 728
The communication function based on eye tracking interface has been tested for both type s of eye 729
tracking interfaces illustrated in Table 3. This type of patient’s needs detection is used for se verely 730
disabled patients who cannot perform any controlled muscle contraction, apart from the eyeball 731
movement and blinking (complete paralysis, tetrap legia, myopathies, amyotrophic lateral sclerosis, 732
etc.). The test consist ed of sending s everal mess ages ( composed by the testing team) to the supervisor 733
using the eye-tracking -based communication mode of the PWA. The task was considered to be carried 734
out when the patient’s request was received by the caretaker device and the supervisor se nt a 735
response that was displayed on the patient ’s screen. This procedure require d a medium (15 min ) or 736
long (20 min ) learning time depending on the type of eye -tracking interface used . Only patients with 737
ocular disorders (strabismus) failed to successfully use the eye -tracking interface. 738
Table 3. Testing plan : tested hardware/software components of Patient Subsystem, patients’ diseases, 739
and learning time depending on system function . 740
Tested
system
function Tested hardware
components of the
Patient Subsystem Tested software
components of the
Patient Subsystem Patients’ diseases Patients’
learning
time Communication Switch -type sensor:
– hand switc h-click;
– pal pad switch;
– ribbon switch;
– wobble switch;
– foot switch;
– sip/puff breeze switch . – PWA (keywords
technology for
switch -type sensor) Disabled patients who can
perform some controlled
muscle contractions, such as
movement of a fin ger, forearm,
or foot; inspiration /expiration ;
sip/puff, etc. (partial paralysis,
paraplegia, amputations,
congestive heart failure,
degenerative osteoarticular
diseases, stroke, chronic
respiratory failure, severe
COPD, recovery after major
surgery, et c.) Minimum
(10 min )
Eye-tracking interface:
– head -mounted device;
– remote device . – PWA (keywords
technology for eye –
tracking interface);
– eye-tracking
algorithm
(including virtual
keyboard and
Internet/e -mail
browser) . Severely disabled patient s who
cannot perform any controlled
muscle contraction s apart from
eyeball movement and
blinking (complete paralysis,
tetraplegia, myopathies,
amyotrophic lateral sclerosis,
stroke, severe degenerative
osteoarticular disease, severe
arthritis in the s houlder,
amputations, etc.) Medium
or long
(15–20
min) Telemonitoring Wireless body area
network:
– oxygen saturation;
– heart rate;
– respiratory rate;
– body temperature . – SimpliciIT
protocol;
– GUI for
telemonitoring the
physiological
paramet ers. All patients –
Sensors 2019 , 19, x FOR PEER REVIEW 23 of 30
The telemonitoring function was tested by all patients. The test consist ed of using a wireless 741
network of medical devices ( wearable sensors ) mounted on the patient’s body for the acquisition of the 742
following physiological pa rameters: blood oxygen saturation , heart rate , respiratory rate , and body 743
temperature . Data values provided b y the sensors are transmitted by the patient’s device to the Server , 744
where they are recorded and can be viewed with PWA (Figure 1 7). As needed, the system can also 745
monitor other physiological parameters , such as blood pressure (BP) and galvanic skin react ion (GSR). 746
The hospital t esting of the telemonitoring function consisted of tracking the acquisition and 747
correct transmission of the measured val ue to the Server database and from the Server to the 748
caretaker’s mobile device. The values recorded in the database (Figure 1 7) were compared with those 749
registered by the hospital equipment, and a 98% similarity was obtain ed. 750
The al erting of the supervisor in special situations when the normal values of the monitored 751
parameters were exceeded was also tested using the alert detection algorithm . This procedure does not 752
require any patient training, so the learning time is not applicable . 753
Regardless of the test procedure stage, a test was immediately stopped if requested by a patient, 754
a member of the medical staff , or a legal guardian . All actions undertaken by the team performing 755
system testin g were done only with the patient’s consent and for their benefit. During the tests , the 756
confidentiality of the patient’s identity was respected by anonymiz ing personal data. 757
4. Results and Discussion 758
To assess the functionality of the proposed system, the patients who participated in testing 759
answered a questionnaire that focused on its necessity, usefulness, operating modes, flex ibility , and 760
adaptability to the patient’s needs. 761
The questionnaire prepared by the testing team include d the following questions for patients: 762
1. Do you consider that the system responds to your needs for communication with the 763
superv isor/ others? 764
2. How do you evaluate the operating modes of the system? 765
3. How do you evaluate the two functions of the system (communication and telemonitoring of 766
physiological parameters)? 767
4. Do you think that the keywords that run on the screen respond to your needs? 768
5. How do you evaluate the graphics used by the system? 769
6. Do you consider the hierarchically organized database useful? 770
7. How do you rate the selection of keyword/ideogram by using the eye -tracking technique? 771
8. How do you rate the selection of keywords/ideograms with the switch -type sensor? 772
9. Do you think the ideograms that accompany the keywords are suggestive? 773
10. Do you consider that you were able to communicate effectively with the su pervisor during the 774
tests? 775
11. Do you consider the telemonitoring of the physiological parameters useful ? 776
12. Do you think t he system is flexible enough to quickly reconfigure to your needs (the frequency 777
of highlighting the ideograms/ke ywords, modifying dwell time , changing databases to enter or 778
remove keywords, changing icons or sounds that accompany keyword displ aying) ? 779
13. Do you consider the system tested to be useful in the hospital/ at home ? 780
14. What was the level of fatigue experienced during testing ? 781
15. Did you feel physical discomfort due to the different modules of the system you tested? 782
783
For questions 1 –13, the patients’ response was scor ed on a scale between 0 ( strongly disagree ) to 784
5 (strongly agree ). The answers to questions 14 and 15 were measured on a scale between 0 (the 785
highest level of fatigue/discomfort) and 5 ( no fatigue/discomfort ). The value corresponding to the 786
patient’s evaluation was scored in the response table. The average score s of the patients’ responses 787
to each question are illustrated in Figure 2 3. 788
Figures 2 4 (a)–(f) are images taken during the testing of the system with hospitalized patients at 789
“Dr. C.I. Parhon” Cl inical Hospital of Iași, Romania. 790
Sensors 2019 , 19, x FOR PEER REVIEW 24 of 30
791
Figure 2 3. Testing score s obtained by the proposed system at “Dr. C.I. Parhon” Clinical Hospital of 792
Iași, Romania. 793
The communication rate depends on the type of communication used by the system ( using the 794
switch -type sensor or the eye -tracking interface) and the patient's phy sical condition. It varies greatly 795
from patient to patient and also within the patient’s trai ning period . 796
For the communication function based on the switch -type sensor, the communication rate 797
depends on the cyclical highlighting speed of the ideogra m, with the speed varying between 1 798
ideogram/sec ond and 1 ideogram/10 sec onds according to the patient’s ability. On the other hand, 799
the duration of the selection depends on the ideogram position in the scene image and in the 800
keywords database hie rarchy. As a result, for a keywords database with several l evels of hierarchy , 801
the duration of ideogram selection can vary from a range of 1–4 seconds to a range of 80–120 seconds . 802
Of the total number of patients, 15 participants opted for a highlighting speed of 1 ideogram/3 sec onds . 803
They succeed ed in mak ing the selection in the first scroll cycle. Of the patients who failed to make 804
the selection on the first attempt, 9 p atients opted for a highlighting speed of 1 ideogram/5 sec onds and 805
3 patients selected 1 ideogram/10 sec onds . Only one participant (3.7% of the total) was unable to 806
complete the task because of difficult ies in understanding . 807
For the communicat ion function based on the eye-tracking interface, the communication rate 808
depends on the dwell time . According to Päivi Majaranta , expert eye typists require only a very short 809
dwell time (e.g. , 300 ms) , while novices may prefer a longer dwell time (e.g. , 1000 ms) to give them 810
more time to think, react , and cancel the action [72]. Experimental results have shown that disabled 811
patients also prefer to use a long dwell time [73]. For this reason, in our tests , the dwell tim e varied 812
between 1 and 3 seconds depending on the user’s ability to use the system . This range of values was 813
experi mentally determined , according to the needs of patients who tested the system, and is in line 814
with the results reported in [73] and [74]. In [73] it is sh own that the dwell t ime threshold for people 815
with disabilities depends on patient ’s physical condition ; the same dwell time may be “short” for one 816
user and “long” for another. Thus , the system must be flexible in order to accommodate a higher 817
range of dwel l time values and different needs of the users with various disabilities. Of the total 818
number of patients, 3 participants (11.1% of the total) were unable to complete the calibrati on stage 819
successfully and, as a consequence , they could not test this functi on of the system. Two other 820
participants (7.4% of the total) could not use the head -mounted eye -tracking interface because it was 821
too difficult to keep the ir head in a fixed position . Of the patients who tested the communication 822
function based on the eye-tracking interface, 3 participants opted for a dwell time of 1 sec ond, 5 opted 823
Sensors 2019 , 19, x FOR PEER REVIEW 25 of 30
for 2 sec onds, and 16 opted for 3 sec onds. Compared with other commercial systems, ours has the 824
advantage of being able to change this time according to the user’s experience in using the system. 825
On the other hand, p atients with st roke and neurological diseases had difficulty in using the eye – 826
tracking interfac e. 827
Furthermore , the communication rate through eye tracking also depends on the running time of 828
the pupil d etection algorithm used in the system. To be suitable for real -time applications, the 829
algorithm must have a speed that is higher than 10 frames/sec ond. The PDA based on the circular 830
Hough transform used in our system provides a speed of 20 frames/sec ond; thus, it is a user -friendly 831
algorithm and suitable for real -time applications. 832
The system usability was assessed by the testing score obtained by the system (presented in 833
Figure 23 ) and by the task success rate (Table 4). 834
835
(a) (b) 836
837
(c) (d) 838
839
(e) (f) 840
Figure 24. System tes ting on hospit alized patients at “Dr. C.I. Parhon ” Clinical Hospital of Iași, 841
Romania. Testing the communication function based on a switch -type sen sor (a), (b); using the head – 842
mounted eye -tracking interface (c); using the remote eye -tracking interface ( d); testing the 843
telemonitoring function by the measurement of blood oxygen saturation and heart rate values and the 844
transmission of these values to the caretaker device (e); training the patients to use the system (f). 845
Sensors 2019 , 19, x FOR PEER REVIEW 26 of 30
Table 4. Task success rate . 846
Communication
function based on
switch -type sensor Communication function
based on eye -tracking interface Telemonitoring
function head -mounted remote device
Task success rate 96.3% 81.5% 88.9% 98%
847
In the evaluation of system performance, 10 participants gave the maximum score (5), and the 848
lowest result was 3.67. Thus, the overall mean score was 4.74, with a standard deviation of 0.38. 849
For the communication f unction, the task success rate represents the percentage of the patients 850
who successfully accomplished their task. 851
For the telemonitoring function, the task success rate represents the ability of the system to 852
correctly transmit the measured values to the Server database and from the Server to the mobile device 853
of the caretaker. The transmitted values are compared with those registered with the hospital 854
equipment, and the obtained similarity (expressed as a percentage) represents the task success rate 855
of th e tele monitoring function. 856
5. Conclusions 857
The assistive system proposed in this paper was developed as a functional and testable 858
prototype that enables two-way communication with severely disabled patients while it 859
simul taneously telemonitor s the patients’ vital physiologic parameters. 860
The proposed assistive solution contributes to the development of techniques and technologies 861
for monitoring and communicating with severely disabled patients. It provides the means of a new 862
comm unication method for people lacking normal abilities, to whom it offers adequate procedures 863
based on keywords technology and equipment. The system also contributes to the develop ment of 864
techniques and technologies for telemoni toring the vital physiologic parameters of disabled patients. 865
Furthermore, the proposed assistive system contributes to the acquisition of new knowledge 866
about disabled people’s contact behavior and communication with intelligent machines by using a 867
computer –human interface —a difficult field to study with a cl ear interdisciplinary nature—and a 868
combination of proper ergonomic design, communication techniques and technologies, and 869
postoperative care aspects. 870
The proposed assistive system presents a series of apparent advantages compared with other 871
similar ones: (1) the innovative principles of two-way communication between severely disabled 872
patients and caretakers/medical staff by gaze detection techniques based on video -oculography ; (2) the 873
possibility of investigati ng patients with severe disabilities who cannot communicate with other 874
people by means of classical communication (spoken, written , or by signs) by using the 875
communication technique that make s use of keywords/ideograms organized in several hierarchical 876
levels; (3) the ability to implement advanced communication by alphanumeric characters, thus allow ing 877
severely disabled patients to use the Internet and e-mail; (4) the ability to dynamically adapt both the 878
software and hardware structure according to the patient’s needs, the evolution of their condition, and 879
medical recommendation s; (5) the permanent monitoring of several phys iological parameters , their 880
analysis , and th e alerting of caretakers in emergency situations , all of which are carried out using the 881
same resources as those used for communication; (6) the increased accountability of medical personnel as 882
a result of logging caretaker interventions; (7) lower caretaking costs due to fewer caretakers needed to 883
assist a larger number of patients. 884
Many attempts and tests (both in the laboratory and on patients) were performed in order to 885
obtain an optimal solution for the prototype . The final tests of the prototype were performed at “Dr. 886
C.I. Parhon” Clinical Hospital of Iași, Romania, with patients f rom the Geriatrics Clinic. 887
Although the number of patients included in prototype testing was small, we consider them 888
highly representative of the population of Romanian senior s with severe disabilities an d low qu ality 889
of life. 890
Sensors 2019 , 19, x FOR PEER REVIEW 27 of 30
The test results demonstrate the system’s good performance in both of its primary functions: 891
communication with disabled patients and telemonitoring their physiological parameters. Bo th the 892
medical staff and patients involved in system testing positively evaluated its functions, ease of use , 893
and adaptability of the system to the needs of patients, highlighting its utility in a modern medical 894
system. 895
The main bene fits of our assistive system that are conferred to disabled people include the 896
patients’ improve d health and wellbeing, their faster reinsertion into society, an increase in the 897
quality of medical services , a decrease in patients’ expenses , and an increase in the number of medical 898
services afford ed by ambulatory medical care. 899
The results obtained make us confident enough to continue this study in other med ical facilities 900
that care for people with severe disabilities, particularly patients with neuromotor disab ilities . 901
Supplementary Materials: The data on medical records of patients , patient information form , and patient consent form 902
used to support the findings of this study are restricted by the Agreement of the Ethics Board no. 4/12.07.2017 of 903
“Dr. C.I. Parhon” Clinical Hospital of Iași, Romania, in order to protect PATIENT PRIVACY . For researchers 904
who meet t he criteria for access to confidential data, more information is available upon request from Prof. Ioana 905
Dana Alexa, MD, PhD, Senior Specialist Internal Medicine and Geriatrics, Head of Geriatrics Department, 906
“Grigore T. Popa” University of Medicine and Pharmacy , “Dr. C.I. Parhon” Clinical Hospital of Iași, Romania, 907
(e-mail: ioana.b.alexa@gmail.com). 908
Author Contributi ons: Conceptualizatio n, Radu Gabriel Bozomitu; Funding acquisition, Radu Gabriel 909
Bozomitu; Investigation, Radu Gabriel Bozomitu, Lucian Niță, Ioana Dana Alexa, Adina Carmen Ilie and 910
Cristian Rotariu; Methodology, Radu Gabriel Bozomitu and Vlad Cehan; Proje ct administration, Ra du Gabriel 911
Bozomitu; Resources, Radu Gabriel Bozomitu, Lucian Niță, Ioana Dana Alexa, Adina Carmen Ilie and Cristian 912
Rotariu; Software, Radu Gabriel Bozomitu, Lucian Niță, Alexandru Păsărică and Cristian Rotariu; Supervision, 913
Radu Gabr iel Bozomitu; Validat ion, Radu Gabriel Bozomitu, Ioana Dana Alexa, Adina Carmen Ilie, Alexandru 914
Păsărică and Cristian Rotariu; Writing —original draft, Radu Gabriel Bozomitu, Lucian Niță, Vlad Cehan, Ioana 915
Dana Alexa, Adina Carmen Ilie, Alexandru Păsăric ă and Cristian Rotari u; Writing —review & editing, Radu 916
Gabriel Bozomitu. 917
Funding: The work has been carried out within the program Joint Applied Research Projects, funded by the 918
Romanian National Authority for Scientific Research (MEN – UEFISCDI), contr act PN -II-PT-PCCA -2013-4-0761, 919
no. 21/2014 (SIACT). 920
Conflicts of Interest: The authors declare no conflict of interest. 921
References 922
1. Hussey ACS, Cook S. Assistive Technologies: Principles and Practice. Baltimore: Mosby. 2002 . 923
2. Bryant DP, Bryant BR. Assistive technology for people with disabilities. Allyn and Bacon Boston; 2003. 924
3. Varshney U. Perva sive healthcare computing: EMR/EHR, wireless and health monitoring. Springer Science 925
& Business Media; 2009. 926
4. Xiao Y, Chen H. Mobile telemedicine: a computing and networking perspective. Auerbach Publicati ons; 927
2008. 928
5. Bronzino JD. Biomedical Enginee ring and Instrumentation. PWS Publishing Co.; 1986. 929
6. Bronzino JD. Biomedical engineering handbook. Vol. 2. CRC press; 1999. 930
7. Dawant BM, others. Knowledge -based systems for intelligent patient monitorin g and management in 931
critical care environments. Biomedical Engineering Handbook CRC Press Ltd. 2000;208. 932
8. Scherer M. Living in the State of Stuck: How Technology Impacts the Lives of People with Disabilities, 933
Cambridge, MA, 2000: Retrieved June 7, 2005 . 934
9. Castinié F, others. The UR -Safe project: A multidisciplinary approach for a fully “nomad” care for patients. 935
Invited paper for SETIT. 2003;17 –21. 936
10. Malan D, Fulford -Jones T, Welsh M, Moulton S. Codeblue: An ad hoc sensor network infrastructure f or 937
emergency medical care. In: International wo rkshop on wearable and implantable body sensor networks. 938
Boston, MA; 2004. 939
11. Kropp A. Wireless communication for medical applications: the HEARTS experience. Journal of 940
Telecommunications and Information T echnology. 2005;40 –41. 941
Sensors 2019 , 19, x FOR PEER REVIEW 28 of 30
12. Rubel P, Fayn J, N ollo G, Assanelli D, Li B, Restier L, et al. Toward personal eHealth in cardiology. Results 942
from the EPI -MEDICS telemedicine project. Journal of electrocardiology. 2005;38(4):100 –106. 943
13. Kakria P, Tripathi N, Kitipawang P. A real -time health monitoring system for remote cardiac patients using 944
smartphone and wearable sensors. International journal of telemedicine and applications. 2015;2015:8. 945
14. Bishop J. Supporting communication between people with social orientation impairments usi ng affective 946
compu ting technologies: Rethinking the autism spectrum. In: Assistive technologies for physical and 947
cognitive disabilities. IGI Global; 2015. p. 42 –55. 948
15. Bowes A, Dawson A, Greasley -Adams C. Literature review: the cost effectiveness of ass istive technology 949
in supporting people with dementia. 2013 . 950
16. Canal G, Escalera S, Angulo C. A real -time human -robot interaction system based on gestures for assistive 951
scenarios. Computer Vision and Image Understanding. 2016;149:65 –77. 952
17. Liu KC, Wu CH, Tseng SY, Tsa i YT. Voice Helper: A Mobile Assistive System for Visually Impaired 953
Persons. In: 2015 IEEE International Conference on Computer and Information Technology; Ubiquitous 954
Computing and Communications; Dependable, Autonomic and Secure Computin g; Pervasive Intel ligence 955
and Computing. 2015. p. 1400 –5. 956
18. Hersh M, Johnson MA. Assistive technology for visually impaired and blind people. Springer Science & 957
Business Media; 2010. 958
19. Yang C -H, Huang H -C, Chuang L -Y, Yang C -H. A mobile communicati on aid system for persons with 959
physical disabilities. Mathematical and computer modelling. 2008;47(3 –4):318–327. 960
20. Tsun MTK, Theng LB, Jo HS, Hui PTH. Robotics for assisting children with physical and cognitive 961
disabilities. In: Assistive Technologies for Physical and C ognitive Disabilities. IGI Global; 2015. p. 78 –120. 962
21. Constantinescu V, Matei D, Costache V, Cuciureanu D, Arsenescu -Georgescu C. Linear and nonlinear 963
parameters of heart rate variability in ischemic stroke patients. Neurologia i neur ochirurgia polska. 964
2018;52(2):194 –206. 965
22. Cristina LM, Matei D, Ignat B, Popescu CD. Mirror therapy enhances upper extremity motor recovery in 966
stroke patients. Acta Neurologica Belgica. 2015;115(4):597 –603. 967
23. Montaño -Murillo R, Posada -Gómez R, Martí nez-Sibaja A, Gonz alez-Sanchez B, Aguilar -Lasserre A, 968
Cornelio -Martínez P. Design and assessment of a remote vibrotactile biofeedback system for neuromotor 969
rehabilitation using active markers. Procedia Technology. 2013;7:96 –102. 970
24. World Health Organiza tion 2010. 2010 Op portunities and developments | Report on the second global 971
survey on eHealth | Global Observatory for eHealth series – Volume 2: TELEMEDICINE. 2011 Jan. 972
25. Evans J, Papadopoulos A, Silvers CT, Charness N, Boot WR, Schlachta -Fairchild L , et al. Remote he alth 973
monitoring for older adults and those with heart failure: adherence and system usability. Telemedicine and 974
e-Health. 2016;22(6):480 –488. 975
26. Kumpusch H, Hayn D, Kreiner K, Falgenhauer M, Mor J, Schreier G. A mobile phone based tele monitoring 976
concept for the simultaneous acquisition of biosignals physiological parameters. Studies in health 977
technology and informatics. 2010;160(Pt 2):1344 –1348. 978
27. Pino A. Augmentative and Alternative Communication systems for the motor disabled. In: Disability 979
Inform atics and Web Accessibility for Motor Limitations. IGI Global; 2014. p. 105 –152. 980
28. Pinheiro CG, Naves EL, Pino P, Losson E, Andrade AO, Bourhis G. Alternative communication systems 981
for people with severe motor disabilities: a survey. Biomedical enginee ring online. 2011;10(1):31. 982
29. AXIWI, Wireless communication system for disabled people [Internet]. Available from: 983
https://www.axiwi.com/wireless -communication -system -for-disabled -people/ 984
30. Origin Instruments, Assistive Technology for Computer and M obile Device Access [Internet]. Available 985
from: http://www.orin.com/ 986
31. EnableMart, Assistive Technology [Internet]. Available from: https://www.enablemart.com/ 987
32. Tobii Eye Trackers [Internet]. Available from: https://gaming.tobii.co m/products/ 988
33. Bozomitu RG. SIACT, Integrated system for assistance in communicating with and telemonitoring severe 989
neuromotor disabled people [Internet]. TUIASI; 2014 -2017. Available from: 990
http://telecom.etc.tuiasi.ro/telecom/staff/rbozomitu/SIACT/index .htm 991
34. Bozomitu RG. ASISTSYS, Integrated System of Assistance for Patients with Severe Neuromotor Affections 992
[Internet]. TUIASI; 2008 -2011. Available from: http://telecom.etc.tuiasi.ro/telecom/staff/rbozomitu/asistsys/ 993
35. Cehan V. TELPROT, Communication system with peo ple with major neuromotor disability [Internet]. 994
TUIASI; 2006 -2008. Available from: http://telecom.etc.tuiasi.ro/telprot/ 995
Sensors 2019 , 19, x FOR PEER REVIEW 29 of 30
36. Bozomitu RG, P ăsărică A, Cehan V, Lupu RG, Rotariu C, Coca E. Implementation of eye -tracking system 996
based on circular Hough trans form algorithm. In: 2015 E -Health and Bioengineering Conference (EHB). 997
2015. p. 1 –4. 998
37. Păsărică A, Bozomitu RG, Tărniceriu D, Andruseac G , Costin H, Rotariu C. Analysis of Eye Image 999
Segmentation Used in Eye Tracking Applications. Revue roumaine des sciences techniques. 2017;62(2):215 – 1000
22. 1001
38. Rotariu C, Bozomitu RG, Pasarica A, Arotaritei D, Costin H. Medical system based on wireless senso rs for 1002
real time remote monitoring of people with disabilities. In: 2017 E -Health and Bioengineering Conference 1003
(EHB). 2017. p. 753 –6. 1004
39. Nita L, Bozomitu RG, Lupu RG, Pasarica A, Rotariu C. Assistive communication system for patients with 1005
severe neurom otor disabilities. In: 2015 E -Health and Bioengineering Conference (EHB). 2015. p. 1 –4. 1006
40. Lupu RG, Bozomitu RG, Nita L, Romila A, Pasarica A, Arotaritei D, et al. Medical professional end -device 1007
applications on Android for interacting with neuromotor d isabled patients. In: 2015 E -Health and 1008
Bioengineering Conference (EHB). 2015. p. 1 –4. 1009
41. Al-Rahayfeh A, Faezipour M. Eye Tracking and Head Movement Detection: A State -of-Art Survey. IEEE 1010
Journal of Translational Engineering in Health and Medicine. 2013;1:2100212 –2100212. 1011
42. Duchowski AT. Eye tracking methodology. Theory and practice. 2007;32 8. 1012
43. Majaranta P, Räihä K -J. Twenty years of eye typing: systems and design issues. In: Proceedings of the 2002 1013
symposium on Eye tracking research & applications. ACM; 2002. p. 15 –22. 1014
44. Majaranta P, Bulling A. Eye tracking and eye -based human –compu ter interaction. In: Advances in 1015
physiological computing. Springer; 2014. p. 39 –65. 1016
45. Cuong NH, Hoang HT. Eye -gaze detection with a single WebCAM based on geometry features extraction. 1017
In: 2010 11th International Conference on Control Automation Roboti cs Vision. 2010. p. 2507 –12. 1018
46. eZ430 -RF2500 Development Tool User’s Guide [Internet]. Available from: 1019
http://www.ti.com/lit/ug/slau227f/slau227f.pdf 1020
47. MSP430F2274, 16 -bit Ultra -Low -Power Microcontroller, 32KB Flash, 1K RAM [Internet]. Available from : 1021
http://www.ti.com/lit/ds/symlink/msp430f2274.pdf 1022
48. CC2500, Low Cost, Low -Power 2.4 GHz RF Transceiver Designed for Low -Power Wireless Apps in the 2.4 1023
GHz ISM B [Internet]. Available from: http://www.ti.com/lit/ds/symlink/cc2500.pdf 1024
49. AFE4400 and AF E4490 Development Guide, Texas Instruments [Internet]. Available from: 1025
http://www.ti.com/lit/ug/slau480c/slau480c.pdf 1026
50. UFI Model 1132 Pneumotrace II [Internet]. Available from: 1027
http://www.ufiservingscience.com/model_1132.html 1028
51. TMP275 Temperature Se nsor, Texas Instruments [Internet]. Available from: 1029
http://www.ti.com/lit/ds/symlink/tmp275.pdf 1030
52. Li D, Winfield D, Parkhurst DJ. Starburst: A hybrid algorithm for video -based eye tracking combining 1031
feature -based and model -based approaches. In: 2005 IEE E Computer Society Conference on Computer 1032
Vision and Pattern Recognition (CVPR’05) – Workshops. 2005. p. 79 –79. 1033
53. Li D, Parkhurst JD. Starburst: A robust algorithm for video -based eye tracking. In: Proceedings of the IEEE 1034
Vision for Human -Computer Inte raction Workshop. 2005. 1035
54. Sheena D, Borah J. Compensation for some second order effects to improve eye position measurements. 1036
Eye movements: Cognition and visual perception. 1981;257 –268. 1037
55. Halır R, Flusser J. Numerically stable direct least square s fitting of ellipses. In: Proc 6th International 1038
Conference in Central Europe on Computer Graphics and Visualization WSCG. Citeseer; 1998. p. 125 –132. 1039
56. Fitzgibbon AW, Pilu M, Fisher RB. Direct least squares fitting of ellipses. In: Proceedings of 13t h 1040
International Conference on Pattern Recognition. Vienna; 1996. p. 253 –7. 1041
57. Świrski L, Bulling A, Dodgson N. Robust real -time pupil tracking in highly off -axis images. In: Proceedings 1042
of the Symposium on Eye Tracking Research and Applications. Santa B arbara, California: ACM; 2012. p. 1043
173–176. 1044
58. Fischler MA, Bolles RC. Ran dom sample consensus: a paradigm for model fitting with applications to 1045
image analysis and automated cartography. In: Readings in computer vision. Elsevier; 1987. p. 726 –740. 1046
59. Cherabit N, Chelali FZ, Djeradi A. Circular hough transform for iris locali zation. Science and Technology. 1047
2012;2(5):114 –121. 1048
Sensors 2019 , 19, x FOR PEER REVIEW 30 of 30
60. Rhody H. Lecture 10: Hough circle transform. Chester F Carlson Center for Imaging Science, Rochester 1049
Institute of Technolog y. 2005 . 1050
61. Feng G -C, Yuen PC. Variance projection function and its appli cation to eye detection for human face 1051
recognition. Pattern Recognition Letters. 1998;19(9):899 –906. 1052
62. Zhou Z -H, Geng X. Projection functions for eye detection. Pattern recogni tion. 2004;37(5):1049 –1056. 1053
63. openEyes, Starburst [Internet]. Available from: http://thirtysixthspan.com/openEyes/software.html 1054
64. Bozomitu RG, Păsărică A, Cehan V, Rotariu C, Coca E. Eye pupil detection using the least squares 1055
technique. In: 2016 39 th International Spring Seminar on Electronics Technology (ISSE). Pilsen, Czech 1056
Republic; 2016. p. 439 –42. 1057
65. Bozomitu RG, Păsări că A, Lupu RG, Rotariu C , Coca E. Pupil detection algorithm based on RANSAC 1058
procedure. In: 2017 International Symposium on S ignals, Circuits and Systems (ISSCS). 2017. p. 1 –4. 1059
66. Sato H, Abe K, Ohi S, Ohyama M. Automatic classification between involunta ry and two types of voluntary 1060
blinks based on an image analysis. In: International Conference on Human -Computer Interaction. Springer; 1061
2015. p. 140 –149. 1062
67. Olsson P. Real -time and offline filters for eye tracking [Master’s Thesis]. Stockholm, Sweden; 20 07. 1063
68. McAuley J, Marsden C. Physiological and pathological tremors and rhythmic central motor control. Brain. 1064
2000;123(8):1545 –1567. 1065
69. Introduction to SimpliciTI, http://www.ti.com/lit/ml/swru130b/swru130b.pdf. 1066
70. Valdez G. SignalR: Building rea l time web applications. Microsoft. 2012;17:12. 1067
71. Singh H, Singh J. Human eye tracking and related issues: A review. International Journal of Scientific and 1068
Research Publications. 2012;2(9):1 –9. 1069
72. Majaranta P. Communication and text entry by gaze. In: Gaze interaction and applications of eye tracking: 1070
Advances in assistive technologies. IGI Global; 2012. p. 63 –77. 1071
73. Majaranta P, Räihä K -J. Text entry by gaze: Utilizing eye -tracking. Text entry s ystems: Mobility, 1072
accessibility, universality. 2007 ;175–187. 1073
74. Hansen DW, Hansen JP. Eye typing with common cameras. In: Proceedings of the 2006 symposium on Eye 1074
tracking research & applications. ACM; 2006. p. 55 –55. 1075
1076
© 2019 by the authors. Submitted for possible open acc ess publication under the terms
and conditions of the Creative Commons Attribution (CC BY) license
(http://creativecommons.org/licenses/by/4.0/).
1077
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: Sensors 2019 , 19, x doi: FOR PEER REVIEW www.mdpi.comjournal sensors [609576] (ID: 609576)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
