In RobotAssisted Rehabilitation (RAR) the accurate estimation of the patient limb joint angles is critical for assessing therapy efficacy. In RAR, the use of classic motion capture systems (MOCAPs) (e.g., optical and electromagnetic) to estimate the Glenohumeral (GH) joint angles is hindered by the exoskeleton body, which causes occlusions and magnetic disturbances. Moreover, the exoskeleton posture does not accurately reflect limb posture, as their kinematic models differ. To address the said limitations in posture estimation, we propose installing the cameras of an optical markerbased MOCAP in the rehabilitation exoskeleton. Then, the GH joint angles are estimated by combining the estimated marker poses and exoskeleton Forward Kinematics. Such hybrid system prevents problems related to marker occlusions, reduced camera detection volume, and imprecise joint angle estimation due to the kinematic mismatch of the patient and exoskeleton models. This paper presents the formulation, simulation, and accuracy quantification of the proposed method with simulated human movements. In addition, a sensitivity analysis of the method accuracy to marker position estimation errors, due to system calibration errors and marker drifts, has been carried out. The results show that, even with significant errors in the marker position estimation, method accuracy is adequate for RAR.
The application of robotics and Virtual Reality (VR) to motor neurorehabilitation (Figure
Robotic and VRbased rehabilitation.
The assessment of (a) patient movement compliance with the prescribed exercises and (b) patient longterm improvement is critical when planning and evaluating the efficacy of RAR therapies. In order to obtain the patient motion data to conduct the said assessments, one has to estimate patient posture (i.e., the joint angles of the limbs). Patient posture estimation methods need to be practical and easy to set up for the physician, so that the said assessments can indeed be an integral part of the therapy.
Current methods for estimating patient posture are either cumbersome or not accurate enough in exoskeletonbased therapies. In order to overcome such limitations, we propose a method where lowcost RGBD cameras (which render color and depth images) are directly installed in the exoskeleton and colored planar markers are attached to the patient’s limb to estimate the angles of the GH joint, thereby overcoming the individual limitations of each of these systems.
Optical, electromagnetic, and inertial MOCAPs have been used in many rehabilitation scenarios for accurate posture estimation [
Optical markerbased systems (e.g., Optotrak, CODA, Vicon) are considered the most accurate for human motion capture [
Electromagnetic systems do not suffer from optical occlusions. However, they are easily perturbed by surrounding metallic objects (e.g., exoskeletal body) and electric/magnetic fields [
Inertial and Magnetic Measurement Systems are robust, handy, and economical for fullbody human motion detection (upper limb tracking in [
In exoskeletonbased rehabilitation, the prevailing approach to estimate human limb joint angles (e.g., [
Recognizing the differences in the kinematic structures of the limb and exoskeleton, [
Reference [
Reference [
We remind the reader that the general context of this paper is the estimation of the GH joint angles.
As per our literature review, no MOCAPs have been developed for the specific scenario of exoskeletonbased rehabilitation. Even if current MOCAPs and the exoskeleton could be set up for simultaneous use (e.g., [
Exoskeletonbased posture estimations present limitations in their accuracy due to kinematic mismatch of the limb and exoskeleton [
The accuracy of the GH joint angle estimations provided by computational methods in [
In response to the limitations discussed in the estimation of patient joint angles in exoskeletonbased therapy (Sections
Occlusions are minimized, which are a major limitation of optical systems.
Accuracy of joint angle estimation is improved, which is a major limitation of exoskeletonbased systems.
This paper presents the implementation and assessment of our method using simulated human motion data. In addition, a sensitivity analysis of our method accuracy to marker position estimation errors is carried out.
We have considered the following scenarios of application for the proposed method in the RAR domain:
Precise estimation of GH joint angles during rehabilitation or evaluation sessions of GH joint analytic movements.
Acquisition of GH joint movement data enabling validation and improvement of other posture estimation methods without using expensive redundant optical MOCAPs.
This section presents the problem of estimating the patient limb GH joint angles during the GH joint RAR using the proposed hybrid motion capture system (a detailed version of the problem definition is presented in the Appendix). This problem can be stated as follows.
Patient: (a) the kinematic model (e.g., the DenavitHartenberg parameters [
Exoskeleton: (a) the kinematic model of the exoskeleton (
Markerbased optical motion capture system (
Components of the GH joint angles estimation system: (a) human kinematic model, (b) exoskeleton kinematic model, (c) markerbased optical motion capture system, and (d) hybrid GH joint angles estimation system.
This section discusses the main features of the kinematic models of the human limb and exoskeleton used for the posture estimation method.
The human kinematic model is denoted by
It can be easily implemented in robotic simulators and similar tools.
It is suitable for simulating humanrobot interaction in realtime [
The spherical model of the GH joint avoids limitations of other representations of such joint, like the Gimbal lock that occurs when using the three concurrent and orthogonal 1DOF revolute joints’ model [
The exoskeleton kinematic model is denoted by
The aim of the method is to estimate the GH joint angles with respect to (w.r.t.) a coordinate system (CS) attached to the scapuloclavicular system. Figure
Marker
Marker
(a) Schematic diagram of the hybrid GH joint angles estimation system and (b) highlevel operation of the system.
Reference [
The cameras of the optical motion capture system
The cameras used in our system are of low cost. Commercial cameras that present similar specifications to the ones simulated here (Table
Vision sensor features.
Color camera resolution (px) 

Depth camera resolution (px) 

Field of view (deg.)  Horizontal = 45; vertical = 45 
Minimum sensing distance (meters)  0.05 
Maximum sensing distance (meters)  0.3 
Figure
A summary of the steps to estimate the GH joint angles is as follows:
Estimate the pose of the markers w.r.t. the cameras.
Estimate the pose of the cameras w.r.t. the exoskeleton.
Estimate the pose of the markers w.r.t. the exoskeleton.
Estimate the upper arm pose w.r.t. the exoskeleton.
Refer the GH joint angles w.r.t. the acromion (marker
The details of the mentioned steps are presented in the following sections.
The purpose of this step is to estimate the position and orientation of the markers (Figure
The RGB image is
The depth image associated with the scene in
Schematic diagram of the iterative estimation of the pose of the markers.
The pose estimation of the markers w.r.t. the cameras is based on the reconstruction of the 3D position of the colored disks on the markers. The following steps are taken to estimate the marker pose:
Estimation of disk coordinates in color image (Figure
Color segmentation in image
Blob extraction on image
Disk center coordinates estimation: for each
Estimation of disk coordinates in the camera
Convert the positions
Compute the indices
The point
The approximate marker disk centers detected by camera
Computation of the marker
Make
Use the four disk centers in the marker (Figure
The submatrix
Estimation of disk coordinates in color image. (a) Simulated RBG image, (b) result of the color segmentation (zoomed image), and (c) result of the blob extraction (zoomed image).
The goal of this step is to find the transformation
Schematic diagram of the iterative estimation of the pose of the cameras.
The rigid transformation matrices
The objective of this step is to estimate the transformation (
Schematic diagram of the iterative estimation of the pose of the markers w.r.t. the exoskeleton CS.
The purpose of this step is to estimate the upper arm pose (
Estimate the position of the GH joint center: the rigid transformation matrix
Estimate
Extract
Estimate the position of the elbow joint center: the rigid transformation matrix
Estimate
Extract
Estimate the upper arm position:
Estimate the arm direction vector as
Estimate the origin of the upper arm CS as
Estimate the upper arm orientation: the estimated orientation of the upper arm is computed using Euler angle
Estimate the rotation of the arm around the
Compute the rotation of the arm around the mobile
Estimate the rotation of the upper arm around its longitudinal axis
Express the pose of the upper arm w.r.t. the
Schematic diagram of the iterative estimation of the upper arm pose.
Coordinate systems for the upper arm pose estimation.
Since
The arm posture estimation method was implemented by using the VREP robotics simulator [
For the estimation of the coordinates of disk centers
The accuracy of the proposed method is determined by comparing its estimations of the upper arm poses with the ones of the simulated human patient (groundtruth values of
Armeo movement generation: we recorded 4 time sequence datasets of the actual Armeo joint measurements (sampled at 66.6 Hz) while performing the following shoulder movements (Figure
Patient movement generation: the movements of the patient upper limb that correspond to the recorded movements of the Armeo are computergenerated with the method in [
GH joint movements: (a) shoulder flexionextension (SFE), (b) shoulder horizontal abductionadduction (SAbAd), and (c) shoulder internal rotation (SIR).
In this way, four sets (one per movement dataset) of known poses of the upper arm are obtained by simulating patient movement and compared here against those estimated with our method. Our method accuracy is assessed without compensating any time offsets between the reference and estimated angles. In this way, realtime accuracy of the method is assessed. Table
Movement dataset features.
Movement dataset  Amplitude (deg.)  Samples 

SAbAd  (6°, 31°, 10°)  1000 
SFE  (31°, 8°, 1°)  1000 
SIR  (3°, 3°, 34°)  1000 
COMB  (40°, 90°, 60°)  2000 
Error in the estimation of the markers position: the error in the position estimation of markers
Error in the estimation of the arm pose: the error in the arm position estimation for a GH joint movement dataset (
To quantify the error in the arm orientation estimation (
Compute the matrix of rotation error
Express
Compute
A sensitivity analysis is carried out to study the influence of relevant parameters on the method accuracy. Formally, the sensitivity analysis determines the effect of the perturbation of the parameter
The upper arm pose accuracy (and, therefore, that of the GH joint angles) relies on the precise estimation of the position of the centers of the elbow and GH joints (
The conducted sensitivity analysis focuses on errors in
Inaccurate computation of
Relative displacement of the markers w.r.t. the GH and elbow joints due to skin movement.
In the sensitivity analysis, translations errors in matrices
For the sensitivity analysis (see (
Parameters of function
Parameter  Meaning  CS of reference 


Translation with magnitude 
GH joint 



Translation with magnitude 
GH joint 



Translation with magnitude 
GH joint 



Translation with magnitude 
Elbow joint 



Translation with magnitude 
Elbow joint 



Translation with magnitude 
Elbow joint 
The sensitivity analysis procedure (Figure
Load the movement dataset of the GH joint to test (SFE, SAbAd, SIR, and COMB).
Select the parameter
Apply the translation indicated by
Compute the estimation errors of the upper arm position and orientation
Compute the position and orientation components of
Increment
Sensitivity analysis steps.
Sensitivity analysis. Coordinate systems of reference for the translations of (a) marker
The complete sensitivity analysis was performed for each movement dataset (SFE, SAbAd, SIR, and COMB). The directions in which marker translations occur (Table
Parameters of the sensitivity analysis.
Minimum marker translation 
0 
Maximum iterations of the sensitivity analysis 
10 
Increment of marker translation in each iteration 
0.002 
Movement datasets evaluated  4 
This section presents and discusses the results of (a) estimation accuracy of the marker 3D position, (b) estimation accuracy of the upper arm pose, and (c) sensitivity analysis of the estimation accuracy of the upper arm pose w.r.t. translation errors in
Table
RMS of errors (and standard deviation in parentheses) in the position estimation of markers
Movement 



SAbAd  0.00089 (0.0001)  0.00175 (0.001) 
SFE  0.00060 (0.0002)  0.00197 (0.0008) 
SIR  0.00088 (0.0001)  0.00135 (0.0007) 
COMB  0.00097 (0.0003)  0.00324 (0.002) 
Figure
Box plots of estimation errors in markers position and upper arm position and orientation for all movement datasets.
The RMS of errors in the upper arm pose estimation are presented in Table
RMS (and standard deviation in parentheses) of errors in the upper arm position and orientation estimation in the assessed movement datasets.
Movement  Position [mts]  Orientation [deg.] 

SAbAd  0.00109 (0.0005)  0.92039 (0.4842) 
SFE  0.00094 (0.0004)  0.83796 (0.3763) 
SIR  0.00091 (0.0002)  0.73465 (0.4156) 
COMB  0.00145 (0.0008)  1.0638 (0.5238) 
In motor rehabilitation, angular errors in the range of 3–5 degrees are considered acceptable for mobility evaluation of patients [
The results of the sensitivity analysis per movement dataset of the shoulder are presented in Figures
Error in upper arm position estimation (
Error in upper arm orientation estimation (
Position component of
Orientation components of
Results of the sensitivity analysis with the SAdAd movement dataset (
Position component of
Orientation component of
Results of the sensitivity analysis with the SFE movement dataset (
Position component of
Orientation component of
Results of the sensitivity analysis with the SIR movement dataset (
Position component of
Orientation component of
Results of the sensitivity analysis with the COMB movement dataset (
Position component of
Orientation component of
Regarding the arm position estimation, one can observe that translations of marker
Observing the behavior of the position component of
In Figures
A side effect of the marker position perturbation is that the marker
In Figures
The results of the sensitivity analysis show that the assumption that transformations
Marker drifts must be mitigated by the marker attachments to the human body. Furthermore, marker attachments should be designed to minimize the effect of errors in
The results presented suggest that the method we implemented is a feasible alternative for estimating the GH joint angles in a RAR scenario.
The literature review provided no references other than [
Table
Contributions of this paper w.r.t. comparable works.
Work  Method  Method evaluation  Accuracy of GH joint angles 

[ 
IKbased swivel angle estimation  ( 
Mean RMSE: 4.8 deg. (bestcase scenario) 


This paper  Hybrid exoskeletonoptical MOCAP  ( 
(a) Mean RMSE: 0.9 deg. (assuming no marker drift or calibration errors) 
In the context of RAR, this paper presents the formulation, implementation, and assessment, in silico, of a novel accurate method to estimate the patient GH joint angles during therapy. Our method does not require redundant markers or cameras and relies on simple geometric relationships and tools of standard robotics and computer vision libraries. These characteristics make it economical and readily applicable in RAR.
The accuracy and the robustness of our method are evaluated using computergenerated human movement data corresponding to actual movement datasets of the Armeo Spring. We present a formal sensitivity analysis of the pose estimation accuracy w.r.t. marker position estimation errors produced by (a) system calibration errors and (b) marker drifts (due to skin artifacts). This analysis indicates that even in the presence of large marker position errors our method presents an accuracy that is acceptable for patient mobility appraisal.
Future work includes (a) implementation of the method using commercially available RGBD vision sensors, (b) evaluation of the method accuracy with actual human movement data, (c) adaptation of the method using solely RGB cameras, and (d) extension of our method to address other limbs.
A human patient upper body with a kinematic model
The model is a simplified version of the spine, arm, and scapuloclavicular systems. However, since we focus on the study of the upper limb, we only describe in detail the kinematic model of the said limb.
The set of links is
The set of joints is
An exoskeleton with a kinematic model
The set of links is
The set of joints is
The
Also
The exoskeleton may be configured to impose specific motion constraints on the patient by blocking specific joints of the
A markerbased optical tracking system
A set
All
The set of colors of the disks mounted on each
The rigid transformation matrices
A set
The rigid transformation matrices
Remarks on each camera
The system of cameras
Find the values of
Region of the scapula bone above the GH joint
Bone of the shoulder girdle located at the root of the neck
Coordinate system(s)
Combination of movements of the GH joint (SAbAd, SFE, and SIR)
Degree(s) of freedom
Glenohumeral
Upper arm bone
Motion capture system(s)
Meters
RobotAssisted Rehabilitation
Root mean square
Bone that connects the humerus to the clavicle
Shoulder horizontal abductionadduction
Shoulder flexionextension
Shoulder internal rotation
Virtual Reality
Virtual Robot Experimentation Platform
With respect to
Exoskeleton kinematic model
Human upper body kinematic model
Set of planar markers mounted on the patient
Position of the GH joint w.r.t. the
Position of the elbow joint w.r.t. the
Set of vision sensors that compose the optical MOCAP
3tuple of joint angles of the GH joint at instant
Tuple of joint angles of the exoskeleton kinematic model at instant
Transformation matrix of marker
Transformation matrix of marker
Transformation matrix of the GH joint w.r.t. the
Transformation matrix of the elbow joint w.r.t. the
The authors declare that there are no competing interests regarding the publication of this paper.
This research is a part of the HYPER Project funded by CONSOLIDERINGENIO 2010, Spanish Ministry for Science and Innovation.