An Application for Skin Macules Characterization Based on a 3-Stage Image-Processing Algorithm for Patients with Diabetes

Diabetic skin manifestations, previous to ulcers and wounds, are not highly accounted as part of diagnosis even when they represent the first symptom of vascular damage and are present in up to 70% of patients with diabetes mellitus type II. Here, an application for skin macules characterization based on a three-stage segmentation and characterization algorithm used to classify vascular, petechiae, trophic changes, and trauma macules from digital photographs of the lower limbs is presented. First, in order to find the skin region, a logical multiplication is performed on two skin masks obtained from color space transformations; dynamic thresholds are stabilised to self-adjust to a variety of skin tones. Then, in order to locate the lesion region, illumination enhancement is performed using a chromatic model color space, followed by a principal component analysis gray-scale transformation. Finally, characteristics of each type of macule are considered and classified; morphologic properties (area, axes, perimeter, and solidity), intensity properties, and a set of shade indices (red, green, blue, and brown) are proposed as a measure to obviate skin color differences among subjects. The values calculated show differences between macules with a statistical significance, which agree with the physician's diagnosis. Later, macule properties are fed to an artificial neural network classifier, which proved a 97.5% accuracy, to differentiate between them. Characterization is useful in order to track macule changes and development along time, provides meaningful information to provide early treatments, and offers support in the prevention of amputations due to diabetic feet. A graphical user interface was designed to show the properties of the macules; this application could be the background of a future Diagnosis Assistance Tool for educational (i.e., untrained physicians) and preventive assistance technology purposes.


Introduction
Diabetes is a rapidly growing chronic disease with a 20% prevalence and which is catalogued as a noncommunicable disease [1]. Diabetes mellitus type II is characterized by insulin resistance. Insulin is a hormone that helps deliver glucose to cells, e.g., to muscular cells where it is metabolized as energy the 2016 National Health and Nutrition Survey (ENSANUT 2016) [4] reported that 9.4% of Mexican adults (i.e., 6.5 millions) have been diagnosed with diabetes. However, in 2017, the International Diabetes Federation (IDF) [5] reported that there are 12 millions of Mexican adults living with diabetes, but 37.5% are not aware that they have this disease.
Comorbidities such as obesity, hypertension, and dyslipidemia, among others, are precipitating factors to develop diabetes [6]. Even more, when these comorbidities are present along with diabetes, a rapid deterioration of body functions could arise and persist; diabetic retinopathy and diabetic foot [4] can cause blindness or lead to amputations which lead to disabilities.
Diabetes is associated, in the long-term, with degenerative processes that affect the cardiovascular and nervous system, as well as the eyes and skin [7]. From 30 to 70% of patients with diabetes develop skin problems [7,8]. Neuropathy, microangiopathy, and macroangiopathy are the main predisposing factors for diabetic foot. eir evolution leads to blood flow reduction and ischemia, structural and functional damage, and an overloaded extremity due to the lack of sensitivity; all these put the foot at risk. Moreover, just adding up anything like a simple trauma or an infection could lead to ulcers, lesions, and even necrosis [9].
Although microangiopathy and macroangiopathy are major contributors to complications like skin lesions or diabetic foot, metabolic disruptions also have a significant direct effect, especially in alterations of the skin [7]. Some of these manifestations are called macules [10], which are defined as a flat, distinct, and discolored area of skin. Other manifestations may include lack of body hair, yellowish coloration, callous formation, onychomycosis, foot and toe deformation, and others [7,8]. Even though macules occur commonly, they are not taken into account as a diagnose element [11,12], nor are they registered as information that could lead to an early diabetic foot diagnosis [12,13].
Relevantly, microangiopathy and macroangiopathy are also the cause of most skin manifestations found in patients with diabetes mellitus who have not been diagnosed with diabetic foot [14].
In the case of diabetes mellitus, skin manifestations have not been accounted as an important aspect of the disease [15]. ere is a high prevalence of skin disorders among these patients as a matter of fact, and various authors report that these disorders are present in ∼70% of their patients [14].
Kiziltan et al. [14] state that diabetic dermopathy is more common on patients with neuropathy or large vessel disease; also, they report it as frequently present in patients with signs and symptoms of polyneuropathy. Pavicic and Korting [16] report that peripheral arterial obstructive disease (PAOD) is up to 6 times more frequent in patients with diabetes and PAOD, neuropathy, and macroangiopathy are key pathophysiologic factors in its development.
Several related studies report that 73% [15] to 80% [17] of the sampled patients present skin lesions or changes. Diabetic dermopathy always comes first as the most common skin manifestation in patients with diabetes. Pavicic and Korting [16] also state that the increasing duration of the disease rise the possibility of skin involvement; 45% of patients suffering diabetes for more than 20 years developed a peripheral vascular disease, and 75-82.1% presented xerosis, which could cause skin tears [16].
Any change in skin pigmentation is called a macule. Macules can be erythematosus (originated by blood vessels dilation or formation of new vessels), pigmentosae (which can be hyperpigmented, hypopigmented, or achromatic), or artificial, among others. Vascular macules occur as a secondary reaction, e.g., to medication, due to peripheral venous insufficiency or trauma [18].
A vascular macule is the one originated from a micro-or macrovascular problem, where the vessels underneath the skin are affected. ese macules are rounded and are of reddish to brown color; normally, they present a diameter of 1 cm, but they can be smaller. Petechiae are very small (the size of a pinhead), reddish, rounded spots that appear on the shins and usually are a secondary effect of treatment with acetylsalicylic acid. Macules due to trophic changes are present when the patient has chronic venous insufficiency.
ey are darker patches of skin, have a larger area than other macules, and appear mainly in the ankles and shins. Macules due to trauma are the evidence (different than a scar or scab) of a traumatic event such as a hit in the shin. ey are of brownish color, and the shape varies according to the trauma presented. is macule lingers in the skin of the patient with diabetes for a longer period of time than it would on a healthy patient.
All these skin manifestations are present previous to a diabetic foot diagnosis; patients can present them all at the same time, and they are generally overlaid. ese macules appear in different parts of the leg and have large areas with undefined borders. eir localization and ulterior segmentation represent a challenging task, but the results can eventually be used as a tool for macule characterization, foot health prognosis, and even for amputation risk assessment.
Regarding the algorithms for image processing, these types of macules are not evaluated or processed until they become lesions or ulcers [19]. Computer aid diagnosis has been used in skin lesions for dermatology and dermoscopy (e.g., carcinomas and melanomas) [20][21][22] by means of support vector machines [21], support vector classification [20], or seeded region growing [22], but not in the prevention of diabetic foot development. Generally, tools for assessing skin problems due to diabetes mellitus type 2 are focused on advanced lesions and use questionnaires [18] that evaluate lesions like ingrown toenails, ulcers, calluses, or fissures, which take place after the diabetic foot diagnosis.
In this paper, we present the design of a graphical user interface (GUI) developed in Matlab ® as an application for the characterization of skin macules. e GUI is based on a segmentation algorithm that applies image-processing techniques in order to find the region of interest (ROI) and characterize the macules present in images of the leg and foot of patients with diabetes mellitus type 2. We also present a statistical study of the calculated properties and a classifier of the 4 types of macules.

Materials and Methods
e first step was to acquire color digital photographs of skin macules "skin images" from the lower limb. For this purpose, a device called Wireless Image Acquisition System (WIAS) [23] was used. e device included a digital wireless camera (Sony DCS-QX100, 18MP), which provided an RGB image ( Figure 1). Zoom and flash were never used in order to avoid changes in resolution or capturing bright areas, respectively. Changes in area, shape, and coloration of macules were document by the skin images.
e macules studied in this work were vascular macules, petechiae, macules due to trophic changes, and macules due to trauma. e study was performed at the Cardiac Rehabilitation Service of a National Institute in Mexico City. Skin images were processed using the Image Processing Toolbox of Matlab ® . ey were taken from 19 Mexican patients diagnosed with diabetes mellitus type II, but not yet with diabetic foot, who gave their signed informed consent.
Segmentation and characterization were performed through a proposed 3-stage image-processing algorithm, as described below:

Stage 1 (Skin Region).
e aim of using the WIAS device was to be able to acquire repeatable digital photographs from areas of interest form the lower limb, and these images were called skin images. e color skin image contained elements that were not of interest, e.g., the robe, the bed clothing, and other background components. So, the first objective was to segment the legs of the patient from it.
A color image can be transformed to different color spaces [24] (i.e., domain transformation) in order to enhance the characteristics of interest, i.e., the differences between skin and nonskin and the similarity among different skin tones. If we see the skin image as a matrix, size is determined by the resolution of the camera. e image has 3 levels of depth; each level corresponds to one RGB color matrix, and each cell in these matrices corresponds to a pixel, whose value is the level of intensity in 8 bits.
In Stage 1, the first step was to transform the image from RGB to HSV color space. RGB describes an image for the amount of red, green, and blue in it. HSV color space does the same but in terms of Hue, saturation, and value. e algorithm [25] is described by equation (1). e RGB values should be normalized to the range [0 1]: where V represented the brightness value, S is the saturation, and H is the Hue matrix. e R, G, and B values had to be divided by 255 (e.g., R � R/255) in order to satisfy the normalization condition. e Hue matrix was selected (this property allowed the differentiation between ROIs and background), a fixed threshold was set, and an intensity range was determined to find a tone-set of values (equation (2)). is became the first skin mask: Journal of Healthcare Engineering With the Skin Mask 1, it was not possible to identify a wide range of skin tones, so it was necessary to make the algorithm more robust. erefore, a second color space transformation was applied using the YCbCr color space transformation matrix determined as follows [26]: en, a dynamic range was used. e histograms for the Cb and Cr matrix were calculated; then, these values were used to set dynamic limits in order to process different skin colors and tones in a wide range. is means that depending on the histogram values, the algorithm would adjust the threshold, so it would tune itself to the skin tone of the patient. e values found within the dynamic range outlined the second skin mask: where the limit values of Cb and Cr used in the equation changed for every skin tone found. en, in order to link the data from both color spaces, the HSV and YCbCr masks were added in an AND operation; this allowed for the resulting mask to work in a wide range of skin tones. is yielded a more robust algorithm for this stage and a more precise skin region.

Stage 2 (Lesion Region).
Once the skin region was segmented, skin lesions had to be identified. From the raw image in RGB color space, pixel values had to be amplified, so they became darker or lighter as they corresponded to healthy or damaged regions. For this purpose, the process described below was followed. e color space CIE 1976 L * a * b * was used to handle luminosity [27], in order to saturate the intensity values. is transformation was derived from the following equations [28]: For this stage, the L matrix corresponded to luminosity from black to white and was the one selected [29]. e resulting saturated image was then reenhanced by converting it to grayscale using principal component analysis (PCA) [30]. Lesion region was calculated using the histogram of the PCA grayscale image, where a threshold was set to find the damaged areas. is threshold also shifted depending on the tones detected from the healthy and the lesioned skin, but it took approximately 10% of the values found in the image ( Figure 2).

Stage 3 (Characterization).
Characterization of the damage in the lower limbs of the patients was performed in 2 stages: (1) Data values of extracted features at the segmented lesion region were classified into 2 types: morphologic properties-area, major axis, minor axis, perimeter, solidity-and intensity propertiesmaximum intensity and minimum intensity. (2) e Shade Index (ShI) was a parameter used to measure color variations from the RGB raw image. An equation was designed for each color, where equation (6) was used for the Shade Index Red (ShI R ), equation (7) for the Shade Index Green (ShI G ), and equation (8) for the Shade Index Blue (ShI B ): ShI G � mean M green mean HS green , where M red is the red component of the area inside the segmented macule, M green is the green component of the area inside the segmented macule, and M blue is the blue component of the area inside the segmented macule in RGB; HS red is the red component of an area of healthy skin around the macule, HS green is the green component of an area of healthy skin around the macule, and HS blue is the blue component of an area of healthy skin around the macule, in RGB. Finally, a Shade Index Brown (ShI BR ) (equation (9)) was used to identify brownish changes in the skin: ShI BR � mean M red + mean M blue mean HS red + mean HS blue .
(9) Figure 3 shows the flow diagram for the 3-stage algorithm for skin and lesion region segmentation in addition to the characterization feature.
It was necessary to find out if the differences were statistically significant among the values calculated for the extracted features in the algorithm for each type of macule. In order to validate this, Student's t-test was performed using SPSS v17 with a confidence interval of 95% (p < 0.05).
Also, a classifier was designed in order to identify each macule by means of building an artificial neural network and the feature vectors that characterize each of them. 60% of the data was used to train the network and 40% to test it.

Results and Discussion
e 3-stage image-processing algorithm reported in this paper is composed by segmentation of skin and its lesions, as well as the values of the features obtained from the shade indices.
Using the skin images acquired with the WAIS, the specialist classified the macules found in the patients as vascular and petechiae, due to trophic changes, or due to trauma macules. e results of image processing for the segmentation of skin region are shown in Results for Stage 1. Segmentation of lesion region is later displayed in Results for Stage 2, and the features of macule characterization are obtained and analyzed in Results for Stage 3.

Results for Stage 1 (Skin Region
). An example of the histogram obtained after the YCbCr color space transformation used to find the dynamic range that self-adjusted to a wide variety of skin tones is shown in Figure 4. Mean value for each matrix fell in the valley of the histograms; the first section of Cb and the second section of Cr were selected in order to find the values that outlined the second skin mask.
Skin region was obtained from skin image, as shown in Figure 5. e background was eliminated with the intention of avoiding segmentation errors due to, e.g., the logo of the bed sheets or any object in the back. Figure 6 shows examples of different lesion regions (which include vascular, petechiae, trophic changes, and trauma macules) found in 4 patients. ese images were the result of applying the novel proposed algorithm. From these examples, it was noticeable that some areas could be overseen in the RGB images, but after the processing enhancement with the CIE 1976 L * a * b * color space transformation, the selection of the luminosity matrix, and the PCA gray-scale transformation, these hidden macules are now within the spectrum of the dynamic range selected from the histogram. From this stage, a general state of health of the extremity was calculated and displayed as percentage of damage (29% for patient no. 1, 24% for patient no. 2, 31% for patient no. 3, and 21% for patient no. 4).

Results for Stage 3 (Characterization of Features).
In order to characterize the macules (vascular, petechiae, due to trophic changes, or due to trauma), feature extraction for morphologic properties, intensity properties, and Shade Indices was performed in 82 macules obtained from the lesion regions found. Table 1 shows the values obtained.
By means of statistical analysis, significant differences (p < 0.05) were found among the macules studied; these p values are shown in Table 2. According to it, petechiae and vascular macules can be differentiated through morphologic properties and Shade Indices (except ShI B ). Differences between petechiae and macules due to trophic changes can be found comparing their morphologic properties; morphologic properties and Shade Indices, ShI R , were significantly different for petechiae and trauma macules. Vascular macules and those resulting from trophic changes can only be differentiated through their morphologic properties, while trauma macules can be differentiated comparing all properties expected for solidity and minimum intensity. Macules due to trophic changes and trauma can be differentiated using the Shade Indices: ShI G , ShI B , and ShI BR , and 4 other properties. e concatenation of properties, calculated for each macule evaluated, form the feature vector for each example. Figures 7(a) and 7(b) show the average value for each property and macule, or the average feature vector.
So, in order to identify each macule, the proposed architecture is a feedforward backpropagation network with 2 hidden layers and 4 neurons per layer; the transfer functions are hyperbolic tangent sigmoid and logsigmoid. e training function updates weight and bias values according to the Levenberg-Marquardt optimization [31]. In order to train the classifier, an 11 × 40 matrix was built, where each type of macule yielded 10 examples; 60% of the data was used for the training, and the remaining 40% was used to test the network. e results were displayed through a confusion matrix (Figure 8(a)), where the coincidence between one of the 10 feature vectors and the target class was demonstrated. e correct identification of the data corresponded to 97.5%. A linear regression of the data (Figure 8(b)) shows the relation between the target data and the results obtained from the network, where R � 0.95054 indicates that the model was capable of identifying ∼95% of the segmented lesions.

Journal of Healthcare Engineering
Software©, showed the results of the 3-stage imageprocessing algorithm for segmentation of the skin macules, feature extraction, and characterization of 4 types of macules: vascular, petechiae, trophic changes, and trauma, as seen in Figure 9.
Macule images need special processing algorithms as they are very peculiar and present different features depending on the patient. ere are no algorithms reported to address this particular problem. Moreover, because of the wide range of human skin color tones, the major challenge to overcome was to find the macules in spite of the changes in illumination among the skin images.
Color space transformations became a useful tool to find different views of the image that allowed enhancing characteristics that were convenient to solve the segmentation problem. In the HSV color space transformation, the Hue values selected showed a good performance with medium skin colors, but it depended of the light in the room. In order to address this situation, a second color space transformation was applied (YCbCr). is color space extracted red and blue components from the image; since macule color varies from red to brownish, these components became very helpful for macule location and segmentation.
In this case, a fix range for the Cb and Cr values was not useful, even when it is the method of choice in the literature [26,28] because it limits the variety of human skin tones detected to a small selection. However, the dynamic range proposed in this paper allowed the algorithm to adjust to a wide range of skin tones, which increases its usefulness meaningfully. e minimum and maximum values taken from the Cb and Cr matrices represented the illumination range of the image. Histogram values allowed the algorithm to self-tune to the specific image and hence to the specific skin tone and illumination, maintaining the simplicity and efficiency of the algorithm without adding the computational cost of neural networks. e position of the camera can be adjusted using the WIAS device in order to avoid areas with too much brightness or intense illumination.
A normal grayscale transformation was not useful for skin segmentation because it equalizes the distribution of gray levels, which is counterproductive for this scenario. On the contrary, PCA generates an image in gray levels within the limit values of the histogram of one specific image every time, it gets rid of the healthy tissue in the image and keeps the sections with clear manifestations of saturation, and these sections are classified and selected as lesioned skin.
is technique helps take advantage of the illumination enhancements achieved by the CIE 1976 L * a * b * transformation used in Stage 2. And, again, to set thresholds and ranges through histogram, values allow the algorithm to adjust to the particularities of the lesion region found without the need to use more complex processing in order to classify between healthy and lesioned skin. So, this algorithm is a simple and efficient solution for processing macules, which can have multiple applications.
Ideally, in an image identification process posterior to a segmentation, it is important to have a feature or property that allows distinguishing between classes of a group of data.  is can be complicated depending on the characteristics of what is being identified in the image, so quantitative parameters are preferred to guarantee a more robust result. erefore, the macules measured were characterized using morphologic properties, to define shape and geometry; intensity properties, to establish maximum and minimum values of the pixels within the lesion and to separate them from the healthy skin; and the proposed Shade Indices, to identify lesions by color tone. During the development of the characterization (Stage 3), it became evident that, e.g., color red (in the RGB image) did not look the same in every skin tone or even with different illumination, so the reference value could not be fixed. e solution to overcome these problems was to come up with a novel set of Shade Indices, where the healthy skin around the macule was used as a reference for color tone shifts. On the contrary, ShI R and ShI BR turned out to be the indices that helped differentiating the most between the macules studied.
Morphologic properties are features of the macule in which it was possible to point out geometric and shape variations among most of them. Intensity properties did not seem to have a considerable input for classification of data since their p values were significant in less than 40% of the relations studied.
In general, from this analysis, it was comprehensible how complex the problem was since different kinds of macules were present at the same time and, even more, they were overlaid. From the image segmentation and processing point of view, there was a high difficulty to isolate the lesion region to provide an accurate assessment. Nevertheless, with the macule properties chosen and calculated, it was possible to classify each type of macule with 97.5% of accuracy.
With the use of the SMaC Software© characterization and latter classification, the macules of patients with diabetes can be measured and tracked along the development of the disease in order to prevent further disabilities and comorbidities. e use of this software can be especially beneficial for those physicians who do not have specialized training or enough expertise to identify specific macules; it can also be used as an educational tool. e perception of the importance on skin manifestations that appear previous to ulcers or amputations must be changed since they seem to be the first symptom of endothelial decay and vascular damage which lead to worse symptoms of diabetic foot and, eventually, to amputation. From the clinical perspective, the origin of skin and limb damage is multifactorial, but it relates mainly to endothelial decay.
In the future, we aim to turn this GUI Software into a Diagnosis Assistance Tool, which would include clinical variables and other diabetic foot manifestations in order to gather enough data to eventually form a database of patients with diabetes, for preventive purposes.

Conclusion
Nowadays, lower limb skin manifestations are not taken into account in the general evaluation of the state and development of diabetes mellitus type II, even when they have an underlying vascular origin. is paper presents the application of an algorithm for the segmentation, characterization, and classification of skin manifestations from photographic images and the identification of them in the lower limbs of diabetic patients. An efficient algorithm for image processing of skin macules characterization performed by means of extracting morphologic and intensity properties is proposed, along to a new set of Shade Indices used to assess color shifts in different skin tones. From the three sets of features, morphologic properties and Shade Indices resulted statistically significant in order to differentiate among macules of various origins. e indices described here are a new way to assess changes in color for different skin tones, which increase the usefulness of the application.
e properties extracted are used as feature vectors for the input of a classification network which resulted in a 97.5% accuracy for the 4 types of macules studied in this paper: vascular, petechiae, trophic changes, and trauma. e SMaC Software© was designed to bring the proposed algorithm as a tool for the physician in order to aid in the identification and assessment of skin lesions in the lower limbs of patients with diabetes.

Data Availability
Photographic images of macules at the lower limbs of patients with diabetes used to support the findings of this study are restricted by our Institutional Ethics and Research Review Board in order to protect patients' privacy. e data may be released upon petition to Research Review Board who establishes the criteria to access confidential data.

Conflicts of Interest
ere are no conflicts of interest to declare.