Nonlinear Radon Transform Using Zernike Moment for Shape Analysis

We extend the linear Radon transform to a nonlinear space and propose a method by applying the nonlinear Radon transform to Zernike moments to extract shape descriptors. These descriptors are obtained by computing Zernike moment on the radial and angular coordinates of the pattern image's nonlinear Radon matrix. Theoretical and experimental results validate the effectiveness and the robustness of the method. The experimental results show the performance of the proposed method in the case of nonlinear space equals or outperforms that in the case of linear Radon.


Introduction
Shape analysis methods have been broadly applied to biomedical signal processing, object recognition, image retrieval, target tracking, and so forth [1]. Moments methods [2,3] can be referred to shape descriptors because of their good characterization in describing different shapes. The most important properties of shape descriptors achieved by different moments are invariance, including translation, rotation, scaling, and stretching, stability to noise, and completeness [4].
In the past twenty years, many attentions have been paid to the completeness property of the invariant descriptor set in pattern recognition and other similar application fields. These kinds of methods can be obtained by the following processes. Firstly, Fourier transform or Radon transform is employed to map the image into other space. Secondly, the different ideas can be conceived to construct invariant descriptors based on the information in new space. Sim et al. [5] gave a new method for texture image retrieval. They converted the images in Fourier domain and calculated modified Zernike moments to extract the texture descriptors.
It is tested that the descriptor has higher accuracy comparing to Gabor, Radon, and wavelet based methods and requires low computational effort. However, it is not invariant to scale. Wang et al. [6] and Xiao et al. [7] introduced the Radon transform to Fourier-Mellin transform to achieve RST (rotation, scaling, and translation) invariance and RS invariance combined blur, respectively. In virtue of Xiao's idea, Zhu et al. [8] constructed RST invariants using Radon transforms and complex moments in digital watermarking. Similarly, the Zernike moments can be connected with Radon transform. Rouze et al. [9] described a method to design an approach by calculating the Zernike moments of an image from its Radon transform using a polynomial transform in the position coordinate and a Fourier transform in the angular coordinate. However, the proposed descriptors are only invariant to rotation. Meanwhile, in order to improve the precision of image retrieval and noise robustness, Hoang and Tabbone [10] proposed a new method similar to Xiao's descriptor to obtain RST invariance based on the Radon, Fourier, and Mellin transform.
Then, Radon transform is widely applied in many methods mainly because of its better properties in projection space 2 Computational and Mathematical Methods in Medicine [11][12][13][14][15]. In the projective space, a rotation of the original image results in a translation in the angle variable, and a scaling of the original image leads to a scaling in the spatial variable together with an amplitude scaling [16,17]. Based on these properties, a rotation and scaling invariant function is easy to construct and highly robust to noise.
Enlightened by the peers' research works, we extend Radon transform to nonlinear Radon transform and propose a new set of complete invariant descriptors by applying Zernike moments to the radial coordinate of the pattern's nonlinear Radon space of an image [18][19][20][21][22].
The remainder of this paper is organized as follows. In Section 2, we briefly review the definition of nonlinear Radon transform and Zernike moments, and propose a new method based on Zernike moment and nonlinear Radon transform. In Section 3, the comparative experiments of the proposed approach with Hu moment invariance, Chong's method is conducted in terms of image retrieval efficiency, different noise robustness. Section 4 concludes this paper.

Nonlinear Radon Transform and Zernike Moments
where ( , ) ∈ 2 ( ), 1 is a real instance, denotes the angle vector formed by the function ( , ), and ( ( , ), ) is a rotation function by ( , ) with an angel of and defined by ( ( , ) , ) − 1 = 0. ( The nonlinear Radon transform indicates curve integral of the image function ( , ) along different curves. The parameter 1 can control the shape of curve. Different curves can be obtained by the values of parameter 1 and function ( , ). Especially when ( , ) = and 1 = 1, ( ( , ), ) = cos + sin . This reveals that the linear Radon transform is the special case of nonlinear Radon transform. The results of different curves' Radon transform are shown in Table 1.
The nonlinear Radon transform has some properties that are beneficial for pattern recognition as outlined below.
(1) Periodicity: the nonlinear Radon transform of ( , ) is periodic in the variable with period 2 when ( , ) is an arbitrarily parametric inference, (2) Resistance: if 1 ( , ) and 2 ( , ) are two images with little difference when ( , ) is arbitrarily parametric inference, the corresponding nonlinear Radon transform of 1 ( , ) and 2 ( , ) are as followes: (3) Translation: a translation of ( , ) by a vector ⇀ = ( 0 , 0 ) results in a shift in the variable of ( , ) by a distance = 0 cos + 0 sin and equals to the length of the projection of ⇀ onto the line cos + sin = , (4) Rotation: a rotation of ( , ) by an angle 0 implies a shift in the variable of ( , ) by a distance 0 when ( , ) is arbitrarily parametric inference, (5) Scaling: a scaling of ( , ) by a factor of results in a scaling in the variable and 1/ of amplitude of ( , ), respectively, when ( , ) represents ellipse or hyperbola curve, 2.2. Zernike Moment. The radial Zernike moments of order ( , ) of an image function ( , ), is defined as where the radial Zernike moment of order ( , ) is defined by the following equation:

NRZM Descriptor Based on Nonlinear Radon
Transform and Zernike Moment. The Zernike moment is carried out to be computed after the projective matrix of nonlinear Radon transform is mapped to the polar coordinate (NRZM). The computational process of our proposed method, NRZM, is illuminated in Figure 1. Supposed̃( , ) is the image ( , ) rotated by rotational angle and scaled by scaling factor , and Radon transform of̃( , ) is given bỹ The Zernike moments of̃( , ) is The radial Zernike polynomials ( ) can be expressed as a series of ( ) as follows:  The derivation process of (13) is given in the Appendix. According to (12), we havẽ 4 Computational and Mathematical Methods in Medicine Let = / , = + , (14) can be rewritten as Equation (15) shows that the radial Zernike moments of being rotated image can be expressed as a linear combination of the radial Zernike moments of original image. Based on this relationship, we can construct a set of rotation invariant which is described as follows: ) .
Then, is invariant to rotation and translation.

Experimental Results and Discussions
This section is intended to test the performance of a complete family of similarity invariants introduced in the previous section for images retrieval by comparison, Chong's method presented in [12], Hu moment presented in [13].
Three subsections are included in this section. In the first subsection, we test the retrieval efficiency of proposed descriptors in shape 216 dataset. This dataset is composed of  18 shape categories with 12 samples per category, and each of every category cannot be obtained by RST transforming from any other shape from the same category. In the second subsection, we test robustness of proposed descriptors in different noisy dataset. In the third subsection, we verify the rotation invariance of the proposed method.

Experiment 1.
The kind of curves is changing with the controlled parameters varying. So, the retrieval efficiency is different with the controlled parameters. Many experiments are conducted to find the best parameters' values of every curve in nonlinear Radon transform, and finally the most suitable values of parameters are listed in Table 2. In the subsequent experiments, we analyze the retrieval efficiency of linear Radon transform, ellipse Radon transform, hyperbola Radon transform, and parabola Radon transform with Zernike moment, respectively, which is referred to as NZ, EPZ, HPZ, and PRZ, respectively. In order to obtain the best retrieval efficiency of every curve Radon, the comparative precisions-recall curves in Shapes 216 are shown in Figure 2. It can be seen that the precision-recall curve of PRZ moves downward more slowly than those of others, which indicates that the retrieval efficient of PRZ is slightly higher than that of RZ while HRZ is weaker than PRZ and RZ.
The comparative number of relevant image upon every category is a better insight into the performance of proposed method as shown in Figure 3. It is easy to see that almost the number of relevant image in every category is higher than 6, especially in bird, children, elephant, face, glass, hammer, heart, and misk.

Experiment 2.
The robustness of the proposed descriptors is demonstrated using eight datasets added additive "salt & pepper" and "Gaussian" noise, respectively. The first seven datasets are generated from original shape 216 database, and each image is corrupted by "salt & pepper" noise with SNR varying from 16 to 4 dB with 2 dB decrements. The last one is generated from shape 216 added "Gaussian" noise with noise density = 0.01, . . . , 0.2.
The retrieval experiments are conducted again in the datasets mentioned above and the precision-recall curves of comparative descriptors are depicted in Figure 4. From Figures 4(a)-4(g), it can be observed that efficiency of the PRZ and RZ are similar. It also can be seen that the PRZ and RZ descriptors have better performances than other comparative methods in "salt and pepper" noisy datasets from SNR = 16 to 8, while Hu moment and Chong's descriptors have similarly the worse performance. However, when SNR = 6 and SNR = 4, the situation has changed. The deterioration appears in the PRZ and RZ because their precision-recall curves moves downward more rapidly than those of HPZ and EPZ, while they move downward more slowly than those of Chong's method and CMI. This demonstrates that PRZ and RZ descriptor are sensitive than other nonlinear methods' descriptors when the value of SNR is low of 8 though it has the stronger robustness than Chong's method and Hu moment. In short, the impact of noise on RZ, ERZ, HRZ, and PRZ curves sometimes were little similar or sometimes differ from one to another. It is also observed that (1) as the values of SNR decrease, the curves of all the descriptors generally move downwards; (2) Hu moment and Chong's descriptors are very sensitive to noise, and their performance has not changed much under different levels of noise; (3) Hu moment method has more resistance to "salt & pepper" noise than Chong's descriptors; (4) among the RZ, ERZ, PRZ, and HRZ, the resistance of PRZ is the strongest to "salt & pepper" noise and that of RZ is close to PRZ when the values of SNR are higher than 6; (5) PRZ is always slightly more robust to "salt & pepper" noise than RZ except for SNR = 6 and SNR = 4; (6) EPZ and HPZ descriptors are more robust to "salt & pepper" noise than PRZ and RZ when values of SNR are higher than 6.
However, the retrieval results shown in Figure 4(h) are essentially different from those in Figures 4(a)-4(g). It is clear that ERZ and HRZ are more robust to "Gaussian" noise than other methods because their precision-recall curves are absolutely on the top of others in the "Gaussian" noisy dataset. This indicates that "Gaussian" noise can result in poor performance in the case of linear transform. In these cases, the nonlinear Radon transform should be a top priority to be employed in the proposed method.

Experiment 3.
The last test dataset is color objective dataset generated by choosing 7 sample images from Col and View subset. Each of the datasets is transformed by being rotated by 72 arbitrary angles (10-360) with 5 degree increment. As a result, the last dataset consists of 504 images, and the retrieval results are shown in Figure 5. From the figure, it can be concluded that the proposed descriptors are invariant to rotation, and the retrieval performance of PRZ is more efficient.

Conclusion
In this paper, we proposed a method to derive a set of rotation invariants using Radon transform and Zernike moments and extend linear Radon transform to nonlinear Radon transform.
Comparing to linear Radon transform, the proposed method can perform better or similar. However, the numerical experiments show that different curve Radon transforms  and Zernike moment perform different. In the noiseless dataset, the retrieval efficiency of PRZ is higher than comparative methods. In the "salt & pepper" noise and the PRZ consistently performs better except SNR = 6 and SNR = 4. While when SNR = 6, SNR = 4, the EPZ and HPZ are most robust than RZ. And in "Gaussian" noise dataset, the proposed method in the cases of nonlinear Radon transform is more robust to "Gaussian" noise than that in the case of linear Radon transform. Moreover, the nonlinear Radon transform can be exploited to other application fields for engineer application and recognition for the sake of the good characteristic, especially their robustness.

Appendix
Proof of (13) From (12), the radial Zernike polynomials can be expressed as a series of decreasing power of as follows: