Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.

In this work, we study the twin-multistate quaternion Hopfield neural networks (TMQHNNs) [18]. e neuron of a TMQHNN consists of a pair of complex-valued multistate neurons. e TMQHNN requires only half the connection weight parameters of the CHNN.
Storage capacity is an important issue in Hopfield neural networks. When a Hopfield neural network is given a training pattern, the weighted sum input is decomposed into main and crosstalk terms. e main term enables the Hopfield neural network to memorize the training patterns. e crosstalk term interferes with the storage of training patterns. e storage capacity of conventional Hopfield neural network has been investigated by evaluating the crosstalk term [19]. Jankowski et al. [1] and Kobayashi [20] applied this technique to CHNNs and rotor Hopfield neural networks (RHNNs), respectively. e RHNN is an extension of the CHNN using vectors and matrices [21]. In this work, we provide the Hebbian learning rule for TMQHNNs and evaluate the storage capacity based on Jankowski's concept.
In the case of TMQHNNs, the cross talk term is decomposed into two complex parts. By evaluating both parts, we determine the storage capacity of TMQHNNs. e theory suggests that TMQHNNs have half the storage capacities of CHNNs. In addition, we compared the storage capacities of CHNNs and TMQHNNs by computer simulation. e rest of this paper is organized as follows: Sections 2 and 3 introduce CHNNs and TMQHNNs, respectively. Section 4 provides the Hebbian learning rule for the TMQHNN and evaluates the storage capacity. It also contains descriptions of the computer simulations conducted to verify our analysis. Section 5 concludes this paper.

Complex-Valued Hopfield Neural Networks
e CHNNs are briefly described [1]. Let z a and w ab be the state of neuron a and the connection weight from neuron b to neuron a, respectively. e weighted sum input to neuron a is given by where N C is the number of neurons. For the resolution factor K, we define θ K � π/K. For the weighted sum, input I � r exp(iθ), where r ≥ 0 and 0 ≤ θ < 2π; the complex-valued multistate activation function is defined by We define the set of neuron states and denote it as follows: S � 1, exp(2iθ K ), . . . , exp(2(K − 1)iθ K ) . e connection weights must satisfy the following conditions: en, the CHNN converges to a fixed point.
where P C is the number of training patterns. e Hebbian learning rule is defined as en, the connection weights satisfy w ab � w ba . Giving the qth training pattern to the CHNN, the weighted sum input to neuron a is e second term of (7) is referred to as the crosstalk term.
e crosstalk term interferes with the storage of training patterns. We define If |arg(1 + A q a )| < θ K , then we have f C (I q a ) � z q a . erefore, if |arg(1 + A q a )| < θ K for all a, the qth training pattern is a fixed point. We regard N C A q a as the summation of P C N C random variables of V for simplicity, although the summation consists of exactly (P − 1)N C terms. e real and imaginary parts of each random variable have the equal variance σ and do not have correlations. Setting P C � αN C , N C A q a is regarded as the summation of αN 2 C random variables. en, we have Let X A and Y A be the real and imaginary parts of (1/ � � α √ )A q a , respectively. From the central limit theorem, we have

Twin-Multistate Quaternion Hopfield Neural Networks
A quaternion is expressed by q � q 0 + q 1 i + q 2 j + q 3 k using real numbers q 0 , q 1 , q 2 , and q 3 . e imaginary units i, j, and k satisfy the following properties: e quaternions satisfy the associative and distributive laws. For a complex number c, we have the important equality: Putting x � q 0 + q 1 i and y � q 2 + q 3 i, the quaternion q is described as q � x + yj. For quaternions x + yj and x ′ + y ′ j, the addition and multiplication are described as e conjugate of q is defined as en, we have the equality In the TMQHNNs, the neuron states and connection weights are represented by quaternions. ese neuron states and connection weights are denoted in the same way as those of CHNNs. e number of neurons in a TMQHNN is denoted as N Q . e weighted sum input to neuron a is given by 2 Computational Intelligence and Neuroscience For the weighted sum, input I � I x + I y j, the activation function is defined as erefore, the set of neuron states is S + Sj. e connection weights must satisfy conditions (3) and (4). en, the TMQHNN converges to a fixed point.

Storage Capacity of Twin-Multistate Quaternion Hopfield Neural Networks
We provide the Hebbian learning rule for TMQHNNs. Let be the pth training pattern, where P Q is the number of training patterns. e Hebbian learning rule is given by en, the connection weights satisfy w ab � w ba . Giving the qth training pattern to the TMQHNN, the weighted sum input to neuron a is e second term of (22) is also referred to as the crosstalk term and interferes the storage of training patterns. We decompose the quaternion z p a into a pair of complex numbers by z p a � x p a + y p a j to investigate the storage capacity. en, we have We define (24) en, we have If |arg(1 + B q a )| < θ K and |arg(1 + C q a )| < θ K , then we have f Q (I q a ) � z q a . We regard 2N Q B q a and 2N Q C q a as the summations of 4N Q P Q random variables of S. en, B q a and C q a follow the same distributions. us, we can discuss only B q a . If the TMQHNN is used instead of the CHNN, N C � 2N Q is required, since a twin-multistate quaternion neuron consists of two complex-valued multistate neurons. Setting P Q � βN C � 2βN Q , 2N Q B q a is regarded as the summation of 8βN 2 Q random variables, and we have Let X B and Y B be the real and imaginary parts of (1/2 � � β )B q a , respectively. From the central limit theorem, we have We require the same distributions for (12) and (22) and obtained α � 2β. us, the CHNN has double the storage capacity of the TMQHNN.
Computer simulations were conducted to verify our analysis. K was varied from 4−12 in steps of 2, and P was varied from 1−20. For each K and P, 100 sets of training patterns were generated randomly; the number of trials was 100.
e CHNN and TMQHNN attempted to store the training patterns by the Hebbian learning rule. If all the training patterns were fixed, the trial was regarded as successful, otherwise, as failed. Figure 1 shows the simulation results. e horizontal and vertical axes indicate the number of training patterns and success rate, respectively. e simulation results showed that the storage capacity of the TMQHNN was a bit larger than half that of the CHNN.

Conclusions
A TMQHNN needs only half the connection weight parameters of CHNN and is expected to have a smaller storage capacity. However, the storage capacity had not yet been analyzed. In this work, we defined the Hebbian learning rule for TMQHNNs and analyzed their storage capacity. e analysis demonstrated that a TMQHNN had half of the storage capacity of a CHNN. In addition, a computer simulation was conducted to verify our analysis. e simulation results confirmed our analysis. In future, we intend Computational Intelligence and Neuroscience 3 to study the storage capacity using different methods [19,22,23].

Data Availability
No data were used to support this study.   Computational Intelligence and Neuroscience