^{1}

Many models of neural networks have been extended to complex-valued neural networks. A complex-valued Hopfield neural network (CHNN) is a complex-valued version of a Hopfield neural network. Complex-valued neurons can represent multistates, and CHNNs are available for the storage of multilevel data, such as gray-scale images. The CHNNs are often trapped into the local minima, and their noise tolerance is low. Lee improved the noise tolerance of the CHNNs by detecting and exiting the local minima. In the present work, we propose a new recall algorithm that eliminates the local minima. We show that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance through computer simulations.

In recent years, complex-valued neural networks have been studied and have been applied to various areas [

CHNNs have two update modes, the asynchronous and synchronous modes. In the asynchronous mode, only one neuron can update at a time. In the synchronous mode, all the neurons simultaneously update and the CHNNs converge to a fixed point or to a cycle of length 2 [

In the present work, we propose a recall algorithm to accelerate the recall. Our proposed recall algorithm removes the autoconnections for updating and uses only the autoconnections for detecting the local minima and the cycles. Our proposed recall algorithm eliminates the local minima and the cycles. We performed computer simulations for the recall speed and the noise tolerance. As a result, we showed that our proposed recall algorithm not only accelerated the recall but also improved the noise tolerance.

The rest of this paper is organized as follows. Section

In the present section, we introduce the complex-valued Hopfield neural networks (CHNNs). First, we describe the architecture of the CHNNs. The CHNNs consist of the complex-valued neurons and the connections between complex-valued neurons. The states of the neurons and the connection weights are represented by complex numbers. We denote the state of neuron

The activation function of complex-valued neurons in the case of

Next, we describe the recall process. There exist two updating modes, the synchronous and asynchronous modes. In the asynchronous mode, only one neuron can update at a time. If the connection weights satisfy the stability condition and the autoconnection

(a) A CHNN. (b) A CHNN represented with two layers. After the right layer updates, the state of the right layer is transferred to the left layer.

Learning is defined as determining the connection weights that make the training patterns fixed. The projection rule is one of the learning algorithms. We denote the

The recall algorithm proposed by Lee is described. We utilize the synchronous mode for recall. A CHNN converges to a fixed point or a cycle. Fixed points are divided into two categories, local and global minima. We can determine whether a fixed point is a local or global minimum. In a fixed point, we calculate the weighted sum inputs. If all the lengths of the weighted sum inputs are

Lee proposed exiting a local minimum or a cycle by changing a neuron’s state. In this work, we added a small noise to a fixed vector in order to exit a local minimum or a cycle.

We only modify the weighted sum input for fast recall algorithm. The modified weighted sum input

We performed computer simulations to compare two recall algorithms. The number of neurons was

Here we describe the recall process in the computer simulation.

A training pattern with noise was given to the CHNN.

The CHNN continued to update in the synchronous mode until it reached a fixed point or a cycle.

If the CHNN was trapped at a local minimum or a cycle, the noise was added at the rate

After updating, if the pattern was identical to the pattern that preceded the previous pattern, the CHNN was trapped at a cycle. If the pattern is equal to the previous pattern, the CHNN was trapped at a local or global minimum. If the CHNN did not achieve a global minimum before 10,000 iterations, the recall process was terminated. We generated 100 training pattern sets randomly. For each training pattern, we performed 100 trials. Therefore, the number of trials was 10,000 for each condition.

First, we performed computer simulations for the recall speed. The numbers of training patterns used were

Recall speed in the case of

Recall speed in the case of

Recall speed in the case of

Next, we performed computer simulations for the noise tolerance. The number of training patterns was

Noise tolerance in the case of

Noise tolerance in the case of

The computer simulations for the recall speed show that the conventional algorithm tended to be trapped at the local minima and the cycles. Autoconnections worked to stabilize the states of the neurons. When

Autoconnections stabilize the states of neurons.

In the case of

Lee improved the noise tolerance of CHNNs with the projection rule by detecting and exiting the local minima and the cycles. We proposed a new recall algorithm to accelerate the recall. Our proposed recall algorithm eliminates the local minima and the cycles and accelerated the recall. In addition, our proposed recall algorithm improved the noise tolerance. On the other hand, the conventional recall algorithm hardly completes the recall in cases in which

The author declares that there are no conflicts of interest regarding the publication of this article.