Reprints Available Directly from the Publisher Photocopying Permitted by License Only (c) 2001 Opa (overseas Publishers Association) N.v. on Permutation Symmetries of Hopfield Model Neural Network

Discrete Hopfield neural network (DHNN) is studied by performing permutation operations on the synaptic weight matrix. The storable patterns set stored with Hebbian learning algorithm in a network without losing memories is studied, and a condition which makes sure all the patterns of the storable patterns set have a same basin size of attraction is proposed. Then, the permutation symmetries of the network are studied associating with the stored patterns set. A construction of the storable patterns set satisfying that condition is achieved by consideration of their invariance under a point group.

INTRODUCTION Symmetry plays an important role in unification and classification of physical systems [1], e.g., crystals by their point group symmetry or the atomic nucleus by the isospin symmetry.Permutation symmetry may also be important, a promi- nent example is the replica symmetry breaking in the spin glass theory, where a classification of the mean field theoretical solutions is achieved by consideration of their invariance under subgroups of the symmetric group [2].
* Corresponding author, e-mail: jydong@jingxian.xmu.edu.cn129 Another object for which the study of symme- tries proves to be very fruitful is the area of neural networks.Global properties of an individual network can be found from symmetry considera- tions of the invariance group of the specific pattern set stored by some learning rules [3, 4].
Here, the permutation symmetries on DHNN are associated with that on the stored patterns set.The organization of this paper is as follows.Section 2 gives a simple introduction on the discrete Hopfield neural network.In Section 3, the DHNN is studied by performing permutation operations on the synaptic weight matrix, and some interesting properties are found.The permutation symmetries of the network are studied associating with the stored patterns set.A condition that makes sure all patterns of the storable patterns set have a same basin size of attraction is proposed.In Section 4, a construction of the storable patterns set with same basin size of attraction is achieved by consideration of their invariance under a point group.We give some discussions and a conclusion in Section 5.

DISCRETE HOPFIELD NEURAL NETWORK
Discrete Hopfield neural network [5] (DHNN) was proposed mainly as an associative memory model.It is a dynamical system uniquely defined by (W, 0).Here W is a symmetric zero-diagonal real weight matrix, where element W, denotes the weight of the connection from neuron j to neuron (i,j= 1,2,...,N for N-neurons network), 0 is a vector of dimension N, where component Oi de- notes the threshold value of neuron i.The state of neuron is denoted by Xi(t) and it is either "0" or "1".Without losing of generality, we let 0i-0, i-1,2,...,N for the sake of simplicity in this paper.
A state X= (X1, X2,..., XN) of the network is called stable if i.e., if the state of the network never changes as a result of evolution.We usually call the stable state as the attractor of the network.
Let X be a stable state of the network, if X m (t) X for any integer (including ), we say that xm(o) is attracted by X, and the state set {Xu(o)[xu(t) X, u 1, 2,..., A}, which are attracted by X, is the attractor basin of X.
In associative memory, the weight matrix W is generally designed with Hebbian algorithm.That is, to store M patterns X {0, 1}N, k--1,2,..., M, in a N-neurons network, the synaptic weight matrix is Here, we convert firstly the mono-polar pattern into bipolar one, then define the synaptic weight matrix of the network as the summation of the out-products of the corresponding bipolar pat- terns.So, let vk= 2X k-1, there is, M W , Vk(Vk) t-I, (4)   k=l I is identify matrix.
It can be easily verified that either Wii 0 or has the same properties discussed in this paper.But we prefer the model of DHNN that remains Wii 0 for the two reasons: (1) It is proved that DHNN with symmetric zero-diagonal real weight matrix will evolve to a stable state or a limited cycles whose period is 2 for any initial state in synchronous mode [6].(2) When Wii=O, the network is easy to be implemented for the applications.Wo.Xj Oi (i 1,2,... ,N) (2) The permutation can be expressed in terms of matrix called permutation matrix, which is a double-stochastic square matrix of (0, 1) and have only one in each row or column.For example, consider q (ij), which may conveniently be written as j 0 Let A N be a column vector and CN N be a N N matrix, then the action of permutation q (ij) on them would be qAul, exchange the value of column with column j.
qCu uqt, exchange the value of column with column j and row with row j.
Where qt is the transposed matrix of q.
An important property of the permutation matrix used in this paper is qqt =I, i.e., qt q-l, where q-1 is the inverse matrix of q.
When the synaptic weight matrix W of a network H is operated by a random permutation matrix q, the relation between H and the new network//whose synaptic matrix 9-qWq can be described by the following theorem.THEOREM Let H and [4 be two discrete Hopfield neural networks whose synaptic weight matrices are W and respectively such that -qWq .L et the set of attractors of the network H be denoted as /= { Vu, u 1,2,..., M}.Let the attractor basin of H be denoted as f--(X uv, v-1,...,A}.
Then the set of attractors of the network must be [-I {f,u= qVu, u 2 M} and the at- tractor basin of u must be (2Ul:i -{( zuv qVuv, v= 1,...,A}.
Proof There are two parts to this theorem.
Let X uv denote an initial state converged to the attractor of V u in the network H, i.e., V(O)-X uv and V(t Then, the state in -{fguv_ qVUV v-- l,..., A} must converge to the attractor of f,u in the network/-/. Assume there is an initial state in the state space Otherwise, similarly processed as above, let V(t) and (t) respectively denote the trajectories of the dynamical system H and from the initial state V(0) and (0), and consider V(0)= qt9(O).
Thus the theorem is proven, m Theorem describes the variance of the network after performed a random permutation operation.
However, there exist some permutations so that the network keeps invariance after permuted by them.Those permutations are the permutation symmetries of the network.
Before studying the permutation symmetry of the network, we introduce firstly some definitions.

DEFINITION
The symmetric permutation of the patterns set {X , u-1,2,..., M} is defined as the permutation q which maps patterns set to itself, {qXu, u 1,2,..., M} .The sym- metric permutation of a network with the synaptic weight matrix W is defined as the permutation g such that g Wg t-W.DEFINITION 2 A patterns set {Xu, u-1, 2,..., M} is storable [8] if all M sample patterns can be the stable states of some network with the synaptic weight matrix W which is a zero-diagonal real symmetric matrix.DEFINITION 3 The Hamming weight w(V) of the N-dimension binary vector V= {1,0} N is defined as the number of component in the vector w(V) E:l Vi.
In Hebbian neural network, the permutation symmetries of the network can be studied by the stored patterns set.We establish the following theorem to show the relationship between them.
THEOREM 2 Let G be the symmetric permuta- tions set of the sample patterns set {X u IX u u (X, X,...,Xv) u--1,2,...,M}, i.e., G--{q/q t}, where X u-{0, 1} N and q is a permutation matrix.For VXi, VX ] t, if there exists at least one permutation q G such that qX )fl, we have (1) Allpatterns in t9 have the same Hamming weight.
(2) All of the permutations in G are the symmetric permutations of the network with the synaptic weight matrix W which is stored the patterns of by Hebbian rule.
(3) All patterns in are stable in the network (W) and have same size of attractor basins if only one of the patterns is stable in the network.(4) G is a group which is a subgroup ofSn consisting of all permutations of the elements from the set (1,2,...,n}.

Proof
(1) Let X" and X m belong to patterns set , and their Hamming weight be K and L respec- tively.Let the permutation relation of them be q such that q-, q-il i2 l which replace the component order of (1,2,..., N) with (i, i, i), i.e., X m qX u q(X,X,... ,X) -(x u il 12 Assume K L, i.e., w( ) w( ).Otherwise, there is, N w(Vm) w(qVu) V. i=1 Because the summation of the components of a vector immaterial to their orders.So, N N w(v w(v") ik=l i=l i.e., K=L, which is opposite to the assumption.Therefore, all patterns in have the same Hamming weight.
(2) To store the patterns of in a network, the synaptic weight matrix is W a (2X 1)(2X; 1)  For Vq E G, i.e., '-{qXu} -ff, we let 9 be the weight matrix that stores the patterns of '-(qX).
Consider another vector XUM.Let the permutation relationship between X and X be X u-qXm, q G.We have step(WXu) step(WqXm) "." q is a symmetric permutation of the network, i.e., W-q Wqt.step(WXu) step(qWqtqXm) step(qWXm) q. step(WXm) qX m X u i.e., X is an attractor of the network.
Similarly, we can prove that all patterns in are the attractors of the network, i.e., the patterns set is storable.
(4) In the permutation set G, the product of two permutations pq is defined as the resultant permutation of first permuting with q and then with p. Now we prove that the following postulates are satisfied.CLOSURE i.e., if q and p belong to G, then c qp also belong to G.
Associativity We know from group theory that this postulate of associativity is true for any permutation, i.e., c(qp)--(cq)p, for any permuta- tions c, q, p. Obviously, G also satisfies this postulate.
Identity Element e The permutation matrix of e is an identity matrix, so eq qe-q holds for any permutation.So do the permutations in G. There- fore, G satisfies this postulate.
Inverse Element We know that q-l= qt is true for any permutation.
For any symmetric permutation q, we have, .qq--e So q-1 is also the symmetric permutation of , i.e., q-EG.
The four postulates are satisfied for G. There- fore, G is a group, and accurately, subgroup of the symmetry group Sn consisting of all permutations of the elements from the set {1, 2,..., n}.
Thus, the theorem is proved.In Theorem 2, we give a condition ofpermutation symmetry that makes sure a set of patterns be storable and have a same basin size of attraction, which can be a selecting rule of the storing patterns set in associative memory, especially for special use of it.

APPLICATION
In associative memory, the main task is to design a weight matrix to store the given patterns set through a certain learning process, so that the stable states of the network are exactly the given patterns set.If some of the given patterns are not the stable states of the network, we can say that the network loses some memories.If the state V is a stable state of the network but doesn't belong to the given patterns set, we say that V is a spurious stable state of the network.Both losing memories and existence of spurious stable states are the main problems of associative memory.And the former mainly results from the reason that the stored patterns set is not storable, e.g., the similarity of the stored patterns may cause this problem.In this section, we demonstrate how to construct a storable patterns set by the method of group theory.
It is well known that the point group can summerize perfectly the rotation symmetries of the geometric graph in Euclidean space.And any point group is a subgroup of the symmetry group Sn.So we can use the rotation symmetries of the geometric graph to construct the storable patterns set described in Theorem 2. For example, there are 24 rotation symmetries in the cube.The 24 rotation symmetries consist of a group called hexahedron group which is isomorphic to $4.We can name randomly the vertexes of the cube with {1,2,..., 8} (see Fig. 1).Then a sym- metric rotation of the cube will correspond to a Io" 5 permutation of the vertex's index.For example, when we rotate the cube 120 around the axes 13, we can see that the cube occupies the same position in the space, but the vertexes' position are changed, which amount to a permutation operation (475) (826).
We can examine it by constructing a network with the 6 vectors.Computed by Eq. (3). the synaptic weight matrix is 0 TABLE The classification table of the network with the synaptic matrix Eq. ( 7).In each column, vector beyond the line is attractor, vectors below the line form a basin of attraction In this example, we show that the storable patterns set can be constructed by an octahedral group with the aid of geometric graph in Euclidean space.However, it can easily be generalized to the other group.

DISCUSSION AND CONCLUSION
For the writing convenience, we list all the attractors and their domains of attraction by their transposed forms in Table I.
We can see from Table I that the patterns set consisting of the 6 vectors is storable and all the 6 vectors have a same basin size of attraction.
Discrete Hopfield neural network was proposed mainly as an associative memory model.Hopfield demonstrated by computer simulation that the network with N neurons can store about 0.138 N patterns in the form of the stable states, that is, not any patterns set is storable.One may thinks that the main reasons for the small capacity is the dynamic system of the network itself, and some researchers try to improve the capacity by modifying the dynamic system of the network, such as the dynamical behavior [9], the connecting dynamics [10, 11], the output function [12], and so on.But results are not satisfactory well.All those show that a storable patterns set is necessary for associative memory, especially for special use of it.
In addition, it should also be noticed that all the prototypes built to this date, in optical or very large-scale integrated technology, are still of a fairly small size.So, it is necessary to derive exact properties for the network as opposed to asymp- totic or to noncompletely rigorous ones, such as those which are based on mean-field theory in statistical mechanics which may be valid only in the limit of large systems.
In this paper, we study the discrete Hopfield neural network by the method of permutation, and associate the permutation symmetries of the net- work with its storing patterns set.By this way, we find some interesting properties in DHNN, which can be applied to associative memory in selecting a storable patterns set.It can be easily verified that the well-known Hadamard patterns set [13]  omitted the trivial pattern (0,0,...,0), whose permutation symmetries are proved to be iso- morphic to the group GL(n, F2) by Folk [14], satisfies the condition proposed in Theorem 2. So, the Hadamard network, in which the Hadamard patterns set has been stored by Hebbian learning rule, can be regarded as a spe- cial case in this paper.