AN EVOLUTIONARY RECURSIVE ALGORITHM IN SELECTING STATISTICAL SUBSET NEURAL NETWORK/VDL FILTERING

We propose an evolutionary recursive algorithm, for the exact windowed case, to estimate subset vector discrete lag (SVDL) filters with a forgetting factor and an intercept variable. SVDL filtering is demonstrated as a basis for constructing a multi-layered polynomial neural network by Penm et al. (2000) The new proposed time update recursions allow users to update SVDL filters at consecutive time instants, and can show evolutionary changes detected in filter structures. With this new approach we are able to more effectively analyse complex relationships where the relevant financial time series have been generated from structures subject to evolutionary changes in their environment. An illustration of these procedures is presented to examine the integration between the Australian and the Japanese bond markets, and the USA and the UK bond markets, changed over the period. The proposed algorithms are also applicable to full-order vector discrete lag (VDL) filtering with a forgetting factor and an intercept.


Introduction
Statistical filter researchers for financial time-series systems are often concerned that the coefficients of their established filters may not be constant over time, but vary when the filters are disturbed by changes arising from outside environmental factors.This concern has motivated researchers to develop sequential estimation algorithms that allow users to update subset time series filters at consecutive time instants, and allow for the coefficients to slowly evolve, and then can show evolutionary changes detected in filter structures.Hannan and Deistler [4] propose a recursive estimation of an autoregressive (AR) filter.Azimi-Sadjadi et al. [1] suggest a full-order updating procedure for the training process of a multiplayer neural network.
These studies utilise the fixed forgetting factor (henceforth called the forgetting factor) in the filtering and simulations of nonstationary time series.The forgetting factor as 2 Subset neural network/VDL filtering described in Hannan and Deistler [4] has been widely adapted to account for nonstationarity of time series, and it is suitable to capture nonstationarity for a filter in which the underlying relationships between the variables involved change smoothly and gradually.In order to emphasise the importance of using more recent observations in filtering, the forgetting factor allocates greater weight to more recent estimated residuals and "forgets" some of the past.When the recursions are implemented in the stationary situation, the value of the forgetting factor is set to one.
The use of vector discrete lag (VDL) filtering in financial time series is versatile.As shown in Penm et al. [6], a VDL filter has been demonstrated as a basis for constructing a multi-layered polynomial neural network.Further, Holmes and Hutton [5] suggest the use of a SVDL system to assess the relationship between z(t) and the set of current and lagged y(t), where there is a continuous or random delay.
In order to increase the filtering and analysis power of neural networks to be applied to a time series system, Penm et al. [6] introduce inhibitor arc and switchable connection to the neural network structure.The inhibitor arc was introduced to network theory by Petri [9], and the associated connection strength for all these arcs is constrained to zero at all times.The switchable connection is obtained from switching theory, and the strength is switchable between zero and non-zero at any time.To incorporate such powerful connections the commonly used estimation algorithms for full-order filters are not applicable, because the structure of the lag coefficients is estimated without the "presence and absence" restrictions.As a result, it is necessary to develop an estimation algorithm for SVDL filters which includes full-order filters as a special case.
While there are well developed time update full-order VDL filtering algorithms [see Carayannis et al. [3]], these are not applicable to VDL filters with a subset structure.This is because the "optimal" subset filter at time instant t may become "suboptimal" at time instant t + i, i = 1,...,n.If one simply sets zero values to the coefficients of the missing lags, and then applies the time update algorithms for the full-order case, this leads to a loss of efficiency in the filter performance, as the subset structure of the filter is not updated accordingly.Consequently, an efficient algorithm needs to be developed for the time update SVDL filters, including full-order filters obtained as special cases.
In this paper a forward time update algorithm has been developed for the exact windowed case, to recursively estimate the SVDL filtering with a forgetting factor and an intercept variable.Compared to the residual-based algorithm proposed in Penm et al. [6], the current algorithm utilises only the available observations without any assumption on unseen observations to estimate filter coefficients.In addition, the current algorithm is a coefficient-based time update algorithm, which can detect evolutionary changes in filter structures.However the proposed algorithm in Penm et al. [6] is a residual-based order update algorithm, which undertakes recursions, moving from low-order filters to high-order filters, so no evolutionary changes are captured through parameter updating.Therefore the focus of Penm et al. [6] is not on the evolution over time of the parametric structure of the system.We now are proposing forward time update algorithms because they allow users to update SVDL filters at consecutive time instants, and can show evolutionary changes detected in filter structures.

Andrew H. Chen et al. 3
To demonstrate the effectiveness of the proposed time update algorithm, we apply the recursive extended two-layer neural network algorithm, equivalent to time update SVDL filtering, to examine the causal linkages between government bond markets of three leading industralised countries, namely the United States, the United Kingdom and Japan, during the period from August 1994 through December 2004.The findings are helpful in explaining linkages between the government bond markets involved.
The remainder of the paper is organised as follows.In Section 2 we present the algorithm for recursively estimating SVDL filtering with a forgetting factor and an intercept variable.An empirical example to assess whether there is any changing integration between the Australian and the Japanese bond markets, and the USA and the UK bond markets is presented in Section 3. In Section 4 a summary is given.

New evolutionary recursive algorithms for SVDL filtering
In this section we introduce forward time-update recursions which recursively estimate an SVDL filter for the exact-windowed case.
In SVDL filtering, it is desirable to relate z(t) to present and past data for y(t).We consider an SVDL filter of the form where h i , i = 1,2,..., p are g × r parameter matrices, ρ is an intercept variable, ε h (t) is a g × 1 stationary process with E{ε h (t)} = 0 and Equation (2.1) and properties associated with ε h (t) together constitute a VDL filter, which involves a g-dimensional regressand vector z(t) and an r-dimensional regressor vector y(t).The order of the system is p, Given two finite data sample sets, {z(n), ...,z(T)} and {y(n), ..., y(T)}, it is necessary to sequentially estimate all possible SVDL filters from (2.1) using the exact-windowed case.Since the actual scheme of (2.1) may not be order p, the resulting estimates of h i is denoted by h p,n,T (i), where T is the sample size under examination.Then the predictor of an SVDL system of (2.1) can be described as 4 Subset neural network/VDL filtering The residual vector for observation i is In reality, many time-series systems present complex non-stationary features and cannot be filtered by assuming that y(t) and x(t) are stationary.Thus, for a VDL filter fitted to these two sample sets using the exact windowed method of forming the sample covariance matrix, we have where where λ, 0 < λ ≤ 1, is the forgetting factor as described in Hannan and Deistler [4].The forgetting factor is suitable for a filter in which the underlying relationships between the variables involved change smoothly and gradually.When λ = 1, the recursions are implemented in the stationary situation.Further, the following relations at t = T + 1 have been established, where To develop time update recursions for SVDL filtering, we consider the forward VAR(p,I s ) filter with an intercept variable of the form where ε(t) is an independent and identically distributed random process with We also consider a backward VAR(p) filter of the form where E{ ε(t)} = 0, and the disturbance variance is V (M s ), M s represents an integer set with elements m 1 ,m 2 ,...,m s , m j = p − i j , j = 1,2,...,s.A reciprocal integer pair for a forward subset VAR filter and a backward subset VAR filter is a pair of (2.9) and (2.11).Figure 2.1 shows a lag tree diagram which illustrates the reciprocal integer pairs of all subset VAR processes up to and including lag length, k = 4.Note that numerals represent particular lags in a forward VAR and underlined numerals represent such leads in a backward VAR.We need to sequentially estimate all possible subset VAR filters from (2.9) and (2.11) using the exact-windowed case.Then we define observation i Y p,i = I, y(i − 1),..., y(i − p) , (2.15) In addition, for the corresponding backward VAR(p,Ms), we will have where B p,n,T (M s ) and Y p−1,i (M s ) are formed by removing the (p + 2 − m 1 ),...,(p + 2 − m s ) th row of B p,n,T and Y p−1,i respectively.We note p + 2 − m j = 2 + i j = o j , thus we can easily see that Next, we consider a forward VAR(p + 1,I s ) filter with a intercept variable of the form where we have shifted the intercept variable to the third term of the filter for ease of matrix algebra operations.Suppose the filter is based on the sample set {y(n), y(n + 1),..., y(T + 1)}, we have (2.18) Again we consider a forward VAR(p,L s ) filter with intercept, that is, where we keep the intercept variable in the third term to assist with our algebraic manipulations.Suppose the filter is fitted to the sample set {y(n + 1), y(n + 2),..., y(T + 1)}, analogously we have (2.20) The matrix inversion of R p,n,T (I s ) provides 8 Subset neural network/VDL filtering From Penm et al. [7], if we permute the first row and the second row of the K p+1,n,T+1 (I s ), the resulting vector is the K p+1,n,T+1 (I s ) associated with a forward VAR(p + 1,L s ) filter of the form (2.25) which means that K p+1,n,T+1 (I s ) = P p+1 K p+1,n,T+1 (I s ), where P p+1 is a permutation matrix of the form: Note that if there is a consecutive set of k deleted lags beginning at lag 1 in the forward VAR(p,I s ) filter fitted using the sample {y(n), ..., y(T + 1)}, we have where I s contains i 1 ,...,i k ,...,i s , and i j = j, j = 1,2,...,k, and I k contains i k+1 ,...,i s−k .We now turn to the SVDL filtering.We use (2.6), (2.8) and (2.15), the following time update recursions for SVDL filtering can then be developed.(2.28) In addition, the forward-time update algorithm from T to T + 1 is summarised in Algorithm 2.1.
To determine the optimal SVDL filter at each time instant, we utilise the filter selection criterion suggested by Hannan and Deistler [4].From now on, we will use MHQC as an abbreviation for this criterion, which is defined by where f (T) = T t=p−1+n λ T−t is the effective sample size, and N the number of functionally independent parameters.The optimal filter selected has the minimum value of MHQC.

Applications
To demonstrate the effectiveness of the proposed recursive algorithm, we investigate whether the causal relationships between the Australian and Japanese bond markets and Andrew H. Chen et al. 9 A p,n,T (I s ), V p,n,T (I), K p,n,T (I s ), τ p,n,T (I s ), B p,n,T (M s ), λ, Y p,T+1 (O s ), Y p−1,T+1 (I s ), Vp,n,T (M s ), H p,n,T (I s ), Ω p,T (I s ), y(T + 1) and z(T + 1) are available Recursions: e p,n,T+1 I s = 1 : MA p,n,T I s Y p,T+1 O s ε p,n,T+1 I s = e p,n,T+1 I s τ −1 p,n,T I s A p,n,T+1 I s = A p,n,T I s − K p,n,T I s ε p,n,T+1 I s V p,n,T+1 I s = λV p,n,T I s + ε p,n,T+1 I s e p,n,T+1 I s the world's two major bond markets, those of the USA and the UK, changed over the period.The approach uses government bond indices from the Australian and the Japanese bond markets in a system that also include indices of the USA and the UK markets.
The former two markets are considered to be representative of markets in the Asia Pacific Basin region, while the latter two markets are treated as a proxy of the world bond market.We undertake this research within the framework of SVDL, which provides a new approach to examining such relationships.The more traditional framework for examining these questions is through VAR.However VAR filtering does not explicitly detect evolutionary causal changes in the current and contemporary variable structures.It 10 Subset neural network/VDL filtering only provides the relationships detected from the lagged filter structure.The current and contemporary causal relations become increasingly important in the efficient and competitive financial markets of the developed economies.Therefore the filter development should focus on VDL filtering.The J. P. Morgan monthly bond price indices are sourced from Datastream over the period August 1994 through December 2004.Those data in US dollar terms are sampled from the government bond indices of Australia (AUS), Japan (JP), the UK (UK) and USA (USA).The sample size of each series is 125.To examine stationarity for each series, Microfit 4.0 is used to carry out the augmented Dickey-Fuller (ADF) unit root test.The 95 per cent critical values for each test computed using the response surface estimates, indicate that all series are I(1).
The algorithm developed in Section 2 is used to assess the relationships between the AUS-JP pair and the UK-USA pair.In detecting the causal relationship from UK-USA to AUS-JP, the variables used are z 1 (t) = log AUS, z 2 (t) = log JP, y 1 (t) = log UK and y 2 (t) = log USA.As discussed above, none of the logarithms of the four bond indices is stationary.Therefore, exponential forgetting is used with a forgetting factor, 0.99, to allow for the presence of non-stationarity.To begin it is assumed P = 16, which corresponds to a one and a quarter year period.The proposed evolutionary SVDL recursions described above are then applied to the logarithms of the data to select the "optimal" specification of the vector discrete lag filters.A SVDL filter with lags (0,8) is selected by the MHQC at T = 120 and 121.At T = 122, the lag structure selected changes to (0).Detailed filter specifications with zero constraints selected are reported in Table 3.1.
To check the adequacy of each optimal filter fit, the strategy suggested in Brailsford et al. [2] is used, with the proposed Penm and Terrell [8] algorithm applied to test each estimated residual vector series.The results in Table 3.1 support the hypothesis that each residual vector series is a white noise process.These optimal filters are then used as the benchmark filters for analysing the causal relationships.The analysis indicates that from T = 120 to 121, AUS is caused by the current and lagged UK; from T = 122 to 125, AUS is caused by the current UK only; and AUS is only caused by the current USA from T = 120 to 125.JP is only caused by the current UK and the current USA from T = 120 to 125.All these results occur when emphasis is placed on recent data.The findings show that an efficient market hypothesis does exist in the system under examination from T = 122.As time goes on, no arbitrage opportunity can be taken in those developed economies.
For the causal relationship from AUS-JP pair, to UK and USA, Table 3.2 shows the optimal discrete lag filters with λ = 0.99.These results strongly support the existence of current causal relationship between AUS and UK.No causality from JP to UK is detected, and no causality from either AUS or JP to US is identified.Also, the results in Table 3.2 support the hypothesis that each estimated vector residual series is a white noise process.The findings confirm that the US economy is the hub of the world economy.The UK bond market movements are not affected by the Japanese bond market movements.Since the Japanese government imposes a zero-interest rate policy to avoid deflation arising in the Japanese economy, no evidence of any causality from the Japanese bond market movements to the UK and US bond markets can be identified.

Conclusion
In this paper a revolutionary recursive algorithm, using the exact windowed case, has been developed to sequentially select the best specification for a statistical subset neural A p,n,T = τ p,n,Tp ,a p,n,T (1),...,a p,n,T (p) , B p,n,T = b p,n,T (p),ξ p,n,T ,...,b p,n,T (1) .(2.12)For a reciprocal integer pair of the forward VAR(p,I s ) and the backward VAR(p,M s ) filters fitted to this sample set, we have R p,n,T I s = T i=p+n λ ,T+1 I s = z(T + 1) + H p,n,T I s Y p−1,T+1 I s ε h p,n,T+1 I s = e h p,n,T+1 I s τ −1 p,n,T+1 I s H p,n,T+1 I s = H p,n,T I s − K p,n,T+1 I s ε h p,n,T+1 I s Ω p,T+1 I s = λΩ p,T I s + ε h p,n,T+1 I s e h p,n,T+1 I s .
n,T+1 I s = P p K p,n,T+1 I s e h p,n,T+1 I s = z(T + 1) + H p,n,T I s Y p−1,T+1 I s ε h p,n,T+1 I s = e h p,n,T+1 I s τ −1 p,n,T+1 I s H p,n,T+1 I s = H p,n,T I s − K p,n,T+1 I s ε h p,n,T+1 I s Ω p,T+1 I s = λΩ p,T I s + ε h p,n,T+1 I s e h p,n,T+1 I s Algorithm 2.1.The forward-time update recursions from T to T + 1 for SVDL forgetting-factor inclusive filters with intercept variable.
represents an integer set with elements o j , j = 1,...,s, and o j = i j + 2. Y p,i (O s ) and Y p−1,i (O s ) are formed by removing the (o 1 ),...,(o s ) th row of Y p,i and Y p−1,i respectively.L s represents an integer set with elements l j , and l j = i j + 1 and Y p,i (L s ) is formed by removing the (l 1 ),...,(l s ) th row of Y p,i .Also A p,n,T (I s ) is formed by removing a T−i Y p,i O s Y p,i O p,n,T (i 1 ),...,a p,n,T (i s ) of A p,n,T .Now we define

Table 3 .
1.The SVDL for the relationship, linking government bond indices from UK-USA pair to Australian and Japan by MHQC using the GLS procedure.− τ) = ε h (t).Non-zero (i, j)th entries in estimated coefficient matrices, h τ and ρ.