Extremal Inverse Eigenvalue Problem for a Special Kind of Matrices

Peng et al. in [1] solved two inverse eigenvalue problems for symmetric arrow matrices and, in the other article [2], a correction, for one of the problems stated in [1], has been presented as well. In recent paper [3], Nazari and Beiranvand introduced an algorithm to construct symmetric quasiantibidiagonal matrices that having its given eigenvalues. Pickmann et al. in [4] introduced an algorithm for inverse eigenvalue problem on symmetric tridiagonal matrices. In this paper we introduced symmetric doubly arrow matrix as follows:


Introduction
Peng et al. in [1] solved two inverse eigenvalue problems for symmetric arrow matrices and, in the other article [2], a correction, for one of the problems stated in [1], has been presented as well.In recent paper [3], Nazari and Beiranvand introduced an algorithm to construct symmetric quasiantibidiagonal matrices that having its given eigenvalues.Pickmann et al. in [4] introduced an algorithm for inverse eigenvalue problem on symmetric tridiagonal matrices.In this paper we introduced symmetric doubly arrow matrix as follows: where   ≥ 0, 1 ≤  ≤ .If  = 1 or  = ; then the matrix  of the form (1) is a symmetric arrow matrix as follows: This family of matrices appears in certain symmetric inverse eigenvalue and inverse Sturm-Liouville problems [5,6], which arise in many applications [7][8][9][10][11][12], including modern control theory and vibration analysis [7,8].In this paper, we construct matrix  of the form (1), from a special kind of spectral information, which only recently is being considered.Since this type of matrix structure generalizes the well-known arrow matrices, we think that it will also become of interest in applications.
We will denote   as the identity matrix of order ;   as the  ×  leading principal submatrix of ;   () as the characteristic polynomial of   ; and as the eigenvalues of   .
We want to solve the following problem. are, respectively, the minimal and maximal eigenvalues of   ,  = 1, 2, . . ., .
Our work is motivated by the results in [2].There, the authors solved this kind of inverse eigenvalue problem for symmetric arrow matrix  of the form (2).
The paper is organized as follows.In Section 2, we discuss some properties of .In Section 3, we solve Problem 1 by giving a necessary and sufficient condition for the existence of the matrix  in (1) and also solve the case in which the matrix , in Problem 1, is required to have all its entries   positive.Finally, In Section 4 we show some examples to illustrate the results.

Properties of the Matrix 𝐴
Lemma 4. Let  be a matrix of the form (1). Then the sequence of characteristic polynomials {  ()}  =1 satisfies the recurrence relation: Proof.It is easy to verify by expanding the determinant.

Solution of Problem 1
The following theorem solves Problem 1.In particular, the theorem shows that condition ( 6) is necessary and sufficient for the existence of the matrix  in (1).

Theorem 8. Let the real numbers 𝜆
() 1 and  () ,  = 1, 2, . . ., , be given.Then there exists an  ×  matrix  of the form (1), such that  () 1 and  ()  are, respectively, the minimal and maximal eigenvalues of its leading principal submatrix   ,  = 1, 2, . . ., , if and only if Proof.Let  () 1 and  () ,  = 1, 2, . . ., , satisfy (13).Observe that and  1 () =  −  1 .From Theorem 2, there exists   ,  = 2, . . .,  with  of the coefficient matrix of the system (15) is nonzero, then the system has unique solutions   and  2 −1 ,  =  + 1, . . ., .In this case, from Lemma 6 we have h > 0. By solving the system (15) we obtain then  −1 is a real number and therefore, there exists  with the spectral properties required.Now we will show that, if ℎ  = 0, the system (15) still has a solution.We do this by induction by showing that the rank of the coefficients matrix is equal to the rank of the augmented matrix.