OBSTACLES TO BOUNDED RECOVERY

In this article, we are concerned with the following problem: let X be a Banach space, over the field F (F = C or R), V ⊂ X is an n-dimensional subspace of X and u1, . . . ,um are m linearly independent functionals on X. given x ∈X we want to recover x on the basis of the values u1(x), . . . ,um(x)∈ F. Hence we are looking for a map F : X → V such that uj (Fx) = uj (x) for all j = 1, . . . ,m. Since we do not know x a priori we choose to look for a map F such that the norm of F


Introduction
In this article, we are concerned with the following problem: let X be a Banach space, over the field F (F = C or R), V ⊂ X is an n-dimensional subspace of X and u 1 , . . ., u m are m linearly independent functionals on X. given x ∈ X we want to recover x on the basis of the values u 1 (x), . . ., u m (x) ∈ F.
Hence we are looking for a map F : X → V such that u j (F x) = u j (x) for all j = 1, . . ., m.Since we do not know x a priori we choose to look for a map F such that the norm of is as small as possible.We may also require additional properties on F such as linearity and idempotency.
(1.3) Clearly Ᏺ(X, U, V ) ⊃ ᏸ(X, U, V ) ⊃ ᏼ(X, U, V ), 1 ≤ r(X, U, V ) ≤ lr(X, U, V ) ≤ pr(X, U, V ). (1.4) The class ᏼ(X, U, V ), and hence the rest of the classes are nonempty if and only if where U | V is the restrictions of functionals from U onto V .
In particular, we will always assume that m ≤ n.If m = n and (1.5) holds then all three classes coincide and consist of uniquely defined linear projection.Hence the problem of estimating the recovery constants is reduced to estimating the norm of one projection.The problem of estimating r(X, U, V ) can also be considered as a local version of "SIN property" described in [1].
In this paper, we will characterize the recovery constants in terms of geometric relationships between Banach spaces X, U , V , and their duals.
In our setting U is an m-dimensional subspace of functionals on X.If we restrict U to be functionals on V , we obtain a new Banach space (1.6) Of course, algebraically it is the same space but the norm on Ũ is defined to be as opposed to and hence topologically these are two different spaces.In fact Ũ ⊂ V * and may not even be isometric to any subspace of X * (and in particular to U ).It turns out that the recovery constants depend on how well U can be embedded in V * and X * , as well as how well U * can be embedded into V .These results will be presented in Section 2.
In Section 3, we will construct examples of the triples (X, U, V ) so that the different restriction constants coincide and also so that three of them are different from each other.Here we will use the Banach space theory to determine whether a given Banach space can or cannot be embedded into another Banach space.In particular, we will prove that r(X, U, V ) = lr(X, U, V ) if X = L 1 and thus generalize some results of [8].
In the last section we will give some applications of the results when the space V consists of polynomials.We will reprove some known results and prove some new results on interpolation by polynomials by interpreting the norms of the interpolation operators as the recovery constants.
We will use the rest of this section to introduce some useful concepts from the local theory of Banach spaces.All of them can be found in the book [2].
Let E and V be two k-dimensional Banach spaces.The Banach-Mazur distance is defined to be Analytically d(E, V ) ≤ d 0 for some d 0 ≥ 1 if and only if there exists basis e 1 , . . ., e k in E and v 1 , . . ., v k in V and constants α j e j (1.10) holds for all α 1 , . . ., α k ∈ F and By homogeneity, it is equivalent to finding basis e 1 , . . ., e k ∈ E and v (1.11) The following properties are obvious: Next we will need the notion of projection constant.Let V be a subspace of X. Define a relative projectional constant λ(V , X) to be λ(V , X) = inf P : P is a projection from X onto V .
(1.13) Now the absolute projectional constant λ(V ) of an arbitrary space V is defined to be Here are a few properties (1.16) this property shows that the absolute projectional constant is an isomorphic invariant.
. Let E and X be Banach spaces and a ≥ 1 be fixed.We say that if there exists a subspace The rest of the notions and results from the theory of Banach spaces will be introduced as needed.

General theorems
The following two theorems of Helly will play a fundamental role in this section (cf.[3]).

Boris Shekhtman 385
We now turn our attention to the recovery constants.Let (X, U, V ) be a recovery triple.Let if and only if the operator J : Ũ → U defined by J −1 u = ũ has the norm J ≤ r 0 .In other words, Proof.Let u 1 , . . ., u m be a basis in U .Then ũ1 , . . ., ũm is a basis Ũ .Let x ∈ X, x = 1, u j (x) = α j .Let r(X, U, V ) ≤ r 0 .Then for every > 0 there exists for all a 1 , . . ., a m ∈ F. Hence for every x ∈ X with x ≤ 1 and every a 1 , . . ., a m ∈ F m j =1 a j u j (x) . (2.7) Passing to the supremum over all x with x ≤ 1 we obtain or equivalently For the proof of the converse, assume that r 0 is such that (2.8) holds.Then for every fixed x ∈ X with x ≤ 1 and every a 1 , . . ., a m ∈ F a j ũj ≥ 1 r 0 a j u j ≥ 1 r 0 a j u j (x) . (2.10) Now by Theorem 2.2, for every > 0 there exists v ∈ V such that v ≤ r 0 + ; u j (x) = ũj (v).
Corollary 2.4.The quantity r(X, U, V ) = r 0 if and only if the operator J : Ũ → U defined by ũ = J −1 u realizes an r 0 -embedding Proof.J is an isomorphism from U onto Ũ ⊂ V * .Since ũ is a restriction of u we have ũ ≤ u .Hence J −1 u ≤ u and d(U, Ũ) ≤ J J −1 ≤ r 0 .
Corollary 2.5.If r(X, U, V ) ≤ r 0 then there exists an embedding This corollary is completely obvious and we stated it solely for the reason of future use.
At the end of this section, we will give an example that shows that the converse to Corollary 2.5 does not hold.It does not suffice to have some embedding U → r0 V * to obtain r(X, U, V ) ≤ r 0 .It has to be a very specific embedding J : ũ → u.
We will now deal with pr(X, U, V ) = inf{ P : P ∈ ᏼ(X, U, V )}.For the next theorem we fix the basis u 1 , . . ., u m ∈ U and for any sequence α 1 , . . ., α m ∈ F define (2.12) Theorem 2.6.Let r 1 ≥ 1.Then pr(X, U, V ) ≤ r 1 if and only if for every > 0, for some v j ∈ V with u j (v k ) = δ jk .We want to show that (2.16) For every > 0 let x ∈ X be such that x ≤ M + ; u j (x ) = α j .We have (2.17) Since this is true for all and in view of (2.16) we obtain the right-hand side inequality in (2.15).For the left-hand side we have (2.18) To prove the converse, let v 1 , . . ., v m ∈ V with u k (v j ) = δ kj and let (2.13) holds for some arbitrary .Define P ∈ ᏼ(X, U, V ) by P x = m j =1 u j (x)v j .We have (2.19) Corollary 2.7.For every > 0 there exists a subspace Comparing Corollaries 2.5 and 2.7 we see that an operator P ∈ ᏼ(X, U, V ) with a small norm forces a good embedding while having an operator F ∈ Ᏺ(X, U, V ) with a small norm implies a sort of a "dual embedding" In general, (2.22) does not imply (2.23) and that is why (as we will see in the next section) pr(X, U, V ) may be much larger than r(X, U, V ).However, there are cases when (2.22) and (2.23) are equivalent.This happens if there exist a projection from V onto T U * or from V * onto J U of small norms, that is, if is small.To rephrase it: (2.22) and (2.23) are equivalent if one of the two embeddings is well complemented.
Proof.For the proof it is convenient to consider the following diagram: where The converse of Proposition 2.8 may not be true.The small change in wording, however, makes it true.
Corollary 2.9.Let r 0 = r(X, U, V ) and let a ≥ 1.Then pr(X, U, V ) ≤ ar 0 if and only if for every > 0 there exists a projection Q from V * onto Ũ such that J Q ≤ ar 0 + .Proof.The sufficiency follows from Proposition 2.8.Suppose that pr(X, U, V ) ≤ ar 0 .Then there exists a projection P ∈ ᏼ(X, U, V ) such that P ≤ Boris Shekhtman 389 ar 0 + .Since P maps X into V hence P * : V * → X * and Im P * = U. (2.26) Thus Q := J −1 P * projects V * onto Ũ and J Q = J J −1 P * = P * = P ≤ ar 0 . (2.27) We will now rephrase Corollary 2.9 in terms of the diagram (2.28) Then pr(X, U, V ) ≤ r 1 if and only if for every > 0 the operator J in (2.28) can be extended to an operator Ĵ from V * onto U , that is, if and only if there exists an operator Proof.If pr(X, U, V ) ≤ r 1 then we conclude from Corollary 2.9 (cf.diagram (2.25)) that Ĵ := J Q is the desired extension of J .Conversely, let Ĵ be an extension with Ĵ ≤ r 1 + .Then Since U ⊂ X * , we can view J as an embedding of Ũ into X * and Ĵ to be an extension of J from V * into all of X * .However, there are other extensions of J to an operator from V * into X * with the range not limited to U .This subtle difference turns out to be the key to the linear recovery.
Theorem 2.11.Let (X, U, V ) be a recovery triple.Let r 2 ≥ 1 and J : Ũ → U ⊂ X * .Then lr(X, U, V ) ≤ r 2 if and only if for every > 0 there exists a linear extension S : V * → X * of an operator J : Ũ → X * such that S ≤ r 2 + . (2.29) Proof.We again illustrate it on the diagram (2.30) Let S be such an extension with S ≤ r 2 + .Then S * : X * → V .Since S is an extension of J we have S ũ = u for every ũ ∈ Ũ ⊂ V * .Therefore for every x ∈ X * * and every u ∈ U x(u) = x S ũ = S * x ũ . (2.31) In particular, if x ∈ X ⊂ X * * we have S * x ∈ V and (2.32) (2.33) Thus L * ũ = u for every ũ ∈ Ũ and L * ≤ r 2 + .Hence L * is the desired extension of J .
It is a little surprising that r(X, U, V ) and pr(X, U, V ) depend (at least explicitly) only on the relationship between U and V , yet lr(X, U, V ) which is squeezed in between those two constants depend explicitly on the space X as well as U and V .
We finish this discussion by demonstrating that the converse results to Corollaries 2.5 and 2.7 are false.Thus only the existence of specific embeddings of U → V * and of U * → V give the estimates for the recovery constants.

Boris Shekhtman 391
Hence l 2 1 is isometric to l 2 ∞ = (l 2 1 ) * and all the spaces U , V , U * , V * are isometric.Therefore U * → 1 V and U → 1 V * and since all the spaces are of the same dimension, the embeddings are 1-complemented.Thus all the conditions of Corollaries 2.5 and 2.7 are satisfied with r 0 = r 2 = 1.Yet we will show that r(X, U, V ) ≥ 2. Indeed let r1 , r2 be the restrictions of r 1 and r 2 onto V .Then (2.37) Choosing α = 1, β = 1 we have Hence J ≥ 2 and by Theorem 2.3, r(X, U, V ) ≥ 2.

Comparison of the recovery constants
In this section, we will establish some relationships between various recovery constants.Recall that for E ⊂ X the notation λ(E, X) stands for a relative projectional constant λ(E, X) = inf P : P is a projection from X onto E . (3.1) Proof.Let Q be a projection from X * onto U and let S be an extension of J (cf. diagram (2.30)) to an operator from V * into X * with S ≤ lr(X, U, V ) + .Then Ĵ := QS is the map from V * onto U and it is an extension of J to an operator from V * onto U .By Corollary 2.10, we have Hence we proved the left-hand side of (3.3).The right-hand side follows from the standard estimate (cf.[4]) The left-hand side of (3.4) is a reformulation of Proposition 2.8, and the righthand side of (3.4) follows from another standard estimate (cf.[4]) Remark 3.2.Using the estimate for relative projectional constant in [4] the righthand side of (3.4) can be improved to λ( It was observed in [8] that r(X, U, V ) = pr(X, U, V ) if X = L 1 (µ) and U = span[u 1 , . . ., u m ] ⊂ L ∞ where u 1 , . . ., u m are functions with disjoint support.In this case U is isometric to l m ∞ .We are now in a position to extend this observation in two different directions.
It is well known (cf.[10]) that every operator with the range in l m ∞ can be extended to an operator from a bigger space (in this case V * ) with the same norm.Let A be such an extension of the operator T J .Then J := T −1 A is an extension of J to an operator from V * to U with Proof.In this case X * = L ∞ (µ) and hence the operator J : Ũ → U can be considered as an operator from Ũ into L ∞ (µ).Using again the "projective property" of L ∞ (µ) (cf.[10]) we can extend J to an operator S from V * to L ∞ (µ) so that J = S .By Theorem 2.11, we obtain the conclusion of the proposition.
Example 3.7 will demonstrate that "lr" in this proposition cannot be replaced by "pr".
We now wish to demonstrate (by means of examples) that r(X, U, V ) can be arbitrarily large; that one can find a sequence (X, U m , V n ) such that r(X, U m , V n ) is bounded, yet lr(X, U m , V n ) tends to infinity as √ m; and that there exists a sequence (X, U m , V n ) such that lr(X, U m , V n ) is bounded, yet pr(X, U m , V n ) tends to infinity as √ m.Also the estimates (3.3) and (3.4) are asymptotically best possible.These examples also serve to demonstrate the usefulness of the results in Section 2 for estimating the recovery constants.
Example 3.5.For arbitrary X, V , M > 0 there exists U ⊂ X * such that r(X, U, V ) ≥ M. Construction 3.6.Fixing X, V , M > 0, it is a matter of triviality to show that there exists a projection P from X onto V P x = n j =1 u j (x)v j (3.12) For the next two examples we will need the Rademacher function r j (t) := sign sin(2 j −1 πt), 0 ≤ t ≤ 1.It is well known (cf.[2]) that for some absolute constant C > 0.
Example 3.7.There exists a sequence of recovery triples It is easy to see that α j rj = α j r j ∞ = |α j |.Hence by Theorem 2.3, we have r(X, U, V ) = 1.Since X = L 1 we use Proposition 3.4 to conclude that lr(X, U, It is a well-known fact (cf.[6]) that for every subspace where C 1 > 0 is some universal constant.Thus we conclude that for every subspace and by Corollary 2.7 we have Example 3.9.There exists a constant C > 0 such that for every integer m there exists a recovery triple (X, U, V ) As in the previous example we conclude that for every subspace and by Corollary 2.7, we obtain We will now choose intervals A j so that r(X, U, V ) = 1 or equivalently (by Theorem 2.3) so that sup In order to do that recall that for every distribution of signs 1 , . . ., 2 m where 1 = 1; j = ±1 there exists a subinterval A in our partition such that sign r j (t) (3.25) Choose A 3 to satisfy continuing this way we come down to choosing A m so that Expanding the integral in (3.24) we obtain where k,j = ±1, and for each k the collection ( k,1 , . . ., k,m ) is distinct, with k,1 = 1.Since there are precisely 2 m−1 such choices, hence max (3.29) Combining this with (3.28) we have max a j (by (3.28)). (3.30) This proves (3.24) and thus r(X, U, V ) = 1.
Remark 3.11.In this example dim V = 2 m−1 is much greater than the dim U =m.I could not construct an example of triples (X, U m , V n ) so that (a) m is proportional to n (say It would be interesting to know if such example is possible.In view of the next section it will also be interesting to find out if such example is possible with n = m + o(m).

Applications to polynomial recovery
In this section, we will examine the situation where X is one of the following Banach spaces C(T), L 1 (T), H 1 (T), A(T) the last being the disk-algebra on the unit circle T. Let H n be the space of polynomials of degree at most n − 1.Let U m be an arbitrary subspace of X * of dimension m.Proof.Part (a) was proved in [9], part (b) follows from an observation of Pelcinski and Bourgain (cf.[10, Proposition 3E15]), and part (c) follows from the fact that any sequence of finite-dimensional spaces can be uniformly embedded into (H 1 (T)) * and (L 1 (T)) * .
For the linear recovery there is a strengthening of Faber theorem (cf.[7,8]).
Theorem 4.3.Under the notation in this section In [8], it was observed that r(L 1 , U m , H n ) → ∞ under an additional condition that d(U m , C m ∞ ) is uniformly bounded.The following corollary follows immediately from Theorem 4.3 and Proposition 3.4.

Corollary 4.4. For any
Here is a partial result that uses Proposition 2.8.
From Theorem 4.3, we have In the positive direction, Bernstein proved (cf.[5]) that for any constant a > 1 there exists a subspace if n ≥ am.The functionals in U m are the linear span of point evaluation and thus U m is isometric to m 1 .Hence we have the following corollary.
Corollary 4.6.For any a > 1 there exists a constant C(a) and a subspace (4.9) We will end this section (and this paper) with the discussion of a "dual version" of a problem of polynomial recovery.The exact relationship between this problem and the problem of bounded recovery is not known to me at the present time.
which is equivalent to the statement of the theorem.
We hope to explore further similarities between this problem and recovery constants in a subsequent paper.

Call for Papers
Thinking about nonlinearity in engineering areas, up to the 70s, was focused on intentionally built nonlinear parts in order to improve the operational characteristics of a device or system.Keying, saturation, hysteretic phenomena, and dead zones were added to existing devices increasing their behavior diversity and precision.In this context, an intrinsic nonlinearity was treated just as a linear approximation, around equilibrium points.
Inspired on the rediscovering of the richness of nonlinear and chaotic phenomena, engineers started using analytical tools from "Qualitative Theory of Differential Equations," allowing more precise analysis and synthesis, in order to produce new vital products and services.Bifurcation theory, dynamical systems and chaos started to be part of the mandatory set of tools for design engineers.
This proposed special edition of the Mathematical Problems in Engineering aims to provide a picture of the importance of the bifurcation theory, relating it with nonlinear and chaotic dynamics for natural and engineered systems.Ideas of how this dynamics can be captured through precisely tailored real and numerical experiments and understanding by the combination of specific tools that associate dynamical system theory and geometric tools in a very clever, sophisticated, and at the same time simple and unique analytical environment are the subject of this issue, allowing new methods to design high-precision devices and equipment.
Authors should follow the Mathematical Problems in Engineering manuscript format described at http://www .hindawi.com/journals/mpe/.Prospective authors should submit an electronic copy of their complete manuscript through the journal Manuscript Tracking System at http:// mts.hindawi.com/according to the following timetable:

First
Round of ReviewsMarch 1, 2009 Hence in each one of the spaces X there exists an obstacle to bounded recovery.It is interesting to observe that only in C(T) this is the strong obstacle.
Let t 1 , ..., t m ∈ T and this time m ≥ n.Let p ∈ H n .Can one bound a uniform norm of the polynomial p in terms of the bounds on the values |p(t j )|?Just as in the case of polynomial recovery, the answer is "yes" if m > an with a > 1.Theorem 4.7.Let a > 1, let m > an.Let t 1 , . .., t m be uniform points on T. Let m = n + o(n).And let t 1 , . .., t m be arbitrary points in T. Then there exist polynomials p n ∈ H n such that |p n (t j )|; j = 1, . .., m and yet p n ∞ → ∞.Here we will prove an analogue of Proposition 4.5 in this case.
Theorem 4.9.Let t 1 , H n ..., t m ∈ T and m = n + o(log 2 n).Then there exist polynomials p n ∈ H n such thatp n t j < 1 : j = 1, ..., m, p n ∞ −→ ∞. (4.11)Proof.Let T n be a linear map from H ∞ n onto m ∞ defined byT n p = p t j ∈ m ∞ .(4.12) Then T n ≤ 1; T n is one-to-one and thus T n induces isomorphisms T n from H ∞ n onto E n := T n (H n ∞ ).It now follows from [4] that