© Hindawi Publishing Corp. Lp-INVERSE THEOREM FOR MODIFIED BETA OPERATORS

We obtain a converse theorem for the linear combinations of 
modified beta operators whose weight function is the Baskakov 
operators. To prove our inverse theorem, we use the technique of 
linear approximating method, namely, Steklov mean.


Introduction.
For f ∈ L p [0, ∞), p ≥ 1, modified beta operators with the weight function of Baskakov operators are defined as where and B(v + 1,n) being the beta function (see, e.g., [3]).
It is easily verified that the operators B n are linear positive operators.Also, B n (1,x) = 1.It turns out that the order of approximation for the operators (1.1) is at best O(n −1 ) howsoever smooth the function may be.With the aim of improving the order of approximation, we have to slack the positive condition of these operators for which we may take appropriate linear combinations of the operators (1.1).Now we consider the linear combinations B n (f ,k,x) of the operators B d j n (f , x) as where , k≠ 0, C(0, 0) = 1 (1.4) and d 0 ,d 1 ,d 2 ,...,d k are (k + 1) arbitrary but fixed distinct positive integers.
Throughout this note, let 0 < a 1 < a 2 < a 3 < b 3 < b 2 < b 1 < ∞, 0 < a < b < ∞, and where t ∈ I and ∆ m h f (t) is the mth order forward difference of the function f with step length h.It follows from [5,7] that (i) f η,m has derivatives up to order m, f ∈ AC(I 1 ), and f (m−1) η,m exists a.e. and belongs to L p (I 1 ); In this note, we obtain an inverse theorem in L p -approximation for the linear combinations of the operators (1.1).

Auxiliary results.
In this section, we give certain results which are necessary to prove the inverse result.
Lemma 2.1 [3].Let the mth order moment be defined as then T n,0 (x) = 1 and T n,1 (x) = (1+3x)/(n−2) and there holds the recurrence relation (2.2) where the constant K 5 is independent of n and h.
Proof.Applying Fubini's theorem and Holder's inequality, we obtain Now, we use the identities where P k (j) and Q k (j) are polynomials in j of degree 2k.
Using the fact 2m j=0 (2.7) Now applying Holder's inequality for summation and by the compactness of h, we obtain the required result. (2.8) Proof.Applying Jensen's inequality repeatedly, (2.9) We break the interval [x, t] in the first term as (2.11) We apply Holder's inequality for infinite sum, Lemma 2.1, and Fubini's theorem to obtain the required estimate.The presence of the factor (1 − ϕ(t)) in the second term of (2.9) implies that |t − x|/δ > 1, which gives arbitrary order O(n −m ).This completes the proof of the lemma.
Lemma 2.4.There exist polynomials q i,j,r (x) independent of n and v such that (2.12) The proof of the above lemma is similar to [3, Lemma 2.2]. (2.13) the constants K 8 and K 9 are independent of n and h.

Inverse theorem.
In this section, we prove the following inverse theorem. then Proof.We choose a function g ∈ C 2k+2 0 such that supp h ⊂ (x 2 ,y 2 ), h(t) = 1 on [x 3 ,y 3 ], and where ∆ 2k+2 γ denotes the (2k+2)th order forward difference.Applying Jensen's inequality repeatedly and Fubini's theorem for the second term, we have (3.4) Applying Lemma 2.5 and using the properties of Steklov mean, we get Following [1], we can complete the proof of the theorem if we show that We will prove (3.6) by the principle of mathematical induction on α.First, we consider the case α ≤ 1, thus for some ξ lying between t and x.Using Lemma 2.1 and the compactness of f to estimate the second term, and the assumption of the theorem for the first term, we get This completes the proof of (3.6) for the case α ≤ 1.Now, assume that (3.6) holds for all values of α satisfying m − 1 < α < m and prove that the same holds true for m < α < m+ 1.Thus, we have The assumed smoothness of f implies that where The direct theorem in [4] and Lemma 2.1 imply that J 1 ,J 3 = O(n −(k+1) ), n → ∞, using Jensen's inequality, mean value theorem on h, and breaking [x, t] as in Lemma 2.3, we have p dw dt dx 1) ,w,p,[x 1 ,y 1 ] p dw .

Call for Papers
As a multidisciplinary field, financial engineering is becoming increasingly important in today's economic and financial world, especially in areas such as portfolio management, asset valuation and prediction, fraud detection, and credit risk management.For example, in a credit risk context, the recently approved Basel II guidelines advise financial institutions to build comprehensible credit risk models in order to optimize their capital allocation policy.Computational methods are being intensively studied and applied to improve the quality of the financial decisions that need to be made.Until now, computational methods and models are central to the analysis of economic and financial decisions.However, more and more researchers have found that the financial environment is not ruled by mathematical distributions or statistical models.In such situations, some attempts have also been made to develop financial engineering models using intelligent computing approaches.For example, an artificial neural network (ANN) is a nonparametric estimation technique which does not make any distributional assumptions regarding the underlying asset.Instead, ANN approach develops a model using sets of unknown parameters and lets the optimization routine seek the best fitting parameters to obtain the desired results.The main aim of this special issue is not to merely illustrate the superior performance of a new intelligent computational method, but also to demonstrate how it can be used effectively in a financial engineering environment to improve and facilitate financial decision making.In this sense, the submissions should especially address how the results of estimated computational models (e.g., ANN, support vector machines, evolutionary algorithm, and fuzzy models) can be used to develop intelligent, easy-to-use, and/or comprehensible computational systems (e.g., decision support systems, agent-based system, and web-based systems) This special issue will include (but not be limited to) the following topics: • Computational methods: artificial intelligence, neural networks, evolutionary algorithms, fuzzy inference, hybrid learning, ensemble learning, cooperative learning, multiagent learning

•
Application fields: asset valuation and prediction, asset allocation and portfolio selection, bankruptcy prediction, fraud detection, credit risk management • Implementation aspects: decision support systems, expert systems, information systems, intelligent agents, web service, monitoring, deployment, implementation