A General Total Variation Minimization Theorem for Compressed Sensing Based Interior Tomography

Recently, in the compressed sensing framework we found that a two-dimensional interior region-of-interest (ROI) can be exactly reconstructed via the total variation minimization if the ROI is piecewise constant (Yu and Wang, 2009). Here we present a general theorem charactering a minimization property for a piecewise constant function defined on a domain in any dimension. Our major mathematical tool to prove this result is functional analysis without involving the Dirac delta function, which was heuristically used by Yu and Wang (2009).


Introduction
While in general an interior region-of-interest (ROI) cannot be uniquely reconstructed from projection data only associated with lines through the ROI [1,2], in the compressed sensing framework, we recently found that a twodimensional interior ROI can be exactly reconstructed via the total variation minimization if the function on the ROI is piecewise constant [3,4]. The major idea behind our analysis is that the total variations of a piecewise constant function and a smooth artifact function are separable. The main mathematical tool is the expression of the two-dimensional gradient in terms of the Dirac delta function. In our analysis [3], the Delta function was instrumental but applied heuristically without mathematical rigor. In this note, we will prove rigorously a more general theorem, as an extension of the total variation minimization property presented in [3], to characterize the total variation minimization property for a piecewise constant function defined on a domain in any dimension. Such a theorem may serve as a theoretical basis for further development of interior tomography algorithms.

Theoretical Result
For piecewise constant or piecewise smooth functions, it is natural to use the space of functions of bounded variation [5] to capture the discontinuities. For an integer d > 0, let Ω ⊂ R d be a d-dimensional open bounded set and denote its boundary by ∂Ω. Then the space of functions of bounded variation is It is a Banach space with the norm where υ L 1 (Ω) is the integral of |υ(x)| over Ω, and is the total variation of the function υ ∈ BV(Ω). Here C 1 0 (Ω) is the space of continuously differentiable functions that vanish on ∂Ω, and for is a subspace of BV(Ω) and 2 International Journal of Biomedical Imaging We that assume Ω has a piecewise C 1 boundary, and it is decomposed into a union of a finite number of subsets with disjoint interiors (6) such that each subset Ω m has a piecewise C 1 boundary. The unit outward normal vector on ∂Ω m is denoted by υ m . Denote Γ i j = Ω i Ω j , which may be empty for some pairs of i and j between 1 and M. We write meas(Γ i j ) for the (d − 1)dimensional measure of Γ i j ; it is the area of Γ i j for d = 3, and the length of Γ i j for d = 2. The symbol i< j will refer to a summation for those i and j with a nonempty Γ i j in the range 1 ≤ i < j ≤ M. The main result of this note is the following. (6):

Theorem 1. Let f be a piecewise constant function corresponding to the decomposition
Consequently, we have the minimization property Proof. After an integration by parts and some rearrangement, we have, for any φ ∈ C 1 0 (Ω) d , By the definition (3), Taking g(x) = 0 in (11), the formula (8) follows (cf. the argument in the next paragraph for a more general situation). Moreover, from (11) again, For the opposite inequality, we first consider the case where g ∈ C 1 (Ω). For any ε > 0, define two open subsets Here, dist(x, D) = min{|x − y| : y ∈ D} is the distance between x and a closed set D. Obviously, for some constant c > 0, We start with a function ψ ε (x) ∈ C(Ω) d satisfying and then apply the well-known mollification technique in the theory of Sobolev space [6] to define where B δ is the ball of radius δ centered at the origin, η δ (x) = η(x/δ)/δ d , and Then |φ ε,δ (x)| ≤ 1 for x ∈ Ω, and for δ sufficiently small, (11), and as δ → 0, we obtain, with some constant c 1 > 0 International Journal of Biomedical Imaging 3 Using the defining properties of ψ ε , we further have for some other constant c 2 > 0. Since ε > 0 is arbitrary, we obtain from the above relation that Combining (12) and (21), we conclude (7) for g ∈ C 1 (Ω). For g ∈ W 1,1 (Ω), we use the density of C 1 (Ω) in W 1,1 (Ω) [6] and choose {g n } ⊂ C 1 (Ω) such that Since (3) defines a seminorm on BV(Ω), we have (23) Thus, taking this limit n → ∞ in (7) for g n ⊂ C 1 (Ω), we obtain (7) for g ∈ W 1,1 (Ω).
As an example of (8), let Ω = B r0 ⊂ R 2 be a disk of radius r 0 centered at the origin. Consider a piecewise constant, radial function f (r) defined on B r0 such that it has a jump j m ∈ R at r m , 1 ≤ m ≤ M, where 0 < r 1 < · · · < r M < r 0 . Then by (8), we have

Discussions and Conclusion
Some comments on the name of our approach "CS-based interior tomography" are in order. In the strict sense, compressed sensing refers to situations where the sampling scheme is built (often with random techniques) to achieve specific properties for satisfactory recovery of an underlying signal, rather than imposed by a specific detector arrangement as in limited data tomography. However, in a broad sense, compressed sensing can be interpreted as achieving better reconstruction from less data relative to the common practice. Hence, while the current name is not far off, an alternative phrase for our approach can be "total variation minimization-based interior tomography." In conclusion, we have extended the total variation minimization property of a piecewise function from twodimensions to any dimensionality in the Sobolev space, which can be used for exact reconstruction of any piecewise function on an ROI by minimizing its total variation under the constraint of the truncated projection data through the ROI. Previously, we implemented an alternating iterative reconstruction algorithm to minimize the total variation, which is time-consuming and needs improvement. Under the guidance of the theoretical finding presented here, we are working to develop a multidimensional ROI reconstruction algorithm for better performance. Clearly, major efforts are still needed in this direction.