首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The maximum likelihood (ML) expectation maximization (EM) approach in emission tomography has been very popular in medical imaging for several years. In spite of this, no satisfactory convergent modifications have been proposed for the regularized approach. Here, a modification of the EM algorithm is presented. The new method is a natural extension of the EM for maximizing likelihood with concave priors. Convergence proofs are given.  相似文献   

2.
The image space reconstruction algorithm (ISRA) was proposed as a modification of the expectation maximization (EM) algorithm based on physical considerations for application in volume emission computered tomography. As a consequence of this modification, ISRA searches for least squares solutions instead of maximizing Poisson likelihoods as the EM algorithm. It is shown that both algorithms may be obtained from a common mathematical framework. This fact is used to extend ISRA for penalized likelihood estimates.  相似文献   

3.
Recently, we proposed an extension of the expectation maximization (EM) algorithm that was able to handle regularization terms in a natural way. Although very general, convergence proofs were not valid for many possibly useful regularizations. We present here a simple convergence result that is valid assuming only continuous differentiability of the penalty term and can be also extended to other methods for penalized likelihood estimation in tomography.  相似文献   

4.
Maximum likelihood reconstruction for emission tomography   总被引:28,自引:0,他引:28  
Previous models for emission tomography (ET) do not distinguish the physics of ET from that of transmission tomography. We give a more accurate general mathematical model for ET where an unknown emission density lambda = lambda(x, y, z) generates, and is to be reconstructed from, the number of counts n(*)(d) in each of D detector units d. Within the model, we give an algorithm for determining an estimate lambdainsertion mark of lambda which maximizes the probability p(n(*)|lambda) of observing the actual detector count data n(*) over all possible densities lambda. Let independent Poisson variables n(b) with unknown means lambda(b), b = 1, ..., B represent the number of unobserved emissions in each of B boxes (pixels) partitioning an object containing an emitter. Suppose each emission in box b is detected in detector unit d with probability p(b, d), d = 1, ..., D with p(b,d) a one-step transition matrix, assumed known. We observe the total number n(*) = n(*)(d) of emissions in each detector unit d and want to estimate the unknown lambda = lambda(b), b = 1, ..., B. For each lambda, the observed data n(*) has probability or likelihood p(n(*)|lambda). The EM algorithm of mathematical statistics starts with an initial estimate lambda(0) and gives the following simple iterative procedure for obtaining a new estimate lambdainsertion mark(new), from an old estimate lambdainsertion mark(old), to obtain lambdainsertion mark(k), k = 1, 2, ..., lambdainsertion mark(new)(b)= lambdainsertion mark(old)(b)Sum of (n(*)p(b,d) from d=1 to D/Sum of lambdainsertion mark()old(b('))p(b('),d) from b(')=1 to B), b=1,...B.  相似文献   

5.
The expectation-maximization (EM) algorithm for computing maximum-likelihood estimates of transmission images in positron-emission tomography (PET) (see K. Lange and R. Carson, J. Comput. Assist. Tomogr., vol.8, no.2, p.306-16, 1984) is extended to include measurement error, accidental coincidences and Compton scatter. A method for accomplishing the maximization step using one step of Newton's method is proposed. The algorithm is regularized with the method of sieves. Evaluations using both Monte Carlo simulations and phantom studies on the Siemens 953B scanner suggest that the algorithm yields unbiased images with significantly lower variances than filtered-backprojection when the images are reconstructed to the intrinsic resolution. Large features in the images converge in under 200 iterations while the smallest features required up to 2,000 iterations. All but the smallest features in typical transmission scans converge in approximately 250 iterations. The initial implementation of the algorithm requires 50 sec per iteration on a DECStation 5000.  相似文献   

6.
A novel method of reconstruction from single-photon emission computerized tomography data is proposed. This method builds on the expectation-maximization (EM) approach to maximum likelihood reconstruction from emission tomography data, but aims instead at maximum posterior probability estimation, which takes account of prior belief about smoothness in the isotope concentration. A novel modification to the EM algorithm yields a practical method. The method is illustrated by an application to data from brain scans.  相似文献   

7.
In virtual colonoscopy, minimizing the blind areas is important for accurate diagnosis of colonic polyps. Although useful for describing the shape of an object, the centerline is not always the optimal camera path for observing the object. Hence, conventional methods in which the centerline is directly used as a path produce considerable blind areas, especially in areas of high curvature. Our proposed algorithm first approximates the surface of the object by estimating the overall shape and cross-sectional thicknesses. View positions and their corresponding view directions are then jointly determined to enable us to maximally observe the approximated surface. Moreover, by adopting bidirectional navigations, we may reduce the blind area blocked by haustral folds. For comfortable navigation, we carefully smoothen the obtained path and minimize the amount of rotation between consecutive rendered images. For the evaluation, we quantified the overall observable area on the basis of the temporal visibility that reflects the minimum interpretation time of a human observer. The experimental results show that our algorithm improves visibility coverage and also significantly reduces the number of blind areas that have a clinically meaningful size. A sequence of rendered images shows that our algorithm can provide a sequence of centered and comfortable views of colonography.  相似文献   

8.
The EM method that was originally developed for maximum likelihood estimation in the context of mathematical statistics may be applied to a stochastic model of positron emission tomography (PET). The result is an iterative algorithm for image reconstruction that is finding increasing use in PET, due to its attractive theoretical and practical properties. Its major disadvantage is the large amount of computation that is often required, due to the algorithm's slow rate of convergence. This paper presents an accelerated form of the EM algorithm for PET in which the changes to the image, as calculated by the standard algorithm, are multiplied at each iteration by an overrelaxation parameter. The accelerated algorithm retains two of the important practical properties of the standard algorithm, namely the selfnormalization and nonnegativity of the reconstructed images. Experimental results are presented using measured data obtained from a hexagonal detector system for PET. The likelihood function and the norm of the data residual were monitored during the iterative process. According to both of these measures, the images reconstructed at iterations 7 and 11 of the accelerated algorithm are similar to those at iterations 15 and 30 of the standard algorithm, for two different sets of data. Important theoretical properties remain to be investigated, namely the convergence of the accelerated algorithm and its performance as a maximum likelihood estimator.  相似文献   

9.
A new class of fast maximum-likelihood estimation (MLE) algorithms for emission computed tomography (ECT) is developed. In these cyclic iterative algorithms, vector extrapolation techniques are integrated with the iterations in gradient-based MLE algorithms, with the objective of accelerating the convergence of the base iterations. This results in a substantial reduction in the effective number of base iterations required for obtaining an emission density estimate of specified quality. The mathematical theory behind the minimal polynomial and reduced rank vector extrapolation techniques, in the context of emission tomography, is presented. These extrapolation techniques are implemented in a positron emission tomography system. The new algorithms are evaluated using computer experiments, with measurements taken from simulated phantoms. It is shown that, with minimal additional computations, the proposed approach results in substantial improvement in reconstruction.  相似文献   

10.
This paper has the dual purpose of introducing some new algorithms for emission and transmission tomography and proving mathematically that these algorithms and related antecedent algorithms converge. Like the EM algorithms for positron, single-photon, and transmission tomography, the algorithms provide maximum likelihood estimates of pixel concentration or linear attenuation parameters. One particular innovation we discuss is a computationally practical scheme for modifying the EM algorithms to include a Bayesian prior. The Bayesian versions of the EM algorithms are shown to have superior convergence properties in a vicinity of the maximum. We anticipate that some of the other algorithms will also converge faster than the EM algorithms.  相似文献   

11.
The problem of reconstruction in positron emission tomography (PET) is basically estimating the number of photon pairs emitted from the source. Using the concept of the maximum-likelihood (ML) algorithm, the problem of reconstruction is reduced to determining an estimate of the emitter density that maximizes the probability of observing the actual detector count data over all possible emitter density distributions. A solution using this type of expectation maximization (EM) algorithm with a fixed grid size is severely handicapped by the slow convergence rate, the large computation time, and the nonuniform correction efficiency of each iteration, which makes the algorithm very sensitive to the image pattern. An efficient knowledge-based multigrid reconstruction algorithm based on the ML approach is presented to overcome these problems.  相似文献   

12.
An accurate model of the nonstationary geometrical response of a camera-collimator system is discussed. The algorithm is compared to three other algorithms that are specialized for region-of-interest evaluation, as well as to the conventional method for summing the reconstructed quantity over the regions of interest. For noise-free data and for regions of accurate shape, least-squares estimates were unbiased within roundoff errors. For noisy data, estimates were still unbiased but precision worsened for regions smaller than resolution: simulating typical statistics of brain perfusion studies performed with a collimated camera, the estimated standard deviation for a 1-cm-square region was 10% with an ultra-high-resolution collimator and 7% with a low-energy all-purpose collimator. Conventional region-of-interest estimates show comparable precision but are heavily biased if filtered backprojection is used for image reconstruction. Using the conjugate-gradient iterative algorithm and the model of nonstationary geometrical response, bias of estimates decreased on increasing the number of iterations, but precision worsened, thus achieving an estimated standard deviation of more than 25% for the same 1-cm region.  相似文献   

13.
This paper introduces and evaluates a block-iterative Fisher scoring (BFS) algorithm. The algorithm provides regularized estimation in tomographic models of projection data with Poisson variability. Regularization is achieved by penalized likelihood with a general quadratic penalty. Local convergence of the block-iterative algorithm is proven under conditions that do not require iteration dependent relaxation. We show that, when the algorithm converges, it converges to the unconstrained maximum penalized likelihood (MPL) solution. Simulation studies demonstrate that, with suitable choice of relaxation parameter and restriction of the algorithm to respect nonnegative constraints, the BFS algorithm provides convergence to the constrained MPL solution. Constrained BFS often attains a maximum penalized likelihood faster than other block-iterative algorithms which are designed for nonnegatively constrained penalized reconstruction.   相似文献   

14.
Since the publication of Shepp and Vadi's [ 14] maximum likelihood reconstruction algorithm for emission tomography (ET), many medical research centers engaged in ET have made an effort to change their reconstruction algorithms to this new approach. Some have succeeded, while others claim they could not adopt this new approach primarily because of limited computing power. In this paper, we discuss techniques for reducing the computational requirements of the reconstruction algorithm. Specifically, the paper discusses the data structures one might use and ways of taking advantage of the geometry of the physical system. The paper also treats some of the numerical aspects of the EM (expectation maximization) algorithm, and ways of speeding up the numerical algorithm using some of the traditional techniques of numerical analysis.  相似文献   

15.
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.  相似文献   

16.
The expectation maximization (EM) algorithm is presented for the case of estimating direction of arrivals of unknown deterministic wideband signals. Alternative regularized least squares estimation techniques for the required signal estimation and a tree structure for the data mapping in the EM algorithm are proposed. Extensive simulation results are presented for comparison of the proposed algorithms with the conventional EM approach and the current high-resolution methods of wideband direction finding  相似文献   

17.
The imaging characteristics of maximum likelihood (ML) reconstruction using the EM algorithm for emission tomography have been extensively evaluated. There has been less study of the precision and accuracy of ML estimates of regional radioactivity concentration. The authors developed a realistic brain slice simulation by segmenting a normal subject's MRI scan into gray matter, white matter, and CSF and produced PET sinogram data with a model that included detector resolution and efficiencies, attenuation, scatter, and randoms. Noisy realizations at different count levels were created, and ML and filtered backprojection (FBP) reconstructions were performed. The bias and variability of ROI values were determined. In addition, the effects of ML pixel size, image smoothing and region size reduction were assessed. Hit estimates at 3,000 iterations (0.6 sec per iteration on a parallel computer) for 1-cm(2) gray matter ROIs showed negative biases of 6%+/-2% which can be reduced to 0%+/-3% by removing the outer 1-mm rim of each ROI. FBP applied to the full-size ROIs had 15%+/-4% negative bias with 50% less noise than hit. Shrinking the FBP regions provided partial bias compensation with noise increases to levels similar to ML. Smoothing of ML images produced biases comparable to FBP with slightly less noise. Because of its heavy computational requirements, the ML algorithm will be most useful for applications in which achieving minimum bias is important.  相似文献   

18.
The expectation maximization method for maximum likelihood image reconstruction in emission tomography, based on the Poisson distribution of the statistically independent components of the image and measurement vectors, is extended to a maximum aposteriori image reconstruction using a multivariate Gaussian a priori probability distribution of the image vector. The approach is equivalent to a penalized maximum likelihood estimation with a special choice of the penalty function. The expectation maximization method is applied to find the a posteriori probability maximizer. A simple iterative formula is derived for a penalty function that is a weighted sum of the squared deviations of image vector components from their a priori mean values. The method is demonstrated to be superior to pure likelihood maximization, in that the penalty function prevents the occurrence of irregular high amplitude patterns in the image with a large number of iterations (the so-called "checkerboard effect" or "noise artifact").  相似文献   

19.
Using a theory of list-mode maximum-likelihood (ML) source reconstruction presented recently by Barrett et al. (1997), this paper formulates a corresponding expectation-maximization (EM) algorithm, as well as a method for estimating noise properties at the ML estimate. List-mode ML is of interest in cases where the dimensionality of the measurement space impedes a binning of the measurement data. It can be advantageous in cases where a better forward model can be obtained by including more measurement coordinates provided by a given detector. Different figures of merit for the detector performance can be computed from the Fisher information matrix (FIM). This paper uses the observed FIM, which requires a single data set, thus, avoiding costly ensemble statistics. The proposed techniques are demonstrated for an idealized two-dimensional (2-D) positron emission tomography (PET) [2-D PET] detector. The authors compute from simulation data the improved image quality obtained by including the time of flight of the coincident quanta  相似文献   

20.
A filtered backprojection reconstruction algorithm was developed for cardiac single photon emission computed tomography with cone-beam geometry. The algorithm reconstructs cone-beam projections collected from ;short scan' acquisitions of a detector traversing a noncircular planar orbit. Since the algorithm does not correct for photon attenuation, it is designed to reconstruct data collected over an angular range of slightly more than 180 degrees with the assumption that the range of angles is oriented so as not to acquire the highly attenuated posterior projections of emissions from cardiac radiopharmaceuticals. This sampling scheme is performed to minimize the attenuation artifacts that result from reconstructing posterior projections. From computer simulations, it is found that reconstruction of attenuated projections has a greater effect upon quantitation and image quality than any potential cone-beam reconstruction artifacts resulting from insufficient sampling of cone-beam projections. With nonattenuated projection data, cone beam reconstruction errors in the heart are shown to be small (errors of at most 2%).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号