首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The EM method that was originally developed for maximum likelihood estimation in the context of mathematical statistics may be applied to a stochastic model of positron emission tomography (PET). The result is an iterative algorithm for image reconstruction that is finding increasing use in PET, due to its attractive theoretical and practical properties. Its major disadvantage is the large amount of computation that is often required, due to the algorithm's slow rate of convergence. This paper presents an accelerated form of the EM algorithm for PET in which the changes to the image, as calculated by the standard algorithm, are multiplied at each iteration by an overrelaxation parameter. The accelerated algorithm retains two of the important practical properties of the standard algorithm, namely the selfnormalization and nonnegativity of the reconstructed images. Experimental results are presented using measured data obtained from a hexagonal detector system for PET. The likelihood function and the norm of the data residual were monitored during the iterative process. According to both of these measures, the images reconstructed at iterations 7 and 11 of the accelerated algorithm are similar to those at iterations 15 and 30 of the standard algorithm, for two different sets of data. Important theoretical properties remain to be investigated, namely the convergence of the accelerated algorithm and its performance as a maximum likelihood estimator.  相似文献   

2.
Since the publication of Shepp and Vadi's [ 14] maximum likelihood reconstruction algorithm for emission tomography (ET), many medical research centers engaged in ET have made an effort to change their reconstruction algorithms to this new approach. Some have succeeded, while others claim they could not adopt this new approach primarily because of limited computing power. In this paper, we discuss techniques for reducing the computational requirements of the reconstruction algorithm. Specifically, the paper discusses the data structures one might use and ways of taking advantage of the geometry of the physical system. The paper also treats some of the numerical aspects of the EM (expectation maximization) algorithm, and ways of speeding up the numerical algorithm using some of the traditional techniques of numerical analysis.  相似文献   

3.
The problem of reconstruction in positron emission tomography (PET) is basically estimating the number of photon pairs emitted from the source. Using the concept of the maximum-likelihood (ML) algorithm, the problem of reconstruction is reduced to determining an estimate of the emitter density that maximizes the probability of observing the actual detector count data over all possible emitter density distributions. A solution using this type of expectation maximization (EM) algorithm with a fixed grid size is severely handicapped by the slow convergence rate, the large computation time, and the nonuniform correction efficiency of each iteration, which makes the algorithm very sensitive to the image pattern. An efficient knowledge-based multigrid reconstruction algorithm based on the ML approach is presented to overcome these problems.  相似文献   

4.
对于如何抑制正电子发射成像(positron emission tomography,PET)中的噪声效果的问题,Bayesian重建或者最大化后验估计(maximum a posteriori,MAP)的方法在重建图像质量和收敛性方面具有相对于其他方法的优越性.基于Bayesian理论,本文提出了一种新的能够保持其先验能量函数凸性的马尔可夫随机场(Markov Random Fields,MRF)混合多阶二次先验(quadratic hybrid multi-order,QHM),该QHM先验综合了二次-阶(quadratic membrane,QM)先验和二次二阶(quadratic plate,QP)先验,且能够根据不同阶数的二次先验和待重建表面的性质自适应的发挥QM先验和QP先验的作用.文中还给出了使用该新的混合先验的收敛重建算法.模拟实验结果的视觉和量化比较证明了对于PET重建,该先验在抑制背景噪声和保持边缘方面具有很好的表现.  相似文献   

5.
An image reconstruction method motivated by positron emission tomography (PET) is discussed. The measurements tend to be noisy and so the reconstruction method should incorporate the statistical nature of the noise. The authors set up a discrete model to represent the physical situation and arrive at a nonlinear maximum a posteriori probability (MAP) formulation of the problem. An iterative approach which requires the solution of simple quadratic equations is proposed. The authors also present a methodology which allows them to experimentally optimize an image reconstruction method for a specific medical task and to evaluate the relative efficacy of two reconstruction methods for a particular task in a manner which meets the high standards set by the methodology of statistical hypothesis testing. The new MAP algorithm is compared to a method which maximizes likelihood and with two variants of the filtered backprojection method.  相似文献   

6.
We develop algorithms for obtaining regularized estimates of emission means in positron emission tomography. The first algorithm iteratively minimizes a penalized maximum-likelihood (PML) objective function. It is based on standard de-coupled surrogate functions for the ML objective function and de-coupled surrogate functions for a certain class of penalty functions. As desired, the PML algorithm guarantees nonnegative estimates and monotonically decreases the PML objective function with increasing iterations. The second algorithm is based on an iteration dependent, de-coupled penalty function that introduces smoothing while preserving edges. For the purpose of making comparisons, the MLEM algorithm and a penalized weighted least-squares algorithm were implemented. In experiments using synthetic data and real phantom data, it was found that, for a fixed level of background noise, the contrast in the images produced by the proposed algorithms was the most accurate.  相似文献   

7.
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.  相似文献   

8.
Presents an image reconstruction method for positron-emission tomography (PET) based on a penalized, weighted least-squares (PWLS) objective. For PET measurements that are precorrected for accidental coincidences, the author argues statistically that a least-squares objective function is as appropriate, if not more so, than the popular Poisson likelihood objective. The author proposes a simple data-based method for determining the weights that accounts for attenuation and detector efficiency. A nonnegative successive over-relaxation (+SOR) algorithm converges rapidly to the global minimum of the PWLS objective. Quantitative simulation results demonstrate that the bias/variance tradeoff of the PWLS+SOR method is comparable to the maximum-likelihood expectation-maximization (ML-EM) method (but with fewer iterations), and is improved relative to the conventional filtered backprojection (FBP) method. Qualitative results suggest that the streak artifacts common to the FBP method are nearly eliminated by the PWLS+SOR method, and indicate that the proposed method for weighting the measurements is a significant factor in the improvement over FBP.  相似文献   

9.
Feasibility of stationary positron emission tomography (PET) using discrete detectors has been investigated by simulation studies. To enable full utilization of detector resolution, a "bank array" of detectors is proposed and an EM algorithm is adopted for image reconstruction. The bank array consists of an odd number of detector banks arranged on a circular ring with a gap equal to one half the detector width. The EM algorithm [11] is used with some modifications for reducing the quantity of computation, improving the convergence speed, and suppressing statistical noise, so as to meet the present purpose. Simulation studies involving several phantoms show that the stationary PET with the new detector array provides image quality which is good enough for clinical applications. For fast dynamic studies with low spatial resolution, the convolution-backprojection method is efficient, but for high-resolution static imaging, resolution enhancement by an iterative method is required. Problems arising in the corrections for attenuation of photons and detector sensitivity, etc., are also discussed. A totally stationary PET avoids the mechanical problems associated with accurate movement of heavy assemblies and is particularly advantageous in gated cardiac imaging or in fast dynamic studies. Elimination of a scan along the detector plane allows a quick scan in the axial direction to achieve three-dimensional imaging with a small number of detector rings.  相似文献   

10.
The expectation maximization method for maximum likelihood image reconstruction in emission tomography, based on the Poisson distribution of the statistically independent components of the image and measurement vectors, is extended to a maximum aposteriori image reconstruction using a multivariate Gaussian a priori probability distribution of the image vector. The approach is equivalent to a penalized maximum likelihood estimation with a special choice of the penalty function. The expectation maximization method is applied to find the a posteriori probability maximizer. A simple iterative formula is derived for a penalty function that is a weighted sum of the squared deviations of image vector components from their a priori mean values. The method is demonstrated to be superior to pure likelihood maximization, in that the penalty function prevents the occurrence of irregular high amplitude patterns in the image with a large number of iterations (the so-called "checkerboard effect" or "noise artifact").  相似文献   

11.
The maximum likelihood (ML) expectation maximization (EM) approach in emission tomography has been very popular in medical imaging for several years. In spite of this, no satisfactory convergent modifications have been proposed for the regularized approach. Here, a modification of the EM algorithm is presented. The new method is a natural extension of the EM for maximizing likelihood with concave priors. Convergence proofs are given.  相似文献   

12.
三维(three- dimensional,3D) 发射层析技术(emission computerized tomography,ECT) 是一种简单、高效且准确的燃 烧场3D成像与检测技术,其中权重矩阵的计算 精度决定了层析重建的精度和质量。本文研究了一种基于高密度子网格光线追迹的权重矩阵 计算方法, 将被测区域划分为密度更高的子网格,并根据相机成像模型实现光线追迹,以确定离散网格 对投影像素 的权重因子。数值模拟和燃烧火焰重建实验表明该算法具有较高的精度和计算效率。该研究 对于3D发射层析技术的实用化具有重要的理论意义。  相似文献   

13.
A filtered backprojection reconstruction algorithm was developed for cardiac single photon emission computed tomography with cone-beam geometry. The algorithm reconstructs cone-beam projections collected from ;short scan' acquisitions of a detector traversing a noncircular planar orbit. Since the algorithm does not correct for photon attenuation, it is designed to reconstruct data collected over an angular range of slightly more than 180 degrees with the assumption that the range of angles is oriented so as not to acquire the highly attenuated posterior projections of emissions from cardiac radiopharmaceuticals. This sampling scheme is performed to minimize the attenuation artifacts that result from reconstructing posterior projections. From computer simulations, it is found that reconstruction of attenuated projections has a greater effect upon quantitation and image quality than any potential cone-beam reconstruction artifacts resulting from insufficient sampling of cone-beam projections. With nonattenuated projection data, cone beam reconstruction errors in the heart are shown to be small (errors of at most 2%).  相似文献   

14.
A novel method of reconstruction from single-photon emission computerized tomography data is proposed. This method builds on the expectation-maximization (EM) approach to maximum likelihood reconstruction from emission tomography data, but aims instead at maximum posterior probability estimation, which takes account of prior belief about smoothness in the isotope concentration. A novel modification to the EM algorithm yields a practical method. The method is illustrated by an application to data from brain scans.  相似文献   

15.
A mathematical method was studied to model the detector response of high spatial-resolution positron emission tomography systems consisting of close-packed small crystals, and to restore the resolution deteriorated due to crystal penetration and/or nonuniform sampling across the field-of-view (FOV). The simulated detector system had 600 bismuth germanate crystals of 3.14 mm width and 30 mm length packed on a single ring of 60 cm diameter. The space between crystals was filled up with lead (i.e., septa). Each crystal was in coincidence with 200 opposite crystals so that the FOV had a radius of 30 cm. The detector response was modeled based on the attenuating properties of the crystals and the septa, as well as the geometry of the detector system. The modeled detector-response function was used to restore the projections from the sinogram of the ring-detector system. The restored projections had a uniform sampling of 1.57 mm across the FOV. The crystal penetration and/or the nonuniform sampling were compensated in the projections. A penalized maximum-likelihood algorithm was employed to accomplish the restoration. The restored projections were then filtered and backprojected to reconstruct the image. A chest phantom with a few small circular "cold" objects ( approximately 4 mm diameter) located at the center and near the periphery of FOV was computer generated and used to test the restoration. The reconstructed images from the restored projections demonstrated resolution improvement off the FOV center, while preserving the resolution near the center.  相似文献   

16.
The use of anatomical information to improve the quality of reconstructed images in positron emission tomography (PET) has been extensively studied. A common strategy has been to include spatial smoothing within boundaries defined from the anatomical data. The authors present an alternative method for the incorporation of anatomical information into PET image reconstruction, in which they use segmented magnetic resonance (MR) images to assign tissue composition to PET image pixels. The authors model the image as a sum of activities for each tissue type, weighted by the assigned tissue composition. The reconstruction is performed as a maximum a posteriori (MAP) estimation of the activities of each tissue type. Two prior functions, defined for tissue-type activities, are considered. The algorithm is tested in realistic simulations employing a full physical model of the PET scanner  相似文献   

17.
POCS超分辨率图像重构的快速算法   总被引:3,自引:0,他引:3  
张地  杜明辉 《信息技术》2004,28(7):1-3,10
超分辨率图像重构是将多帧低分辨率图像重构成一幅高分辨率图像的过程。由于其求解是一大型病态求逆问题,计算量随着放大倍数的增加而急剧上升,如何降低计算复杂度是超分辨率成像所面临的一个急需解决的课题。提出了一个基于PoCs的高分辨率图像重构的快速算法。其原理是利用各低分辨率图像之间位移的关系将所有的低分辨率图像进行重组,然后对每个组进行PoCs超分辨图象重构。实验结果表明。该快速算法较大地提高了超分辨图像重构的速度。  相似文献   

18.
The image space reconstruction algorithm (ISRA) was proposed as a modification of the expectation maximization (EM) algorithm based on physical considerations for application in volume emission computered tomography. As a consequence of this modification, ISRA searches for least squares solutions instead of maximizing Poisson likelihoods as the EM algorithm. It is shown that both algorithms may be obtained from a common mathematical framework. This fact is used to extend ISRA for penalized likelihood estimates.  相似文献   

19.
Inclusion of brain tissues with different rates of blood flow and metabolism within a voxel or region of interest is an unavoidable problem with positron emission tomography due to its limited spatial resolution. Because regional cerebral blood flow (rCBF) is higher in gray matter than in white matter, the partial volume effect leads to underestimation of rCBF in gray matter when rCBF in the region as a whole is determined. Furthermore, weighted-average rCBF itself is underestimated if the kinetic model used in the analysis fails to account for the tissue heterogeneity. We have derived a computationally efficient method for estimating both gray matter and weighted-average rCBF in heterogeneous tissues and validated the method in simulation studies. The method is based on a model that represents a heterogeneous tissue as a weighted mixture of two homogeneous tissues. A linear least squares algorithm is used to estimate the model parameters.  相似文献   

20.
Algebraic reconstruction techniques (ART) are iterative procedures for recovering objects from their projections. It is claimed that by a careful adjustment of the order in which the collected data are accessed during the reconstruction procedure and of the so-called relaxation parameters that are to be chosen in an algebraic reconstruction technique, ART can produce high-quality reconstructions with excellent computational efficiency. This is demonstrated by an example based on a particular (but realistic) medical imaging task, showing that ART can match the performance of the standard expectation-maximization approach for maximizing likelihood (from the point of view of that particular medical task), but at an order of magnitude less computational cost.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号