首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
Although confocal microscopes have considerably smaller contribution of out-of-focus light than widefield microscopes, the confocal images can still be enhanced mathematically if the optical and data acquisition effects are accounted for. For that, several deconvolution algorithms have been proposed. As a practical solution, maximum-likelihood algorithms with regularization have been used. However, the choice of regularization parameters is often unknown although it has considerable effect on the result of deconvolution process. The aims of this work were: to find good estimates of deconvolution parameters; and to develop an open source software package that would allow testing different deconvolution algorithms and that would be easy to use in practice. Here, Richardson-Lucy algorithm has been implemented together with the total variation regularization in an open source software package IOCBio Microscope. The influence of total variation regularization on deconvolution process is determined by one parameter. We derived a formula to estimate this regularization parameter automatically from the images as the algorithm progresses. To assess the effectiveness of this algorithm, synthetic images were composed on the basis of confocal images of rat cardiomyocytes. From the analysis of deconvolved results, we have determined under which conditions our estimation of total variation regularization parameter gives good results. The estimated total variation regularization parameter can be monitored during deconvolution process and used as a stopping criterion. An inverse relation between the optimal regularization parameter and the peak signal-to-noise ratio of an image is shown. Finally, we demonstrate the use of the developed software by deconvolving images of rat cardiomyocytes with stained mitochondria and sarcolemma obtained by confocal and widefield microscopes.  相似文献   

3.
The width of the emission spectrum of a common fluorophore allows only for a limited number of spectral distinct fluorescent markers in the visible spectrum, which is also the regime where CCD-cameras are used in microscopy. For imaging of cells or tissues, it is required to obtain an image from which the morphology of the whole cell can be extracted. This is usually achieved by differential interference contrast (DIC) microscopy. These images have a pseudo-3D appearance, easily interpreted by the human brain. In the age of high throughput and high content screening, manual image processing is not an option. Conventional algorithms for image processing often use threshold-based criteria to identify objects of interest. These algorithms fail for DIC images as they have a range from dim to bright with an intermediate intensity equal to the background, so as to produce no clear object boundary. In this article we compare different reconstruction methods for up to 100 MB-large DIC images and implement a new iterative reconstruction method based on the Hilbert Transform that enables identification of cell boundaries with standard threshold algorithms.  相似文献   

4.
Operation principles of locally spatial filtration algorithms and wavelet filtration algorithms are described. Numerous computational experiments are performed with filtration of two different images distorted by noise with different statistical natures (white, color, and pulsed). Based on the analysis of the results obtained, recommendations on using the image filtration algorithms considered are given.  相似文献   

5.
A new algorithm is proposed for estimating optimal threshold values in thresholding algorithms of wavelet filtration of signals and images. Numerous computational experiments are performed to compare this threshold with threshold values widely used in wavelet filtration algorithms. Application of the proposed threshold is demonstrated to ensure substantial reduction of the filtration error of both smooth and high-contrast images.  相似文献   

6.
Microstructure analysis of polar ice cores is vital to understand the processes controlling the flow of polar ice on the microscale. This paper presents an automatic image processing framework for extraction and parametrization of grain boundary networks from images of the NEEM deep ice core. As cross‐section images are acquired using controlled surface sublimation, grain boundaries and air inclusions appear dark, whereas the inside of grains appears grey. The initial segmentation step of the software is to separate possible boundaries of grains and air inclusions from background. A Machine learning approach is utilized to gain automatic, reliable classification, which is required for processing large data sets along deep ice cores. The second step is to compose the perimeter of section profiles of grains by planar sections of the grain surface between triple points. Ultimately, grain areas, grain boundaries and triple junctions of the later are diversely parametrized. High resolution is achieved, so that small grain sizes and local curvatures of grain boundaries can systematically be investigated.  相似文献   

7.
基于被动毫米波成像特性,提出了改进的稀疏表示——圆周中心差(ISR-CSCD)算法来解决被动毫米波图像中弱小目标与背景区分度较弱,目标可提取特征较少的问题。该算法通过改进稀疏表示方法完成背景抑制与目标增强。依据目标与周围背景特征先验,提出了圆周中心差背景抑制算法对检测图像进行背景抑制。然后,融合改进稀疏表示方法和圆周中心差背景抑制算法的结果得到抑制了背景的目标增强图像。最后,基于恒虚警率的检测方法完成了弱小目标的检测。对不同场景下的毫米波图像进行了实验检测,结果表明,与主流算法图像稀疏表示(SR)法、鲁棒规则核回归牛顿算法(NRRKR),空时联合分类稀疏表示算法(STCSR)和累积中心与周边差异测量算法(ACSDM)相比,ISR-CSCD算法具有更低的虚警率、更高的检测精度、更强的鲁棒性。对各种虚警率、信噪比之下的毫米波弱小目标检测结果显示,ISR-CSCD检测率相对于其它算法平均提高了约15%。  相似文献   

8.
传统超声回波时延估计算法是在高斯噪声背景下展开研究的,而实际工况中超声回波不仅含有高斯噪声,还含有脉冲冲击噪声(α稳定分布噪声)等,这导致传统算法失效。为了解决上述问题,本文提出了一种针对混合噪声特别是包含噪声背景下的超声回波时延估计算法:归一化循环相关时延估计算法。首先,对归一化循环相关算法理论进行了简要的介绍。接着,对归一化循环相关时延估计算法进行了理论推导分析。然后,结合仿真分析,在相同α混合噪声情况下对传统循环相关和归一化循环相关时延估计进行比较。最后,在不同信噪比下,对归一化循环相关时延估计算法的估计性能进行了分析。通过对比实验发现,在噪声特征指数趋于1时,循环相关算法已不能估计出时延,而归一化循环相关算法的误差仍能保持在0.4μs;且在-10dB信噪比下,归一化循环相关算法时延估计也能保持在10μs误差范围内。本文所提归一化循环相关算法在混合噪声特别是包含α噪声情况下能够对超声回波时延进行精确估计,具有传统算法所不能比拟的优势。  相似文献   

9.
Maximum likelihood algorithms for estimating the area of stochastic images against a stochastic background in the presence of spatial noise are synthesized. A comparison is made of area estimation algorithms based on the additive and applicative models of interaction between the image and background. Asymptotic expressions for the characteristics of the area estimators are obtained. The effect of the difference between the statistical characteristics of the background and image on the accuracy of image area estimation is studied.  相似文献   

10.
多子阵高分辨实时波达估计算法研究   总被引:2,自引:1,他引:2  
常用的高分辨波达估计算法一般都面临计算量巨大的高维协方差矩阵求逆或需要已知目标数目的先验信息等难题,难以在实际中应用。本文从多子阵处理策略出发,结合最小无失真响应算法(MVDR)理论,提出了一种自适应高分辨实时波达估计算法,并给出了详细的理论推导及相应的性能分析。研究表明,本文算法波达估计误差优于频域常规波束形成算法,同时,相较于常用算法实现高分辨波达估计所需的10 dB以上的阵元信噪比门限,本文算法所需的信噪比门限为3.5 dB,更适用于低信噪比条件下目标的探测,并且对舷侧阵阵元信噪比分布不一致性有较强的宽容性。更重要的是,由于协方差矩阵维数的大大降低,使得本文算法的处理速度大大提升,从而使得高分辨的实时波达估计成为可能。仿真及海试数据处理验证了本文算法的可行性。  相似文献   

11.
Statistically optimal and neural network algorithms of joint detection and estimation of brightness difference parameters in a local observation window are synthesized for the purpose of solving the problem of object boundary extraction in digital images in the presence of noise. Comparative results on the performance of these algorithms are given.  相似文献   

12.
A new diagnostic is developed to reconstruct the plasma boundary using visible wavelength images. Exploiting the plasma's edge localized and toroidally symmetric emission profile, a new coordinate transform is presented to reconstruct the plasma boundary from a poloidal view image. The plasma boundary reconstruction is implemented in MATLAB and applied to camera images of Mega-Ampere Spherical Tokamak discharges. The optically reconstructed plasma boundaries are compared to magnetic reconstructions from the offline reconstruction code EFIT, showing very good qualitative and quantitative agreement. Average errors are within 2 cm and correlation is high. In the current software implementation, plasma boundary reconstruction from a single image takes 3 ms. The applicability and system requirements of the new optical boundary reconstruction, called OFIT, for use in both feedback control of plasma position and shape and in offline reconstruction tools are discussed.  相似文献   

13.
Physical principles and algorithms for reconstructing images of the inner structure of an object made of a solid material are considered. These are based on the pulsed echo method of ultrasonic testing using multielement antenna arrays focused on each point of the visualized region of the object by spatiotemporal processing of signals from a combination sounding of the object by all possible pairs of the antenna array. Substantial improvement of the image during testing of a plane-parallel object is obtained by using signals that are multiply reflected from the object boundaries; the use of different algorithms of image reconstruction is expedient for different types of discontinuity flaws.  相似文献   

14.
Picture processing using generalized instrumentation with specially developed software, and image analysis by means of a Quantimet 720 image analyser with a linear correlator, were employed in the study of rock textures. The boundaries of all individual grains from a portion of a thin section of a rock were distinguished and hand-drafted for digitization and analysis. From the grey level image of boundaries of the profiles of the grains, obtained by hexagonal scanning of line drawings, binary images were produced for each phase and used for textural measurements. Ink drawings extracted from the boundary pattern were also used for the image analyser experiments. In picture processing experiments, programmed structuring elements of different shapes and two-dimensional autocorrelation functions were used to characterize morphology and spatial distribution of the different phases. Transformations of binary images were programmed at the bit level to reduce storage requirements and computing time. In Quantimet 720 experiments, linear erosion and histogram of sizes were used for computing the mean dimensions of the grains. Periodical arrangements were studied by means of the covariance function. This contribution uses simple texture measures derived from mathematical morphology and picture processing methods, in order to assess the feasibility of two different technical approaches in the analysis of the same material. The extraction of image information from the material analysed, which was not attempted before, is cumbersome and technically complex. This difficulty is partly responsible of why little is known on the quantitative characterization of metamorphic textures from thin sections.  相似文献   

15.
基于WSN的两种气体源定位算法研究   总被引:1,自引:0,他引:1  
本文基于气体污染源浓度衰减模型,首次采用极大似然预估算法、直接三边测距算法对气体污染源定位进行了对比研究。仿真实验对比了两种算法在不同数量传感器节点以及背景噪声情况下对预估定位误差的影响。仿真表明:在环境背景噪声较大的情况下,采用极大似然算法比采用直接三边测距算法有着更强的鲁棒性;而直接三边测距算法简单,在背景噪声较小时,定位效果同样有效。  相似文献   

16.
Various deconvolution algorithms are often used for restoration of digital images. Image deconvolution is especially needed for the correction of three‐dimensional images obtained by confocal laser scanning microscopy. Such images suffer from distortions, particularly in the Z dimension. As a result, reliable automatic segmentation of these images may be difficult or even impossible. Effective deconvolution algorithms are memory‐intensive and time‐consuming. In this work, we propose a parallel version of the well‐known Richardson–Lucy deconvolution algorithm developed for a system with distributed memory and implemented with the use of Message Passing Interface (MPI). It enables significantly more rapid deconvolution of two‐dimensional and three‐dimensional images by efficiently splitting the computation across multiple computers. The implementation of this algorithm can be used on professional clusters provided by computing centers as well as on simple networks of ordinary PC machines. Microsc. Res. Tech., 2010. © 2009 Wiley‐Liss, Inc.  相似文献   

17.
A new method of abandoned object detection based on analyzing a sequence of depth images is proposed, and algorithms of real-time determination of the quasi-stationary background and abandoned objects are developed. The efficiency of the proposed algorithms is compared with that of the algorithm based on analyzing brightness images..  相似文献   

18.
Modern microscopic techniques like high-content screening (HCS), high-throughput screening, 4D imaging, and multispectral imaging may involve collection of thousands of images per experiment. Efficient image-compression techniques are indispensable to manage these vast amounts of data. This goal is frequently achieved using lossy compression algorithms such as JPEG and JPEG2000. However, these algorithms are optimized to preserve visual quality but not necessarily the integrity of the scientific data, which are often analyzed in an automated manner. Here, we propose three observer-independent compression algorithms, designed to preserve information contained in the images. These algorithms were constructed using signal-to-noise ratio (SNR) computed from a single image as a quality measure to establish which image components may be discarded. The compression efficiency was measured as a function of image brightness and SNR. The alterations introduced by compression in biological images were estimated using brightness histograms (earth's mover distance (EMD) algorithm) and textures (Haralick parameters). Furthermore, a microscope test pattern was used to assess the effect of compression on the effective resolution of microscope images.  相似文献   

19.
Particle Image Velocimetry (PIV) measurement accuracy is lower along the phase boundaries of two-phase-flows, because the interrogation windows contain information from both phases. Different seeding density, background intensity, velocity magnitude and flow direction conditions often exist across the boundary, and the cross-correlation-based PIV algorithm selects only the highest correlation peak. The highest correlation peak is either influenced by the wrong phase (across the boundary), or the correctly calculated displacement is erroneously detected as an outlier at a later stage and is subsequently replaced. Phase-separated PIV measurements minimize this problem, and increase accuracy along the boundary by treating each phase separately. This type of measurement requires for each time step; (i) the accurate detection of the phase boundary in consecutive frames, (ii) generation of dynamic phase masks, (iii) an accurate PIV evaluation of each phase and (iv) recombination of the flow fields. In this article, we focus on the first step and test a hybrid phase boundary detection (PBD) technique in three different two-phase-flow configurations which manifest different challenges: The first configuration is the mixing of two liquids in a magnetic micromixer, the second is a combustion experiment where a turbulent, pre-mixed, low-swirl, lifted flame is investigated, and the third is a bubble column reactor where air bubbles are rising in a water tank. The PBD implementation uses a three-step procedure: approximate global thresholding, local Otsu thresholding, and discrimination of image gradients. Comparison of results with and without the use of PBD and phase separation indicate that there are significant measurement accuracy improvements along the boundary.  相似文献   

20.
A problem of digital watermarking of a sequence of TV images is considered. An algorithm of embedding of such watermarks based on combining singular decomposition and Haar filtration procedures and a corresponding blind algorithm of watermark retrieval are developed. It is shown that the embedded watermark is stable to basic distortions arising in the TV path, e.g., enhancement of noise, loss of definition, truncation and compression of the multimedia material by MPEG2 or AVC algorithms. Owing to noise-immune encoding, the number of correctly retrieved bits of digital watermarks can reach 100%.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号