首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到7条相似文献,搜索用时 15 毫秒
1.
2.
A new algorithm is presented for the detection of single, fluorescence-labeled proteins in the analysis of images from living cells. It is especially suited for images with just a few (<1 per 10 microm2) fluorescence peaks from individual proteins with high background and noise (signal to background ratios as low as 2 and signal to noise as low as 10). The analysis uses the peaks over threshold method from extreme value theory and requires minimal assumptions on the underlying distributions. The significant advantage of the method over others is the rare occurrence to detect false positives. Some examples of simulated and real data are given as comparisons. The algorithm is implemented in MATLAB.  相似文献   

3.
In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens.  相似文献   

4.
Image velocimetry for open channel is safe, efficient and environmentally friendly and large-scale particle image velocimetry (LSPIV) is one of the most adopted methods. Furthermore, robust LSPIV algorithms strongly rely on the error vector filtering strategy. However, previous works mainly conduct filtering by median filtering, main orientation filtering and maximum filtering, and these strategies are not stable enough due to the lack of taking both the visibility and stability of tracer particles into consideration. Meanwhile, statistical property like SNR (Signal-to-Noise Ratio) and peak cross-correlation are unable to estimate the tracer visibility. In order to improve the accuracy, we propose a robust and effective filtering strategy called PPSR (Peak-Peak-Sidelobe-Ratio) from the image matching perspective to ensure the visibility and stability of tracers. We conduct a serial of experiments on the public dataset Brenta and Tiber to prove the effectiveness of the proposed algorithm.  相似文献   

5.
In recent years, there has been a return to the use of electron probe X‐ray microanalysis for biological studies but this has occurred at a time when the Hall programme which acted as the mainstay for biological microanalysis is no longer easily available. Commercial quantitative routines rely on the Cliff‐Lorimer method that was originally developed for materials science applications. Here, the development of these two main routines for obtaining quantitative data from thin specimens is outlined and the limitations that are likely to be met when the Cliff‐Lorimer routine is applied to biological specimens is discussed. The effects of specimen preparation on element content is briefly summarized and the problems encountered when using quantitative analysis on resin‐embedded materials emphasized.  相似文献   

6.
Image analysis systems are an essential tool in measurements of size of intraparenchymal tumors or lesions in experimental small animal models. Conventional image analysis systems are relatively expensive. We therefore compared the performance of a professional image analysis system with an inexpensive setup by evaluating tumor size in an orthotopic glioma mouse model. The maximum cross-sectional tumor area of H&E stained brain-slides of two groups of mice (treatment and control group) was measured by two independent investigators using a professional image analysis system (Leica DM IRB microscope) with the Leica Quantimet 500c software, and a low-cost-system (Intel QX3 microscope) with a non-commercial image analysis software. Mean tumor volumes were calculated and the results from each of the image analysis systems, investigators, and treatment effects were compared. The tumor volumes as measured with the low-cost and the professional system differed between -3.7 and +7.5% (P = 0.69-0.99). Measurements made by investigator A and B differed between -7.0 and +3.9% (P = 0.69-0.88). Treatment in all cases significantly reduced the tumor volume between 58.4 and 62.7% (P = 0.0002 or 0.0003), regardless of the investigator or the used image analysis system. We therefore conclude that the QX3 low-cost microscope in combination with a non-commercial image-analysis software represents an inexpensive solution to reliably analyze the size of regions of interest, if they provide a sufficient contrast. However, the low-cost setup due to its low resolution definitely limits a detailed analysis of histologic features.  相似文献   

7.
Real structures investigated in the material and biological sciences, such as minerals or tissues, can often be reduced to two phases. In a stochastic approach, the components of such binary structures may be considered as the union of grains — random sets implanted with their centres at random points — and their complementary space, which is called the pore space. The simplest stochastic germ-grain model is the Boolean model of random sets, which we use here instrumentally as a null model (reference model) for comparison with our biological material. After a brief review of basic properties of the Boolean model and related statistical methods, we introduce centred contact density functions as a new approach. Empirical contact density functions are estimated from the empirical contact distribution functions with an image analyser by dilation of the grain phase. Theoretical contact density functions are then predicted from a set of image parameters, under the assumption that the Boolean model holds. A centred contact density function is the difference between the estimated and the predicted contact density function. Apart from a random error term, centred contact density functions amount to zero irrespective of the area fraction of the grain phase, when the germ-grain model is Boolean. As a section of a spatial Boolean model is a planar Boolean model, the method is also applicable in stereological studies where digitized images are obtained from sections of a three-dimensional structure. Centred contact density functions were determined for mastopathic tissue as compared to mammary cancer, and for tumour-free prostatic tissue as compared to prostatic cancer. For each category of specimens, twenty cases with 10 images each were analysed. Benign and malignant glandular tissue of the aforementioned types deviates significantly from the Boolean model. Centred contact density functions show that malignant transformation is connected with profound geometric remodelling of the pore space.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号