首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In stereological studies analysis of sampling variances is used for optimizing the sampling design. In organs with a heterogeneous distribution of the phase of interest the analysis of sampling variances can be undertaken only if the observed variance between sections is distributed into the fraction which is due to random variation and the fraction which is due to the heterogeneity. In the present example (pancreatic islet volume estimated by light microscopic point-counting) the density of islets showed a linear increase along the axis of the organ. By analysis of sampling variances it was calculated that the most efficient number of sections (cut perpendicular to the organ) was considerably lower when the isolated contribution from the random variation was considered. The total islet volume was obtained by the product of the fractional islet volume and the pancreatic weight. Analysis of sampling variances of the total islet volume was performed by including the variance contribution from the individual pancreatic weights to the variance of the group mean total islet volume. Due to a negative correlation between the fractional volume and organ weight the total islet volume in the group of animals was more precisely estimated than the fractional islet volume. The methods used for dealing with the heterogeneity of the organ and for estimating sampling variances of total structural quantities generalize to a large number of stereological studies in biology.  相似文献   

2.
We suggest the use of bootstrap methods for inference from stereological estimates of volume fraction. An informal introduction to stereological estimation of volume fraction and to principles of bootstrap techniques is given. The bootstrap method is a robust computer-intensive resampling technique, based on independent random sampling from a data set with replacement. Bootstrap methods were used to estimate confidence intervals for volume fractions, and to test for a significant difference between estimated volume fractions from two samples. Two sampling designs are considered: independent replicated samples (visual fields) from a single object, and estimates of volume fraction from multiple independent objects. The methods are presented as worked examples on real data sets obtained from tumour pathology (mammary cancer, pancreatic cancer). The volume fraction of glandular lumina per total volume of the epithelial phase was chosen as target parameter. It indicates the degree of glandular differentiation in adenocarcinomas and is estimated as a ratio-of-means statistic with variable denominator within cases. The confidence intervals of the volume fraction estimated by the bootstrap method were slightly narrower than the parametrically calculated confidence intervals for all data sets. The outcomes of significance tests based on the bootstrap technique were unchanged as compared with classical tests based on the assumptions of normality and homoscedasticity of the data. Special attention was paid to the reproducibility of the bootstrap technique in replicated trials on the same data.  相似文献   

3.
Considerable residual stresses may form during quenching due to the differential cooling and the increase in volume accompanying the phase transformations. The design of a part may be entirely responsible for the formation of residual stresses at a critical level and even cracking during quenching. Furthermore, a certain design may be perfectly safe for one type of steel, or one type cooling conditions, and unsafe for another.In this study, an experimental procedure to investigate the influence of specimen geometry on the evolution residual stresses is proposed. The cylindrical specimens with 30 mm outer diameter were prepared from C60 and 90MnCrV8 steel bars. First, solid cylinders were quenched according to different procedures. Then, the treatment giving the minimum residual stress was applied to the hollow cylinders having various hole diameter and degree of eccentricity. By changing the position of holes in the cross-section of the specimens, a thickness gradient as a function of the eccentricity ratio was obtained. Thus, for a given transformation behaviour and quenching conditions, the effect of shape becomes more discriminating on the eccentrically drilled holes. The tangential residual stresses were determined at the specified points along the circumference of the cylinders by X-ray diffraction and d-sin2 Ψ technique. The microstructures of the specimens were determined by metallographic investigation also using hardness values and respective CCT-diagrams. The results were discussed considering the microstructural evaluation of the specimens.  相似文献   

4.
To better evaluate the activation and proliferative response of hepatic stellate cells (HSC) in hepatic fibrosis, it is essential to have sound quantitative data in non‐pathological conditions. Our aim was to obtain the first precise and unbiased estimate of the total number of HSC in the adult rat, by combining the optical fractionator, in a smooth sampling design, with immunocytochemistry against glial fibrillary acidic protein. Moreover, we wanted to verify whether there was sufficiently relevant specimen inhomogeneity that could jeopardize the high expected estimate precision when using the smooth fractionator design for HSC. Finally, we wanted to address the question of what sampling scheme would be advisable a priori for future studies. Microscopical observations and quantitative data provided no evidence for inhomogeneity of tissue distribution of HSC. Under this scenario, we implemented a baseline sampling strategy estimating the number (N?) of HSC as 207E06 (CV = 0.17). The coefficient of error [CE(N?)] was 0.04, as calculated by two formerly proposed approaches. The biological difference among animals contributed ? 95% to the observed variability, whereas methodological variance comprised the remaining 5%. We then carried out a half reduction of sampling effort, at the level of both sections and fields. In either occasion, the CE(N?) values were low (? 0.05) and the biological variance continued to be far more important than methodological variance. We concluded that our baseline sampling (counting 650–1000 cells/rat) would be appropriate to assess the lobular distribution and the N? of HSC. However, if the latter is the only parameter to be estimated, around half of our baseline sampling (counting 250–600 cells/rat) would still generate precise estimates [CE(N?) < 0.1], being in this case more efficient to reduce the number of sections than to reduce the sampled fields.  相似文献   

5.

A double notched central hole, i.e., DNC specimen, which contains a center hole with a left side notch and a right side notch accompanying the slope α, is proposed to simultaneously achieve dual values of stress triaxiality and Lode parameter from one specimen in the present study. Analytical and numerical approaches are carried out under decent assumptions to characterize the plane strain behavior of DNC specimen along with the slope α. Quasi-static tensile tests with conventional specimens and DNC specimens revealed plastic strains at fracture for each specimen case in accordance with stress triaxiality and Lode parameter by digital image correlation. It is found that one DNC specimen successfully provides dual values of stress triaxiality and Lode parameter at the same time; therefore, it saves up to 50 % of experimental time and labor of conventional specimens.

  相似文献   

6.
A formula is given for the variance of the intersection of two geometric objects, S and T, under uniform, i.e. translation invariant, randomness. It involves an integral of the product of the point-pair distance distributions of S and T. In systematic sampling S is the specimen and T is the test system, for example a system of planes, lines, or dots in ?3 or ?2. The general n-dimensional integral (or sum) is difficult to use, but for systematic sectioning, i.e. for a test system of parallel hyperplanes (planes in ?3 or lines in ?2) it can be reduced to a one-dimensional expression: this leads to Matheron's treatment in terms of ‘covariograms’ of the specimens. Under the condition of isotropy analogous simplifications lead to equations involving the distributions of scalar point-pair distances and to the approach developed by Matérn for sampling with point grids. The equations apply to arbitrary test systems, but they include fluctuating functions that require high precision in the numerical evaluation and make it difficult to pradict undamped variance oscillations of the volume estimator which occur for some specimen shapes but not for others. A generalized Euler method of successive partial integrations removes this difficulty and shows that, for a convex specimen, the undamped oscillations result from discontinuities in its chord-length density. The periodicities equal the ratios of the critical chord-lengths to the periodicities of the test system. Analogous relations apply to the covariogram. The formulae for the variance are extended also to the covariance of the volume estimators of paired specimens.  相似文献   

7.
An efficient sampling procedure is presented for estimation of total line length per unit volume Lv. It involves the following steps: (1) choose a vertical axis in the specimen, and cut the specimen to obtain VUR vertical slices of constant thickness Δ such that parallel planes of the slices contain the vertical direction; (2) observe the projected image of a vertical slice using transmission microscopy such that beam direction is perpendicular to the slice; (3) count the number of intersections of the projected images of the lineal features of interest with cycloid-shaped test lines whose minor axis is perpendicular to the vertical axis. The expected value of the number of intersections per unit length prj is related to Lv as follows: Thus, Lv can be estimated from the measurements performed on the projected images of VUR vertical slices.  相似文献   

8.
9.
The classical methods for estimating the volume of human body compartments in vivo (e.g. skin-fold thickness for fat, radioisotope counting for different compartments, etc.) are generally indirect and rely on essentially empirical relationships — hence they are biased to unknown degrees. The advent of modern non-invasive scanning techniques, such as X-ray computed tomography (CT) and magnetic resonance imaging (MRI) is now widening the scope of volume quantification, especially in combination with stereological methods. Apart from its superior soft tissue contrast, MRI enjoys the distinct advantage of not using ionizing radiations. By a proper landmarking and control of the scanner couch, an adult male volunteer was scanned exhaustively into parallel systematic MR ‘sections’. Four compartments were defined, namely bone, muscle, organs and fat (which included the skin), and their corresponding volumes were easily and efficiently estimated by the Cavalieri method: the total section area of a compartment times the section interval estimates the volume of the compartment without bias. Formulae and nomograms are given to predict the errors and to optimize the design. To estimate an individual's muscle volume with a 5% coefficient of error, 10 sections and less than 10min point counting (to estimate the relevant section areas) are required. Bone and fat require about twice as much work. To estimate the mean muscle volume of a population with the same error contribution, from a random sample of six subjects, the workload per subject can be divided by √6, namely 4 min per subject. For a given number of sections planimetry would be as accurate but far more time consuming than point counting.  相似文献   

10.
A method to determine the volume fraction of a dispersoid phase based on measurements carried out on thin foils is presented. The method involves the reconstruction of the shape and volume, by electron energy-loss spectroscopy thickness measurements, of the analysed foil and the determination of the volume of the dispersoids using more conventional electron microscopy imaging techniques. Hence the technique does not require the use of stereology theorems. The procedure to measure the total inelastic mean free path and the linearity of its measurement for thicknesses (t) relative to the mean free path (t/λ) up to t/λ ~ 4 is described. A method allowing the conversion of one single experimental λ-value to various collection conditions either graphically or by parameterization is also outlined. Various imaging methods (CTEM, STEM and chemical mapping) were evaluated for their ability to retrieve the distribution of dispersoids and thus their volume. Artefacts of the technique and of the sample preparation method are also discussed. The possibility of applying such techniques using in-column and post-column imaging filters and the limitations of such methods are presented. Although the system has been applied to a relevant metallurgical system in the aluminium industry, it can be used for any other material provided that the values of the mean free path can be obtained.  相似文献   

11.
In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images – although convenient manual application is always an option – and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the ‘lambda method’) is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable – one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods.  相似文献   

12.
Variation of roughness parameters on some typical manufactured surfaces   总被引:1,自引:0,他引:1  
A number of specimen surfaces, including machined surfaces and calibration standards, are examined by a stylus instrument on-line to a microcomputer. For each measuement on each specimen 14 roughness parameters are computed for each of 10 profiles, and the mean and standard deviation of each parameter is calculated. Variations of up to 15% are found even on calibration standards, and 50% variations or larger are found on many machined surfaces. Increasing the range setting and decreasing the cut-off are both found to increase scatter. Using a skid has little effect. Measuring with the lay increases the scatter. Decreasing the sampling interval has no effect on Ra and Rq roughness but increase Rz and similar roughness and makes texture parameters ‘shape’.  相似文献   

13.
Single molecule detection based on fluorescent labels offers the possibility to gain not only qualitative but also quantitative insight into specific functions of complex biological systems. Fluorescence correlation spectroscopy is one of the favourite techniques to determine concentrations and diffusion constants as well as molecular brightness of molecules in the pico‐ to nano‐molar concentration range, with broad applications in biology and chemistry. Although fluorescence correlation spectroscopy in principle has the potential to measure absolute concentrations and diffusion coefficients, the necessity to know the exact size and shape of the confocal volume very often hampers the possibility to obtain quantitative results and restricts fluorescence correlation spectroscopy to relative measurements mainly. The determination of the confocal volume in situ is difficult because it is sensitive to optical alignment and aberrations, optical saturation and variations of the index of refraction as observed in biological specimen. In the present contribution, we compare different techniques to characterize the confocal volume and to obtain the confocal parameters by fluorescence correlation spectroscopy curve fitting, a fluorescence correlation spectroscopy dilution series and confocal scanning of fluorescent beads. The results are compared in the view of quantitative fluorescence correlation spectroscopy measurement and analysis. We investigate how unavoidable artefacts caused by a non‐ideal confocal volume can be experimentally determined and validated.  相似文献   

14.
Formulae of stereology are used to estimate 3D geometrical parameters of cocontinuous structures measured from 2D micrographs of polymer blends. 3D images of symmetric and nonsymmetric polymer blends made of fluorescently labelled polystyrene and styrene‐ran‐acrylonitrile copolymer were obtained with laser scanning confocal microscopy. Geometrical parameters of the blend interface, specifically volume fraction, surface area per unit volume (S V ) and average of local mean curvature were measured directly from the 3D images and compared to the values estimated from analysis of a number of 2D slices combined with stereological relations. When the total length of phase boundary considered in the analysis of the 2D slices (LTot ) was at least 6000 times bigger than the characteristic length of the microstructure (S?1V ), the standard deviation for all the parameters measured became negligible. However, considerable discrepancies between the average values computed from 3D and 2D images were observed for any value of LTot . The mean curvature distribution was also measured from both the 3D images and the 2D slices. The distribution was estimated from the 2D slices but with a width about 2.4 times that of the true value obtained from the 3D images.  相似文献   

15.
Larsen    Gundersen  & Nielsen 《Journal of microscopy》1998,191(3):238-248
Existing design-based direct length estimators require random rotation around at least one axis of the tissue specimen prior to sectioning to ensure isotropy of test probes. In some tissue it is, however, difficult or even impossible to define the region of interest, unless the tissue is sectioned in a specific, nonrandom orientation. Spatial uniform sampling with isotropic virtual planes circumvents the use of physically isotropic or vertical sections. The structure that is contained in a thick physical section is investigated with software-randomized isotropic virtual planes in volume probes in systematically sampled microscope fields using computer-assisted stereological analysis. A fixed volume of 3D space in each uniformly sampled field is probed with systematic random, isotropic virtual planes by a line that moves across the computer screen showing live video images of the microscope field when the test volume is scanned with a focal plane. The intersections between the linear structure and the virtual probes are counted with columns of two dimensional disectors.
Global spatial sampling with sets of isotropic uniform random virtual planes provides a basis for length density estimates from a set of parallel physical sections of any orientation preferred by the investigator, i.e. the simplest sampling scheme in stereology. Additional virtues include optimal conditions for reducing the estimator variance, the possibility to estimate total length directly using a fractionator design and the potential to estimate efficiently the distribution of directions from a set of parallel physical sections with arbitrary orientation.
Other implementations of the basic idea, systematic uniform sampling using probes that have total 3D × 4π freedom inside the section, and therefore independent of the position and the orientation of the physical section, are briefly discussed.  相似文献   

16.
A general formulation for the secondary fluorescence correction is presented. It is intended to give an intuitive appreciation for the various factors that influence the magnitude of the secondary fluorescence correction, the specimen geometry in particular, and to serve as a starting point for the derivation of quantitative correction formulae. This formulation is primarily intended for the X-ray microanalysis of electron-transparent specimens in the analytical electron microscope (AEM). The fluoresced intensity, IYX, is expressed relative to the primary intensity of the fluorescing element, IY, rather than to that of the fluoresced element, IX, as has been customary for microanalysis. The importance of this choice of IY as a reference intensity for the electron-transparent specimens examined in the AEM is discussed. The various factors entering the secondary fluorescence correction are grouped into three factors, representing the dependencies of the correction on specimen composition, X-ray fluorescence probability and specimen geometry. In principle, an additional factor should be appended to account for the difference in detection efficiencies of the fluoresced and fluorescing X-rays; however, this factor is shown to be within a few per cent of unity for practical applications of the secondary fluorescence correction. The absorption of secondary X-rays leaving the specimen en route to the detector is also accounted for through a single parameter. In the limit that the absorption of secondary X-rays is negligible, the geometric factor has the simple physical interpretation as the fractional solid angle subtended by the fluoresced volume from the perspective of the analysed volume. Studies of secondary fluorescence in the published literature are compared with this physical interpretation. It is shown to be qualitatively consistent with Reed's expression for secondary fluorescence in the electron probe microanalyser and with the specimen-thickness dependence of the Nockolds expression for the parallel-sided thin foil. This interpretation is also used to show that the ‘sec α’ dependence on specimen tilt in the latter expression is erroneous and should be omitted. The extent to which extrapolation methods can be used to correct for secondary fluorescence is also discussed. The notion that extrapolation methods, by themselves, can be used to correct for secondary fluorescence is refuted. However, extrapolation methods greatly facilitate secondary fluorescence correction for wedge-shaped specimens when used in conjunction with correction formulae.  相似文献   

17.
从理论角度与实验角度研究了三次相位调制参数对艾里光束的影响。理论上分析说明三次相位对应的频谱为艾里光束,并引入三次相位调制参数a3以表征相位变化速率。实验上通过设定不同的三次相位调制参数,观察其对艾里光束的影响。结果表明,三次相位调制参数可以影响艾里光束的光瓣尺寸、光瓣间距以及能量分布等,并确定a3最佳值在2~4之间。  相似文献   

18.
‘Vertical’ sections are plane sections longitudinal to a fixed (but arbitrary) axial direction. Examples are sections of a cylinder parallel to the central axis; and sections of a flat slab normal to the plane of the slab. Vertical sections of any object can be generated by placing the object on a table and taking sections perpendicular to the plane of the table. The standard methods of stereology assume isotropic random sections, and are not applicable to this kind of biased sampling. However, by using specially designed test systems, one can obtain an unbiased estimate of surface area. General principles of stereology for vertical sections are outlined. No assumptions are necessary about the shape or orientation distribution of the structure. Vertical section stereology is valid on the same terms as standard stereological methods for isotropic random sections. The range of structural quantities that can be estimated from vertical sections includes Vv, Nv, Sv and the volume-weighted mean particle volume v?v, but not Lv. There is complete freedom to choose the vertical axis direction, which makes the sampling procedure simple and ‘natural’. Practical sampling procedures for implementation of the ideas are described, and illustrated by examples.  相似文献   

19.
At present a model-free, design-based theory of unbiased estimation, and a model-based one of linear unbiased estimation of minimum variance, are available for stereology. The main developments rest upon the nested scheme {section (quadrat)}, whence the raw data are expressible in terms of area, length and number. The main aim of this paper is to complete the available model-based theory by introducing the step in which sections are analysed by point-counting via coherent test systems (CTSs). Using this development, the stereologist should be able to handle raw point and intersection counts optimally, in order to find the best estimator of a ratio R in a given specimen in a wide range of circumstances. The latter include, for instance, the use of different CTSs on different sections and of double CTSs on each section, as well as the case—(not uncommon in electron microscopy)—in which different sections from the same sample are observed at slightly different magnifications but analysed by quadrats (via automatic or semi-automatic image analysers, for instance), or CTSs of fixed sizes. The main conclusion pertaining to the latter case is that the estimators obtained via section-wise magnification corrections are in general superior to those corrected via a global, average magnification. In order to illustrate the methodology, a synthetic numerical example, and a real one, are given.  相似文献   

20.
In this paper the electrochemical polishing behavior of duplex stainless steel (DSS) in phosphoric-sulfuric mixed acids with volume ratios of 1:1, 2:1, and 3:1 was studied. The electrochemical polishing was conducted at 70°C by using arotating disc electrode. Before the polishing, the steel specimens were heated at 1080°C for 10 min and cooled with different rates to obtain dissimilar microstructures. Experimental results show that a brightening surface of each DSS specimen can be obtained by polishing in the mixed acids at 70°C. However, the dissolution rate between α and γ phases in a DSS specimen is different during potentiostatic polishing inthe mixed acids and the rate of α phase is obviously higher than that of γ phase. However, the difference in the dissolution rate can be reduced as the DSS specimen was polished in a highH3PO4-content mixed acid. Some small round σ-phases were found to precipitate along the α/γ interface in a DSS specimen, which can be obtained by heating at 1080°C and then cooling in the furnace. The presentation of σ phase increased the hardness and microstructure fraction of the γ phase in the DSS specimen. Moreover, the σ phase can be leveled together with α and γ phases as polishing in the 3:1 v/v mixed acid.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号