首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The phase space evolution model of Huizenga and Storchi, Morawska-Kaczyńska and Huizenga and Janssen et al has been modified to (i) allow application on currently available computer equipment with limited memory (128 Megabytes) and (ii) allow 3D dose calculations based on 3D computer tomographic patient data. This is a further development aimed at the use of the phase space evolution model in radiotherapy electrons beam treatment planning. The first modification regards the application of depth evolution of the phase space state combined with an alternative method to transport back-scattered electrons. This depth evolution method requires of the order of 15 times less computer memory than the energy evolution method. Results of previous and new electron transport methods are compared and show that the new electron transport method for back-scattered electrons hardly affects the accuracy of the calculated dose distributions. The second modification regards the simulation of electron transport through tissues with varying densities by applying distributed electron transport through similarly composed media with a limited number of fixed densities. Results of non-distributed and distributed electron transport are compared and show that the distributed electron transport method hardly affects the accuracy of the calculated dose distributions. It is also shown that the results of the new dose distribution calculations are still in good agreement with and require significantly less computation time than results obtained with the EGS4 Monte Carlo method.  相似文献   

2.
When measured data contain damage events of the structure, it is important to extract the information of damage as much as possible from the data. In this paper, two methods are proposed for such a purpose. The first method, based on the empirical mode decomposition (EMD), is intended to extract damage spikes due to a sudden change of structural stiffness from the measured data thereby detecting the damage time instants and damage locations. The second method, based on EMD and Hilbert transform is capable of (1) detecting the damage time instants, and (2) determining the natural frequencies and damping ratios of the structure before and after damage. The two proposed methods are applied to a benchmark problem established by the ASCE Task Group on Structural Health Monitoring. Simulation results demonstrate that the proposed methods provide new and useful tools for the damage detection and evaluation of structures.  相似文献   

3.
Spectral analysis of velocity signals recorded by acoustic Doppler velocimetry (ADV) and contaminated with intermittent spikes remains a challenging task. In this paper, we propose a new method for reconstructing contaminated time series which integrates two previously developed techniques for detecting and replacing spurious spikes. The spikes are first detected using a modified version of the universal phase-space-thresholding technique and subsequently replaced by the last valid data points. The accuracy of the new approach is evaluated by applying it to identify and remove spikes and reconstruct the spectra of two clean data sets which are artificially contaminated with random spikes: (1) high-quality hot-wire measurement and (2) numerically simulated velocity time series with bimodal probability density distribution. The technique is also applied to reconstruct the spectra obtained from intentionally contaminated ADV measurements and compare them with ADV spectra at the same point in the flow obtained using proper ADV settings. Special emphasis is placed on testing the ability of the technique to reproduce realistic power spectra in flows with rich coherent dynamics. The results show that the power spectra of the reconstructed time series contain a filtered white noise caused by the steps in the reconstruction technique using the last valid data point. We show that even for a severely contaminated time series, the proposed method can accurately recover the power spectra up to the frequency corresponding to the half the mean sampling rate of the valid data points.  相似文献   

4.
High frequency Space Shuttle liftoff data are treated by autoregressive (AR) and autoregressive‐moving‐average (ARMA) digital algorithms. These algorithms provide useful information on the spectral densities of the data. Further, they yield spectral models, which lend themselves to incorporation into the concept of the random response spectrum. This concept yields a reasonably smooth power spectrum for the design of structural and mechanical systems when the available data bank is limited. Due to the nonstationarity of the liftoff event, the pertinent data are split into three slices. Each of the slices is associated with a rather distinguishable phase of the liftoff event, in which stationarity can be expected. The presented results are preliminary in nature; they aim to call attention to the availability of the discussed concepts and to the need to augment the Space Shuttle data bank as more flights are completed.  相似文献   

5.
The present article outlines the advantages of the participant event monitoring methodology for the investigation of unpredictable, low-base-rate events in children. Several methods for assessing the quality of participant event monitoring data are advanced with a data set showing participant event monitoring of children's minor injuries by 61 children and their mothers. Child–mother correspondence and debriefing data suggest good accuracy for frequency estimates. Home- and laboratory-based simulations illustrate the participant event monitors' accuracy for major details. Traditional measures of data quality show good overall coder and test–retest reliability, and cross-observer reports show acceptable estimates of validity for objective aspects of the events and the expected lower estimates for the more subjective aspects. Conceptual and pragmatic difficulties of the method are considered, and suggestions for future research are advanced. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
We propose a new concept for analyzing EEG/MEG data. The concept is based on a projection of the spatiotemporal signal into the relevant phase space and the interpretation of the brain dynamics in terms of dynamical systems theory. The projection is obtained by a simultaneous determination of spatial modes and coefficients of differential equations. The resulting spatiotemporal model can be characterized by stationary points and corresponding potential field maps. Brain information processing can be interpreted by attraction and repulsion of spatial field distributions given by these stationary points. This allows an objective and quantitative characterization of the brain dynamics. We outline this concept and the underlying algorithm. Results of the application of this method to an event related potential (ERP) study of auditory memory processes are discussed.  相似文献   

7.
Field inhomogeneities or susceptibility variations produce blurring in images acquired using non-2DFT k-space readout trajectories. This problem is more pronounced for sequences with long readout times such as spiral imaging. Theoretical and practical correction methods based on an acquired field map have been reported in the past. This paper introduces a new correction method based on the existing concept of frequency segmented correction but which is faster and theoretically more accurate. It consists of reconstructing the data at several frequencies to form a set of base images that are then added together with spatially varying linear coefficients derived from the field map. The new algorithm is applied to phantom and in vivo images acquired with projection reconstruction and spiral sequences, yielding sharply focused images.  相似文献   

8.
The molecular interactions implicated in the mammalian G1/S cell cycle phase transition comprise a highly nonlinear network which can produce seemingly paradoxical results and make intuitive interpretations unreliable. A new approach to this problem is presented, consisting of (1) a convention of unambiguous reaction diagrams, (2) a convenient computer simulation method, and (3) a quasi-evolutionary method of probing the functional capabilities of simplified components of the network. Simulations were carried out for a sequence of hypothetical primordial systems, beginning with the simplest plausibly functional system. The complexity of the system was then increased in small steps, such that functionality was added at each step. The results suggested new functional concepts: (1) Rb-family proteins could store E2F in a manner analogous to the way a condenser stores electric charge, and, upon phosphorylation, release a large wave of active E2F; (2) excessive or premature cyclin-dependent kinase activities could paradoxically impair E2F activity during the G1/S transition period. The results show how network simulations, carried out by means of the methods described, can assist in the design and interpretation of experiments probing the control of the G1/S phase transition.  相似文献   

9.
以相变材料为核心的潜热储存技术,对加快新能源开发和提高能源利用率起着关键性作用。以油酸钙为前驱体,通过水热法合成了具有自支撑网络结构的羟基磷灰石(HAP)气凝胶,并采用浸渍法制备出自支撑羟基磷灰石复合相变材料。通过扫描电镜、傅里叶红外光谱、X射线衍射、热重法、差示扫描量热法等手段对所制备复合相变材料的形貌、稳定性、热性能等进行了表征及测试。实验结果表明,负载石蜡或十八醇的羟基磷灰石气凝胶复合相变材料均具有良好的热性能,质量分数60%石蜡@HAP气凝胶复合相变材料的熔融焓和凝固焓测量值分别为85.10和85.30 J·g?1,结晶度为81.50%;质量分数60%十八醇@HAP气凝胶复合相变材料的熔融焓和凝固焓测量值为113.78和112.25 J·g?1,结晶度为86.20%,且具有很好的热稳定性和化学稳定性。此外,羟基磷灰石气凝胶载体材料阻燃性好,无腐蚀且安全环保,有效拓展了相变材料在智能保温纺织物和建筑材料等领域的实际应用。   相似文献   

10.
In diffraction tomography, the spatial distribution of the scattering object is reconstructed from the measured scattered data. For a scattering object that is illuminated with plane-wave radiation, under the condition of weak scattering one can invoke the Born (or the Rytov) approximation to linearize the equation for the scattered field (or the scattered phase) and derive a relationship between the scattered field (or the scattered phase) and the distribution of the scattering object. Reconstruction methods such as the Fourier domain interpolation methods and the filtered backpropagation method have been developed previously. However, the underlying relationship among and the noise properties of these methods are not evident. We introduce the concepts of ideal and modified sinograms. Analysis of the relationships between, and the noise properties of the two sinograms reveals infinite classes of methods for image reconstruction in diffraction tomography that include the previously proposed methods as special members. The methods in these classes are mathematically identical, but they respond to noise and numerical errors differently.  相似文献   

11.
MOTIVATION: Hidden Markov models can efficiently and automatically build statistical representations of related sequences. Unfortunately, training sets are frequently biased toward one subgroup of sequences, leading to an insufficiently general model. This work evaluates sequence weighting methods based on the maximum-discrimination idea. RESULTS: One good method scales sequence weights by an exponential that ranges between 0.1 for the best scoring sequence and 1.0 for the worst. Experiments with a curated data set show that while training with one or two sequences performed worse than single-sequence Probabilistic Smith-Waterman, training with five or ten sequences reduced errors by 20% and 51%, respectively. This new version of the SAM HMM suite outperforms HMMer (17% reduction over PSW for 10 training sequences), Meta-MEME (28% reduction), and unweighted SAM (31% reduction). AVAILABILITY: A WWW server, as well as information on obtaining the Sequence Alignment and Modeling (SAM) software suite and additional data from this work, can be found at http://www.cse.ucse. edu/research/compbio/sam.html  相似文献   

12.
There are now replicated findings that posttraumatic stress disorder (PTSD) symptoms related to the September 11, 2001, attacks occurred in large numbers of persons who did not fit the traditional definition of exposure to a traumatic event. These data are not explained by traditional epidemiologic "bull's eye" disaster models, which assume the psychological effects are narrowly, geographically circumscribed, or by existing models of PTSD onset. In this article, the authors develop a researchable model to explain these and other terrorism-related phenomena by synthesizing research and concepts from the cognitive science, risk appraisal, traumatic stress, and anxiety disorders literatures. They propose the new term relative risk appraisal to capture the psychological function that is the missing link between the event and subjective response in these and other terrorism-related studies to date. Relative risk appraisal highlights the core notion from cognitive science that human perception is an active, multidimensional process, such that for unpredictable societal threats, proximity to the event is only one of several factors that influence behavioral responses. Addressing distortions in relative risk appraisal effectively could reduce individual and societal vulnerability to a wide range of adverse economic and ethnopolitical consequences to terrorist attacks. The authors present ways in which these concepts and related techniques can be helpful in treating persons with September 11- or terrorism-related distress or psychopathology. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
Three experiments examined the incidental remembering of event durations. In each study, Ss engaged in an initial learning phase in which they performed a set of perceptual ratings on events for a varying number of trials. These events consisted of tonal sequences or ecological sounds that varied in their internal structure and ending. Ss were then given a surprise memory task in which they were asked to recognize the duration of each event (Exps 1 and 3) or extrapolate its completion (Exp 2). Results showed that in contrast to irregularly timed events, those filled with regularly timed or continuous pitch information yielded high levels of accuracy that increased with greater learning experience. In addition, durations marked by a nonarbitrary ending were more accurately remembered than those marked by an arbitrary ending which, in fact, were misremembered as shorter than their actual duration. These findings are discussed in terms of an approach that emphasizes the role of event structure on perceiving and remembering activities. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

14.
We present two methods for designing amino acid sequences of proteins that will fold to have good hydrophobic cores. Given the coordinates of the desired target protein or polymer structure, the methods generate sequences of hydrophobic (H) and polar (P) monomers that are intended to fold to these structures. One method designs hydrophobic inside, polar outside; the other minimizes an energy function in a sequence evolution process. The sequences generated by these methods agree at the level of 60-80% of the sequence positions in 20 proteins in the Protein Data Bank. A major challenge in protein design is to create sequences that can fold uniquely, i.e. to a single conformation rather than to many. While an earlier lattice-based sequence evolution method was shown not to design unique folders, our method generates unique folders in lattice model tests. These methods may also be useful in designing other types of foldable polymer not based on amino acids.  相似文献   

15.
In standard time-to-event or survival analysis, occurrence times of the event of interest are observed exactly or are right-censored, meaning that it is only known that the event occurred after the last observation time. There are numerous methods available for estimating the survival curve and for testing and estimation of the effects of covariates in this context. In some situations, however, the times of the events of interest may only be known to have occurred within an interval of time. In clinical trials, for example, patients are often seen at pre-scheduled visits but the event of interest may occur in between visits. These data are interval-censored. Owing to the lack of well-known statistical methodology and available software, a common ad hoc approach is to assume that the event occurred at the end (or beginning or midpoint) of each interval, and then apply methods for standard time-to-event data. However, this approach can lead to invalid inferences, and in particular will tend to underestimate the standard errors of the estimated parameters. The purpose of this tutorial is to illustrate and compare available methods which correctly treat the data as being interval-censored. It is not meant to be a full review of all existing methods, but only those which are available in standard statistical software, or which can be easily programmed. All approaches will be illustrated on two data sets and compared with methods which ignore the interval-censored nature of the data. We hope this tutorial will allow those familiar with the application of standard survival analysis techniques the option of applying appropriate methods when presented with interval-censored data.  相似文献   

16.
张函  钱权  武星 《工程科学学报》2023,45(7):1232-1237
材料的生产环境和测量条件不同,导致用于机器学习的材料数据的噪声较大.对材料数据进行标注需要一定的专业知识和专业技能,因此标注成本也相对较高.这两方面的因素给机器学习应用于材料领域带来了巨大挑战.为应对这个挑战,提出了一个主动回归学习方法,由离群点检测模块、贪婪采样模块和最小变化采样模块组成.同其他主动学习方法相比,该方法整合了离群点检测机制,选取高质量样本的同时有效地排除了噪声数据的影响,避免了沉没成本.在公开数据集和非公开数据集上与最新的主动回归学习方法进行了对比实验,实验结果表明本文方法在相同的数据量下训练的任务模型性能指标相比于其他模型平均提高15%,且只需30%~40%的数据量作为训练集就可以达到甚至超过使用全部数据训练任务模型的精度.  相似文献   

17.
RT O'Neill 《Canadian Metallurgical Quarterly》1998,17(15-16):1851-8; discussion 1859-62
This paper deals with a conceptual discussion of a variety of statistical concepts, methods and strategies that are relevant to the quantitative assessment of risk derived from safety data collected during the pre- and post-marketing phase of a new drug's life cycle. A call is made for the use of more standard approaches to the analysis of safety data that are statistically and epidemiologically rigorous and for attempts to link the strategies for pre-market safety assessment with strategies for post-market safety evaluation. This link may be facilitated by recognizing the limitations and complementary roles played by pre- and post-market safety data collection schemes and by linking the quantitative analyses utilized for either exploratory or confirmatory purposes of risk assessment in each phase of safety data collection. Examples are provided of studies specifically designed to evaluate risk in a post approval setting and several available guidelines intended to improve the quality of these studies are discussed.  相似文献   

18.
This paper presents a new approach to analyzing water distribution networks during a contamination event. Previous computer models for predicting the extent of contamination spread in water distribution networks are demand-driven models. The new approach makes use of supervisory control and data acquisition (SCADA) data to create connectivity matrices, which encapsulate the worst-case projection of the potential spread of contamination obtained by combining the effects of all possible scenarios. Two methods for creating connectivity matrices are described, the first based on operating modes, and the second on fundamental paths. Both methods produce identical results, although the method of fundamental paths is more efficient computationally. The connectivity- and hydraulic-based approaches are compared using an example problem.  相似文献   

19.
We propose a new method for detecting conserved RNA secondary structures in a family of related RNA sequences. Our method is based on a combination of thermodynamic structure prediction and phylogenetic comparison. In contrast to purely phylogenetic methods, our algorithm can be used for small data sets of approximately 10 sequences, efficiently exploiting the information contained in the sequence variability. The procedure constructs a prediction only for those parts of sequences that are consistent with a single conserved structure. Our implementation produces reasonable consensus structures without user interference. As an example we have analysed the complete HIV-1 and hepatitis C virus (HCV) genomes as well as the small segment of hantavirus. Our method confirms the known structures in HIV-1 and predicts previously unknown conserved RNA secondary structures in HCV.  相似文献   

20.
A statistical method was developed for reconstructing the nucleotide or amino acid sequences of extinct ancestors, given the phylogeny and sequences of the extant species. A model of nucleotide or amino acid substitution was employed to analyze data of the present-day sequences, and maximum likelihood estimates of parameters such as branch lengths were used to compare the posterior probabilities of assignments of character states (nucleotides or amino acids) to interior nodes of the tree; the assignment having the highest probability was the best reconstruction at the site. The lysozyme c sequences of six mammals were analyzed by using the likelihood and parsimony methods. The new likelihood-based method was found to be superior to the parsimony method. The probability that the amino acids for all interior nodes at a site reconstructed by the new method are correct was calculated to be 0.91, 0.86, and 0.73 for all, variable, and parsimony-informative sites, respectively, whereas the corresponding probabilities for the parsimony method were 0.84, 0.76, and 0.51, respectively. The probability that an amino acid in an ancestral sequence is correctly reconstructed by the likelihood analysis ranged from 91.3 to 98.7% for the four ancestral sequences.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号