首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
New directions in medical and biomedical sciences have gradually emerged over recent years that will change the way diseases are diagnosed and treated and are leading to the redirection of medicine toward patient-specific treatments. We refer to these new approaches for studying biomedical systems as predictive medicine, a new version of medical science that involves the use of advanced computer models of biomedical phenomena, high-performance computing, new experimental methods for model data calibration, modern imaging technologies, cutting-edge numerical algorithms for treating large stochastic systems, modern methods for model selection, calibration, validation, verification, and uncertainty quantification, and new approaches for drug design and delivery, all based on predictive models. The methodologies are designed to study events at multiple scales, from genetic data, to sub-cellular signaling mechanisms, to cell interactions, to tissue physics and chemistry, to organs in living human subjects. The present document surveys work on the development and implementation of predictive models of vascular tumor growth, covering aspects of what might be called modeling-and-experimentally based computational oncology. The work described is that of a multi-institutional team, centered at ICES with strong participation by members at M. D. Anderson Cancer Center and University of Texas at San Antonio. This exposition covers topics on signaling models, cell and cell-interaction models, tissue models based on multi-species mixture theories, models of angiogenesis, and beginning work of drug effects. A number of new parallel computer codes for implementing finite-element methods, multi-level Markov Chain Monte Carlo sampling methods, data classification methods, stochastic PDE solvers, statistical inverse algorithms for model calibration and validation, models of events at different spatial and temporal scales is presented. Importantly, new methods for model selection in the presence of uncertainties fundamental to predictive medical science, are described which are based on the notion of Bayesian model plausibilities. Also, as part of this general approach, new codes for determining the sensitivity of model outputs to variations in model parameters are described that provide a basis for assessing the importance of model parameters and controlling and reducing the number of relevant model parameters. Model specific data is to be accessible through careful and model-specific platforms in the Tumor Engineering Laboratory. We describe parallel computer platforms on which large-scale calculations are run as well as specific time-marching algorithms needed to treat stiff systems encountered in some phase-field mixture models. We also cover new non-invasive imaging and data classification methods that provide in vivo data for model validation. The study concludes with a brief discussion of future work and open challenges.  相似文献   

2.
3.
As international interchanges increase in importance, military, corporate, and humanitarian leaders urgently need to capture national and regional differences. Computational modeling promises to be a powerful tool for representing complex tasks and for testing the validity of these representations. Models may also support decision aids and serve as platforms for training. This essay discusses the need to extend current modeling approaches to better accommodate complexity and to include the variation in behavior, roles, values, and cognition that influence international interchanges.  相似文献   

4.
Personalized medicine is an emerging field, considered by many in the biomedical community to be among the upcoming approaches to medical treatment. To embrace this new challenge, physicians need a better understanding of the biological processes in the human body, as well as precise diagnostic tools and patient-specific treatments. In response, the last three decades have witnessed a major shift in tissue engineering development, from treating bone tissue at the macro-scale level only to treating it at complex multiscale levels. Researchers have begun striving for a better understanding of bone structure and mechanics, and then applying this knowledge in designing new medical treatments and procedures. Today computational methods, including finite element analyses, are the tool of choice for biomechanical research of bone tissues. Moreover, bone multiscale modeling can become a vital part of a comprehensive computerized diagnostic system for patient-specific treatment of metabolic bone diseases, fractures and bone cancer. This review paper describes the state of the art in multiscale computational methods used in analyzing bone tissue. The discussed methods and techniques can serve as a base for the creation of such an envisioned diagnostic system.  相似文献   

5.
6.
The aim of this paper is to provide a general review of the computational models of the human foot. The field of computational simulation in biomechanics has significantly advanced in the last three decades. Medicine and engineering fields increasingly collaborate to analyze biological systems. This study seeks a link between two areas of knowledge to achieve a better understanding between clinicians and researchers. The review includes two-dimensional and three-dimensional, detailed and simplified, partial- and full-shape models of the lower limb, ankle and foot. Practical issues in computational modeling, tissue constitutive model approaches and pioneering applications are extensively discussed. Recent challenges and future guidelines in the field of foot computational simulation are outlined. Although this study is focused on foot modeling, the main ideas can be employed to evaluate other parts of the body. The advances in computational foot modeling can aid in reliable simulations and analyses of foot pathologies, which are promising as modern tools of personalized medicine.  相似文献   

7.
8.
This research aims to support collaborative distance learners by demonstrating how a probabilistic machine learning method can be used to model and analyze online knowledge sharing interactions. The approach applies Hidden Markov Models and Multidimensional Scaling to analyze and assess sequences of coded online student interaction. These analysis techniques were used to train a system to dynamically recognize (1) when students are having trouble learning the new concepts they share with each other, and (2) why they are having trouble. The results of this research may assist an instructor or intelligent coach in understanding and mediating situations in which groups of students collaborate to share their knowledge.  相似文献   

9.
This paper describes a computational technique for modeling and visualizing dynamical behaviors of complex systems in phase space. The technique employs a novel idea offlow pipes to model trajectory bundles that exhibit the same qualitative features. It parses a continuous phase space of a dynamical system, consisting of an infinite number of individual trajectories, into a manageable discrete collection of flow pipes that a computer can efficiently reason about. The technique provides a computational way for both machines and humans to visualize and manipulate dynamics of a physical system. The flow-pipe modeling technique is implemented in a program called MAPS. The technique has been applied to the automatic control synthesis in which programs automatically analyze and design high-performance, global controllers.  相似文献   

10.
Computational Economics - The knowledge-based economy is the basis of economics in which all businesses and industries benefit from the distribution and application of knowledge in pursuit of their...  相似文献   

11.
In this paper, I argue that computationalism is a progressive research tradition. Its metaphysical assumptions are that nervous systems are computational, and that information processing is necessary for cognition to occur. First, the primary reasons why information processing should explain cognition are reviewed. Then I argue that early formulations of these reasons are outdated. However, by relying on the mechanistic account of physical computation, they can be recast in a compelling way. Next, I contrast two computational models of working memory to show how modeling has progressed over the years. The methodological assumptions of new modeling work are best understood in the mechanistic framework, which is evidenced by the way in which models are empirically validated. Moreover, the methodological and theoretical progress in computational neuroscience vindicates the new mechanistic approach to explanation, which, at the same time, justifies the best practices of computational modeling. Overall, computational modeling is deservedly successful in cognitive (neuro)science. Its successes are related to deep conceptual connections between cognition and computation. Computationalism is not only here to stay, it becomes stronger every year.  相似文献   

12.
We characterize the classes of languages over finite alphabets which may be described by P automata, i.e., accepting P systems with communication rules only. Motivated by properties of natural computing systems, and the actual behavior of P automata, we study computational complexity classes with a certain restriction on the use of the available workspace in the course of computations and relate these to the language classes described by P automata. We prove that if the rules of the P system are applied sequentially, then the accepted language class is strictly included in the class of languages accepted by one-way Turing machines with a logarithmically bounded workspace, and if the rules are applied in the maximally parallel manner, then the class of context-sensitive languages is obtained.  相似文献   

13.
Journal of Computer and Systems Sciences International - We consider a methodology for organizing hybrid computing structures to simulate abrupt changes in controlled natural processes and...  相似文献   

14.
Machine Intelligence Research - Objective image quality assessment (IQA) plays an important role in various visual communication systems, which can automatically and efficiently predict the...  相似文献   

15.
在细胞生物学生长过程演化的研究中,需要记录细胞生长的整个过程.由于外界环境的影响,常常导致记录的生长过程失真.为了解决上述难题,提出使用三十张鞘毛藻属植物细胞生长过程中的图片,利用C-V模型对鞘毛藻属植物细胞分割,以细胞图像重心为中心将图像裁剪标准化,然后运用移动最小二乘的图像变形方法,并结合图像融合技术,得到一系列插值图像,生成视频,仿真细胞生长的过程.通过图像演化,能得到比较逼真的细胞生长全过程模型,减轻记录细胞生长过程的工作量,对细胞生物学的研究起到一定的帮助.  相似文献   

16.
A consumer entering a new bookstore can face more than 250,000alternatives. The efficiency of compensatory and noncompensatory decisionrulesfor finding a preferred item depends on the efficiency of their associatedinformation operators. At best, item-by-item information operators lead tolinear computational complexity; set information operators, on the other hand,can lead to constant complexity. We perform an experiment demonstrating thatsubjects are approximately rational in selecting between sublinear and linearrules. Many markets are organized by attributes that enable consumers toemploya set-selection-by-aspect rule using set information operations. In cyberspacedecision rules are encoded as decision aids.  相似文献   

17.
Many programming calculi have been designed to have a Curry-Howard correspondence with a classical logic. We investigate the effect that different choices of logical connective have on such calculi, and the resulting computational content.We identify two connectives 'if-and-only-if' and 'exclusive or' whose computational content is not well known, and whose cut elimination rules are non-trivial to define. In the case of the former, we define a term calculus and show that the computational content of several other connectives can be simulated. We show this is possible even for connectives not logically expressible with 'if-and-only-if'.  相似文献   

18.
This paper summarizes existing software reliability growth models (SRGM's) described by nonhomogeneous Poisson processes. The SRGM's are classified in terms of the software reliability growth index of the error detection rate per error. The maximum-likelihood estimations based on the SRGM's are discussed for software reliability data analysis and software reliability evaluation. Using actual software error data observed by software testing, application examples of the existing SRGM's are illustrated.  相似文献   

19.
Excess of liquid water in gas channels of polymer electrolyte fuel cells is responsible for malfunctioning of these devices. Not only it decreases their efficiency via partial blockage of reactants and pressure drop, but it can also lead to the irreversible damage due to oxygen starvation in case of complete channel flooding or full coverage of the gas diffusion layer with a liquid film. Liquid water evacuation is carried out via airflow in gas channels. Several experimental and computational techniques have been applied to date for the analysis of the coupled airflow–water behavior in order to understand the impact of fuel cell design and operation regimes upon the liquid water accumulation. Considerable progress has been achieved with the development of sophisticated computational fluid dynamics (CFD) tools. Nevertheless, the complexity of the problem under consideration leaves several issues unresolved. In this paper, analysis techniques applied to liquid water–airflow transport in fuel cells gas channels are reviewed and most important results are summarized. Computationally efficient, yet strongly simplified analytical models are discussed. Afterwards, CFD approaches including the conventional fixed grid (Eulerian) and the novel embedded Eulerian–Lagrangian models are described. Critical comparative assessment of the existing methods is provided at the end of the paper and the unresolved issues are highlighted.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号