首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到5条相似文献,搜索用时 0 毫秒
1.
The present contribution describes a potential application of Grid Computing in Bioinformatics. High resolution structure determination of biological specimens is critical in BioSciences to understanding the biological function. The problem is computational intensive. Distributed and Grid Computing are thus becoming essential. This contribution analyzes the use of Grid Computing and its potential benefits in the field of electron microscope tomography of biological specimens. Jose-Jesus Fernandez, Ph.D.: He received his M.Sc. and Ph.D. degrees in Computer Science from the University of Granada, Spain, in 1992 and 1997, respectively. He was a Ph.D. student at the Bio-Computing unit of the National Center for BioTechnology (CNB) from the Spanish National Council of Scientific Research (CSIC), Madrid, Spain. He became an Assistant Professor in 1997 and, subsequently, Associate Professor in 2000 in Computer Architecture at the University of Almeria, Spain. He is a member of the supercomputing-algorithms research group. His research interests include high performance computing (HPC), image processing and tomography. Jose-Roman Bilbao-Castro: He received his M.Sc. degree in Computer Science from the University of Almeria in 2001. He is currently a Ph.D. student at the BioComputing unit of the CNB (CSIC) through a Ph.D. CSIC-grant in conjuction with Dept. Computer Architecture at the University of Malaga (Spain). His current research interestsinclude tomography, HPC and distributed and grid computing. Roberto Marabini, Ph.D.: He received the M.Sc. (1989) and Ph.D. (1995) degrees in Physics from the University Autonoma de Madrid (UAM) and University of Santiago de Compostela, respectively. He was a Ph.D. student at the BioComputing Unit at the CNB (CSIC). He worked at the University of Pennsylvania and the City University of New York from 1998 to 2002. At present he is an Associate Professor at the UAM. His current research interests include inverse problems, image processing and HPC. Jose-Maria Carazo, Ph.D.: He received the M.Sc. degree from the Granada University, Spain, in 1981, and got his Ph.D. in Molecular Biology at the UAM in 1984. He left for Albany, NY, in 1986, coming back to Madrid in 1989 to set up the BioComputing Unit of the CNB (CSIC). He was involved in the Spanish Ministry of Science and Technology as Deputy General Director for Research Planning. Currently, he keeps engaged in his activities at the CNB, the Scientific Park of Madrid and Integromics S.L. Immaculada Garcia, Ph.D.: She received her B.Sc. (1977) and Ph.D. (1986) degrees in Physics from the Complutense University of Madrid and University of Santiago de Compostela, respectively. From 1977 to 1987 she was an Assistant professor at the University of Granada, from 1987 to 1996 Associate professor at the University of Almeria and since 1997 she is a Full Professor and head of Dept. Computer Architecture. She is head of the supercomputing-algorithms research group. Her research interest lies in HPC for irregular problems related to image processing, global optimization and matrix computation.  相似文献   

2.
网格计算的安全性研究与技术实现   总被引:2,自引:0,他引:2  
网格计算环境必须以现有的Internet为通信支撑平台,由于Internet本身的开放性和异构性,决定了网格计算面临着各种各样的安全威胁,因此网格安全已成为网格计算环境中的一个核心问题。该文简述了网格安全需求,分析了网格安全技术,并给出了基于Globus项目中网格安全的主要技术手段。  相似文献   

3.
Computational modeling in the health sciences is still very challenging and much of the success has been despite the difficulties involved in integrating all of the technologies, software, and other tools necessary to answer complex questions. Very large-scale problems are open to questions of spatio-temporal scale, and whether physico-chemical complexity is matched by biological complexity. For example, for many reasons, many large-scale biomedical computations today still tend to use rather simplified physics/chemistry compared with the state of knowledge of the actual biology/biochemistry. The ability to invoke modern grid technologies offers the ability to create new paradigms for computing, enabling access of resources which facilitate spanning the biological scale. Wibke Sudholt: She is a postdoc with J. A. McCammon and K. Baldridge at the University of California, San Diego and a fellow of the German Academic Exchange Service (DAAD). She received her diploma (Dipl. Chem.) at the University Dortmund, Germany in 1996, and her doctoral degree in 2001 (Dr. rer. nat.) at Heinrich-Heine-University Duesseldorf, Germany with Wolfgang Domcke on theoretical studies of a charge-transfer process. Her current research interests include the combination of quantum chemistry, molecular mechanics and continum electrostatics to describe chemical reactions in complex molecular systems. Kim K. Baldridge: She is a theoretical and computational chemist with expertise in the design, development, and application of computational quantum chemical methodology for understanding chemical and biochemical reaction processes of broad interest. Efforts include development of computational tools and associated grid technologies for the broader scientific community. She is a Fellow of the APS and AAAS, and was the 2000 Agnes Fay Morgan Awardee for Research Achievement in Chemistry. She is the Program Director for Integrative Computational Sciences at SDSC, where she has worked since 1989, and additionally holds an adjunct professorship at UCSD. David Abramson: He is currently a professor of Computer Science in the School of Computer Science and Software Engineering (CSSE) at Monash University, Australia. He is a project leader in the Co-operative Research Centre for Distributed Systems Nimrod Project and Chief Investigator on two ARC funded research projects. He is a co-founder of Active Tools P/L with Dr. Rok Sosic, established to commercialize the Nimrod project, and Guardsoft, focused on commercializing the Guard project. Abramson’s current interests are in high performance computer systems design and software engineering tools for programming parallel, distributed supercomputers. Colin Enticott: He completed a BComp (Hons) degree mid. 2002 at Monash University, Australia. His project, done under the supervision of Professor David Abramson, “The Multi Site EnFuzion Client” dealt in the area of cluster-of-clusters computing that has lead him into Grid computing. Currently employed by DSTC (Distributed Systems Technology Centre, Melbourne, Australia) working on the user front-end of Nimrod (the Nimrod Portal) and cluster implementations. Slavisa Garic: He completed Bachelor of Computer Science (Hons) degree at Monash University, Australia in November 2001. His project, “Suburban Area Networks: Security” involved working on security aspects of wireless community and suburban networks. The beginning of year 2002, he joined Distributed Systems Technology Centre, Melbourne Australia, where he currently works as a core Nimrod/G developer.  相似文献   

4.
With advances in remote-sensing technology, the large volumes of data cannot be analyzed efficiently and rapidly, especially with arrival of high-resolution images. The development of image-processing technology is an urgent and complex problem for computer and geo-science experts. It involves, not only knowledge of remote sensing, but also of computing and networking. Remotely sensed images need to be processed rapidly and effectively in a distributed and parallel processing environment. Grid computing is a new form of distributed computing, providing an advanced computing and sharing model to solve large and computationally intensive problems. According to the basic principle of grid computing, we construct a distributed processing system for processing remotely sensed images. This paper focuses on the implementation of such a distributed computing and processing model based on the theory of grid computing. Firstly, problems in the field of remotely sensed image processing are analyzed. Then, the distributed (and parallel) computing model design, based on grid computing, is applied. Finally, implementation methods with middleware technology are discussed in detail. From a test analysis of our system, TARIES.NET, the whole image-processing system is evaluated, and the results show the feasibility of the model design and the efficiency of the remotely sensed image distributed and parallel processing system.  相似文献   

5.
Grid computing connects heterogeneous resources to achieve the illusion of being a single available entity. Charging for these resources based on demand is often referred to as utility computing, where resource providers lease computing power with varying costs based on processing speed. Consumers using this resource have time and cost constraints associated with each job they submit. Determining the optimal way to divide the job among the available resources with regard to the time and cost constraints is tasked to the Grid Resource Broker (GRB). The GRB must use an optimization algorithm that returns an accurate result in a timely manner. The genetic algorithm and the simulated annealing algorithm can both be used to achieve this goal, although simulated annealing outperforms the genetic algorithm for use by the GRB. Determining optimal values for the variables used in each algorithm is often achieved through trial and error, and success depends upon the solution domain of the problem.
Sanjay P. Ahuja (Corresponding author)Email:
  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号