首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   69篇
  免费   1篇
化学工业   7篇
金属工艺   1篇
机械仪表   1篇
建筑科学   5篇
能源动力   2篇
轻工业   1篇
水利工程   1篇
无线电   6篇
一般工业技术   18篇
冶金工业   6篇
自动化技术   22篇
  2024年   3篇
  2022年   1篇
  2021年   2篇
  2020年   1篇
  2019年   1篇
  2018年   2篇
  2016年   3篇
  2014年   1篇
  2013年   5篇
  2012年   3篇
  2011年   2篇
  2010年   2篇
  2009年   6篇
  2008年   3篇
  2007年   2篇
  2006年   6篇
  2005年   3篇
  2004年   3篇
  2003年   2篇
  2002年   1篇
  1999年   1篇
  1998年   2篇
  1997年   2篇
  1996年   1篇
  1995年   3篇
  1994年   2篇
  1993年   1篇
  1992年   3篇
  1987年   1篇
  1983年   1篇
  1976年   1篇
排序方式: 共有70条查询结果,搜索用时 15 毫秒
51.
The cardiac bidomain model is a popular approach to study electrical behavior of tissues and simulate interactions between the cells by solving partial differential equations. The iterative and data parallel model is an ideal match for the parallel architecture of Graphic Processing Units (GPUs). In this study, we evaluate the effectiveness of architecture-specific optimizations and fine grained parallelization strategies, completely port the model to GPU, and evaluate the performance of single-GPU and multi-GPU implementations. Simulating one action potential duration (350 msec real time) for a 256×256×256 tissue takes 453 hours on a high-end general purpose processor, while it takes 664 seconds on a four-GPU based system including the communication and data transfer overhead. This drastic improvement (a factor of 2460×) will allow clinicians to extend the time-scale of simulations from milliseconds to seconds and minutes; and evaluate hypotheses in a shorter amount of time that was not feasible previously.  相似文献   
52.
Impact analysis of faults and attacks in large-scale networks   总被引:2,自引:0,他引:2  
Monitoring and quantifying component behavior is key to, making networks reliable and robust. The agent-based architecture presented here continuously monitors network vulnerability metrics providing new ways to measure the impact of faults and attacks.  相似文献   
53.
54.
A method of designing testable systolic architectures is proposed in this paper. Testing systolic arrays involves mapping of an algorithm into a specific VLSI systolic architecture, and then modifying the design to achieve concurrent testing. In our approach, redundant computations are introduced at the algorithmic level by deriving two versions of a given algorithm. The transformed dependency matrix (TDM) of the first version is a valid transformation matrix while the second version is obtained by rotating the first TDM by 180 degrees about any of the indices that represent the spatial component of the TDM. Concurrent error detection (CED) systolic array is constructed by merging the corresponding systolic array of the two versions of the algorithm. The merging method attempts to obtain the self testing systolic array at minimal cost in terms of area and speed. It is based on rescheduling input data, rearranging data flow, and increasing the utilization of the array cells. The resulting design can detect all single permanent and temporary faults and the majority of the multiple fault patterns with high probability. The design method is applied to an algorithm for matrix multiplication in order to demonstrate the generality and novelty of our approach to design testable VLSI systolic architectures.This work has been supported by a grant from the Natural Sciences and Engineering Research Council of Canada.  相似文献   
55.
The brain is being evaluated as a de novo source of cytokines. Because recent evidence indicates that interleukin-6 (IL-6) may influence blood-brain barrier function and vascular permeability, we have sought to determine whether mechanical injury can directly induce in situ cerebral IL-6 production. Adult human astrocyte cultures were subjected to mechanical injury by the in vitro method of fluid percussion barotrauma, developed in our laboratory. Serial supernatant samples were collected for 8 h and evaluated for IL-6 activity using a proliferation assay employing the dependent B cell hybridoma cell line, B9. At optimum injury, the IL-6 level became significantly (P < 0.0001, analysis of variance) elevated from baseline 2 h after trauma and continued to increase over the observation period. Our study shows that following mechanical injury human astrocytes produce IL-6, which may contribute to post-traumatic cerebrovascular dysfunction. Elucidating the precise role of intracerebral cytokines is essential to our understanding of the mechanism responsible for post-traumatic cerebrovascular dysfunction.  相似文献   
56.
57.
Soil tensile strength (qt) plays an important role in controlling cracks and tensile failures particularly in the design of foundations that usually fail under tensile stresses at the bottom of the treated layer. Soil-cement mixtures are used in many engineering applications including building of stabilized pavement bases and canal lining. Splitting tensile test (STT) is one of the common applied methods for indirect determination of qt. Given that the determination of qt of artificially cemented soils from STT—especially for samples with long curing time—is relatively costly and time-consuming, there is a need to develop some empirical models that can estimate determinable properties simply. In the current study, it has been analyzed that whether the Group Method of Data Handling (GMDH)-type Neural Network (NN) is suitable to predict the qt of sands stabilized with zeolite and cement. For this purpose, a program of STT considering three distinct porosity ratios, four cement contents and six different percent of cement replacement by zeolite in 42, 56 and 90 days of curing time is performed in present study. Active particle (AP) has been introduced as a new parameter for modeling the GMDH-type NN. The performances of the proposed models reveal that GMDH is a reliable and accurate approach to predict the qt of sands stabilized by zeolite-cement mixture. Proposing an equation in current study, it can be interpreted that AP is one of the key parameters to predict the qt of zeolite-cemented sands. The sensitivity analysis on the proposed GMDH model with the best performance has shown that the proposed qt is considerably influenced by cement content variations.  相似文献   
58.
59.
60.
Distributed computing systems are attractive due to the potential improvement in availability, fault-tolerance, performance, and resource sharing. Modeling and evaluation of such computing systems is an important step in the design process of distributed systems. We present a two-level hierarchical model to analyze the availability of distributed systems. At the higher level (user level), the availability of the tasks (processes) is analyzed using a graph-based approach. At the lower level (component level), detailed Markov models are developed to analyze the component availabilities. These models take into account the hardware/software failures, congestion and collisions in communication links, allocation of resources, and the redundancy level. A systematic approach is developed to apply the two-level hierarchical model to evaluate the availability of the processes and the services provided by a distributed computing environment. This approach is then applied to analyze some of the distributed processes of a real distributed system, Unified Workstation Environment (UWE), that is currently being implemented at AT&T Bell Laboratories  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号