首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   267篇
  免费   14篇
电工技术   3篇
化学工业   55篇
金属工艺   4篇
机械仪表   8篇
建筑科学   8篇
能源动力   12篇
轻工业   12篇
水利工程   2篇
石油天然气   1篇
无线电   6篇
一般工业技术   51篇
冶金工业   52篇
原子能技术   1篇
自动化技术   66篇
  2023年   1篇
  2022年   1篇
  2021年   5篇
  2020年   5篇
  2019年   4篇
  2018年   3篇
  2017年   3篇
  2016年   5篇
  2015年   8篇
  2014年   13篇
  2013年   19篇
  2012年   18篇
  2011年   14篇
  2010年   14篇
  2009年   12篇
  2008年   10篇
  2007年   15篇
  2006年   14篇
  2005年   18篇
  2004年   7篇
  2003年   9篇
  2002年   10篇
  2001年   3篇
  2000年   8篇
  1999年   8篇
  1998年   9篇
  1997年   5篇
  1996年   5篇
  1995年   3篇
  1994年   1篇
  1993年   2篇
  1992年   2篇
  1991年   1篇
  1989年   1篇
  1988年   1篇
  1987年   2篇
  1986年   7篇
  1984年   1篇
  1982年   2篇
  1981年   5篇
  1980年   1篇
  1979年   1篇
  1977年   2篇
  1975年   1篇
  1973年   1篇
  1967年   1篇
排序方式: 共有281条查询结果,搜索用时 15 毫秒
1.
2.
A polymer synthesis method is presented in which chain growth driven by exothermic reaction stimulates a gradual chain collapse. The globular precipitates in such systems can be restrained from coalescing by polymerizing in a quiescent environment. Time‐resolved small‐angle scattering study of the methacrylic acid polymerization kinetics in a quiescent system above its lower critical solution temperature (LCST) in water reveals the following features of this method: (a) growing oligomers remain as rigid chains until a critical chain length is reached, at which they undergo chain collapse, (b) radius of gyration increases linearly with time until a critical conversion is reached, and (c) radius of gyration remains constant after the critical conversion, even while conversion is gradually increasing. Following this self‐stabilizing growth mechanism, we show that nanoparticles can be directly synthesized by polymerizing N‐isopropylacrylamide above its LCST in water. The average size of nanoparticles obtained from a polymer–solvent system is expected to be the maximum extent of reaction spread at that monomer concentration. This hypothesis was then verified by polymerizing N‐isopropylacrylamide above their LCST in water, but by initiating the reaction with X‐rays shielded by a mask. The microfabricated patterns conform well to the size and shape of the mask used confirming that the growing chains do not propagate beyond the exposed regions as long as the reaction temperature is maintained above the LCST. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 102: 429–425, 2006  相似文献   
3.
Data refinement in a state-based language such as Z is defined using a relational model in terms of the behaviour of abstract programs. Downward and upward simulation conditions form a sound and jointly complete methodology to verify relational data refinements, which can be checked on an event-by-event basis rather than per trace. In models of concurrency, refinement is often defined in terms of sets of observations, which can include the events a system is prepared to accept or refuse, or depend on explicit properties of states and transitions. By embedding such concurrent semantics into a relational framework, eventwise verification methods for such refinement relations can be derived. In this paper, we continue our program of deriving simulation conditions for process algebraic refinement by defining further embeddings into our relational model: traces, completed traces, failure traces and extension. We then extend our framework to include various notions of automata based refinement.  相似文献   
4.
In the context of nonlinear dynamic system identification for Hammerstein systems, Rollins et al. (2003a) studied the information efficiency of the following two competing experimental design approaches: statistical design of experiments (SDOE) and pseudo-random sequences design (PRSD). The focus of this study is the Wiener system and evaluates SDOE against PRS under D-optimal efficiency. Three cases are evaluated and the results strongly support SDOE as the better approach.  相似文献   
5.
The goal of this work is to present a causation modeling methodology with the ability to accurately infer blood glucose levels using a large set of highly correlated noninvasive input variables over an extended period of time. These models can provide insight to improve glucose monitoring, and glucose regulation through advanced model-based control technologies. The efficacy of this approach is demonstrated using real data from a type 2 diabetic (T2D) subject collected under free-living conditions over a period of 25 consecutive days. The model was identified and tested using eleven variables that included three food variables as well as several activity and stress variables. The model was trained using 20 days of data and validated using 5 days of data. This gave a fitted correlation coefficient of 0.70 and an average absolute error (AAE) (i.e., the average of the absolute values for the measured glucose concentration minus modeled glucose concentration) of 13.3 mg/dL for the validation data. This AAE result was significantly better than the subject’s personal glucose meter AAE of 15.3 mg/dL for replicated measurements.  相似文献   
6.
We report excitation of surface plasmon in a gold-coated side-polished D-shape microstructure optical fiber (MOF). As the leaky evanescent field from the fiber core becomes highly localized by the plasmon wave, its intensity also gets amplified significantly. Here we demonstrate an efficient use of this intensified field as excitation in fluorescence spectroscopy. The so-called plasmonic enhanced fluorescence emission from Rhodamine B has been investigated experimentally. First, plasmonic effect alone was found to provide an immediate fluorescence enhancement factor of two. Second, experimental results show a good agreement with theoretical modeling. Strong evanescent field generation and surface enhancement with simple metallic coating makes this fiber based device a good candidate for compact fluorescence spectroscopy.  相似文献   
7.
Volunteer computing uses the free resources in Internet and Intranet environments for large-scale computation and storage. Currently, 70 applications use over 12 PetaFLOPS of computing power from such platforms. However, these platforms are currently limited to embarrassingly parallel applications. In an effort to broaden the set of applications that can leverage volunteer computing, we focus on the problem of predicting if a group of resources will be continuously available for a relatively long time period. Ensuring the collective availability of volunteer resources is challenging due to their inherent volatility and autonomy. Collective availability is important for enabling parallel applications and workflows on volunteer computing platforms. We evaluate our predictive methods using real availability traces gathered from hundreds of thousands of hosts from the SETI@home volunteer computing project. We show our prediction methods can guarantee reliably the availability of collections of volunteer resources. We show that this is particularly useful for service deployments over volunteer computing environments.  相似文献   
8.
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.  相似文献   
9.
Hemodialysis catheter (HDC) dysfunction due to thrombosis is common, and dysfunction incidence can reach up to 50% within 1 year of use. Although administration of intraluminal alteplase (tissue plasminogen activator [tPA]) is the standard of practice to pharmacologically restore HDC function, there are no evidence‐based guidelines concerning the optimal tPA dose. The purpose of this study was to compare the efficacy of 1.0‐mg vs. 2.0‐mg tPA dwell protocols in restoring the HDC function in thrombotic dysfunctional catheters. A retrospective, single‐center study was conducted on two independent cohorts of patients; the first (n = 129) received 2.0 mg tPA/catheter lumen, while the second (n = 108) received 1.0 mg tPA/catheter lumen. Kaplan–Meier and Cox regression analyses were performed to compare the catheter survival time between patients who received 1.0 mg tPA and those who received 2.0 mg tPA. Catheter removal occurred in 25 (19.4%) of those catheters treated with 1.0 mg tPA compared with 11 (10.2%) of catheters treated with 2.0 mg tPA (P = 0.05). The hazard ratio (HR) for catheter removal was 2.75 (95% confidence interval [95%CI] = 1.25–6.04) for the 1.0‐mg tPA cohort compared with the 2.0‐mg tPA cohort. Correction added on 3 December 2012, after first online publication: The tPA cohort values were changed. Female gender (HR = 2.51; 95%CI = 1.20–5.27) and age (HR = 0.96; 95%CI = 0.94–0.98) were also associated with catheter survival. Our findings suggest that treatment of dysfunctional HDC with 2.0‐mg tPA dwells is superior to 1.0‐mg tPA dwells.  相似文献   
10.
The ability to reverse-engineer models of software behaviour is valuable for a wide range of software maintenance, validation and verification tasks. Current reverse-engineering techniques focus either on control-specific behaviour (e.g., in the form of Finite State Machines), or data-specific behaviour (e.g., as pre / post-conditions or invariants). However, typical software behaviour is usually a product of the two; models must combine both aspects to fully represent the software’s operation. Extended Finite State Machines (EFSMs) provide such a model. Although attempts have been made to infer EFSMs, these have been problematic. The models inferred by these techniques can be non-deterministic, the inference algorithms can be inflexible, and only applicable to traces with specific characteristics. This paper presents a novel EFSM inference technique that addresses the problems of inflexibility and non-determinism. It also adapts an experimental technique from the field of Machine Learning to evaluate EFSM inference techniques, and applies it to three diverse software systems.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号