首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3870篇
  免费   251篇
  国内免费   11篇
电工技术   43篇
综合类   2篇
化学工业   1063篇
金属工艺   60篇
机械仪表   108篇
建筑科学   100篇
矿业工程   6篇
能源动力   190篇
轻工业   532篇
水利工程   34篇
石油天然气   14篇
无线电   264篇
一般工业技术   680篇
冶金工业   108篇
原子能技术   31篇
自动化技术   897篇
  2024年   7篇
  2023年   66篇
  2022年   171篇
  2021年   196篇
  2020年   139篇
  2019年   137篇
  2018年   157篇
  2017年   152篇
  2016年   179篇
  2015年   127篇
  2014年   195篇
  2013年   303篇
  2012年   250篇
  2011年   330篇
  2010年   211篇
  2009年   224篇
  2008年   186篇
  2007年   172篇
  2006年   126篇
  2005年   126篇
  2004年   77篇
  2003年   65篇
  2002年   69篇
  2001年   49篇
  2000年   40篇
  1999年   30篇
  1998年   35篇
  1997年   25篇
  1996年   37篇
  1995年   26篇
  1994年   18篇
  1993年   11篇
  1992年   13篇
  1991年   22篇
  1990年   10篇
  1989年   11篇
  1988年   12篇
  1987年   6篇
  1986年   12篇
  1985年   14篇
  1984年   16篇
  1983年   14篇
  1982年   12篇
  1981年   8篇
  1980年   8篇
  1979年   9篇
  1977年   6篇
  1975年   5篇
  1973年   6篇
  1972年   3篇
排序方式: 共有4132条查询结果,搜索用时 15 毫秒
71.
Operational risk is commonly analyzed in terms of the distribution of aggregate yearly losses. Risk measures can then be computed as statistics of this distribution that focus on the region of extreme losses. Assuming independence among the operational risk events and between the likelihood that they occur and their magnitude, separate models are made for the frequency and for the severity of the losses. These are then combined to estimate the distribution of aggregate losses. While the detailed form of the frequency distribution does not significantly affect the risk analysis, the choice of model for the severity often has a significant impact on operational risk measures. For heavy-tailed distributions these measures are dominated by extreme losses, whose probability cannot be reliably extrapolated from the available data. With limited empirical evidence, it is difficult to distinguish among alternative models that produce very different values of the risk measures. Furthermore, the estimates obtained can be unstable and overly sensitive to the presence or absence of single extreme events. Setting a bound on the maximum amount that can be lost in a single event reduces the dependence on the distributional assumptions and improves the robustness and stability of the risk measures, while preserving their sensitivity to changes in the risk profile. This bound should be determined by expert assessment on the basis of economic arguments and validated by the regulator, so that it can be used as a control parameter in the risk analysis.  相似文献   
72.
In this paper, we present a heterogeneous parallel solver of a high frequency single level Fast Multipole Method (FMM) for the Helmholtz equation applied to acoustic scattering. The developed solution uses multiple GPUs to tackle the compute bound steps of the FMM (aggregation, disaggregation, and near interactions) while the CPU handles a memory bound step (translation) using OpenMP. The proposed solver performance is measured on a workstation with two GPUs (NVIDIA GTX 480) and is compared with that of a distributed memory solver run on a cluster of 32 nodes (HP BL465c) with an Infiniband network. Some energy efficiency results are also presented in this work.  相似文献   
73.
Two key aspects of the Knowledge Society are the interconnection between the actors involved in the decision making processes and the importance of the human factor, particularly the citizen’s continuous learning and education. This paper presents a new module devoted to knowledge extraction and diffusion that has been incorporated into a previously developed decision making tool concerning the Internet and related with the multicriteria selection of a discrete number of alternatives (PRIOR-Web). Quantitative and qualitative procedures using data and text mining methods have been employed in the extraction of knowledge. Graphical visualisation tools have been incorporated in the diffusion stage of the methodological approach suggested when dealing with decision making in the Knowledge Society. The resulting collaborative platform is being used as the methodological support for the cognitive democracy known as e-cognocracy.  相似文献   
74.
In this paper we present the "R&W Simulator" (version 3.0), a Java simulator of Rescorla and Wagner's prediction error model of learning. It is able to run whole experimental designs, and compute and display the associative values of elemental and compound stimuli simultaneously, as well as use extra configural cues in generating compound values; it also permits change of the US parameters across phases. The simulator produces both numerical and graphical outputs, and includes a functionality to export the results to a data processor spreadsheet. It is user-friendly, and built with a graphical interface designed to allow neuroscience researchers to input the data in their own "language". It is a cross-platform simulator, so it does not require any special equipment, operative system or support program, and does not need installation. The "R&W Simulator" (version 3.0) is available free.  相似文献   
75.
In this paper, an IMS LD engine based on a Petri net model that represents the operational semantics of units of learning based on this specification is presented. The Petri nets of this engine, which is called OPENET4LD, verify the structural properties that are desirable for a learning flow and also facilitate the adaptation of the engine if potential changes in the IMS LD specification were proposed. Furthermore, OPENET4LD has an open and flexible architecture based on a set of ontologies that describe both the semantics of the Petri nets execution and the semantics of each learning flow component of IMS LD. Furthermore, the implementation of this architecture has been exhaustively validated with a number of UoLs that are compliant with the levels A and B of IMS LD.  相似文献   
76.
In microalloyed steels, static recrystallisation is temporarily inhibited by precipitation which is occurring at the same time. A high number of microalloyed steels containing various combinations of carbon, nitrogen and precipitate forming elements like V, Nb and Ti were recrystallised at different temperatures and strain rates. From these results recrystallisation‐precipitation–time‐temperature (RPTT) diagrams were established. The influence of grain size and strain rate on the RPTT diagrams was studied. The precipitation kinetics were mathematically described for isothermal conditions and converted to cooling conditions, which enables an application to hot rolling. Under cooling conditions, completion of recrystallisation is prevented, especially for Nb alloyed steels.  相似文献   
77.
78.
A multistep iterative calibration methodology for the opto-mechanical system introduced in Part I is proposed. The methodology makes use of a monoview coplanar set of control points, whose number has been determined on the basis of both geometrical considerations and the results of a statistical analysis aiming at assessing the procedure stability in the case of noisy image data. The calibration procedure is carried out comparing the theoretical and observed images of the calibration pattern. Both synthetic and real data have been employed to test the calibration procedure, which proved to be accurate and efficient. The experimental results achieved by the calibrated system are satisfactory in terms of measurement precision.  相似文献   
79.
The single-objective optimization of structures, whose parameters are assigned as fuzzy numbers or fuzzy relations, is presented in this paper as a particular case of the random set theory and evidence theory approach to uncertainty. Some basic concepts concerning these theories are reviewed and the relationships among interval analysis, convex modeling, possibility theory and probability theory are pointed out. In this context a frequentistic view of fuzzy sets makes sense and it is possible to calculate bounds on the probability that the solution satisfies the constraints. Some special but useful cases illustrate in detail the meaning of the approach proposed and its links with a recent formulation conceived within the context of convex modeling. Some theorems allow a very efficient computational procedure to be set up in many real design situations. Two numerical examples illustrate the model presented.  相似文献   
80.
Searching information through the Internet often requires users to separately contact several digital libraries, use each library interface to author the query, analyze retrieval results and merge them with results returned by other libraries. Such a solution could be simplified by using a centralized server that acts as a gateway between the user and several distributed repositories: The centralized server receives the user query, forwards the user query to federated repositories—possibly translating the query in the specific format required by each repository—and fuses retrieved documents for presentation to the user. To accomplish these tasks efficiently, the centralized server should perform some major operations such as: resource selection, query transformation and data fusion. In this paper we report on some aspects of MIND, a system for managing distributed, heterogeneous multimedia libraries (MIND, 2001, http://www.mind-project.org). In particular, this paper focusses on the issue of fusing results returned by different image repositories. The proposed approach is based on normalization of matching scores assigned to retrieved images by individual libraries. Experimental results on a prototype system show the potential of the proposed approach with respect to traditional solutions.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号