首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   11660篇
  免费   1017篇
  国内免费   113篇
电工技术   224篇
综合类   49篇
化学工业   3058篇
金属工艺   265篇
机械仪表   477篇
建筑科学   486篇
矿业工程   21篇
能源动力   786篇
轻工业   1065篇
水利工程   220篇
石油天然气   164篇
武器工业   5篇
无线电   1204篇
一般工业技术   2001篇
冶金工业   372篇
原子能技术   90篇
自动化技术   2303篇
  2024年   41篇
  2023年   232篇
  2022年   365篇
  2021年   739篇
  2020年   688篇
  2019年   836篇
  2018年   937篇
  2017年   875篇
  2016年   860篇
  2015年   524篇
  2014年   863篇
  2013年   1277篇
  2012年   829篇
  2011年   936篇
  2010年   588篇
  2009年   531篇
  2008年   319篇
  2007年   223篇
  2006年   198篇
  2005年   140篇
  2004年   127篇
  2003年   84篇
  2002年   83篇
  2001年   45篇
  2000年   37篇
  1999年   37篇
  1998年   44篇
  1997年   31篇
  1996年   35篇
  1995年   31篇
  1994年   16篇
  1993年   25篇
  1992年   10篇
  1991年   22篇
  1990年   21篇
  1989年   12篇
  1988年   7篇
  1987年   11篇
  1986年   12篇
  1985年   12篇
  1984年   18篇
  1983年   16篇
  1982年   8篇
  1981年   5篇
  1980年   4篇
  1979年   7篇
  1978年   7篇
  1977年   7篇
  1975年   6篇
  1973年   2篇
排序方式: 共有10000条查询结果,搜索用时 343 毫秒
121.
The lumped parameter/complex plane analysis technique revealed several contributions to the terminal admittance of the ZnO—Bi2O3 based varistor grain-boundary ac response. The terminal capacitance has been elucidated via the multiple trapping phenomena, a barrier layer polarization, and a resonance effect in the frequency range 10−2≤ f ≤ 109 Hz. The characterization of the trapping relaxation behavior near ∼ 105 Hz (∼ 10−6 s) provided a better understanding of a previously reported loss-peak. The possible nonuniformity in this trapping activity associated with its conductance term observed via the depression angle of a semicircular relaxation in the complex capacitance ( C *) plane has been postulated.  相似文献   
122.
The temperature dependence of the diffusion coefficient of ethanol-soluble substances from ground cloves (particle size 250 μm) during extraction was estimated by fitting batch extraction data at several temperatures (27.8, 40, 50, and 60°C) to a previously developed mass transfer model. The model was based on spherical geometry of particles. Nonlinear regression analysis was used to develop an equation that describes the diffusivity as a function of temperature. The temperature dependence ofD A was of the Arrhenius type.  相似文献   
123.
With the high availability of digital video contents on the internet, users need more assistance to access digital videos. Various researches have been done about video summarization and semantic video analysis to help to satisfy these needs. These works are developing condensed versions of a full length video stream through the identification of the most important and pertinent content within the stream. Most of the existing works in these areas are mainly focused on event mining. Event mining from video streams improves the accessibility and reusability of large media collections, and it has been an active area of research with notable recent progress. Event mining includes a wide range of multimedia domains such as surveillance, meetings, broadcast, news, sports, documentary, and films, as well as personal and online media collections. Due to the variety and plenty of Event mining techniques, in this paper we suggest an analytical framework to classify event mining techniques and to evaluate them based on important functional measures. This framework could lead to empirical and technical comparison of event mining methods and development of more efficient structures at future.  相似文献   
124.
Robot manufacturers will be required to demonstrate objectively that all reasonably foreseeable hazards have been identified in any robotic product design that is to be marketed commercially. This is problematic for autonomous mobile robots because conventional methods, which have been developed for automatic systems do not assist safety analysts in identifying non-mission interactions with environmental features that are not directly associated with the robot’s design mission, and which may comprise the majority of the required tasks of autonomous robots. In this paper we develop a new variant of preliminary hazard analysis that is explicitly aimed at identifying non-mission interactions by means of new sets of guidewords not normally found in existing variants. We develop the required features of the method and describe its application to several small trials conducted at Bristol Robotics Laboratory in the 2011–2012 period.  相似文献   
125.
Cloud computing techniques take the form of distributed computing by utilizing multiple computers to execute computing simultaneously on the service side. To process the increasing quantity of multimedia data, numerous large-scale multimedia data storage computing techniques in the cloud computing have been developed. Of all the techniques, Hadoop plays a key role in the cloud computing. Hadoop, a computing cluster formed by low-priced hardware, can conduct the parallel computing of petabytes of multimedia data. Hadoop features high-reliability, high-efficiency, and high-scalability. The numerous large-scale multimedia data computing techniques include not only the key core techniques, Hadoop and MapReduce, but also the data collection techniques, such as File Transfer Protocol and Flume. In addition, distributed system configuration allocation, automatic installation, and monitoring platform building and management techniques are all included. As a result, only with the integration of all the techniques, a reliable large-scale multimedia data platform can be offered. In this paper, we introduce how cloud computing can make a breakthrough by proposing a multimedia social network dataset on Hadoop platform and implementing a prototype version. Detailed specifications and design issues are discussed as well. An important finding of this article is that we can save more time if we conduct the multimedia social networking analysis using Cloud Hadoop Platform rather than using a single computer. The advantages of cloud computing over the traditional data processing practices are fully demonstrated in this article. The applicable framework designs and the tools available for the large-scale data processing are also proposed. We show the experimental multimedia data including data sizes and processing time.  相似文献   
126.
A new variant of Differential Evolution (DE), called ADE-Grid, is presented in this paper which adapts the mutation strategy, crossover rate (CR) and scale factor (F) during the run. In ADE-Grid, learning automata (LA), which are powerful decision making machines, are used to determine the proper value of the parameters CR and F, and the suitable strategy for the construction of a mutant vector for each individual, adaptively. The proposed automata based DE is able to maintain the diversity among the individuals and encourage them to move toward several promising areas of the search space as well as the best found position. Numerical experiments are conducted on a set of twenty four well-known benchmark functions and one real-world engineering problem. The performance comparison between ADE-Grid and other state-of-the-art DE variants indicates that ADE-Grid is a viable approach for optimization. The results also show that the proposed ADE-Grid improves the performance of DE in terms of both convergence speed and quality of final solution.  相似文献   
127.
The increasing demand on execution of large-scale Cloud workflow applications which need a robust and elastic computing infrastructure usually lead to the use of high-performance Grid computing clusters. As the owners of Cloud applications expect to fulfill the requested Quality of Services (QoS) by the Grid environment, an adaptive scheduling mechanism is needed which enables to distribute a large number of related tasks with different computational and communication demands on multi-cluster Grid computing environments. Addressing the problem of scheduling large-scale Cloud workflow applications onto multi-cluster Grid environment regarding the QoS constraints declared by application’s owner is the main contribution of this paper. Heterogeneity of resource types (service type) is one of the most important issues which significantly affect workflow scheduling in Grid environment. On the other hand, a Cloud application workflow is usually consisting of different tasks with the need for different resource types to complete which we call it heterogeneity in workflow. The main idea which forms the soul of all the algorithms and techniques introduced in this paper is to match the heterogeneity in Cloud application’s workflow to the heterogeneity in Grid clusters. To obtain this objective a new bi-level advanced reservation strategy is introduced, which is based upon the idea of first performing global scheduling and then conducting local scheduling. Global-scheduling is responsible to dynamically partition the received DAG into multiple sub-workflows that is realized by two collaborating algorithms: (1) The Critical Path Extraction algorithm (CPE) which proposes a new dynamic task overall critically value strategy based on DAG’s specification and requested resource type QoS status to determine the criticality of each task; and (2) The DAG Partitioning algorithm (DAGP) which introduces a novel dynamic score-based approach to extract sub-workflows based on critical paths by using a new Fuzzy Qualitative Value Calculation System to evaluate the environment. Local-scheduling is responsible for scheduling tasks on suitable resources by utilizing a new Multi-Criteria Advance Reservation algorithm (MCAR) which simultaneously meets high reliability and QoS expectations for scheduling distributed Cloud-base applications. We used the simulation to evaluate the performance of the proposed mechanism in comparison with four well-known approaches. The results show that the proposed algorithm outperforms other approaches in different QoS related terms.  相似文献   
128.
In the present study, the Group method of data handling (GMDH) network was utilized to predict the scour depth below pipelines. GMDH network was developed using back propagation. Input parameters that were considered as effective parameters on the scour depth included those of sediment size, geometry of pipeline, and approaching flow characteristics. Training and testing performances of the GMDH networks have been carried out using nondimensional data sets that were collected from the literature. These data sets are related to the two main situations of pipelines scour experiments namely clear-water and live-bed conditions. The testing results of performances were compared with the support vector machines (SVM) and existing empirical equations. The GMDH network indicated that using of back propagation produced lower error of scour depth prediction than those obtained using the SVM and empirical equations. Also, the effects of many input parameters on the scour depth have been investigated.  相似文献   
129.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   
130.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号