首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2965篇
  免费   274篇
  国内免费   27篇
电工技术   74篇
综合类   14篇
化学工业   814篇
金属工艺   86篇
机械仪表   163篇
建筑科学   136篇
矿业工程   9篇
能源动力   193篇
轻工业   357篇
水利工程   57篇
石油天然气   47篇
无线电   251篇
一般工业技术   455篇
冶金工业   77篇
原子能技术   23篇
自动化技术   510篇
  2024年   16篇
  2023年   75篇
  2022年   106篇
  2021年   231篇
  2020年   210篇
  2019年   249篇
  2018年   285篇
  2017年   264篇
  2016年   247篇
  2015年   137篇
  2014年   231篇
  2013年   346篇
  2012年   206篇
  2011年   204篇
  2010年   129篇
  2009年   100篇
  2008年   54篇
  2007年   32篇
  2006年   33篇
  2005年   15篇
  2004年   21篇
  2003年   13篇
  2002年   7篇
  2001年   3篇
  2000年   5篇
  1999年   5篇
  1998年   6篇
  1997年   8篇
  1996年   6篇
  1995年   4篇
  1994年   3篇
  1992年   2篇
  1991年   4篇
  1989年   2篇
  1988年   1篇
  1987年   1篇
  1985年   1篇
  1980年   1篇
  1977年   2篇
  1974年   1篇
排序方式: 共有3266条查询结果,搜索用时 31 毫秒
21.

Todays, XML as a de facto standard is used to broadcast data over mobile wireless networks. In these networks, mobile clients send their XML queries over a wireless broadcast channel and recieve their desired XML data from the channel. However, downloading the whole XML data by a mobile device is a challenge since the mobile devices used by clients are small battery powered devices with limited resources. To meet this challenge, the XML data should be indexed in such a way that the desired XML data can be found easily and only such data can be downloaded instead of the whole XML data by the mobile clients. Several indexing methods are proposed to selectively access the XML data over an XML stream. However, the existing indexing methods cause an increase in the size of XML stream by including some extra information over the XML stream. In this paper, a new XML stream structure is proposed to disseminate the XML data over a broadcast channel by grouping and summarizing the structural information of XML nodes. By summarizing such information, the size of XML stream can be reduced and therefore, the latency of retrieving the desired XML data over a wirless broadcast channel can be reduced. The proposed XML stream structure also contains indexes in order to skip from the irrelevant parts over the XML stream. It therefore can reduce the energy consumption of mobile devices in downloading the results of XML queries. In addition, our proposed XML stream structure can process different types of XML queries and experimental results showed that it improves the performace of XML query processing over the XML data stream compared to the existing research works in terms of access and tuning times.

  相似文献   
22.
Mahdi  M. 《Microsystem Technologies》2021,27(8):2913-2917
Microsystem Technologies - In this paper, we propose a simple design for the heating device with ultra-low power consumption. The device is composed of a micro heater made of Nichrome (20/80)...  相似文献   
23.
Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.  相似文献   
24.
In this article, we consider the project critical path problem in an environment with hybrid uncertainty. In this environment, the duration of activities are considered as random fuzzy variables that have probability and fuzzy natures, simultaneously. To obtain a robust critical path with this kind of uncertainty a chance constraints programming model is used. This model is converted to a deterministic model in two stages. In the first stage, the uncertain model is converted to a model with interval parameters by alpha-cut method and distribution function concepts. In the second stage, the interval model is converted to a deterministic model by robust optimization and min-max regret criterion and ultimately a genetic algorithm with a proposed exact algorithm are applied to solve the final model. Finally, some numerical examples are given to show the efficiency of the solution procedure.  相似文献   
25.
Electrospinning with a collector consisting of two pieces of electrically conductive substrates separated by a gap has been used to prepare uniaxially aligned PAN nanofibers. Solution of 15 wt % of PAN/DMF was used tentatively for electrospinning. The effects of width of the gap and applied voltage on degree of alignment were investigated using image‐processing technique by Fourier power spectrum method. The electrospinning conditions that gave the best alignment of nanofibers for 10–15 wt % solution concentrations were experimentally obtained. Bundles like multifilament yarns of uniaxially aligned nanofibers were prepared using a new simple method. After‐treatments of these bundles were carried out in boiling water under tension. A comparison was made between the crystallinity and mechanical behavior of posttreated and untreated bundles. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 101: 4350–4357, 2006  相似文献   
26.
Improper maintenance, repair, and operations of societal centric structures can lead to catastrophic failures that drastically affect global economy, the environment, and everyday life. Due to the remote, cramped and highly irregular environmental nature of these structures, routine manual procedures and operations can be rather tedious, dangerous, and hazardous for humans. Automating maintenance, repair, and operations removes human workers from having to crawl within highly cluttered and constrained spaces, breathing in stale air mixed with fumes from welding or particulate from repair work, and provides higher reliability and consistency in the repair work. This paper introduces SHeRo, a scalable hexapod robot designed for maintenance, repair, and operations within remote, inaccessible, irregular, and hazardous environments. The scalability of the design enhances traditional hexapod robot designs by incorporating two prismatic joints into each leg. A detailed discussion on the design and realization of SHeRo is provided. An analysis on the stability and workspace of SHeRo is presented and a dynamic criterion is developed to integrate the concepts of robot stability and constant orientation workspace into a stable workspace. The analytical solution of the lateral stable workspace of SHeRo is derived along with a metric for comparing stable workspace between different robot configurations. A simulated demonstration and two physical experimental demonstrations are presented showing the advantage of introducing scalability into the hexapod robot design along with the workspace enhancement and flexibility of the scalable hexapod robot.  相似文献   
27.
In real scheduling problems, some disruptions and unexpected events may occur. These disruptions cause the initial schedule to quickly become infeasible and non-optimal. In this situation, an appropriate rescheduling method should be used. In this paper, a new approach has been proposed to achieve stable and robust schedule despite uncertain processing times and unexpected arrivals of new jobs. This approach is a proactive–reactive method which uses a two-step procedure. In the first step an initial robust solution is produced proactively against uncertain processing times using robust optimization approach. This initial robust solution is more insensitive against the fluctuations of processing times in future. In the next step, when an unexpected disruption occurs, an appropriate reactive method is adopted to deal with this unexpected event. In fact, in the second step, the reactive approach determines the best modified sequence after any unexpected disruption based on the classical objective and performance measures. The robustness measure is implemented in the reactive approach to increase the performance of the real schedule after disruption. Computational results indicate that this method produces better solutions in comparison with four classical heuristic approaches according to effectiveness and performance of solutions.  相似文献   
28.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   
29.
Latexes of carboxylated styrene-butadiene rubber were prepared via batch emulsion copolymerization with different amounts of acrylic acid in the absence of emulsifier. The effect of acid monomer was investigated in the particle formation and growth. It was observed that the amount of acrylic acid strongly affected the particle formation. The number of particles and thus polymerization rate increased with increasing of the acid content. There was no significant difference in the polymerization rate per particle in all experiments. The results show that in this case particle growth process is less dependent on the acrylic acid amount in comparison with its influence on nucleation stage and then particle number. Several parameters such as polymerization rate and number of latex particle per unit volume of the aqueous phase were calculated. Attempt was made to evaluate the average number of growing chain per particle. Also average particle diameter of the above carboxylated SBR latexes was obtained through some calculations from the direct measurement of average particle diameter in the swollen state by light scattering technique for the first time.  相似文献   
30.
Projection Functions have been widely used for facial feature extraction and optical/handwritten character recognition due to their simplicity and efficiency. Because these transformations are not one-to-one, they may result in mapping distinct points into one point, and consequently losing detailed information. Here, we solve this problem by defining an N-dimensional space to represent a single image. Then, we propose a one-to-one transformation in this new image space. The proposed method, which we referred to as Linear Principal Transformation (LPT), utilizes Eigen analysis to extract the vector with the highest Eigenvalue. Afterwards, extrema in this vector were analyzed to extract the features of interest. In order to evaluate the proposed method, we performed two sets of experiments on facial feature extraction and optical character recognition in three different data sets. The results show that the proposed algorithm outperforms the observed algorithms in the paper and achieves accuracy from 1.4 % up to 14 %, while it has a comparable time complexity and efficiency.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号