首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2047篇
  免费   214篇
  国内免费   15篇
电工技术   42篇
综合类   10篇
化学工业   634篇
金属工艺   67篇
机械仪表   120篇
建筑科学   82篇
矿业工程   6篇
能源动力   134篇
轻工业   269篇
水利工程   38篇
石油天然气   24篇
无线电   172篇
一般工业技术   331篇
冶金工业   53篇
原子能技术   18篇
自动化技术   276篇
  2024年   11篇
  2023年   53篇
  2022年   77篇
  2021年   170篇
  2020年   149篇
  2019年   172篇
  2018年   193篇
  2017年   181篇
  2016年   191篇
  2015年   103篇
  2014年   164篇
  2013年   229篇
  2012年   150篇
  2011年   131篇
  2010年   91篇
  2009年   73篇
  2008年   35篇
  2007年   22篇
  2006年   16篇
  2005年   11篇
  2004年   11篇
  2003年   6篇
  2002年   7篇
  2000年   4篇
  1998年   2篇
  1997年   3篇
  1996年   3篇
  1995年   4篇
  1994年   2篇
  1993年   1篇
  1992年   1篇
  1991年   1篇
  1989年   2篇
  1984年   1篇
  1982年   1篇
  1980年   2篇
  1977年   1篇
  1976年   1篇
  1974年   1篇
排序方式: 共有2276条查询结果,搜索用时 15 毫秒
11.
Combining accurate neural networks (NN) in the ensemble with negative error correlation greatly improves the generalization ability. Mixture of experts (ME) is a popular combining method which employs special error function for the simultaneous training of NN experts to produce negatively correlated NN experts. Although ME can produce negatively correlated experts, it does not include a control parameter like negative correlation learning (NCL) method to adjust this parameter explicitly. In this study, an approach is proposed to introduce this advantage of NCL into the training algorithm of ME, i.e., mixture of negatively correlated experts (MNCE). In this proposed method, the capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability. The proposed hybrid ensemble method, MNCE, is compared with their constituent methods, ME and NCL, in solving several benchmark problems. The experimental results show that our proposed ensemble method significantly improves the performance over the original ensemble methods.  相似文献   
12.
In this article, we consider the project critical path problem in an environment with hybrid uncertainty. In this environment, the duration of activities are considered as random fuzzy variables that have probability and fuzzy natures, simultaneously. To obtain a robust critical path with this kind of uncertainty a chance constraints programming model is used. This model is converted to a deterministic model in two stages. In the first stage, the uncertain model is converted to a model with interval parameters by alpha-cut method and distribution function concepts. In the second stage, the interval model is converted to a deterministic model by robust optimization and min-max regret criterion and ultimately a genetic algorithm with a proposed exact algorithm are applied to solve the final model. Finally, some numerical examples are given to show the efficiency of the solution procedure.  相似文献   
13.
Electrospinning with a collector consisting of two pieces of electrically conductive substrates separated by a gap has been used to prepare uniaxially aligned PAN nanofibers. Solution of 15 wt % of PAN/DMF was used tentatively for electrospinning. The effects of width of the gap and applied voltage on degree of alignment were investigated using image‐processing technique by Fourier power spectrum method. The electrospinning conditions that gave the best alignment of nanofibers for 10–15 wt % solution concentrations were experimentally obtained. Bundles like multifilament yarns of uniaxially aligned nanofibers were prepared using a new simple method. After‐treatments of these bundles were carried out in boiling water under tension. A comparison was made between the crystallinity and mechanical behavior of posttreated and untreated bundles. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 101: 4350–4357, 2006  相似文献   
14.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   
15.
The ability to handle very large amounts of image data is important for image analysis, indexing and retrieval applications. Sadly, in the literature, scalability aspects are often ignored or glanced over, especially with respect to the intricacies of actual implementation details. In this paper we present a case-study showing how a standard bag-of-visual-words image indexing pipeline can be scaled across a distributed cluster of machines. In order to achieve scalability, we investigate the optimal combination of hybridisations of the MapReduce distributed computational framework which allows the components of the analysis and indexing pipeline to be effectively mapped and run on modern server hardware. We then demonstrate the scalability of the approach practically with a set of image analysis and indexing tools built on top of the Apache Hadoop MapReduce framework. The tools used for our experiments are freely available as open-source software, and the paper fully describes the nuances of their implementation.  相似文献   
16.
Adsorption isotherms for activated carbon made from pecan shells have been obtained at 25 °C and an approximate pH of 3 for a number of metal ion solutes. It was found that the Slips and Freundlich equations were satisfactory for explaining the experimental data. The correlation of metal ion adsorption with the solute parameters of metal ion electronegativity and first stability constant of the metal hydroxide was investigated. In the case of most of the metal ions studied, higher electronegativities and stability constants corresponded to the higher adsorption levels of metal ions onto the activated carbon. A correlation was developed that predicts the constants of the Freundlich equation from the selected parameters of the metal ions, and thus can predict the adsorption isotherms at constant pH. The developed correlation gives results with acceptable deviations from experimental data. A procedure is proposed for obtaining similar correlations for different conditions (temperature, pH, carbon type and dosage). The ratio of equivalent metal ions adsorbed to protons released is calculated for the studied metal ions over a range of concentrations. In most cases, particularly at low concentrations, this ratio is close to one, confirming that ion exchange of one proton with one equivalent metal ion is the dominant reaction mechanism.  相似文献   
17.
ABSTRACT

Cadmium as a highly toxic metal is released into the environment through paper production, metal processing, phosphate fertilizers, insecticides, and treatment of wastewater. Cadmium also inhibits the body activities and is very toxic for kidney and other organisms. In the current study, zinc-based metal–organic framework, zeolitic imidazolate framework (ZIF)-8, was synthesized and modified by dimethylethylenediamine (ZIF-8-mmen) for the removal of cadmium. To optimize the experiments, response surface methodology was applied with three variables including pH, adsorbent dosage, and contact time using central composite design. The optimum conditions for pH, dosage, and time were 2, 0.1 g, and 89 min, respectively, with removal efficiency of 85.38%. The Langmuir isotherm (q m = 1000 mg/g) indicates the monolayer adsorption. The kinetic studies reveal that the Lagergren model was predominant and cadmium was not chemisorbed. Thermodynamic parameters show spontaneous, endothermic, and physisorption processes.  相似文献   
18.

Mapping vulnerability to Saltwater Intrusion (SWI) in coastal aquifers is studied in this paper using the GALDIT framework but with a novelty of transforming the concept of vulnerability indexing to risk indexing. GALDIT is the acronym of 6 data layers, which are put consensually together to invoke a sense of vulnerability to the intrusion of saltwater against aquifers with freshwater. It is a scoring system of prescribed rates to account for local variations; and prescribed weights to account for relative importance of each data layer but these suffer from subjectivity. Another novelty of the paper is to use fuzzy logic to learn rate values and catastrophe theory to learn weight values and these together are implemented as a scheme and hence Fuzzy-Catastrophe Scheme (FCS). The GALDIT data layers are divided into two groups of Passive Vulnerability Indices (PVI) and Active Vulnerability Indices (AVI), where their sum is Total Vulnerability Index (TVI) and equivalent to GALDIT. Two additional data layers (Pumping and Water table decline) are also introduced to serve as Risk Actuation Index (RAI). The product of TVI and RAI yields Risk Indices. The paper applies these new concepts to a study area, subject to groundwater decline and a possible saltwater intrusion problem. The results provide a proof-of-concept for PVI, AVI, RAI and RI by studying their correlation with groundwater quality samples using the fraction of saltwater (fsea), Groundwater Quality Indices (GQI) and Piper diagram. Significant correlations between the appropriate values are found and these provide a new insight for the study area.

  相似文献   
19.
In this work, the transport properties of gaseous penetrant through several dense glassy polymeric membranes are studied. The nonequilibrium lattice fluid (NELF) in conjunction with the modified Fick's law and dual mode sorption model was used to simulate the gas transport in glassy polymeric membranes. The approach is based on the sorption, diffusion, in which solubility is calculated based on the NELF model, and diffusion coefficient is obtained from the product thermodynamic coefficient and molecular mobility. The governing equation is solved by the finite element method using COMSOL multi-physics software. The developed model for gas permeability of glassy polymeric membrane can be applied in a wide range of pressure and temperature. The comparison of the calculated permeability and solubility of gasses with the experimental data represented the ability of the developed model. Increasing feed gas temperature increases the gas permeability, while this variation leads to lower gas solubility in the glassy polymeric membranes. The effect of feed temperature and pressure on permeability and solubility is investigated, and the experimental data from literature are described by the developed model. A good prediction of the experimental data can be observed over the considered condition.  相似文献   
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号