首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   124篇
  免费   4篇
电工技术   2篇
综合类   1篇
化学工业   20篇
金属工艺   2篇
机械仪表   5篇
建筑科学   8篇
矿业工程   1篇
能源动力   12篇
轻工业   5篇
水利工程   3篇
无线电   12篇
一般工业技术   28篇
冶金工业   7篇
自动化技术   22篇
  2023年   2篇
  2022年   2篇
  2021年   3篇
  2020年   1篇
  2019年   5篇
  2018年   5篇
  2017年   4篇
  2016年   13篇
  2015年   3篇
  2014年   8篇
  2013年   13篇
  2012年   7篇
  2011年   4篇
  2010年   10篇
  2009年   15篇
  2008年   6篇
  2007年   3篇
  2006年   2篇
  2005年   3篇
  2004年   3篇
  2003年   5篇
  2002年   1篇
  2001年   1篇
  1998年   1篇
  1997年   1篇
  1996年   1篇
  1994年   1篇
  1992年   2篇
  1990年   1篇
  1989年   1篇
  1987年   1篇
排序方式: 共有128条查询结果,搜索用时 15 毫秒
61.
In this paper, a transmission line fault location model which is based on an Elman recurrent network (ERN) has been presented for balanced and unbalanced short circuit faults. All fault situations with different inception times are implemented on a 380-kV prototype power system. Wavelet transform (WT) is used for selecting distinctive features about the faulty signals. The system has the advantages of utilizing single-end measurements, using both voltage and current signals. ERN is able to determine the fault location occurred on transmission line rapidly and correctly as an important alternative to standard feedforward back propagation networks (FFNs) and radial basis functions (RBFs) neural networks.  相似文献   
62.
The aim of this paper is to estimate the fault location on transmission lines quickly and accurately. The faulty current and voltage signals obtained from a simulation are decomposed by wavelet packet transform (WPT). The extracted features are applied to artificial neural network (ANN) for estimating fault location. As data sets increase in size, their analysis become more complicated and time consuming. The energy and entropy criterion are applied to wavelet packet coefficients to decrease the size of feature vectors. The test results of ANN demonstrate that the applying of energy criterion to current signals after WPT is a very powerful and reliable method for reducing data sets in size and hence estimating fault locations on transmission lines quickly and accurately.  相似文献   
63.
In conjunction with the advance in computer technology, virtual screening of small molecules has been started to use in drug discovery. Since there are thousands of compounds in early-phase of drug discovery, a fast classification method, which can distinguish between active and inactive molecules, can be used for screening large compound collections. In this study, we used Support Vector Machines (SVM) for this type of classification task. SVM is a powerful classification tool that is becoming increasingly popular in various machine-learning applications. The data sets consist of 631 compounds for training set and 216 compounds for a separate test set. In data pre-processing step, the Pearson's correlation coefficient used as a filter to eliminate redundant features. After application of the correlation filter, a single SVM has been applied to this reduced data set. Moreover, we have investigated the performance of SVM with different feature selection strategies, including SVM–Recursive Feature Elimination, Wrapper Method and Subset Selection. All feature selection methods generally represent better performance than a single SVM while Subset Selection outperforms other feature selection methods. We have tested SVM as a classification tool in a real-life drug discovery problem and our results revealed that it could be a useful method for classification task in early-phase of drug discovery.  相似文献   
64.
Divergent thinking (DT) tests are widely used as an estimate of creativity. However, tests of DT may be biased by experience. Scores from these tests may depend on the amount and types of experiences of examinees. This investigation was designed to determine the degree to which personal and social experiences influence DT scores. Two different tasks were administered: Uses task and Problem Generation (PG). Fluency and originality scores were calculated for each. Analyses indicated that the impact of experience was similar in the PG and Uses tasks. Personal and social experience explained 44% and 30% of fluency scores for PG and Uses tasks, respectively, and 65% of originality scores for both PG and Uses. The differences between uncorrected scores (all ideas, including those reflecting experience) and corrected scores (where ideas tied to personal or social experiences were eliminated) were statistically significant, with the largest discrepancy in Uses fluency and lowest in Uses originality. Findings supported the claim that divergent thinking tests may depend heavily on experience. Alternatives for using DT tests without an experiential bias are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
65.
A mixed integer programming model that selects cells that can handle demand variability is presented. Valid inequalities are added to the linear programming relaxation of the model. By adding these valid inequalities, the lower bound on the model's optimal objective function value is significantly improved. An efficient heuristic procedure that generates good solutions to this model is also presented.  相似文献   
66.
Complex media fusion operations can be costly in terms of the time they need to process input objects. If data arrive faster to fusion nodes than the speed with which they can consume the inputs, this will result in some input objects not being processed. In this paper, we develop load shedding mechanisms which take into consideration both data quality and expensive nature of media fusion operators. In particular, we present quality assessment models for objects and multistream fusion operators and highlight that such quality assessments may impose partial orders on objects. We highlight that the most effective load control approach for fusion operators involves shedding of (not the individual input objects but) combinations of objects. Yet, identifying suitable combinations of objects in real time will not be possible if efficient combination selection algorithms do not exist. We develop efficient combination selection schemes for scenarios with different quality assessment and target characteristics. We first develop efficient combination-based load shedding when the fusion operator has unambiguously monotone semantics. We then extend this to the more general ambiguously monotone case and present experimental results that show the performance gains using quality-aware combination-based load shedding strategies under the various fusion scenarios.  相似文献   
67.
This article reports the chromium (VI) removal from water by preparing polyacrylonitrile‐co‐poly (2‐ethyl hexylacrylate) (PAN(92)‐co‐P2EHA(8)) and polyaniline (PANI) nanoporous membranes at various PANI loadings. It was observed that chromium (VI) rejections of nanoporous membranes are highly concentration and pH dependent. Almost complete chromium removal (99.9%) with higher flux values (120–177 L m?2 h?1) was observed for nanoporous membranes. Moreover, nanoporous membranes were also demonstrated as fouling resistant. Total flux loss was low and a part was attributed to reversible flux loss, which cannot cause any permanent hysteresis and easily overcome with simple washing. Scanning electron microscopy (SEM) studies were performed for identifying cross sectional morphology. It was pointed out that pore size should be small enough for filtration and optimized for higher flux but pores should be functionalized for rejection. Chemical structure, swelling ratios, sheet resistivity, and fracture morphologies of nanoporous membranes were reported. POLYM. ENG. SCI., 2012. © 2012 Society of Plastics Engineers  相似文献   
68.
This study was performed in order to evaluate the permeability of the basalts and pyroclastics and the maximum depth of grout injection at the Atasu dam site, Turkey, using the equations of Kiraly and Hoek and Bray based on values obtained from Lugeon tests. In order to evaluate maximum discharge values and depth of injection, seepage analyses were performed using the finite element technique for each 10 m up to 100 m. The results indicate that to establish an impermeable zone at the dam site, the depth of injection should be taken as 50 m for the left and right slopes and 40 m for the river bed.  相似文献   
69.
We introduce a robust image segmentation method based on a variational formulation using edge flow vectors. We demonstrate the nonconservative nature of this flow field, a feature that helps in a better segmentation of objects with concavities. A multiscale version of this method is developed and is shown to improve the localization of the object boundaries. We compare and contrast the proposed method with well known state-of-the-art methods. Detailed experimental results are provided on both synthetic and natural images that demonstrate that the proposed approach is quite competitive.   相似文献   
70.
Pregabalin is an anticonvulsant drug used for neuropathic pain and as an adjunct therapy for partial seizures with or without secondary generalization in adults. In conventional therapy recommended dose for pregabalin is 75?mg twice daily or 50?mg three times a day, with maximum dosage of 600?mg/d. To achieve maximum therapeutic effect with a low risk of adverse effects and to reduce often drug dosing, modified release preparations; such as microspheres might be helpful. However, most of the microencapsulation techniques have been used for lipophilic drugs, since hydrophilic drugs like pregabalin, showed low-loading efficiency and rapid dissolution of compounds into the aqueous continous phase. The purpose of this study was to improve loading efficiency of a water-soluble drug and modulate release profiles, and to test the efficiency of the prepared microspheres with the help of animal modeling studies. Pregabalin is a water soluble drug, and it was encapsulated within anionic acrylic resin (Eudragit S 100) microspheres by water in oil in oil (w/o/o) double emulsion solvent diffusion method. Dichloromethane and corn oil were chosen primary and secondary oil phases, respectively. The presence of internal water phase was necessary to form stable emulsion droplets and it accelerated the hardening of microspheres. Tween 80 and Span 80 were used as surfactants to stabilize the water and corn oil phases, respectively. The optimum concentration of Tween 80 was 0.25% (v/v) and Span 80 was 0.02% (v/v). The volume of the continous phase was affected the size of the microspheres. As the volume of the continous phase increased, the size of microspheres decreased. All microsphere formulations were evaluated with the help of in vitro characterization parameters. Microsphere formulations (P1–P5) exhibited entrapment efficiency ranged between 57.00?±?0.72 and 69.70?±?0.49%; yield ranged between 80.95?±?1.21 and 93.05?±?1.42%; and mean particle size were between 136.09?±?2.57 and 279.09?±?1.97?µm. Pregabalin microspheres having better results among all formulations (Table 3) were chosen for further studies such as differential scanning calorimetry, Fourier transform infrared analysis and dissolution studies. In the last step, the best pregabalin microsphere formulation (P3) was chosen for in vivo animal studies. The pregabalin-loaded microspheres (P3) and conventional pregabalin capsules were applied orally in rats for three days, resulted in clinical improvement of cold allodynia, an indicator of peripheral neuropathy. This result when evaluated together with the serum pregabalin levels and in vitro release studies suggests that the pregabalin microspheres prepared with w/o/o double emulsion solvent diffusion method can be an alternative form for neuropathic pain therapy. Conclusively, a drug delivery system successfully developed that showed modified release up to 10?h and could be potentially useful to overcome the frequent dosing problems associated with pregabalin conventional dosage form.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号