全文获取类型
收费全文 | 3384篇 |
免费 | 87篇 |
国内免费 | 16篇 |
专业分类
电工技术 | 55篇 |
综合类 | 8篇 |
化学工业 | 657篇 |
金属工艺 | 119篇 |
机械仪表 | 103篇 |
建筑科学 | 47篇 |
矿业工程 | 15篇 |
能源动力 | 177篇 |
轻工业 | 183篇 |
水利工程 | 30篇 |
石油天然气 | 6篇 |
无线电 | 549篇 |
一般工业技术 | 831篇 |
冶金工业 | 340篇 |
原子能技术 | 28篇 |
自动化技术 | 339篇 |
出版年
2024年 | 13篇 |
2023年 | 42篇 |
2022年 | 79篇 |
2021年 | 160篇 |
2020年 | 81篇 |
2019年 | 84篇 |
2018年 | 121篇 |
2017年 | 86篇 |
2016年 | 101篇 |
2015年 | 73篇 |
2014年 | 110篇 |
2013年 | 206篇 |
2012年 | 130篇 |
2011年 | 181篇 |
2010年 | 144篇 |
2009年 | 154篇 |
2008年 | 141篇 |
2007年 | 110篇 |
2006年 | 110篇 |
2005年 | 85篇 |
2004年 | 64篇 |
2003年 | 67篇 |
2002年 | 55篇 |
2001年 | 61篇 |
2000年 | 64篇 |
1999年 | 53篇 |
1998年 | 105篇 |
1997年 | 58篇 |
1996年 | 52篇 |
1995年 | 50篇 |
1994年 | 62篇 |
1993年 | 59篇 |
1992年 | 54篇 |
1991年 | 57篇 |
1990年 | 45篇 |
1989年 | 28篇 |
1988年 | 29篇 |
1987年 | 32篇 |
1986年 | 30篇 |
1985年 | 23篇 |
1984年 | 32篇 |
1983年 | 20篇 |
1982年 | 27篇 |
1981年 | 13篇 |
1980年 | 24篇 |
1979年 | 21篇 |
1978年 | 10篇 |
1977年 | 18篇 |
1976年 | 23篇 |
1973年 | 8篇 |
排序方式: 共有3487条查询结果,搜索用时 15 毫秒
41.
In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single output Boolean functions (BFs) and derived the simulated graphs for number of min-terms against the BFC for different number of variables. For NN model (NNM) development, we looked at three data transformation techniques for pre-processing the NN-training and validation data. The trained NNMs are used for complexity estimation for the Boolean logic expressions with a given number of variables and sum of products (SOP) terms. Both FFNNs and RNNs were evaluated against the ISCAS benchmark results. Our FFNNs and RNNs were able to predict the BFC with correlations of 0.811 and 0.629 with the benchmark results, respectively. 相似文献
42.
M. Adam Khan N. Ram Prasad S. Navaneetha Krishnan S. Karthic Raja J. T. Winowlin Jappes Muthukannan Duraiselvam 《Materials and Manufacturing Processes》2017,32(14):1635-1641
The recent research in biocompatible materials has been useful in replacing and supporting the fractured natural human bones/joints. Under some condition, negative reaction like release of ions from the bare metal toward the human body fluid leads to corrosion. In this proposed research paper, the biocompatibility of the laser surface-modified austenitic stainless steel (SS316L) and nickel-based superalloy (Inconel 718) was studied. The investigation on laser-modified surfaces is evaluated through electrochemical polarization analysis using simulated body fluid (SBF). The samples subjected to electrochemical polarization analysis were characterized by optical image analysis, SEM, EDS, and XRD analysis. It was inferred that laser surface-modified materials provided enhanced corrosion resistance and bare nickel alloy is more susceptible to corrosion by SBF. 相似文献
43.
Balasubramanian Mythili Gnanamangai Ponnusamy Ponmurugan Sengottaiyan Eashwari Jeeva Kolandasamy Manjukarunambika Vishwanathan Elango Kolandaivel Hemalatha Jyoti Prasad Kakati Rajamanickam Mohanraj Somasundaram Prathap 《IET nanobiotechnology / IET》2017,11(8):917
Tea leaves have economic importance in preparation of the popular beverage of the world “tea”. Bird’s eye spot disease of tea leaves creates significant revenue loss in tea trade of many tea plant cultivating countries. Management of this disease by silver (AgNps) and copper (CuNps) nanoparticles that are biosynthesised by efficient antagonists was studied. The biocontrol agents like Pseudomonas fluorescens, Trichoderma atroviride and Streptomyces sannanensis were evaluated for nanoparticle synthesis against Cercospora theae isolates namely KC10, MC24 and VC38. Initially, the freshly prepared extracellular AgNps showed high disease control (59.42 – 79.76%), but the stability of antagonistic property in stored nanoparticles were significantly high in CuNps (58.71 – 73.81%). Greenhouse studies on various treatments imposed also showed reduced disease incidence percentage of 13.4, 7.57 and 10.11% when treated with CuNps synthesized by P. fluorescens, T. atroviride and S. sannanensis respectively. Various treatment schedule in fields suggested the use of Bionanocopper@1.5 ppm for highest yield (3743 kg/ha) with 66.1% disease prevention. The results suggest the use of biosynthesised CuNps using Streptomyces sannanensis for controlling the tea plant pathogens causing foliar disease with higher stability in releasing the antagonistic activity during sporadic disease incidence of bird’s eye spot disease in tea plants.Inspec keywords: silver, copper, crops, plant diseases, nanoparticles, air pollution, agrochemicals, nanobiotechnologyOther keywords: biosynthesised silver, biosynthesised copper, nanoformulation, foliar spray, bird eye spot disease control, tea plantations, tea leaves, economic importance, revenue loss, tea trade, tea plant cultivating countries, silver nanoparticles, AgNps, copper nanoparticles, CuNps, biocontrol agents, nanoparticle synthesis, Cercospora theae isolates, KC10, MC24, VC38, greenhouse studies, antagonistic property, P. fluorescens, T. atroviride, S. sannanensis, fungicides, synthetic nanomaterials, bionanomaterials, disease prevention, green leaf yield, BionanoCu, tea plant pathogens, foliar disease 相似文献
44.
Text mining has become a major research topic in which text classification is the important task for finding the relevant information from the new document. Accordingly, this paper presents a semantic word processing technique for text categorization that utilizes semantic keywords, instead of using independent features of the keywords in the documents. Hence, the dimensionality of the search space can be reduced. Here, the Back Propagation Lion algorithm (BP Lion algorithm) is also proposed to overcome the problem in updating the neuron weight. The proposed text classification methodology is experimented over two data sets, namely, 20 Newsgroup and Reuter. The performance of the proposed BPLion is analysed, in terms of sensitivity, specificity, and accuracy, and compared with the performance of the existing works. The result shows that the proposed BPLion algorithm and semantic processing methodology classifies the documents with less training time and more classification accuracy of 90.9%. 相似文献
45.
The lossy nature of the JPEG compression leaves traces which are utilized by the forensic agents to identify the local tampering in the image. In this paper, a tricky anti-forensic method has been proposed to remove the traces left by the JPEG compression in both the spatial domain and discrete cosine transform domain. A novel Least Cuckoo Search algorithm is devised in the proposed anti-forensic compression scheme. Moreover, a new fitness function called histogram deviation is formulated in the optimization algorithm. The experimentation of the proposed anti-forensic compression scheme is performed over uncompressed images from UCID database. The performance of the proposed method is evaluated, and it is compared with the existing methods using PSNR, MSE and classification accuracy as measures. The experimentation ensued with promising results, i.e. accuracy of 0.97, PSNR of 44.34?dB, and MSE of 0.1789 which prove the efficacy of the proposed method. 相似文献
46.
Both unit and integration testing are incredibly crucial for almost any software application because each of them operates a distinct process to examine the product. Due to resource constraints, when software is subjected to modifications, the drastic increase in the count of test cases forces the testers to opt for a test optimization strategy. One such strategy is test case prioritization (TCP). Existing works have propounded various methodologies that re-order the system-level test cases intending to boost either the fault detection capabilities or the coverage efficacy at the earliest. Nonetheless, singularity in objective functions and the lack of dissimilitude among the re-ordered test sequences have degraded the cogency of their approaches. Considering such gaps and scenarios when the meteoric and continuous updations in the software make the intensive unit and integration testing process more fragile, this study has introduced a memetics-inspired methodology for TCP. The proposed structure is first embedded with diverse parameters, and then traditional steps of the shuffled-frog-leaping approach (SFLA) are followed to prioritize the test cases at unit and integration levels. On 5 standard test functions, a comparative analysis is conducted between the established algorithms and the proposed approach, where the latter enhances the coverage rate and fault detection of re-ordered test sets. Investigation results related to the mean average percentage of fault detection (APFD) confirmed that the proposed approach exceeds the memetic, basic multi-walk, PSO, and optimized multi-walk by 21.7%, 13.99%, 12.24%, and 11.51%, respectively. 相似文献
47.
Yossi Azar Uriel Feige Iftah Gamzu Thomas Moscibroda Prasad Raghavendra 《Theory of Computing Systems》2011,49(4):738-756
We consider buffer management of unit packets with deadlines for a multi-port device with reconfiguration overhead. The goal
is to maximize the throughput of the device, i.e., the number of packets delivered by their deadline. For a single port or
with free reconfiguration, the problem reduces to the well-known packets scheduling problem, where the celebrated earliest-deadline-first
(EDF) strategy is optimal 1-competitive. However, EDF is not 1-competitive when there is a reconfiguration overhead. We design
an online algorithm that achieves a competitive ratio of 1−o(1) when the ratio between the minimum laxity of the packets and the number of ports tends to infinity. This is one of the
rare cases where one can design an almost 1-competitive algorithm. One ingredient of our analysis, which may be interesting
on its own right, is a perturbation theorem on EDF for the classical packets scheduling problem. Specifically, we show that
a small perturbation in the release and deadline times cannot significantly degrade the optimal throughput. This implies that
EDF is robust in the sense that its throughput is close to the optimum even when the deadlines are not precisely known. 相似文献
48.
Computer-aided design (CAD) is a ubiquitous tool that today’s students will be expected to use proficiently for numerous engineering purposes. Taking full advantage of the features available in modern CAD programs requires that models are created in a manner that allows others to easily understand how they are organized and alter them in an efficient and robust manner. The results of a class-based exercise are presented to examine the role of model attributes on model creation, alteration, and student perception. Two popular CAD programs are used for the exercise: SolidWorks and Pro|Engineer. General results from both programs are reported. Fewer more complex features are found to be correlated with reduced modeling time. Simple features are shown to be positively correlated with the number of features retained without change. More complex features are found to be negatively correlated with the number of new features. Student perceptions of model quality and intuitiveness are positively correlated with the amount of feature reuse. Student survey data shows a preference for simpler features, the naming of features, and the use of reference geometry. The results do not allow for a generic approach regarding feature complexity to be prescribed. Overall, properly conveying design intent is shown to be positively correlated with design retention and negatively correlated with alteration time. 相似文献
49.
Bhim Bali Prasad Deepak Kumar Rashmi Madhuri Mahavir Prasad TiwariAuthor vitae 《Sensors and actuators. B, Chemical》2011,160(1):418
A new kind of molecularly imprinted polymer-modified graphite electrode was fabricated by “grafting-to” approach, incorporating sol–gel technique, for the detection of acute deficiency in serum ascorbic acid level (SAAL), manifesting hypovitaminosis C. The modified electrode exhibited ascorbic acid (AA) oxidation at less positive potential (0.0 V) than the earlier reported methods, resulting in a limit of detection as low as 6.13 ng mL−1 (RSD = 1.2%, S/N = 3). The diffusion coefficient (1.096 × 10−5 cm2 s−1), rate constant (7.308 s−1), and Gibb's free energy change (−12.59 kJ mol−1) due to analyte adsorption, were also calculated to explore the kinetics of AA oxidation. The proposed sensor was found to enhance sensitivity substantially so as to detect ultra trace level of AA in the presence of other biologically important compounds (dopamine, uric acid, etc.), without any cross interference and matrix complications from biological fluids and pharmaceutical samples. 相似文献
50.
Sriraam Natarajan Prasad Tadepalli Thomas G. Dietterich Alan Fern 《Annals of Mathematics and Artificial Intelligence》2008,54(1-3):223-256
Many real-world domains exhibit rich relational structure and stochasticity and motivate the development of models that combine predicate logic with probabilities. These models describe probabilistic influences between attributes of objects that are related to each other through known domain relationships. To keep these models succinct, each such influence is considered independent of others, which is called the assumption of “independence of causal influences” (ICI). In this paper, we describe a language that consists of quantified conditional influence statements and captures most relational probabilistic models based on directed graphs. The influences due to different statements are combined using a set of combining rules such as Noisy-OR. We motivate and introduce multi-level combining rules, where the lower level rules combine the influences due to different ground instances of the same statement, and the upper level rules combine the influences due to different statements. We present algorithms and empirical results for parameter learning in the presence of such combining rules. Specifically, we derive and implement algorithms based on gradient descent and expectation maximization for different combining rules and evaluate them on synthetic data and on a real-world task. The results demonstrate that the algorithms are able to learn both the conditional probability distributions of the influence statements and the parameters of the combining rules. 相似文献