首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4687篇
  免费   401篇
  国内免费   32篇
电工技术   61篇
综合类   25篇
化学工业   1287篇
金属工艺   153篇
机械仪表   242篇
建筑科学   188篇
矿业工程   11篇
能源动力   250篇
轻工业   407篇
水利工程   104篇
石油天然气   83篇
武器工业   3篇
无线电   410篇
一般工业技术   822篇
冶金工业   157篇
原子能技术   34篇
自动化技术   883篇
  2024年   17篇
  2023年   60篇
  2022年   111篇
  2021年   237篇
  2020年   258篇
  2019年   318篇
  2018年   406篇
  2017年   360篇
  2016年   335篇
  2015年   207篇
  2014年   359篇
  2013年   561篇
  2012年   388篇
  2011年   379篇
  2010年   243篇
  2009年   232篇
  2008年   134篇
  2007年   114篇
  2006年   69篇
  2005年   38篇
  2004年   32篇
  2003年   23篇
  2002年   25篇
  2001年   18篇
  2000年   22篇
  1999年   21篇
  1998年   18篇
  1997年   15篇
  1996年   12篇
  1995年   14篇
  1994年   10篇
  1993年   17篇
  1992年   7篇
  1991年   7篇
  1990年   9篇
  1989年   7篇
  1988年   7篇
  1987年   2篇
  1986年   4篇
  1985年   2篇
  1984年   9篇
  1983年   2篇
  1982年   5篇
  1980年   1篇
  1979年   1篇
  1978年   2篇
  1974年   1篇
  1967年   1篇
排序方式: 共有5120条查询结果,搜索用时 15 毫秒
81.
In this paper, we propose a source localization algorithm based on a sparse Fast Fourier Transform (FFT)-based feature extraction method and spatial sparsity. We represent the sound source positions as a sparse vector by discretely segmenting the space with a circular grid. The location vector is related to microphone measurements through a linear equation, which can be estimated at each microphone. For this linear dimensionality reduction, we have utilized a Compressive Sensing (CS) and two-level FFT-based feature extraction method which combines two sets of audio signal features and covers both short-time and long-time properties of the signal. The proposed feature extraction method leads to a sparse representation of audio signals. As a result, a significant reduction in the dimensionality of the signals is achieved. In comparison to the state-of-the-art methods, the proposed method improves the accuracy while the complexity is reduced in some cases.  相似文献   
82.
Almost all binarization methods have a few parameters that require setting. However, they do not usually achieve their upper-bound performance unless the parameters are individually set and optimized for each input document image. In this work, a learning framework for the optimization of the binarization methods is introduced, which is designed to determine the optimal parameter values for a document image. The framework, which works with any binarization method, has a standard structure, and performs three main steps: (i) extracts features, (ii) estimates optimal parameters, and (iii) learns the relationship between features and optimal parameters. First, an approach is proposed to generate numerical feature vectors from 2D data. The statistics of various maps are extracted and then combined into a final feature vector, in a nonlinear way. The optimal behavior is learned using support vector regression (SVR). Although the framework works with any binarization method, two methods are considered as typical examples in this work: the grid-based Sauvola method, and Lu’s method, which placed first in the DIBCO’09 contest. The experiments are performed on the DIBCO’09 and H-DIBCO’10 datasets, and combinations of these datasets with promising results.  相似文献   
83.
Hubs are special facilities designed to act as switching, transshipment and sorting points in various distribution systems. Since hub facilities concentrate and consolidate flows, disruptions at hubs could have large effects on the performance of a hub network. In this paper, we have formulated the multiple allocation p-hub median problem under intentional disruptions as a bi-level game model. In this model, the follower’s objective is to identify those hubs the loss of which would most diminish service efficiency. Moreover, the leader’s objective is to identify the set of hubs to locate in order to minimize expected transportation cost while taking normal and failure conditions into account. We have applied two algorithms based on simulated annealing to solve the defined problem. In addition, the algorithms have been calibrated using the Taguchi method. Computational experiments on different instances indicate that the proposed algorithms would be efficient in practice.  相似文献   
84.
Mohammad Hossein  Reza   《Pattern recognition》2008,41(8):2571-2593
This paper investigates the use of time-adaptive self-organizing map (TASOM)-based active contour models (ACMs) for detecting the boundaries of the human eye sclera and tracking its movements in a sequence of images. The task begins with extracting the head boundary based on a skin-color model. Then the eye strip is located with an acceptable accuracy using a morphological method. Eye features such as the iris center or eye corners are detected through the iris edge information. TASOM-based ACM is used to extract the inner boundary of the eye. Finally, by tracking the changes in the neighborhood characteristics of the eye-boundary estimating neurons, the eyes are tracked effectively. The original TASOM algorithm is found to have some weaknesses in this application. These include formation of undesired twists in the neuron chain and holes in the boundary, lengthy chain of neurons, and low speed of the algorithm. These weaknesses are overcome by introducing a new method for finding the winning neuron, a new definition for unused neurons, and a new method of feature selection and application to the network. Experimental results show a very good performance for the proposed method in general and a better performance than that of the gradient vector field (GVF) snake-based method.  相似文献   
85.
Duality properties have been investigated by many researchers in the recent literature. They are introduced in this paper for a fully fuzzified version of the minimal cost flow problem, which is a basic model in network flow theory. This model illustrates the least cost of the shipment of a commodity through a capacitated network in terms of the imprecisely known available supplies at certain nodes which should be transmitted to fulfil uncertain demands at other nodes. First, we review on the most valuable results on fuzzy duality concepts to facilitate the discussion of this paper. By applying Hukuhara’s difference, approximated and exact multiplication and Wu’s scalar production, we exhibit the flow in network models. Then, we use combinatorial algorithms on a reduced problem which is derived from fully fuzzified MCFP to acquire fuzzy optimal flows. To give duality theorems, we utilize a total order on fuzzy numbers due to the level of risk and realize optimality conditions for providing some efficient combinatorial algorithms. Finally, we compare our results with the previous worthwhile works to demonstrate the efficiency and power of our scheme and the reasonability of our solutions in actual decision-making problems.  相似文献   
86.
87.
In recent years, classification learning for data streams has become an important and active research topic. A major challenge posed by data streams is that their underlying concepts can change over time, which requires current classifiers to be revised accordingly and timely. To detect concept change, a common methodology is to observe the online classification accuracy. If accuracy drops below some threshold value, a concept change is deemed to have taken place. An implicit assumption behind this methodology is that any drop in classification accuracy can be interpreted as a symptom of concept change. Unfortunately however, this assumption is often violated in the real world where data streams carry noise that can also introduce a significant reduction in classification accuracy. To compound this problem, traditional noise cleansing methods are incompetent for data streams. Those methods normally need to scan data multiple times whereas learning for data streams can only afford one-pass scan because of data’s high speed and huge volume. Another open problem in data stream classification is how to deal with missing values. When new instances containing missing values arrive, how a learning model classifies them and how the learning model updates itself according to them is an issue whose solution is far from being explored. To solve these problems, this paper proposes a novel classification algorithm, flexible decision tree (FlexDT), which extends fuzzy logic to data stream classification. The advantages are three-fold. First, FlexDT offers a flexible structure to effectively and efficiently handle concept change. Second, FlexDT is robust to noise. Hence it can prevent noise from interfering with classification accuracy, and accuracy drop can be safely attributed to concept change. Third, it deals with missing values in an elegant way. Extensive evaluations are conducted to compare FlexDT with representative existing data stream classification algorithms using a large suite of data streams and various statistical tests. Experimental results suggest that FlexDT offers a significant benefit to data stream classification in real-world scenarios where concept change, noise and missing values coexist.  相似文献   
88.
Strategic reasoning about business models is an integral part of service design. In fast moving markets, businesses must be able to recognize and respond strategically to disruptive change. They have to answer questions such as: what are the threats and opportunities in emerging technologies and innovations? How should they target customer groups? Who are their real competitors? How will competitive battles take shape? In this paper we define a strategic modeling framework to help understand and analyze the goals, intentions, roles, and the rationale behind the strategic actions in a business environment. This understanding is necessary in order to improve existing or design new services. The key component of the framework is a strategic business model ontology for representing and analyzing business models and strategies, using the i* agent and goal oriented methodology as a basis. The ontology introduces a strategy layer which reasons about alternative strategies that are realized in the operational layer. The framework is evaluated using a retroactive example of disruptive technology in the telecommunication services sector from the literature.  相似文献   
89.
When a system’s performance is inadequate, the concept of availability importance can be used to improve it. The availability of an item depends on the combined aspects of its reliability and maintainability. In a system consisting of many subsystems, the availability of some subsystems is more important to system performance than others. The availability measure determines the priority of availability across subsystems. Most researchers only consider operation time and ignore the influence of the operating environment; therefore, their estimations are not accurate enough. In contrast to previous research, we focus on the influence of the operating environment on the system/subsystem’s characteristics with a view to prioritizing them based on the importance of availability. The paper considers part of the mining fleet system of Sungun copper mine, including the wagon drill, loader, bulldozer, and dump truck subsystems. We identify an ordered list of possibilities for availability improvement and suggest changes or remedial actions for each item to either reduce its failure rate or reduce the time required to repair it.  相似文献   
90.
Cancer diagnosis and patient monitoring require sensitive and simultaneous measurement of multiple cancer biomarkers considering that single biomarker analysis present inadequate information on the underlying biological transformations. Thus, development of sensitive and selective assays for multiple biomarker detection might improve clinical diagnosis and expedite the treatment process. Herein, a microfluidic platform for the rapid, sensitive, and parallel detection of multiple cancer‐specific protein biomarkers from complex biological samples is presented. This approach utilizes alternating current electrohydrodynamic‐induced surface shear forces that provide exquisite control over fluid flow thereby enhancing target–sensor interactions and minimizing non‐specific binding. Further, the use of surface‐enhanced Raman scattering‐based spectral encoding with individual barcodes for different targets enables specific and simultaneous detection of captured protein biomarkers. Using this approach, the specific and sensitive detection of clinically relevant biomarkers including human epidermal growth factor receptor 2 (HER2); Mucin 1, cell surface associated (MUC1); epidermal growth factor receptor; and Mucin 16, cell surface associated (MUC16) at concentrations as low as 10 fg mL?1 in patient serum is demonstrated. Successful target detection from patient samples further demonstrates the potential of this current approach for the clinical diagnosis, which envisages a clinical translation for a rapid and sensitive appraisal of clinical samples in cancer diagnostics.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号