首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7985篇
  免费   360篇
  国内免费   49篇
电工技术   223篇
综合类   21篇
化学工业   1711篇
金属工艺   160篇
机械仪表   192篇
建筑科学   248篇
矿业工程   8篇
能源动力   550篇
轻工业   788篇
水利工程   94篇
石油天然气   153篇
武器工业   4篇
无线电   974篇
一般工业技术   1425篇
冶金工业   479篇
原子能技术   90篇
自动化技术   1274篇
  2024年   18篇
  2023年   188篇
  2022年   437篇
  2021年   561篇
  2020年   379篇
  2019年   397篇
  2018年   498篇
  2017年   343篇
  2016年   399篇
  2015年   248篇
  2014年   369篇
  2013年   610篇
  2012年   407篇
  2011年   466篇
  2010年   294篇
  2009年   255篇
  2008年   239篇
  2007年   219篇
  2006年   177篇
  2005年   161篇
  2004年   134篇
  2003年   107篇
  2002年   121篇
  2001年   64篇
  2000年   66篇
  1999年   81篇
  1998年   132篇
  1997年   111篇
  1996年   74篇
  1995年   81篇
  1994年   55篇
  1993年   50篇
  1992年   41篇
  1991年   24篇
  1990年   31篇
  1989年   42篇
  1988年   48篇
  1987年   29篇
  1986年   32篇
  1985年   42篇
  1984年   49篇
  1983年   41篇
  1982年   26篇
  1981年   21篇
  1980年   28篇
  1979年   23篇
  1978年   18篇
  1977年   21篇
  1976年   30篇
  1974年   16篇
排序方式: 共有8394条查询结果,搜索用时 9 毫秒
91.
End-of-life disassembly has developed into a major research area within the sustainability paradigm, resulting in the emergence of several algorithms and structures proposing heuristics techniques such as Genetic Algorithm (GA), Ant Colony Optimization (ACO) and Neural Networks (NN). The performance of the proposed methodologies heavily depends on the accuracy and the flexibility of the algorithms to accommodate several factors such as preserving the precedence relationships during disassembly while obtaining near- optimal and optimal solutions. This paper improves a previously proposed Genetic Algorithm model for disassembly sequencing by utilizing a faster metaheuristic algorithm, Tabu search, to obtain the optimal solution. The objectives of the proposed algorithm are to minimize (1) the traveled distance by the robotic arm, (2) the number of disassembly method changes, and (3) the number of robotic arm travels by combining the identical-material components together and hence eliminating unnecessary disassembly operations. In addition to improving the quality of optimum sequence generation, a comprehensive statistical analysis comparing the previous Genetic Algorithm and the proposed Tabu Search Algorithm is also included  相似文献   
92.
This paper presents a modular system for both abnormal event detection and categorization in videos. Complementary normalcy models are built both globally at the image level and locally within pixels blocks. Three features are analyzed: (1) spatio-temporal evolution of binary motion where foreground pixels are detected using an enhanced background subtraction method that keeps track of temporarily static pixels; (2) optical flow, using a robust pyramidal KLT technique; and (3) motion temporal derivatives. At the local level, a normalcy MOG model is built for each block and for each flow feature and is made more compact using PCA. Then, the activity is analyzed qualitatively using a set of compact hybrid histograms embedding both optical flow orientation (or temporal gradient orientation) and foreground statistics. A compact binary signature of maximal size 13 bits is extracted from these different features for event characterization. The performance of the system is illustrated on different datasets of videos recorded on static cameras. The experiments show that the anomalies are well detected even if the method is not dedicated to one of the addressed scenarios.  相似文献   
93.
Associative classification has been shown to provide interesting results whenever of use to classify data. With the increasing complexity of new databases, retrieving valuable information and classifying incoming data is becoming a thriving and compelling issue. The evidential database is a new type of database that represents imprecision and uncertainty. In this respect, extracting pertinent information such as frequent patterns and association rules is of paramount importance task. In this work, we tackle the problem of pertinent information extraction from an evidential database. A new data mining approach, denoted EDMA, is introduced that extracts frequent patterns overcoming the limits of pioneering works of the literature. A new classifier based on evidential association rules is thus introduced. The obtained association rules, as well as their respective confidence values, are studied and weighted with respect to their relevance. The proposed methods are thoroughly experimented on several synthetic evidential databases and showed performance improvement.  相似文献   
94.
95.
Image processing algorithm is implemented to detect the grain boundary of the crystal using (SEM) Scanning Electron Microscopy. This paper presents a method for edge-detection in color image based on Sobel, Canny operator’s algorithm and discrete wavelet transform. The performance of these methods is effective and faster. Filtering is another approach to clear the noise of an image. Scanning Electron Microscopy (SEM) used to inspect semiconductor materials and devices for several decades, continues to increase in importance. Removal of noise is an important step in the image restoration process, but de-noising of the image has remained a challenging problem in recent research associated with image process. De-noising is used to remove the noise from corrupted images, while retaining the edges and other detailed features too are an essential part of de-noising.  相似文献   
96.
Vertical handover gain significant importance due to the enhancements in mobility models by the Fourth Generation (4G) technologies. However, these enhancements are limited to specific scenarios and hence do not provide support for generic mobility. Similarly, various schemes are proposed based on these mobility models but most of them are suffered from the high packet loss, frequent handovers, too early and late handovers, inappropriate network selection, etc. To address these challenges, a generic vertical handover management scheme for heterogeneous wireless networks is proposed in this article. The proposed scheme works in three phases. In the first phase, a handover triggering approach is designed to identify the appropriate place for initiating handover based on the estimated coverage area of a WLAN access point or cellular base station. In the second phase, fuzzy rule based system is designed to eliminate the inappropriate networks before deciding an optimal network for handover. In the third phase, a network selection scheme is developed based on the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) decision mechanism. Various parameters such as delay, jitter, Bit Error Rate (BER), packet loss, communication cost, response time, and network load are considered for selecting an optimal network. The proposed scheme is tested in a mobility scenario with different speeds of a mobile node ranging from very low to very high. The simulation results are compared with the existing decision models used for network selection and handover triggering approaches. The proposed scheme outperforms these schemes in terms of energy consumption, handover delay and time, packet loss, good put, etc.  相似文献   
97.
The issue of bifurcation control for a delayed fractional network involving two neurons is concerned. Delay-dependent stability conditions and the bifurcation point are established by discussing the associated characteristic equation of the proposed network. Then, a delayed feedback controller is firstly designed to stabilize the Hopf bifurcation, and desirable dynamics is achieved. It is indicated that the designed controller is extremely effective which can postpone the onset of bifurcation by carefully selecting the feedback gain. Finally, simulation results are given to verify the efficiency of the theoretical results.  相似文献   
98.
Triggered Updates for Temporal Consistency in Real-Time Databases   总被引:1,自引:0,他引:1  
A real-time database systemhas temporal consistency constraints in addition to timing constraints.The timing constraints require a transaction to be completedby a specified deadline, and the temporal consistency constraintsrequire that temporal data read by a transaction be up-to-date.If a transaction reads out-of-date data, it will become temporallyinconsistent. A real-time database system consists of differenttypes of temporal data objects, including derived objects. Thevalue of a derived object is computed from a set of other objects,known as the read-set of the derived object. The derived objectmay not always reflect the current state of its read-set; a derivedobject can become out-of-date even if its read-set is up-to-date.Any subsequent transaction reading the derived object will thenbecome temporally inconsistent. In this case, in order to readup-to-date objects, a transaction will have to wait until someother transaction updates the out-of-date object. However, indoing so, the waiting transaction may miss its deadline, particularlyif the update is not periodic but instead arrives randomly. Wepropose to update the outdated objects so that not only is thetemporal consistency improved, but also the number of misseddeadlines does not increase significantly, and as a result thereis an overall improvement in the performance of the system. Wepropose, implement and study a novel approach, to be known astriggered updates, to improve temporal consistency in firm real-timedatabase systems when updates are not periodic. We identify propertiesof triggered updates and explain how they work by giving bothan intuitive and a probabilistic analysis. We present strategiesfor generating triggered updates, discuss their suitability invarious contexts and perform a detailed simulation study to evaluatetheir performance. Results show that it is possible to improvetemporal consistency without degrading the timeliness of real-time database systems to a great deal.  相似文献   
99.
In this study, the effect of the centrifugal forces on the eigenvalue solution obtained using two different nonlinear finite element formulations is examined. Both formulations can correctly describe arbitrary rigid body displacements and can be used in the large deformation analysis. The first formulation is based on the geometrically exact beam theory, which assumes that the cross section does not deform in its own plane and remains plane after deformation. The second formulation, the absolute nodal coordinate formulation (ANCF), relaxes this assumption and introduces modes that couple the deformation of the cross section and the axial and bending deformations. In the absolute nodal coordinate formulation, four different models are developed; a beam model based on a general continuum mechanics approach, a beam model based on an elastic line approach, a beam model based on an elastic line approach combined with the Hellinger–Reissner principle, and a plate model based on a general continuum mechanics approach. The use of the general continuum mechanics approach leads to a model that includes the ANCF coupled deformation modes. Because of these modes, the continuum mechanics model differs from the models based on the elastic line approach. In both the geometrically exact beam and the absolute nodal coordinate formulations, the centrifugal forces are formulated in terms of the element nodal coordinates. The effect of the centrifugal forces on the flap and lag modes of the rotating beam is examined, and the results obtained using the two formulations are compared for different values of the beam angular velocity. The numerical comparative study presented in this investigation shows that when the effect of some ANCF coupled deformation modes is neglected, the eigenvalue solutions obtained using the geometrically exact beam and the absolute nodal coordinate formulations are in a good agreement. The results also show that as the effect of the centrifugal forces, which tend to increase the beam stiffness, increases, the effect of the ANCF coupled deformation modes on the computed eigenvalues becomes less significant. It is shown in this paper that when the effect of the Poisson ration is neglected, the eigenvalue solution obtained using the absolute nodal coordinate formulation based on a general continuum mechanics approach is in a good agreement with the solution obtained using the geometrically exact beam model.  相似文献   
100.
In this paper, we consider minimax games for stochastic uncertain systems with the pay-off being a nonlinear functional of the uncertain measure where the uncertainty is measured in terms of relative entropy between the uncertain and the nominal measure. The maximizing player is the uncertain measure, while the minimizer is the control which induces a nominal measure. Existence and uniqueness of minimax solutions are derived on suitable spaces of measures. Several examples are presented illustrating the results. Subsequently, the results are also applied to controlled stochastic differential equations on Hilbert spaces. Based on infinite dimensional extension of Girsanov’s measure transformation, martingale solutions are used in establishing existence and uniqueness of minimax strategies. Moreover, some basic properties of the relative entropy of measures on infinite dimensional spaces are presented and then applied to uncertain systems described by a stochastic differential inclusion on Hilbert space. An explicit expression for the worst case measure representing the maximizing player (adversary) is found.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号