全文获取类型
收费全文 | 9050篇 |
免费 | 4篇 |
国内免费 | 1篇 |
专业分类
电工技术 | 125篇 |
化学工业 | 205篇 |
金属工艺 | 279篇 |
机械仪表 | 44篇 |
建筑科学 | 26篇 |
能源动力 | 27篇 |
轻工业 | 3篇 |
水利工程 | 1篇 |
石油天然气 | 14篇 |
武器工业 | 2篇 |
无线电 | 284篇 |
一般工业技术 | 75篇 |
冶金工业 | 36篇 |
原子能技术 | 83篇 |
自动化技术 | 7851篇 |
出版年
2024年 | 2篇 |
2022年 | 2篇 |
2019年 | 1篇 |
2018年 | 1篇 |
2017年 | 1篇 |
2016年 | 1篇 |
2015年 | 2篇 |
2014年 | 209篇 |
2013年 | 166篇 |
2012年 | 759篇 |
2011年 | 2267篇 |
2010年 | 1094篇 |
2009年 | 940篇 |
2008年 | 664篇 |
2007年 | 577篇 |
2006年 | 440篇 |
2005年 | 566篇 |
2004年 | 516篇 |
2003年 | 571篇 |
2002年 | 270篇 |
2001年 | 1篇 |
2000年 | 1篇 |
1999年 | 1篇 |
1997年 | 2篇 |
1996年 | 1篇 |
排序方式: 共有9055条查询结果,搜索用时 0 毫秒
31.
A. CsabaiAuthor VitaeI. StroudAuthor Vitae P.C. XirouchakisAuthor Vitae 《Computer aided design》2002,34(13):1011-1035
Although there have been many advances in computer-aided modelling techniques and representations of mechanical parts, there are areas where exact modelling is a handicap. One of these is 3D layout design. Here, simpler models are useful for initial design sketches to verify kinematic behaviour and organise product structure before the detailed component design phase begins. A commitment to exact, or close approximational geometry too early can imply a commitment to form before functionality has been finalised. This paper describes a system for top-down 3D layout design based on simple conceptual elements which can be used as a basis for visualisation, discussion, definition of product structure and kinematic functionality in the conceptual design phase before the embodiment or detailing begins. This tool forms a bridge between the abstract nature of the conceptual design phase and the geometric nature of the embodiment phase. The 3D layout module uses design spaces with simple geometry and kinematic connections to represent a product. The design spaces act as containers or envelopes within which the final component design is to be realised. The kinematic connections allow the behaviour of the product to be simulated to gain more information (such as overall component dimensions and areas of potential collisions) for the detailed design phase. In addition the paper describes the design process based on the proposed 3D layout design system and contrasts this with the traditional design process. An industrial case study is presented to illustrate the following advantages of the proposed approach: (i) the design process proceeds faster because unnecessary layout parameter and constraint modifications are avoided since kinematic functionality verification precedes the detail design, (ii) the design process can produce better designs since alternative solution principles can be explored early in the design process. Theoretical issues are discussed concerning kinematic constraint inheritance during design space decomposition and concerning computer support for non-rigid design spaces. 相似文献
32.
Alain BrettoAuthor Vitae Hocine CherifiAuthor Vitae Driss AboutajdineAuthor Vitae 《Pattern recognition》2002,35(3):651-658
Hypergraph theory as originally developed by Berge (Hypergraphe, Dunod, Paris, 1987) is a theory of finite combinatorial sets, modeling lot of problems of operational research and combinatorial optimization. This framework turns out to be very interesting for many other applications, in particular for computer vision. In this paper, we are going to survey the relationship between combinatorial sets and image processing. More precisely, we propose an overview of different applications from image hypergraph models to image analysis. It mainly focuses on the combinatorial representation of an image and shows the effectiveness of this approach to low level image processing; in particular to segmentation, edge detection and noise cancellation. 相似文献
33.
Peter B. GoldsmithAuthor Vitae 《Automatica》2002,38(4):703-708
The goal of iterative learning control (ILC) is to improve the accuracy of a system that repeatedly follows a reference trajectory. This paper proves that for each causal linear time-invariant ILC, there is an equivalent feedback that achieves the ultimate ILC error with no iterations. Remarkably, this equivalent feedback depends only on the ILC operators and hence requires no plant knowledge. This equivalence is obtained whether or not the ILC includes current-cycle feedback. If the ILC system is internally stable and converges to zero error, there exists an internally stabilizing feedback that approaches zero error at high gain. Since conventional feedback requires no iterations, there is no reason to use causal ILC. 相似文献
34.
Ilya A. Shkolnikov Author VitaeYuri B. ShtesselAuthor Vitae 《Automatica》2002,38(5):837-842
A method of asymptotic output tracking in a class of causal nonminimum-phase uncertain nonlinear systems is considered. Local asymptotic stability of output tracking-error dynamics are provided for the specified class of systems with the nonlinear hyperbolic at the origin internal dynamics forced by a reference output profile and external disturbances defined by a known linear exosystem. The nonlinear vector field of the internal dynamics is expanded in a power series, obtained at a selected operational point in the internal dynamics state space. The presented technique employs some linear algebraic methods and sliding mode control approach. The solution is a complete constructive algorithm. 相似文献
35.
P. Remagnino Author Vitae Author Vitae G.A. Jones Author Vitae 《Pattern recognition》2004,37(4):675-689
Latest advances in hardware technology and state of the art of computer vision and artificial intelligence research can be employed to develop autonomous and distributed monitoring systems. The paper proposes a multi-agent architecture for the understanding of scene dynamics merging the information streamed by multiple cameras. A typical application would be the monitoring of a secure site, or any visual surveillance application deploying a network of cameras. Modular software (the agents) within such architecture controls the different components of the system and incrementally builds a model of the scene by merging the information gathered over extended periods of time. The role of distributed artificial intelligence composed of separate and autonomous modules is justified by the need for scalable designs capable of co-operating to infer an optimal interpretation of the scene. Decentralizing intelligence means creating more robust and reliable sources of interpretation, but also allows easy maintenance and updating of the system. Results are presented to support the choice of a distributed architecture, and to prove that scene interpretation can be incrementally and efficiently built by modular software. 相似文献
36.
M. Carroll Author Vitae Author Vitae 《Computers & Electrical Engineering》2004,30(5):331-345
The paper reports on the experiments undertaken at the University of Wollongong to characterise fading profiles and delay parameters of an indoor wireless channels at 5 GHz U-NII bands. The measurements were undertaken at different locations around the campus with results recorded for a post-processing to calculate the Rician K-factor, the level crossing rate and the average fade duration as well as mean excess delay, rms delay spread, and the coherence bandwidth of the channel. The presented measurement results can be useful in developing a Markov chain based model of the transport channel for IEEE802.11a or HYPRLAN-2 networks. The results also indicate scenarios where the coherence bandwidth of the channel is smaller than the width of the sub-carrier OFDM channels in either of the mentioned systems. 相似文献
37.
This paper investigates fault detection and isolation of linear parameter-varying (LPV) systems by using parameter-varying (C,A)-invariant subspace and parameter-varying unobservability subspaces. The so called “detection filter” approach, formulated as the fundamental problem of residual generation (FPRG) for linear time-invariant (LTI) systems, is extended for a class of LPV systems. The question of stability is addressed in the terms of Lyapunov quadratic stability by using linear matrix inequalities. The results are applied to the model of a generic small commercial aircraft. 相似文献
38.
Decentralized overlapping feedback laws are designed for a formation of unmanned aerial vehicles. The dynamic model of the formation with an information structure constraint in which each vehicle, except the leader, only detects the vehicle directly in front of it, is treated as an interconnected system with overlapping subsystems. Using the mathematical framework of the inclusion principle, the interconnected system is expanded into a higher dimensional space in which the subsystems appear to be disjoint. Then, at each subsystem, a static state feedback controller is designed to robustly stabilize the perturbed nominal dynamics of the subsystem. The design procedure is based on the application of convex optimization tools involving linear matrix inequalities. As a final step, the decentralized controllers are contracted back to the original interconnected system for implementation. 相似文献
39.
Exploratory data analysis methods are essential for getting insight into data. Identifying the most important variables and detecting quasi-homogenous groups of data are problems of interest in this context. Solving such problems is a difficult task, mainly due to the unsupervised nature of the underlying learning process. Unsupervised feature selection and unsupervised clustering can be successfully approached as optimization problems by means of global optimization heuristics if an appropriate objective function is considered. This paper introduces an objective function capable of efficiently guiding the search for significant features and simultaneously for the respective optimal partitions. Experiments conducted on complex synthetic data suggest that the function we propose is unbiased with respect to both the number of clusters and the number of features. 相似文献
40.
Vidroha Debroy Author VitaeW. Eric WongAuthor Vitae 《Journal of Systems and Software》2011,84(4):587-602
Test set size in terms of the number of test cases is an important consideration when testing software systems. Using too few test cases might result in poor fault detection and using too many might be very expensive and suffer from redundancy. We define the failure rate of a program as the fraction of test cases in an available test pool that result in execution failure on that program. This paper investigates the relationship between failure rates and the number of test cases required to detect the faults. Our experiments based on 11 sets of C programs suggest that an accurate estimation of failure rates of potential fault(s) in a program can provide a reliable estimate of adequate test set size with respect to fault detection and should therefore be one of the factors kept in mind during test set construction. Furthermore, the model proposed herein is fairly robust to incorrect estimations in failure rates and can still provide good predictive quality. Experiments are also performed to observe the relationship between multiple faults present in the same program using the concept of a failure rate. When predicting the effectiveness against a program with multiple faults, results indicate that not knowing the number of faults in the program is not a significant concern, as the predictive quality is typically not affected adversely. 相似文献