首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   2018篇
  免费   135篇
  国内免费   4篇
电工技术   24篇
综合类   3篇
化学工业   307篇
金属工艺   14篇
机械仪表   25篇
建筑科学   85篇
矿业工程   3篇
能源动力   41篇
轻工业   347篇
水利工程   21篇
石油天然气   3篇
无线电   163篇
一般工业技术   251篇
冶金工业   601篇
原子能技术   19篇
自动化技术   250篇
  2022年   12篇
  2021年   16篇
  2020年   19篇
  2019年   38篇
  2018年   41篇
  2017年   30篇
  2016年   45篇
  2015年   20篇
  2014年   33篇
  2013年   105篇
  2012年   60篇
  2011年   82篇
  2010年   53篇
  2009年   48篇
  2008年   87篇
  2007年   75篇
  2006年   82篇
  2005年   83篇
  2004年   77篇
  2003年   64篇
  2002年   69篇
  2001年   42篇
  2000年   33篇
  1999年   50篇
  1998年   163篇
  1997年   100篇
  1996年   81篇
  1995年   37篇
  1994年   53篇
  1993年   61篇
  1992年   15篇
  1991年   30篇
  1990年   23篇
  1989年   25篇
  1988年   17篇
  1987年   19篇
  1986年   21篇
  1985年   21篇
  1984年   15篇
  1983年   17篇
  1982年   12篇
  1981年   13篇
  1980年   17篇
  1979年   14篇
  1978年   10篇
  1977年   19篇
  1976年   32篇
  1974年   12篇
  1971年   8篇
  1970年   10篇
排序方式: 共有2157条查询结果,搜索用时 31 毫秒
991.
This paper reviews and extends a technique to detect weak coupling (one-way coupling or complete decoupling) among elements of a dynamic system model, and to partition and reduce models in which weak coupling is found. The ability to partition a model increases the potential for physical-domain model reduction, and allows parallel simulation of smaller individual submodels that can reduce computation time. Negligible constraint equation terms are identified and eliminated in a bond graph by converting inactive power bonds to modulated sources. If separate bond graphs result, between which all modulating signals move from a “driving” subgraph to a “driven” one, then one-way coupling exists in the model and it can be separated into driving and driven partitions. Information flow between the subgraphs is one-way.In this paper the algorithm is extended to models in which two-way information flow from modulating signals precludes complete partitioning. It is shown for several classes of modulating signals that, under certain conditions the signal is “weak” and therefore can be eliminated. Removal of weak signals allows partitioning of the longitudinal and pitch dynamics of a medium-duty truck model. The intensity of dynamic coupling and the potential for model reduction are shown to depend on the magnitude of system parameters and the severity of inputs such as road roughness.  相似文献   
992.
Carrier-grade networks of the future are currently being standardized and designed under the umbrella name of Next Generation Network (NGN). The goal of NGN is to provide a more flexible network infrastructure that supports not just data and voice traffic routing, but also higher level services and interfaces for third-party enhancements. Within this paper, opportunities to integrate grid and cloud computing strategies and standards into NGN are considered. The importance of standardized interfaces and interoperability testing demanded by carrier-grade networks are discussed. Finally, a proposal how the testing methods developed at the European Telecommunications Standards Institute (ETSI) can be applied to improve the quality of standards and implementations is presented.  相似文献   
993.
By the end of 1943, US Navy mathematician/codebreaker Marshall Hall Jr. had developed a system of statistical weights to align JN-25 messages in depth. Although then-current methods of aligning JN-25 messages in depth were working satisfactorily, Hall developed his method “just in case.” On 1 December 1943, the Japanese changed the method of numbering the lines and columns of additives on pages of the JN-25 additive book, and Hall’s weights, which had been developed “just in case,” were needed immediately. This paper discusses both the mathematical idea that was the foundation of Hall’s weights and the construction of the weights. It also explores the navy’s use of the weights as well as their use at Bletchley Park. At the same time, the navy was exploring the use of two other systems of weights to align JN-25 messages in depth, and those systems of weights are also described.  相似文献   
994.
We present two techniques for simplifying the list processing required in standard iterative refinement approaches to shape quality mesh generation. The goal of these techniques is to gain simplicity of programming, efficiency in execution, and robustness of termination. ‘Shape quality’ for a mesh generation method usually means that, under suitable conditions, a mesh with all angles exceeding a prescribed tolerance is generated. The methods introduced in this paper are truncated versions of such methods. They depend on the shape improvement properties of the terminal-edge LEPP-Delaunay refinement technique; we refer to them as approximate shape quality methods. They are intended for geometry-based preconditioning of coarse initial meshes for subsequent refinement to meet data representation needs. One technique is an algorithm re-organization to avoid maintaining a global list of triangles to be refined. The re-organization uses a recursive triangle processing strategy. Truncating the recursion depth results in an approximate method. Based on this, we argue that the refinement process can be carried out using a static list of the triangles to be refined that can be identified in the initial mesh. Comparisons of approximate to full shape quality meshes are provided.  相似文献   
995.
Metamodels for Computer-based Engineering Design: Survey and recommendations   总被引:47,自引:1,他引:46  
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of today’s engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper, we review several of these techniques, including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning and kriging. We survey their existing application in engineering design, and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations, and how common pitfalls can be avoided.  相似文献   
996.
Two methods adapted from biological microscopy are described for a new application in imaging the morphology of rubbery latex particles. In the first method, a drop of latex is frozen in liquid nitrogen, sectioned with a diamond knife and vapour-stained with osmium tetroxide, then viewed by transmission electron microscopy. When applied to latexes made by emulsion polymerization of methyl methacrylate in a natural rubber latex seed, inclusions are clearly visible. A chemical fixation method is then described for imaging the morphology of such rubbery latex particles. Glutaraldehyde is added to the latex, followed by osmium tetroxide. The sample is then dehydrated in ethanol, epoxy resin added, and the sample cured, ultramicrotomed, and imaged with transmission electron microscopy. An inclusion morphology is again clearly seen.  相似文献   
997.
Water Rights     
Chris Perry  Geoff Kite 《国际水》2013,38(4):341-347
Abstract

Water is becoming increasingly scarce in many parts of the world. In most such areas, definition and enforcement of water rights have yet to be put in place, while experience shows that these steps are fundamentally important to ensure productive and efficient use of water. Often, as competition intensifies, the data required to assess appropriate allocations and rules become more contentious and difficult to access. New technologies are available, primarily based on remotely sensed satellite data used in conjunction with a minimum set of ground measurements, to generate hydrological data. Such data can readily be correlated with streamflow measurements and also can be used for “what if” analyses of specific climate situations or changes in ground cover. The satellite data are freely available over the Internet, as are other data required.  相似文献   
998.
Immersive visualisation is increasingly being used for comprehensive and rapid analysis of objects in 3D and object dynamic behaviour in 4D. Challenges are therefore presented to provide natural user interaction to enable effortless virtual object manipulation. Presented in this paper is the development and evaluation of an immersive human?Ccomputer interaction system based on stereoscopic viewing and natural hand gestures. For the development, it is based on the integration of a back-projection stereoscopic system for object and hand display, a hybrid inertial and ultrasonic tracking system to provide the absolute positions and orientations of the user??s head and hands, as well as a pair of high degrees-of-freedom data gloves to provide the relative positions and orientations of digit joints and tips on both hands. For the evaluation, it is based on a two-object scene with a virtual cube and a CT (computed tomography) volume created for demonstration of real-time immersive object manipulation. The system is shown to provide a correct user view of objects and hands in 3D with depth, as well as to enable a user to use a number of simple hand gestures to perform basic object manipulation tasks involving selection, release, translation, rotation and scaling. Also included in the evaluation are some quantitative tests of the system performance in terms of speed and latency.  相似文献   
999.
1000.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号