首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   69篇
  免费   0篇
化学工业   5篇
能源动力   16篇
无线电   7篇
一般工业技术   25篇
冶金工业   1篇
自动化技术   15篇
  2021年   2篇
  2020年   2篇
  2019年   1篇
  2018年   1篇
  2016年   5篇
  2013年   9篇
  2012年   1篇
  2011年   2篇
  2010年   2篇
  2009年   1篇
  2008年   2篇
  2006年   1篇
  2005年   2篇
  2004年   7篇
  2003年   3篇
  2001年   4篇
  2000年   1篇
  1999年   4篇
  1998年   2篇
  1995年   1篇
  1994年   2篇
  1993年   1篇
  1992年   3篇
  1991年   2篇
  1990年   1篇
  1989年   3篇
  1988年   4篇
排序方式: 共有69条查询结果,搜索用时 0 毫秒
31.
The primary objectives of the present exposition are to: (i) provide a generalized unified mathematical framework and setting leading to the unique design of computational algorithms for structural dynamic problems encompassing the broad scope of linear multi‐step (LMS) methods and within the limitation of the Dahlquist barrier theorem (Reference [3], G. Dahlquist, BIT 1963; 3 : 27), and also leading to new designs of numerically dissipative methods with optimal algorithmic attributes that cannot be obtained employing existing frameworks in the literature, (ii) provide a meaningful characterization of various numerical dissipative/non‐dissipative time integration algorithms both new and existing in the literature based on the overshoot behavior of algorithms leading to the notion of algorithms by design, (iii) provide design guidelines on selection of algorithms for structural dynamic analysis within the scope of LMS methods. For structural dynamics problems, first the so‐called linear multi‐step methods (LMS) are proven to be spectrally identical to a newly developed family of generalized single step single solve (GSSSS) algorithms. The design, synthesis and analysis of the unified framework of computational algorithms based on the overshooting behavior, and additional algorithmic properties such as second‐order accuracy, and unconditional stability with numerical dissipative features yields three sub‐classes of practical computational algorithms: (i) zero‐order displacement and velocity overshoot (U0‐V0) algorithms; (ii) zero‐order displacement and first‐order velocity overshoot (U0‐V1) algorithms; and (iii) first‐order displacement and zero‐order velocity overshoot (U1‐V0) algorithms (the remainder involving high‐orders of overshooting behavior are not considered to be competitive from practical considerations). Within each sub‐class of algorithms, further distinction is made between the design leading to optimal numerical dissipative and dispersive algorithms, the continuous acceleration algorithms and the discontinuous acceleration algorithms that are subsets, and correspond to the designed placement of the spurious root at the low‐frequency limit or the high‐frequency limit, respectively. The conclusion and design guidelines demonstrating that the U0‐V1 algorithms are only suitable for given initial velocity problems, the U1‐V0 algorithms are only suitable for given initial displacement problems, and the U0‐V0 algorithms are ideal for either or both cases of given initial displacement and initial velocity problems are finally drawn. For the first time, the design leading to optimal algorithms in the context of a generalized single step single solve framework and within the limitation of the Dahlquist barrier that maintains second‐order accuracy and unconditional stability with/without numerically dissipative features is described for structural dynamics computations; thereby, providing closure to the class of LMS methods. Copyright © 2003 John Wiley & Sons, Ltd.  相似文献   
32.
33.
34.
The success of resin transfer molding (RTM) depends upon the complete wetting of the fiber preform. Effective mold designs and process modifications facilitating the improved impregnation of the preform have direct impact on the successful manufacturing of parts. Race tracking caused by variations in permeabilities around bends, corners in liquid composite molding (LCM) processes such as RTM have been traditionally considered undesirable, while related processes such as vacuum assisted RTM (VARTM) and injection molding have employed flow channels to improve the resin distribution. In this paper, studies on the effect of flow channels are explored for RTM through process simulation studies involving flow analysis of resin, when channels are involved. The flow in channels has been modeled and characterized based on equivalent permeabilities. The flow in the channels is taken to be Darcian as in the fiber preform, and process modeling and simulation tools for RTM have been employed to study the flow and pressure behavior when channels are involved. Simulation studies based on a flat plate indicated that the pressures in the mold are reduced with channels, and have been compared with experimental results and equivalent permeability models. Experimental comparisons validate the reduction in pressures with channels and validate the use of equivalent permeability models. Numerical simulation studies show the positive effect of the channels to improve flow impregnation and reduce the mold pressures. Studies also include geometrically complex parts to demonstrate the positive advantages of flow channels in RTM.  相似文献   
35.
Ontology Management in Enterprises   总被引:2,自引:0,他引:2  
In today's increasingly competitive business market, many large organisations have begun to research and develop flexible enterprises in order to be able to quickly respond to market opportunities. They are seeking ways to maximise the power of information assets stored across hundreds of their databases and application programs by bringing them into open interoperable environments. However, this effort has been seriously hindered by various kinds of heterogeneity. Each system has its own domain model for its environment and efficient task operations. However, in order for systems to communicate and co-operate effectively, a shared domain model is required. The semantics of the common, and each local, model must be captured explicitly and formally to enable meaningful information exchange.The semantics of diverse information resources are captured by ontologies — definitions of terms as used in data sources, i.e. concepts and the relationships between concepts. When defining the relationships between data sources, we rely on ontologies to make the meaning of the different vocabularies used explicit. This paper explores the role of ontologies in enterprises, and proposes a methodology for managing enterprise ontology resources and a suite of support tools.  相似文献   
36.
In the manufacturing process of large geometrically complex components comprising of fibre‐reinforced composite materials by resin transfer molding (RTM), the process involves injection of resin into a mold cavity filled with porous fibre preforms. The overall success of the RTM manufacturing process depends on the complete impregnation of the fibre mat by the polymer resin, prevention of polymer gelation during filling, and subsequent avoidance of dry spots. Since a cold resin is injected into a hot mold, the associated physics encompasses a moving boundary value problem in conjunction with the multi‐disciplinary study of flow/thermal and cure kinetics inside the mold cavity. Although experimental validations are indispensable, routine manufacture of large complex structural geometries can only be enhanced via computational simulations, thus eliminating costly trial runs and helping the designer in the set‐up of the manufacturing process. This study describes the computational developments towards formulating an effective simulation‐based design methodology using the finite element method. The specific application is for thin shell‐like geometries with the thickness being much smaller than the other dimensions of the part. Due to the highly advective nature of the non‐isothermal conditions involving thermal and polymerization reactions, special computational considerations and stabilization techniques are also proposed. Validations and comparisons with experimental results are presented whenever available. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   
37.
A standardized formal theory of development/evolution, characterization and design of a wide variety of computational algorithms emanating from a generalized time weighted residual philosophy for dynamic analysis is first presented with subsequent emphasis on detailed formulations of a particular class relevant to the so‐called time integration approaches which belong to a much broader classification relevant to time discretized operators. Of fundamental importance in the present exposition is the evolution of the theoretical design and the subsequent characterization encompassing a wide variety of time discretized operators, and the proposed developments are new and significantly different from the way traditional modal type and a wide variety of step‐by‐step time integration approaches with which we are mostly familiar have been developed and described in the research literature and in standard text books over the years. The theoretical ideas and basis towards the evolution of a generalized methodology and formulations emanate under the umbrella and framework and are explained via a generalized time weighted philosophy encompassing single‐field and two‐field forms of representations of the semi‐discretized dynamic equations of motion. Therein, the developments first leading to integral operators in time, and the resulting consequences then systematically leading to and explaining a wide variety of generalized time integration operators of which the family of single‐step time integration operators and various widely recognized and new algorithms are subsets, the associated multi‐step time integration operators and a class of finite element in time integration operators, and their relationships are particularly addressed. The generalized formulations not only encompass and explain a wide variety of time discretized operators and the recovery of various original methods of algorithmic development, but furthermore, naturally inherit features for providing new avenues which have not been explored an/or exploited to‐date and permit time discretized operators to be uniquely characterized by algorithmic markers. The resulting and so‐called discrete numerically assigned [DNA] markers not only serve as a prelude towards providing a standardized formal theory of development of time discretized operators and forum for selecting and identifying time discretized operators, but also permit lucid communication when referring to various time discretized operators. That which constitutes characterization of time discretized operators are the so‐called DNA algorithmic markers which essentially comprise of both (i) the weighted time fields introduced for enacting the time discretization process, and (ii) the corresponding conditions these weighted time fields impose (dictate) upon the approximations (if any) for the dependent field variables in the theoretical development of time integrators and the associated updates of the time discretized operators. Furthermore, a single analysis code which permits a variety of choices to the analyst is now feasible for performing structural dynamics computations on modern computing platforms. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   
38.
Some noteworthy and historical perspectives and an overview of macroscale and microscale heat transport behavior in materials and structures are presented. The topic of heat waves is also discussed. The significance of constitutive models for both macroscale and microscale heat conduction are described in conjunction with generalizations drawn concerning the physical relevance and the role of relaxation and retardation times emanating from the Jeffreys type heat flux constitutive model, with consequences to the Cattaneo heat flux model and subsequently to the Fourier heat flux model. Both macroscopic model formulations for applications to macroscopic heat conduction problems and two-step models for use in specialized applications to account for microscale heat transport mechanisms are overviewed with emphasis on the proposition of a Generalized Two-Step relaxation / retardation time-based heating model. So as to bring forth a variety of issues in a single forum, illustrative numerical applications are overviewed including some relevance to thermo-mechanical interactions  相似文献   
39.
Stabilized flux-based finite element representations for steady two-dimensional incompressible flow / thermal problems with emphasis on subsequently applying such techniques to convectively cooled structures are described in this article. First, the discretized equations are derived from a mixed formulation using both primary and flux variables in conjunction with the Streamline-Upwind-Petrov-Galerkin and Pressure-Stabilizing-Petrov-Galerkin features that are used to stabilize the solutions. The constitutive equations are then introduced into the discretized representations and the equations are finally solved for the primary variables. Equal-order linear quadrilateral interpolation functions are used for the velocities, pressure, and temperature. Numerical results are presented for a variety of situations, and finally emphasis is placed on applications to convectively cooled structures that are subjected to intense localized heating.  相似文献   
40.
Abstract

The present article introduces a new and effective virtual-pulse (VIP) time integral methodology of computation for linear transient heat transfer analysis and serves to lay down the theoretical basis for subsequent applications to general heat transfer problems. For expository purposes, attention is purposely restricted to linear models. For this class of problems, the proposed methodology is explicit, unconditionally stable, and possesses second-order accuracy for a general heat loading situation. Unlike past approaches and ongoing practices, the methodology offers several computationally attractive yet accurate features, and, promises to be an attractive alternative for heat transfer analysts  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号