首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The impact of steady-state multiplicities on the control of a simulated industrial scale methyl acetate reactive distillation (RD) column is studied. At a fixed reflux rate, output multiplicity, with multiple output values for the same reboiler duty, causes the column to drift to an undesirable steady-state under open loop operation. The same is avoided for a fixed reflux ratio policy. Input multiplicity, where multiple input values give the same output, leads to “wrong” control action under feedback control severely compromising control system robustness. A new metric, rangeability, is defined to quantify the severity of input multiplicity in a steady-state input–output (IO) relation. Rangeability is used in conjunction with conventional sensitivity analysis for the design of robust control structures for the RD column. Results for the two synthesized control structures show that controlling the most sensitive reactive tray temperature results in poor robustness due to low rangeability causing “wrong” control action for large disturbances. Controlling a reactive tray temperature with acceptable sensitivity but larger rangeability gives better robustness. It is also shown that controlling the difference in the temperature of two suitably chosen reactive trays further improves robustness of both the structures as input multiplicity is avoided. The article brings out the importance of IO relations for control system design and understanding the complex dynamic behavior of RD systems.  相似文献   

2.
Assembly line balancing is a classic ill-structured problem where total enumeration is infeasible and optimal solutions uncertain for industrial problems. A quantitative approach to classifying problem difficulty and solution quality is therefore important. Two existing measures of difficulty, order strength and west ratio are compared to a new compound expression of difficulty, project index. Project index is based on individual assessment of precedence (precedence index) and task time (task time index). The current working definition of project index is given. Early criteria for judging assembly lines use balance delay and smoothness index, both are flawed as criteria. Line and balance efficiency are developed as more appropriate. Project index, line and balance efficiency will be illustrated for a published test-case examined by the “ALine” balancing package. The potential for a “learning” approach, selecting models to suit problems using the measures of difficulty, will form part of the conclusions within this paper.  相似文献   

3.
Software co-evolution can be characterised as a way to “adjust” any given software implementation to a change (“shift”) in the software requirements. In this paper, we propose a formal definition of evolution complexity to precisely quantify the cost of adjusting a particular implementation to a change (“shift”) in the requirements. As a validation, we show that this definition formalises intuition about the evolvability of design patterns.  相似文献   

4.
Population learning in dynamic economies with endogenous network formation has been traditionally studied in basic settings where agents face quite simple and predictable strategic situations (e.g. coordination). In this paper, we start instead to explore economies where the payoff landscape is very complicated (rugged). We propose a model where the payoff to any agent changes in an unpredictable way as soon as any small variation in the strategy configuration within its network occurs. We study population learning where agents: (i) are allowed to periodically adjust both the strategy they play in the game and their interaction network; (ii) employ some simple criteria (e.g. statistics such as MIN, MAX, MEAN, etc.) to myopically form expectations about their payoff under alternative strategy and network configurations. Computer simulations show that: (i) allowing for endogenous networks implies higher average payoff as compared to static networks; (ii) populations learn by employing network updating as a “global learning” device, while strategy updating is used to perform “fine tuning”; (iii) the statistics employed to evaluate payoffs strongly affect the efficiency of the system, i.e. convergence to a unique (multiple) steady-state(s); (iv) for some class of statistics (e.g. MIN or MAX), the likelihood of efficient population learning strongly depends on whether agents are change-averse in discriminating between options associated to the same expected payoff.  相似文献   

5.
The socio-economic dimensions of ICT-driven educational change   总被引:1,自引:0,他引:1  
This paper analyses the varied socio-economic implications of ICT-based educational change. Drawing from a rich, 3-year long research project with 20 secondary schools throughout Europe, the social, human, professional, institutional, and economic costs for building the school of tomorrow in close alliance with ICT are discussed. The aim of this paper is to show the real costs involved in such a comprehensive model of educational change, which cannot be reduced to the cost of installing computers in classrooms. Rather, it must aim at capturing the varied long-term requirements necessary for educational change in conjunction with ICT. Great emphasis is placed on questions concerning the very sustainability of innovation and the necessity to adopt a long-term perspective that provides us with a realistic socio-economic evaluation. We argue that the real costs of educational change only become apparent when short-term improvements have been converted into sustainable changes that last beyond a project’s life-time. Key aspects for lasting contributions are identified, among which “network building” and applying a “bottom-up strategy” for change are given particular importance.  相似文献   

6.
This paper deals with a human-assisted knowledge extraction method to extract “if…then…” rules from a small set of machining data. The presented method utilizes both probabilistic reasoning and fuzzy logical reasoning to benefit from the machining data and from the judgment and preference of a machinist. Using the extracted rules, one can determine the values of operational parameters (feed, cutting velocity, etc.) to ensure the desired machining performance (keep surface roughness within the stipulated range (e.g., moderate)). Applying the presented method in a real-life machining knowledge extraction situation and comparing it with the inductive learning based knowledge extraction method (i.e., ID3), the usefulness of the method is demonstrated. As the concept of manufacturing automation is shifting toward “how to support humans by computers”, the presented method provides some valuable hints to the developers of futuristic computer integrated manufacturing systems.  相似文献   

7.
An approach to the discrete location problem is presented. The formulation is based on the set partitioning problem with the emphasis being placed on the weighted objective function and the importance of the “value” or “attractiveness” of the potential “covers”, i.e. the number of centres served by a specific location, FORTRAN programs were written to generate the potential subsets of centres to be served, for a specific county in northern Greece, and a mathematical programming package (APEX III) was used in its integer programming mode to provide solutions for various sets of subsets of centres which were generated according to certain criteria.  相似文献   

8.
Extraction of geometric characteristics for manufacturability assessment   总被引:1,自引:0,他引:1  
One of the advantages of feature-based design is that it provides data which are defined as parameters of features in readily available forms for tasks from design through manufacturing. It can thus facilitate the integration of CAD and CAM. However, not all design features are features required in down stream applications and not all parameters or data can be predefined in the features. One of the significant examples is property that is formed by feature interactions. For example, the interaction of a positive feature and a negative feature induces a wall thickness change that might cause defects in a part. Therefore, the identification of the wall thickness change by detecting the feature interaction is required in the moldability assessment.The work presented in this paper deals with the extraction of geometric characteristics in feature-based design for manufacturability assessment. We focus on the manufacturability assessment of discrete parts with emphasis on a net shape process—injection molding. The definition, derivation and representation of the spatial relationships between features are described. The geometric characteristics formed by feature interactions are generalized as significant items, such as “depth”, “thickness”, “height” etc. based on the generalization of feature shapes. Reasoning on feature interactions and extraction of geometric characteristics is treated as a refinement procedure. High-level spatial relationships—“is_in”, “adjacent_to” and “coplanar” as well as their geometric details are first derived. The significant items formed from feature interactions are then computed based on the detailed spatial relationships. This work was implemented in a computer-aided concurrent product and process development environment to support molding product design assessment.  相似文献   

9.
Chemical equilibrium calculation program for metamorphic petrology, FLASK-SG, was written for Unix variants (Linux, IRIX, Tru64 UNIX). It is also ported to Windows 95/98. The user specifies a temperature, pressure, and substance amounts (in moles of any chemical formula in C–H–O–Si–Al–Ti–Fe–Mn–Mg–Ca–Na–K system) to this program, then it calculates the stable mineral assemblage, mineral amounts, and gas composition under the given conditions using Gibbs free energy minimization method with the Holland and Powell (1990) data set. Searching algorithm for the stable mineral assemblage is the Metropolis Monte Carlo method. The coding language is C++, and experimental object oriented programming style is adopted to make the main program part as a class library. Model-dependent functions such as fugacity coefficients and activities are implemented as virtual methods of the “systems” class, so they can be easily changed as methods of inherited class from the “systems” class. These characteristics are aimed for a future “simulation kit”.  相似文献   

10.
As digital interfaces increasingly mediate our access to information, the design of these interfaces becomes increasingly important. Designing digital interfaces requires writers to make rhetorical choices that are sometimes technical in nature and often correspond with principles taught in the computer science subfield of human-computer interaction. We propose that an HCI-informed writing pedagogy can complicate for both writing and computer science students the important role audience should play when designing traditional and digital interfaces. Although it is a subtle shift in many ways, this pedagogy seemed to complicate student understanding of the relationship between audience and the texts/interfaces they created: it was not just the “human” (beliefs, attitudes, values, demographics) or the “computer” (the software or hardware or other types of mediation) that mattered but rather the “interaction” between the two. First we explore some of the ways in which writing code and writing prose have merged and paved the way for an HCI-informed writing pedagogy. Next we examine some parallels between human-computer interaction principles and composition principles. Finally, we refer to assignments, student responses, and anecdotal evidence from our classes where an HCI-informed writing pedagogy drew—or could have drawn—student attention more acutely to various audience-related technical and rhetorical interface design choices.  相似文献   

11.
Efficient constrained local model fitting for non-rigid face alignment   总被引:1,自引:1,他引:0  
Active appearance models (AAMs) have demonstrated great utility when being employed for non-rigid face alignment/tracking. The “simultaneous” algorithm for fitting an AAM achieves good non-rigid face registration performance, but has poor real time performance (2–3 fps). The “project-out” algorithm for fitting an AAM achieves faster than real time performance (>200 fps) but suffers from poor generic alignment performance. In this paper we introduce an extension to a discriminative method for non-rigid face registration/tracking referred to as a constrained local model (CLM). Our proposed method is able to achieve superior performance to the “simultaneous” AAM algorithm along with real time fitting speeds (35 fps). We improve upon the canonical CLM formulation, to gain this performance, in a number of ways by employing: (i) linear SVMs as patch-experts, (ii) a simplified optimization criteria, and (iii) a composite rather than additive warp update step. Most notably, our simplified optimization criteria for fitting the CLM divides the problem of finding a single complex registration/warp displacement into that of finding N simple warp displacements. From these N simple warp displacements, a single complex warp displacement is estimated using a weighted least-squares constraint. Another major advantage of this simplified optimization lends from its ability to be parallelized, a step which we also theoretically explore in this paper. We refer to our approach for fitting the CLM as the “exhaustive local search” (ELS) algorithm. Experiments were conducted on the CMU MultiPIE database.  相似文献   

12.
An essential prerequisite to construct a manifold trihedral polyhedron from a given natural (or partial-view) sketch is solution of the “wireframe sketch from a single natural sketch (WSS)” problem, which is the subject of this paper. Published solutions view WSS as an “image-processing”/“computer vision” problem where emphasis is placed on analyzing the given input (natural sketch) using various heuristics. This paper proposes a new WSS method based on robust tools from graph theory, solid modeling and Euclidean geometry. Focus is placed on producing a minimal wireframe sketch that corresponds to a topologically correct polyhedron.  相似文献   

13.
This paper provides a case study of a multilingual knowledge management system for a large organization. In so doing we elicit what it means for a system to be “multilingual” and how that changes some previous research on knowledge management. Some researchers have viewed multilingual as meaning a multilingual user interface. However, that is only a small part of the story. In this case we find multilingual also refers to a broad range of “multilingual,” including multilingual knowledge resources, multilingual feedback from users, multilingual search, multilingual ontologies and other concerns.  相似文献   

14.
This two-part paper is concerned with the analysis and achievement of human-like behavior by robot arms (manipulators). The analysis involves three issues: (i) the resolution of the inverse kinematics problem of redundant robots, (ii) the separation of the end-effector's motion into two components, i.e. the smooth (low accelerated) component and the fast (accelerated) component, and (iii) the fatigue of the motors (actuators) of the robot joints. In the absence of the fatigue, the human-like performance is achieved by using the partitioning of the robot joints into “smooth” and “accelerated” ones (called distributed positioning—DP). The actuator fatigue is represented by the so-called “virtual fatigue” (VF) concept. When fatigue starts, the human-like performance is achieved by engaging more the joints (motors) that are less fatigued, as does the human arm. Part I of the paper provides the theoretical issues of the above approach, while Part II applies it to the handwriting task and provides extensive simulation results that support the theoretical expectations.  相似文献   

15.
Defining operational semantics for a process algebra is often based either on labeled transition systems that account for interaction with a context or on the so-called reduction semantics: we assume to have a representation of the whole system and we compute unlabeled reduction transitions (leading to a distribution over states in the probabilistic case). In this paper we consider mixed models with states where the system is still open (towards interaction with a context) and states where the system is already closed. The idea is that (open) parts of a system “P” can be closed via an operator “PG” that turns already synchronized actions whose “handle” is specified inside “G” into prioritized reduction transitions (and, therefore, states performing them into closed states). We show that we can use the operator “PG” to express multi-level priorities and external probabilistic choices (by assigning weights to handles inside G), and that, by considering reduction transitions as the only unobservable τ transitions, the proposed technique is compatible, for process algebra with general recursion, with both standard (probabilistic) observational congruence and a notion of equivalence which aggregates reduction transitions in a (much more aggregating) trace based manner. We also observe that the trace-based aggregated transition system can be obtained directly in operational semantics and we present the “aggregating” semantics. Finally, we discuss how the open/closed approach can be used to also express discrete and continuous (exponential probabilistic) time and we show that, in such timed contexts, the trace-based equivalence can aggregate more with respect to traditional lumping based equivalences over Markov Chains.  相似文献   

16.
17.
Towards an algebraic theory of information integration   总被引:4,自引:0,他引:4  
Information integration systems provide uniform interfaces to varieties of heterogeneous information sources. For query answering in such systems, the current generation of query answering algorithms in local-as-view (source-centric) information integration systems all produce what has been thought of as “the best obtainable” answer, given the circumstances that the source-centric approach introduces incomplete information into the virtual global relations. However, this “best obtainable” answer does not include all information that can be extracted from the sources because it does not allow partial information. Neither does the “best obtainable” answer allow for composition of queries, meaning that querying a result of a previous query will not be equivalentto the composition of the two queries. In this paper, we provide a foundation for information integration, based on the algebraic theory of incomplete information. Our framework allows us to define the semantics of partial facts and introduce the notion of the exact answer—that is the answer that includes partial facts. We show that querying under the exact answer semantics is compositional. We also present two methods for actually computing the exact answer. The first method is tableau-based, and it is a generalization of the “inverse-rules” approach. The second, much more efficient method, is a generalization of the rewriting approach, and it is based on partial containment mappings introduced in the paper.  相似文献   

18.
Both the hardware and software available for digital geological mapping (DGM) have advanced considerably in recent years. Mobile computers have become cheaper, lighter, faster and more power efficient. Global Positioning Systems (GPS) have become cheaper, smaller and more accurate, and software specifically designed for geological mapping has become available. These advances have now reached a stage where it is effective to replace traditional paper-based mapping techniques with those employing DGM methodologies. This paper attempts to assess and evaluate two currently available DGM systems for geological outcrop mapping: one based on a Personal Digital Assistant (PDA) running ESRI “ArcPad”, and the second based on a Tablet PC running “Map IT” software. Evaluation was based on field assessment during mapping of a well-exposed coastal section of deformed Carboniferous and Permian rocks at N. Tynemouth in NE England. Prior to the field assessment, several key criteria were identified as essential attributes of an effective DGM system. These criteria were used as the basis for the assessment and evaluation process. Our findings suggest that the main concerns presented by sceptics opposed to DGM have largely been resolved.In general, DGM systems using a Tablet PC were found to be most suitable for a wide range of geological data collection tasks, including detailed outcrop mapping. In contrast, systems based on a PDA, due to small screen and limited processing power, were best suited for more basic mapping and simple data collection tasks. In addition, PDA-based systems can be particularly advantageous for mapping projects in remote regions, in situations where there is a limited power supply or where total weight of equipment is an important consideration.  相似文献   

19.
Online Analytical Processing (OLAP) has become a primary component of today’s pervasive Decision Support systems. As the underlying databases grow into the multi-terabyte range, however, single CPU OLAP servers are being stretched beyond their limits. In this paper, we present a comprehensive model for a fully parallelized OLAP server. Our multi-node platform actually consists of a series of largely independent sibling servers that are “glued” together with a lightweight MPI-based Parallel Service Interface (PSI). Physically, we target the commodity-oriented, “shared nothing” Linux cluster, an architecture that provides an extremely cost effective alternative to the “shared everything” commercial platforms often used in high-end database environments. Experimental results demonstrate both the viability and robustness of the design.  相似文献   

20.
This research contributes to the theoretical basis for appropriate design of computer-based, integrated planning information systems. The research provides a framework for integrating relevant knowledge, theory, methods, and technology. Criteria for appropriate system design are clarified. The requirements for a conceptual system design are developed based on “diffusion of innovation” theory, lessons learned in the adoption and use of existing planning information systems, current information-processing technology (including expert system technology), and methodology for evaluation of mitigation strategies for disaster events. Research findings focus on the assessment of new information systems technology. Chief among these findings is the utility of case-based reasoning for discovering and formalizing the meta rules needed by expert systems, the role of the “diffusion of innovation” theory in establishing design criteria, and the definition of client interests served by integrated planning information systems. The work concludes with the selection of a prototyping exercise. The prototype is developed in a forthcoming technical paper (Masri & Moore, 1994).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号