首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The development of complex models can be greatly facilitated by the utilization of libraries of reusable model components. In this paper we describe an object-oriented module specification formalism (MSF) for implementing archivable modules in support of continuous spatial modeling. This declarative formalism provides the high level of abstraction necessary for maximum generality, provides enough detail to allow a dynamic simulation to be generated automatically, and avoids the “hard-coded” implementation of space-time dynamics that makes procedural specifications of limited usefulness for specifying archivable modules. A set of these MSF modules can be hierarchically linked to create a parsimonious model specification, or “parsi-model”. The parsi-model exists within the context of a modeling environment (an integrated set of software tools which provide the computer services necessary for simulation development and execution), which can offer simulation services that are not possible in a loosely-coupled “federated” environment, such as graphical module development and configuration, automatic differentiation of model equations, run-time visualization of the data and dynamics of any variable in the simulation, transparent distributed computing within each module, and fully configurable space-time representations. We believe this approach has great potential for bringing the power of modular model development into the collaborative simulation arena.  相似文献   

2.
Environmental modelling is done more and more by practising ecologists rather than computer scientists or mathematicians. This is because there is a broad spectrum of development tools available that allows graphical coding of complex models of dynamic systems and help to abstract from the mathematical issues of the modelled system and the related numerical problems for estimating solutions. In this contribution, we study how different modelling tools treat a test system, a highly non-linear predator–prey model, and how the numerical solutions vary. We can show that solutions (a) differ if different development tools are chosen but the same numerical procedure is selected; (b) depend on undocumented implementation details; (c) vary even for the same tool but for different versions; and (d) are generated but with no notifications on numerical problems even if these could be identified. We conclude that improved documentation of numeric methods used in the modelling software is essential to make sure that process based models formulated in terms of these modelling packages do not become “black box” models due to uncertainty in integration methods.  相似文献   

3.
Integrated assessment and its inherent platform, integrated modelling, present an opportunity to synthesize diverse knowledge, data, methods and perspectives into an overarching framework to address complex environmental problems. However to be successful for assessment or decision making purposes, all salient dimensions of integrated modelling must be addressed with respect to its purpose and context. The key dimensions include: issues of concern; management options and governance arrangements; stakeholders; natural systems; human systems; spatial scales; temporal scales; disciplines; methods, models, tools and data; and sources and types of uncertainty. This paper aims to shed light on these ten dimensions, and how integration of the dimensions fits in the four main phases in the integrated assessment process: scoping, problem framing and formulation, assessing options, and communicating findings. We provide examples of participatory processes and modelling tools that can be used to achieve integration.  相似文献   

4.
The introduction of new technologies and concepts has redefined the relative positioning of information systems (IS) and decision technologies in a corporate context. Corporate IS have been extended to include not only transaction processing databases but also analytical databases, often known as Data Warehouses. On-line analytical processing (OLAP), as introduced by Codd et al. [E.F. Codd, S.B Codd, C.T. Salley, Providing On-Line Analytical Processing to User–Analysts: An IT Mandate, E.F. Codd and Associates, 1993], is capable of capturing the structure of the real world data in the form of multidimensional tables which are known as `datacubes' by management information systems (MIS) and statistical systems specialists. Manipulation and presentation of such information through multidimensional views and graphical displays provide invaluable support for the decision-maker. We illustrate the natural coupling, which exists between data modelling, symbolic modelling and `What if' analysis phases of a decision support system (DSS). In particular, we explore the power of roll-up and drill-down features of OLAP and show how these translate into aggregation and disagreggation of the underlying decision models. Our approach sets out a paradigm for analysing the data, applying DSS tools and progressing through the information value chain to create organisational knowledge.  相似文献   

5.
Newer approaches for modelling travel behaviour require a new approach to integrated spatial economic modelling. Travel behaviour modelling is increasingly disaggregate, econometric, dynamic, and behavioural. A fully dynamic approach to urban system modelling is described, where interactions are characterized as two agents interacting through discrete events labelled as “offer” or “accept”. This leads to a natural partition of an integrated urban model into submodels based on the category of what is being exchanged, the type of agent, and the time and place of interaction.Where prices (or price-like signals such as congested travel times) exist to stimulate supply and/or to suppress demand, the dynamic change in prices can be represented either behaviourally, as individual agents adjust their expectations in response to their personal history and the history of the modelled region, or with an “auctioneer” from micro-economic theory, who adjusts average prices. When no auctioneers are used, the modelling system can use completely continuous representations of both time and space.Two examples are shown. The first is a demonstration of a continuous-time continuous-space transaction simulation with simple agents representing businesses and households. The second shows how an existing model—the Oregon TLUMIP project for statewide land-use and transport modelling—can be adapted into the paradigm.  相似文献   

6.
Quantitative design is crucial to ICT and it is therefore important to integrate performance modelling techniques into support environments that facilitate the correct construction of computer systems. We consider Performance Modelling Interchange Formats (PMIFs), which allow models to be specified in a uniform way and ported to a number of tools that solve them. We focus on extending the class of models describable in a PMIF that can be solved analytically – specifically, yielding a product-form solution for their equilibrium state probabilities. We use an extension of an established theorem, called the ‘reversed compound agent theorem’ (RCAT) as the basis of the analytical modelling tool into which the extended PMIF feeds models. We describe the RCAT methodology in practical terms, how it is integrated into an extended PMIF, and illustrate our methodology with three examples.  相似文献   

7.
Supply chain excellence has a real huge impact on business strategy. Building supply chains (SCs) as flexible system represents one of the most exciting opportunities to create value (e.g., seamless SCs). This requires integrated decision making amongst autonomous chain partners with effective decision knowledge sharing among them. The key to success lies in knowing which decision has more impact on the supply chains performance. Knowledge sharing has immense potential to create expedient opportunities and thus retain greater value for supply chains. In this context, knowledge management (KM) can be used as an effective approach to achieve knowledge sharing and decision synchronization among supply chain partners. To maximize competitive advantage, concept of seamless supply chains is emerging with KM as key enabler. Thus, there is a need to develop demo models that can encourage chain members towards collaborative-knowledge sharing in the SCs. This paper depicts the application of one such model based on decision knowledge sharing (DKS) for improved supply chains management. We study the impact of DKS (both partial and full DKS configuration in SC) and then compare the performance with information sharing (IS) and forecasting. By exploiting DKS and flexibility in supply chains structures better performance can be achieved. The paper develops the demo models on various supply chains scenario like (1st, 2nd and 3rd stage SCs, forecasting, IS and DKS (full and partial). The partial and full DKS based flexibility configurations of SCs are considered for simulation experimentation. A simulation model of a supply chains based on flexible framework is developed for demo purposes. The key results are highlighted along with the respective industry implications. Our research is continuing in this direction.  相似文献   

8.
Exploring the inter-linkages of water, soil and waste resources and advancing an integrated management (or Nexus-) approach requires integrated modelling tools. Numerous models are available dealing with specific environmental processes, covering certain spatial and temporal scales and applying different mathematical process-describing relationships. However, finding and selecting the most appropriate (suite of) model(s) for a particular purpose is challenging, since current inventories do not allow any interactive filtering or model comparison. Therefore, we developed an interactive web-based platform, called Nexus Tools Platform (NTP), for inter-model comparison of existing modelling tools, which provides detailed information and allows for advanced filtering based on real-time visualizations. The alpha version of NTP (September 2015) comprises 73 models and covers a wide range of model types and environmental processes. We present selected NTP application examples for how to find and select the most appropriate modelling tools for a specific application using meta-analysis.  相似文献   

9.
10.
Scientific research involves mathematical modelling in the context of an interactive balance between theory, experiment and computation. However, computational methods and tools are still far from being appropriately integrated in the high school and university curricula in science and mathematics. In this paper, it is discussed the relevance of mathematical modelling and illustrated how a computer modelling tool (Modellus, a free tool available on the Internet and developed at FCTUNL) can be used to embed modelling in high school and undergraduate courses. Modellus allows students to create and explore mathematical models using functions, differential and iterative equations, and visualize the behaviour of mathematical objects.  相似文献   

11.
12.
A theory of application service provider (ASP) use from a client perspective   总被引:14,自引:1,他引:13  
Since the late 1990s, “application service providers” (ASPs) have offered telecommunication-based application services. ASP use can be considered a type of IS outsourcing as well as the creation of an inter-organizational system, such as is created for electronic data interchange (EDI). We developed a theory of ASP adoption and use from the client’s perspective based on analysis of primary and secondary data on ASP use and analysis of literature on IS outsourcing and EDI. We present a model and highlight similarities and differences between conventional IS outsourcing, ASP use, and EDI.  相似文献   

13.
The evolution of web-based optimisation: From ASP to e-Services   总被引:2,自引:0,他引:2  
In many application domains, optimisation and other analytic models are often embedded as a decision engine within the respective business processes. In this paper, we study recent trends in the provision of optimisation tools and optimisation-based decision support systems (DSS) as web-enabled distributed applications. We analyse the evolution from the Application Service Provision (ASP) model to the e-Services model, and we illustrate the importance of distributed optimisation components in the effective deployment of embedded “business analytics”. We finally provide an overview of the OSP and the WEBOPT projects, which deliver optimisation-based applications and optimisation components in a distributed environment.  相似文献   

14.
A large number of modelling tools exist for the construction and solution of mathematical models of chemical processes. Each (chemical) process modelling tool provides its own model representation and model definition functions as well as its own solution algorithms, which are used for performing computer-aided studies for the process under consideration. However, in order to support reusability of existing models and to allow for the combined use of different modelling tools for the study of complex processes, model integration is needed. This paper presents a concept for an integration platform that allows for the integration of modelling tools, combining their models to build up a process model and performing computer-aided studies based on this integrated process model. In order to illustrate the concept without getting into complicated algorithmic issues, we focus on steady-state simulation using models comprising only algebraic equations. The concept is realized in the component-based integration platform CHEOPS, which focuses on integrating and solving existing models rather than providing its own modelling capabilities.  相似文献   

15.
Data modelling reveals the internal structure of an information system, abstracting away from details of the physical representation. We show that entity-relationship modelling, a well-tried example of a data-modelling technique, can be applied to both interactive and noninteractive information artifacts in the domain of HCI. By extending the conventional ER notation slightly (to give ERMIA, Entity-Relationship Modelling for Information Artifacts) it can be used to describe differences between different representations of the same information, differences between user's conceptual models of the same device, and the structure and update requirements of distributed information in a worksystem. It also yields symbolic-level estimates of Card, Pirolli and Mackinlay's index of “cost-of-knowledge” in an information structure, plus a novel index, the “cost-of-update”; these symbolic estimates offer a useful complement to the highly detailed analyses of time costs obtainable from GOMS-like models. We conclude that, as a cheap, coarse-grained, and easy-to-learn modelling technique, ERMIA usefully fills a gap in the range of available HCI analysis techniques.  相似文献   

16.
Time series data mining (TSDM) techniques permit exploring large amounts of time series data in search of consistent patterns and/or interesting relationships between variables. TSDM is becoming increasingly important as a knowledge management tool where it is expected to reveal knowledge structures that can guide decision making in conditions of limited certainty. Human decision making in problems related with analysis of time series databases is usually based on perceptions like “end of the day”, “high temperature”, “quickly increasing”, “possible”, etc. Though many effective algorithms of TSDM have been developed, the integration of TSDM algorithms with human decision making procedures is still an open problem. In this paper, we consider architecture of perception-based decision making system in time series databases domains integrating perception-based TSDM, computing with words and perceptions, and expert knowledge. The new tasks which should be solved by the perception-based TSDM methods to enable their integration in such systems are discussed. These tasks include: precisiation of perceptions, shape pattern identification, and pattern retranslation. We show how different methods developed so far in TSDM for manipulation of perception-based information can be used for development of a fuzzy perception-based TSDM approach. This approach is grounded in computing with words and perceptions permitting to formalize human perception-based inference mechanisms. The discussion is illustrated by examples from economics, finance, meteorology, medicine, etc.  相似文献   

17.
 Diffuse nutrient emissions from agricultural land is one of the major sources of pollution for ground water, rivers and coastal waters. The quantification of pollutant loads requires mathematical modelling of water and nutrient cycles. The deterministic simulation of nitrogen dynamics, represented by complicated highly non-linear processes, requires the application of detailed models with many parameters and large associated data bases. The operation of those models within integrated assessment tools or decision support systems for large regions is often not feasible. Fuzzy rule based modelling provides a fast, transparent and parameter parsimonious alternative. Besides, it allows regionalisation and integration of results from different models and measurements at a higher generalised level and enables explicit consideration of expert knowledge. In this paper an algorithm for the assessment of fuzzy rules for fuzzy modelling using simulated annealing is presented. The fuzzy rule system is applied to simulate nitrogen leaching for selected agricultural soils within the 23687 km2 Saale River Basin. The fuzzy rules are defined and calibrated using results from simulation experiments carried out with the deterministic modelling system SWIM. Monthly aggregated time series of simulated water balance components (e.g. percolation and evapotranspiration), fertilization amounts, resulting nitrogen leaching and crop parameters are used for the derivation of the fuzzy rules. The 30-year simulation period was divided into 20 years for training and 10 years for validation, with the latter taken from the middle part of the period. Three specific fuzzy rule systems were created from the simulation experiments, one for each selected soil profile. Each rule system includes 15 rules as well as one prescribed rules from expert knowledge and 7 input variables. The performance of the fuzzy rule system is satisfactory for the assessment of nitrate leaching on annual to long term time steps. The approach allows rapid scenario analysis for large regions and has the potential to become part of decision support systems for generalised integrated assessment of water and nutrients in macroscale regions.  相似文献   

18.
Real-world tasks are often complex, uncertain, and constrained. The complexity arises from the need to consider numerous factors that are of varying degrees of relevance to the problem at hand. The uncertainty springs from imperfect information concerning the state of the world, the repertoire of feasible alternatives, and the consequences of each action. The constraints are attributable to time, money, and computational resources as well as individual tastes and societal values.Despite the rich nature of practical tasks, previous work in decision making—whether in engineering, statistics, management, or economics—has focused solely on partial aspects of the problem. This state of affairs is reflected in the nomenclature, which involves categories such as “constrained optimization” or “decisions under uncertainty”.If real-world tasks are to be addressed in a coherent fashion, it is imperative to develop a systematic framework providing an integrated view. The framework may then serve as the foundation for a general theory of decision making that can capture the full richness of realistic problems. This paper explores how these goals might be achieved.Algebraic and stochastic models of innovative decision making are presented. This is followed by an examination of idea generation in product design. Finally, suggestions are made for extending the work along both theoretical and empirical lines.  相似文献   

19.
The formulation of a problem may be defined as a process of acquisition and organization of knowledge related to a given situation, on which a decision maker projects some action. The assistance in the problem formulation that we may expect within decision support systems is difficult to design and to implement. This is mainly due to the frequent lack of attention to a sufficiently formalized conceptual framework which would consider the decision with a more cognition sciences oriented approach. In the first part, we will present an instrumental model for the study of decision processes as an attempt to simulate the cognitive process of knowledge acquisition and organization carried out by a decision maker facing a problematic situation. Considering its epistemological foundations, this model can be named “cognitivist model”. Within this model, the decision is defined as a cognitive construction which we call “decisional construct”. It consists of the elaboration of one or several abstract representations of the problematic situation (formulation phase), and the design of operational models (solving phase). In the second part, we will present the COGITA project, which consists of the design and realization of an environment for the development of problem formulation assistance systems. The modelization and simulation of cognitive processes call for relevant techniques originating either in artificial intelligence or in connectionism. We will show which are the main characteristics, potentials, limits and complementarity of these techniques and why their integration is fundamental and necessary to the simulation of the cognitive process associated with the formulation. COGITA is a hybrid system currently under development which tends to integrate symbolic artificial intelligence techniques and connectionist models in a cooperative hybridation the general architecture of which is presented.  相似文献   

20.
The Rule-Based (RB) and the Artificial Neural Network (ANN) approaches to expert systems development have each demonstrated some specific advantages and disadvantages. These two approaches can be integrated to exploit the advantages and minimize the disadvantages of each method used alone. An RB/ANN integrated approach is proposed to facilitate the development of an expert system which provides a “high-performance” knowledge-based network, an explanation facility, and an input/output facility. In this case study an expert system designed to assist managers in forecasting the performance of stock prices is developed to demonstrate the advantages of this integrated approach and how it can enhance support for managerial decision making.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号