首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
We present three strong arguments against the ontic interpretation of quantum states. We then show that the appropriate alternative is not an epistemic interpretation, but viewing quantum states as representing the available knowledge about the potentialities of a quantum system from the perspective a of a particular point in space. Unlike ordinary knowledge, which requires a knower, available knowledge can be assumed to be present regardless of a knower. The relationship between “perspectives on potentialities” and “the potentialities themselves” is clarified.  相似文献   

3.
Malin proposes a solution to some of the conceptual problems of the foundations of quantum mechanics within the framework of Alfred North Whitehead’s “Philosophy of Organism”. Standard quantum dynamics, governed by the time-dependent Schr?dinger equation, does not provide for the reduction of superpositions of physical states and hence does not account for occurrence of observational data. If consciousness is invoked to explain the results of measurements, it would appear that quantum mechanics is given an anthropocentric interpretation. Reduction of superpositions is achieved without anthropocentrism, according to Malin, by accepting Whitehead’s ontology of “actual occasions”, which are protomental entities independent of and presumably antedating human beings. Furthermore, Whitehead’s philosophy has the great virtue of offering a plausible solution to the profound problem of relating minds to material systems. Shimony is sympathetic to Whitehead’s world view, but with the reservation that it leaves an immense unexplained and unexplored gap between the conjectured “experience” of actual occasions and the high level experience of the human mind.QUPON/QIPC Special Issue  相似文献   

4.
This paper analyzes the application of Moran’s index and Geary’s coefficient to the characterization of lung nodules as malignant or benign in computerized tomography images. The characterization method is based on a process that verifies which combination of measures, from the proposed measures, has been best able to discriminate between the benign and malignant nodules using stepwise discriminant analysis. Then, a linear discriminant analysis procedure was performed using the selected features to evaluate the ability of these in predicting the classification for each nodule. In order to verify this application we also describe tests that were carried out using a sample of 36 nodules: 29 benign and 7 malignant. A leave-one-out procedure was used to provide a less biased estimate of the linear discriminator’s performance. The two analyzed functions and its combinations have provided above 90% of accuracy and a value area under receiver operation characteristic (ROC) curve above 0.85, that indicates a promising potential to be used as nodules signature measures. The preliminary results of this approach are very encouraging in characterizing nodules using the two functions presented.
Rodolfo Acatauassu NunesEmail:
  相似文献   

5.
In Part B of this paper, planar collision theories, counterparts of the theory associated with Newton’s hypotheses described in Part A, are developed in connection with Poisson’s and Stronge’s hypotheses. First, expressions for the normal and tangential impulses, the normal and tangential velocities of separation, and the change of the system mechanical energy are written for five types of collision. These together with Routh’s semigraphical method and Coulomb’s coefficient of friction are used to show that the algebraic signs of the four parameters introduced in Part A span the same five cases of system configuration of Part A. For each, α determines the type of collision which once found allows the evaluation of the normal and tangential impulses and ultimately the changes in the motion variables. The analysis of the indicated cases shows that for Poisson’s hypothesis, a solution always exists which is unique, coherent and energy-consistent. The same applies to Stronge’s hypothesis, however, for a narrower range of application. It is thus concluded that Poisson’s hypothesis is superior as compared with Newton’s and Stronge’s hypotheses.  相似文献   

6.
    
Supply chains are multifaceted structures focusing on the integration of all the factors involved in the overall process of production and distribution of end products to the customers. Growing interest in supply chain systems has highlighted the need to adopt appropriate approaches that can ensure the efficient management of their complexity, enormity and broadness of scope. With the main aim of supply chain management being to optimise the performance of supply chains, attention is mainly drawn to the development of modelling frameworks that can be utilised to analyse and comprehend the dynamic behaviour of supply chains. While there have been only a few supply chain modelling attempts reported in the literature, this paper proposes a modelling framework that is used to simulate the operation of a supply chain network of moderate complexity. The proposed model comprises four echelons and is build around a central medium-sized manufacturing company operating as a typical Make-to-Order (MTO) system. The developed model was built using a systems dynamics (SD) approach. The operations performed within a supply chain are a function of a great number of key variables which often seem to have strong interrelationships. The ability of understanding the network as a whole, analysing the interactions between the various components of the integrated system and eventually supplying feedback without de-composing it make systems dynamics an ideal methodology for modelling supply chain networks. The objective of the paper is to model the operation of the supply chain network under study and obtain a true reflection of its behaviour. The modelling framework is also used to study the performance of the system under the initial conditions considered and compare it with that obtained by running the system under eight different scenarios concerning commonly addressed real-life operational conditions. The modelling effort has focused on measuring the supply chain system performance in terms of key metrics such as inventory, WIP levels, backlogged orders and customer satisfaction at all four echelons. The study concludes with the analysis of the obtained results and the conclusions drawn from contrasting the system’s performance under each investigated scenario to that of the benchmark model.  相似文献   

7.
This paper proposes an integrated modelling framework for the analysis of manufacturing systems that can increase the capacity of modelling tools for rapidly creating a structured database with multiple detail levels and thus obtain key performance indicators (KPIs) that highlight possible areas for improvement. The method combines five important concepts: hierarchical structure, quantitative/qualitative analysis, data modelling, manufacturing database and performance indicators. It enables methods to build a full information model of the manufacturing system, from the shopfloor functional structure to the basic production activities (operations, transport, inspection, etc.). The proposed method is based on a modified IDEF model that stores all kind of quantitative and qualitative information. A computer-based support tool has been developed to connect with the IDEF model, creating automatically a relational database through a set of algorithms. This manufacturing datawarehouse is oriented towards obtaining a rapid global vision of the system through multiple indicators. The developed tool has been provided with different scorecard panels to make use of KPIs to decide the best actions for continuous improvement. To demonstrate and validate both the proposed method and the developed tools, a case study has been carried out for a complex manufacturing system.  相似文献   

8.
Traditional component manufacturing systems have been optimized for either small scale craft production or for mass production of a small variety of high volume parts. Trends towards intermediate volumes and larger variety of parts have exposed the need for intelligently embedding flexibility in manufacturing systems and processes. The literature offers only few attempts to value component fabrication flexibility in a systematic way. In this article a 5-step framework for valuing flexibility and ranking of manufacturing processes under uncertainty is developed. A discrete time simulation is used to predict profit, remaining tool value and machine utilization as a function of three probabilistic demand and specification scenarios. A case study demonstrates the simulation and contrasts both a high volume (automotive) and a low volume (aerospace) market situation across six different processes ranging from punching to laser cutting. It is found that for intermediate, uncertain production volumes alternative manufacturing processes that embed flexibility carefully in one or more dimensions can outperform traditional processes that are either completely non-flexible (e.g., stamping) or completely flexible (e.g., laser cutting). It is also shown that flexibility in parts manufacturing is a complex topic because flexibility can be embedded in the parts themselves, in tooling or in the process parameters.  相似文献   

9.
This paper deals with collision with friction. In Part A, equations governing a one-point collision of planar, simple nonholonomic systems are generated. Expressions for the normal and tangential impulses, the normal and tangential velocities of separation of the colliding points, and the change of the system mechanical energy are written for three types of collision (i.e., forward sliding, sticking, etc.). These together with Routh’s semigraphical method and Coulomb’s coefficient of friction are used to show that the algebraic signs of four, newly-defined, configuration-related parameters, not all independent, span five cases of system configuration. For each, the ratio between the tangential and normal components of the velocity of approach, called α, determine the type of collision, which once found, allows the evaluation of the associated normal and tangential impulses and ultimately the changes in the motion variables. The analysis of these cases indicates that the calculated mechanical energy may increase if sticking or reverse sliding occur. In Part B, theories based on Poisson’s and Stronge’s hypotheses are presented with more encouraging results.  相似文献   

10.
    
The latest developments in industry involved the deployment of digital twins for both long and short term decision making, such as supply chain management, production planning and control. Modern production environments are frequently subject to disruptions and consequent modifications. As a result, the development of digital twins of manufacturing systems cannot rely solely on manual operations. Recent contributions proposed approaches to exploit data for the automated generation of the models. However, the resulting representations can be excessively accurate and may also describe activities that are not significant for estimating the system performance. Generating models with an appropriate level of detail can avoid useless efforts and long computation times, while allowing for easier understanding and re-usability. This paper proposes a method to automatically discover manufacturing systems and generate adequate digital twins. The relevant characteristics of a production system are automatically retrieved from data logs. The proposed method has been applied on two test cases and a real manufacturing line. The experimental results prove its effectiveness in generating digital models that can correctly estimate the system performance.  相似文献   

11.
集成整合是当前企业信息化建设的重要内容。从企业信息化集成整合技术的发展趋势出发,分析目前主流的两种集成整合技术——面向服务架构(SOA)和业务流程管理(BPM)的技术特点和优势,探讨两者结合的集成整合平台方案及关键技术.并提出集成整合平台实施的路径和方法。  相似文献   

12.
  总被引:4,自引:0,他引:4  
Increasing competition in globalising markets requires effective means for development and production planning regarding innovative products. One efficient approach is an integrated product and process design, following the idea of concurrent engineering. Initialising process design at an early stage of product development allows for balancing the product characteristics and the process capabilities. In this connection, the architecture of process planning operations has to reflect the information certainty derived from the development process. At the Laboratory for Machine Tools and Production Engineering (WZL), methods and electronic data processing (EDP) tools for a continuous, product related process design in different planning and optimisation phases have been developed. The procedure and the efficiency of the developed methods and EDP-tools which are designed modularly are illustrated in the following paper.  相似文献   

13.
基于Petri网的BPR建模方法的研究   总被引:2,自引:0,他引:2  
文章论述了基于Petri网的事务流程模型和流程建模方法。经过40多年的研究与应用,Petri 网在自动控制、通信和计算机科学等许多领域已经获得了较为成熟的应用。但把它作为支持BPR实施的流程建模工具,还没有得到广泛的应用。文中提出了一种基于Petri网的流程建模方法-BPM,在增加适当的属性后,用BPM设计的流程可直接用于仿真,这样就可得到必要的流程优化数据,以实现流程的优化。文章最后用一个简单的例子对BPM的有效性进行了验证。  相似文献   

14.
生产系统的计算机仿真研究   总被引:3,自引:0,他引:3  
贾启君  王凤岐  郭伟 《计算机仿真》2003,20(10):111-113,138
生产系统在设计与运行过程中面临着诸多的问题需要解决,计算机仿真技术作为一种有效验证生产系统设计合理性和辅助生产管理决策的手段,近年来在我国得到了越来越广泛的关注。该文系统总结了计算机仿真技术对生产系统的重要作用,利用Ithink5.0软件建立了简单生产过程的计算机仿真模型并进行了分析。  相似文献   

15.
The arguments given in this paper suggest that Grover’s and Shor’s algorithms are more closely related than one might at first expect. Specifically, we show that Grover’s algorithm can be viewed as a quantum algorithm which solves a non-abelian hidden subgroup problem (HSP). But we then go on to show that the standard non-abelian quantum hidden subgroup (QHS) algorithm can not find a solution to this particular HSP. This leaves open the question as to whether or not there is some modification of the standard non-abelian QHS algorithm which is equivalent to Grover’s algorithm.   相似文献   

16.
This paper reports on the study of the “underbody front”-automated welding cell at Opel Belgium, a major automobile manufacturer of General Motors International Operations. It employs the use of simulation in an experimental design framework to identify potential improvements in average daily output through management of buffer sizes at key buffer locations within the cell. Many practical applications of animated computer simulation stop at the modeling and displaying of the process under study. Simulation as a tool for process reengineering or enhancement can only reach its full potential if incorporated in a comprehensive statistical study, so as to attain statistically significant results. The paper also reports on the reactions of, and issues raised by, management when the experimental design methodology was presented as a tool for process enhancement and productivity improvement.  相似文献   

17.
We provide sharp estimates for the probabilistic behaviour of the main parameters of the Euclid Algorithms, both on polynomials and on integer numbers. We study in particular the distribution of the bit-complexity which involves two main parameters: digit-costs and length of remainders. We first show here that an asymptotic Gaussian law holds for the length of remainders at a fraction of the execution, which exhibits a deep regularity phenomenon. Then, we study in each framework—polynomials (P) and integer numbers (I)—two gcd algorithms, the standard one (S) which only computes the gcd, and the extended one (E) which also computes the Bezout pair, and is widely used for computing modular inverses. The extended algorithm is more regular than the standard one, and this explains that our results are more precise for the Extended algorithm: we exhibit an asymptotic Gaussian law for the bit-complexity of the extended algorithm, in both cases (P) and (I). We also prove that an asymptotic Gaussian law for the bit-complexity of the standard gcd in case (P), but we do not succeed obtaining a similar result in case (I). The integer study is more involved than the polynomial study, as it is usually the case. In the polynomial case, we deal with the central tools of the distributional analysis of algorithms, namely bivariate generating functions. In the integer case, we are led to dynamical methods, which heavily use the dynamical system underlying the number Euclidean algorithm, and its transfer operator. Baladi and Vallée (J. Number Theory 110(2):331–386, 2005) have recently designed a general framework for “distributional dynamical analysis”, where they have exhibited asymptotic Gaussian laws for a large family of parameters. However, this family does not contain neither the bit-complexity cost nor the size of remainders, and we have to extend their methods for obtaining our results. Even if these dynamical methods are not necessary in case (P), we explain how the polynomial dynamical system can be also used for proving our results. This provides a common framework for both analyses, which well explains the similarities and the differences between the two cases (P) and (I), for the algorithms themselves, and also for their analysis. An extended abstract of this paper can be found in Lhote and Vallée (Proceedings of LATIN’06, Lecture Notes in Computer Science, vol. 3887, pp. 689–702, 2006).  相似文献   

18.
This paper looks into a new area for knowledge-based system application, that of manufacturing modelling. Manual generation of IDEF0 models of manufacturing systems is time-consuming and inconsistent. However, the process can be automated to improve timeliness and consistency. In this paper, a knowledge-based manufacturing modelling system for the automatic generation of IDEF0 models is proposed. The system will not only greatly reduce the IDEF0 modelling time but will also eliminate the inconsistency problem of conventional IDEF0 modelling systems. The paper explains the knowledge-based approach and identifies the kinds of domain knowledge that are required for the construction of the knowledge-based manufacturing modelling system.  相似文献   

19.
20.
A key ingredient in system and organization modeling is modeling business processes that involve the collaborative participation of different teams within and outside the organization. Recently, the use of the Unified Modeling Language (UML) for collaborative business modeling has been increasing, thanks to its human-friendly visual representation of a rich set of structural and behavioral views, albeit its unclear semantics. In the meantime, the use of the Web Ontology Language (OWL) has also been emerging, thanks to its clearly-defined semantics, hence being amenable to automatic analysis and reasoning, although it is less human friendly than, and also perhaps not as rich as, the UML notation — especially concerning processes, or activities. In this paper, we view the UML and the OWL as being complementary to each other, and exploit their relative strengths. We provide a mapping between the two, through a set of mapping rules, which allow for the capture of UML activity diagrams in an OWL-ontology. This mapping, which results in a formalization of collaborative processes, also sets a basis for subsequent construction of executable models using the Colored Petri Nets (CPN) formalism. For this purpose, we also provide appropriate mappings from OWL-based ontological elements into CPN elements. A case study of a mortgage granting system is described, along with the potential benefits and limitations of our proposal.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号