首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 859 毫秒
1.
2.
This paper proposes an approach for facilitating systems interoperability in a manufacturing environment. It is based on the postulate that an ontological model of a product may be considered as a facilitator for interoperating all application software that share information during the physical product lifecycle. The number of applications involved in manufacturing enterprises may in fact refer to the knowledge that must be embedded in it, appropriately storing all its technical data based on a common model. Standardisation initiatives (ISO and IEC) try to answer the problem of managing heterogeneous information scattered within organizations, by formalising the knowledge related to product technical data. The matter of this approach is to formalise all those technical data and concepts contributing to the definition of a Product Ontology, embedded into the product itself and making it interoperable with applications, thus minimising loss of semantics.  相似文献   

3.
Usually the data generation rate of a data stream is unpredictable, and some data elements of the data stream cannot be processed in real time if the generation rate exceeds the capacity of a data stream processing algorithm. In order to overcome this situation gracefully, a load shedding technique is recommended. This paper proposes a frequency-based load shedding technique over a data stream of tuples. In many data stream processing applications, such as mining frequent patterns, data elements having high frequency can be considered more significant than others having low frequency. Based on this observation, in the proposed technique, only frequent elements of a data stream are processed in real time while the others are trimmed. The decision to shed a load from the data stream or not is controlled automatically by the data generation rate of a data stream. Consequently, an unnecessary load shedding operation is not allowed in the proposed technique.  相似文献   

4.
To survive the cut-throat competition in the manufacturing industry, many companies have introduced digital manufacturing technology. Digital manufacturing technology not only shortens the product development cycle times but also improves the precision of engineering simulation. However, building the virtual objects needed for a digital manufacturing environment requires skilled human resources; it is also costly and time-consuming. A high precision environment with the similar resources is also needed for a high precision simulation. In this paper, we propose a method of constructing a mixed reality-based digital manufacturing environment. The method integrates real objects, such as real images, with the virtual objects of a virtual manufacturing system. This type of integration minimizes the cost of implementing virtual objects and enhances the user's sense of reality. We studied several methods and derived a general framework for the system. Finally, we developed our idea into a virtual factory layout planning system. To assign the pose and position of real objects in virtual space, we applied a circle-based tracking method which uses a safety sign instead of the planar-square-shaped marker generally used for registration. Furthermore, we developed the framework to encapsulate simulation data from legacy data and process data for visualization based on mixed reality.  相似文献   

5.
Data stream is a continuous, rapid, time-varying sequence of data elements which should be processed in an online manner. These matters are under research in Data Stream Management Systems (DSMSs). Single processor DSMSs cannot satisfy data stream applications?? requirements properly. Main shortcomings are tuple latency, tuple loss, and throughput. In our previous publications, we introduced parallel execution of continuous queries to overcome these problems via performance improvement, especially in terms of tuple latency. We scheduled operators in an event-driven manner which caused system performance reduction in periods between consecutive scheduling instances. In this paper, a continuous scheduling method (dispatching) is presented to be more compatible with the continuous nature of data streams as well as queries to improve system adaptivity and performance. In a multiprocessing environment, the dispatching method forces processing nodes (logical machines) to send partially-processed tuples to next machines with minimum workload to execute the next operator on them. So, operator scheduling is done continuously and dynamically for each tuple processed by each operator. The dispatching method is described, formally presented, and its correctness is proved. Also, it is modeled in PetriNets and is evaluated via simulation. Results show that the dispatching method significantly improves system performance in terms of tuple latency, throughput, and tuple loss. Furthermore, the fluctuation of system performance parameters (against variation of system and stream characteristics) diminishes considerably and leads to high adaptivity with the underlying system.  相似文献   

6.
Precision z-level contour machining is important for various computer-aided manufacturing (CAM) applications such as pocket machining and high-speed machining (HSM). This paper presents a new z-level contour tool-path generation algorithm for NC machining of triangulated surface models. Traditional approaches of z-level machining rely on the creation of accurate CL (cutter location) surfaces by surface offsetting or high-density z-map generation, which is computationally expensive and memory demanding. In contrast, this paper presents a novel approach to the generation of CL data directly from the section polygon of a triangulated surface model. For each polygon vertex of the contour, the offset direction is determined by the normal to the edge, while the offset distance is not fixed but is determined from the cutter shape and the part surface. An interference-free tool-path computation algorithm using fillet endmills is developed. Since there is no need to create a complete CL surface or high-density z-map grids, this proposed method is highly efficient and more flexible, and can be directly applied to triangulated surfaces either tessellated from CAD models, or reconstructed from 3D scanned data for reverse engineering (RE) applications.  相似文献   

7.
With rapid advances in new generation information technologies, digital twin (DT), and cyber-physical system, smart assembly has become a core focus for intelligent manufacturing in the fourth industrial evolution. Deep integration between information and physical worlds is a key phase to develop smart assembly process design that bridge the gap between product assembly design and manufacturing. This paper presents a digital twin reference model for smart assembly process design, and proposes an application framework for DT-based smart assembly with three layers. Product assembly station components are detailed in the physical space layer; two main modules, communication connection and data processing, are introduced in the interaction layer; and we discuss working mechanisms of assembly process planning, simulation, predication, and control management in the virtual space layer in detail. A case study shows the proposed approach application for an experimental simplified satellite assembly case using the DT-based assembly application system (DT-AAS) to verify the proposed application framework and method effectiveness.  相似文献   

8.
We present a method for the classification of multi-labeled text documents explicitly designed for data stream applications that require to process a virtually infinite sequence of data using constant memory and constant processing time.Our method is composed of an online procedure used to efficiently map text into a low-dimensional feature space and a partition of this space into a set of regions for which the system extracts and keeps statistics used to predict multi-label text annotations. Documents are fed into the system as a sequence of words, mapped to a region of the partition, and annotated using the statistics computed from the labeled instances colliding in the same region. This approach is referred to as clashing.We illustrate the method in real-world text data, comparing the results with those obtained using other text classifiers. In addition, we provide an analysis about the effect of the representation space dimensionality on the predictive performance of the system. Our results show that the online embedding indeed approximates the geometry of the full corpus-wise TF and TF-IDF space. The model obtains competitive F measures with respect to the most accurate methods, using significantly fewer computational resources. In addition, the method achieves a higher macro-averaged F measure than methods with similar running time. Furthermore, the system is able to learn faster than the other methods from partially labeled streams.  相似文献   

9.
With more and more real deployments of wireless sensor network applications, we envision that their success is nonetheless determined by whether the sensor networks can provide a high quality stream of data over a long period. In this paper, we propose a consistency-driven data quality management framework called Orchis that integrates the quality of data into an energy efficient sensor system design. Orchis consists of four components, data consistency models, adaptive data sampling and process protocols, consistency-driven cross-layer protocols and flexible APIs to manage the data quality, to support the goals of high data quality and energy efficiency. We first formally define a consistency model, which not only includes temporal consistency and numerical consistency, but also considers the application-specific requirements of data and data dynamics in the sensing field. Next, we propose an adaptive lazy energy efficient data collection protocol, which adapts the data sampling rate to the data dynamics in the sensing field and keeps lazy when the data consistency is maintained. Finally, we conduct a comprehensive evaluation to the proposed protocol based on both a TOSSIM-based simulation and a real prototype implementation using MICA2 motes. The results from both simulation and prototype show that our protocol reduces the number of delivered messages, improves the quality of collected data, and in turn extends the lifetime of the whole network. Our analysis also implies that a tradeoff should be carefully set between data consistency requirements and energy saving based on the specific requirements of different applications.  相似文献   

10.
This work focuses on additive manufacturing by Directed Energy Deposition (DED) using a 6-axis robot. The objective is to generate an optimized trajectory in the joint space, taking into account axis redundancy for parts of revolution produced with a coaxial deposition system. To achieve this goal, a new layer-by-layer method coupled with a trajectory constrained optimization is presented. The optimization results are theoretically compared to a non-optimized trajectory and a point-by-point optimized trajectory. The layer-by-layer generation of optimized trajectories is validated experimentally on a 6-axis robot using a PLA extrusion system. Experimental results show that the layer-by-layer trajectory optimization strategy applied to parts of revolution provides better geometrical accuracy while improving the efficiency of the manufacturing device compared to non-optimized solutions.  相似文献   

11.
In a cloud manufacturing system, manufacturing services form the service network and collaboration network. The performance of the networks can affect the cloud manufacturing system. Researchers can study the behavior of enterprises, service, resource and elements in cloud manufacturing through the simulation platform. In this paper, the service agent in the simulation platform can drive the manufacturing service to form a self-organization network, in order to conduct transactions and cooperation spontaneously. Thus, the service network of service agent and the collaboration network of service agent are formed. And the models of two kinds of networks are presented by the theory of Set Pair Analysis (SPA). In addition, an evaluation method for two kinds of network models is proposed. The simulation data of the networks are produced by the method of data generation in cloud manufacturing simulation platform. Finally, the evaluation methods of the networks are validated by the simulation data.  相似文献   

12.
Advanced manufacturing is one of the core national strategies in the US (AMP), Germany (Industry 4.0) and China (Made-in China 2025). The emergence of the concept of Cyber Physical System (CPS) and big data imperatively enable manufacturing to become smarter and more competitive among nations. Many researchers have proposed new solutions with big data enabling tools for manufacturing applications in three directions: product, production and business. Big data has been a fast-changing research area with many new opportunities for applications in manufacturing. This paper presents a systematic literature review of the state-of-the-art of big data in manufacturing. Six key drivers of big data applications in manufacturing have been identified. The key drivers are system integration, data, prediction, sustainability, resource sharing and hardware. Based on the requirements of manufacturing, nine essential components of big data ecosystem are captured. They are data ingestion, storage, computing, analytics, visualization, management, workflow, infrastructure and security. Several research domains are identified that are driven by available capabilities of big data ecosystem. Five future directions of big data applications in manufacturing are presented from modelling and simulation to real-time big data analytics and cybersecurity.  相似文献   

13.
The goal of this paper is to propose an approach to enhance interoperability between manufacturing applications using the Core Manufacturing Simulation Data Information Model (CMSDIM) in order to streamline design and manufacturing activities throughout the product life cycle. To this end, a system framework required to facilitate such interoperability is first presented. The proposed approach, architecture, and developed translators are then illustrated and demonstrated using two separate case studies. The first case study facilitates design for manufacturing and assembly improvements for the development of new products, allowing for part of a discrete event simulation model of a downstream manufacturing and assembly process to be automatically generated from corresponding product assembly information contained in the lean design engineering software. Conceptual design and development of this case study, which extracts outputs from Design Profit™ lean design software and generates a corresponding discrete event simulation model in ProModel™ for a Nikon® L-100 Camera, is then discussed. The second case study demonstrates interoperability of three applications (order and inventory system, Gantt chart scheduler, and discrete event simulation) for a generic job shop operation. Using the considered case studies, this paper also details and demonstrates the benefits of interoperability enhancement using the CMSDIM, which is an important consideration in any product life cycle. Finally, we discuss how future research opportunities integrating additional manufacturing applications can be used to address intellectual challenges present in our current approach.  相似文献   

14.
We present a decision-making assistant tool for an integrated product and process design environment for manufacturing applications. Specifically, we target microwave modules that use electro-mechanical components and require optimal solutions to reduce cost, improve quality, and gain leverage in time to market the product. This tool will assist the product and process designer to improve their productivity and enable them to cooperate and coordinate their designs through a common design interface. We consider a multiobjective optimization model that determines components and processes for a given conceptual design for microwave modules. This model outputs a set of solutions that the Pareto optimal concerning cost, quality, and other metrics. In addition, we identify system integration issues for manufacturing applications, and propose an architecture that will serve as a building block to our continuing research in virtual manufacturing applications.  相似文献   

15.
This paper discusses application of hyperspace data mining to materials manufacturing. We introduce an innovative hyperspace method whereby data are separated into subspaces, features selected according to data patterns, and control rendered in the original feature space. This technique has three major advantages: no equipment is added, no experiment is needed, and no interruption occurs to production. A number of proprietary algorithms have been built into a software product, MasterMinerTM, for use in materials design and manufacturing. Examples are given to show the efficacy of the proposed method and MasterMiner tool.  相似文献   

16.
The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time. In reality, streaming data usually arrives out-of-order due to factors such as network delay. The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness. Watermark is a special kind of data inserted into the data stream with a timestamp, which helps the framework to decide whether the data received is late and thus be discarded. Traditional watermark generation strategies are periodic; they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy. This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation. This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy. We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets. The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy.  相似文献   

17.
With the increasing availability of modern mobile devices and location acquisition technologies, massive trajectory data of moving objects are collected continuously in a streaming manner. Clustering streaming trajectories facilitates finding the representative paths or common moving trends shared by different objects in real time. Although data stream clustering has been studied extensively in the past decade, little effort has been devoted to dealing with streaming trajectories. The main challenge lies in the strict space and time complexities of processing the continuously arriving trajectory data, combined with the difficulty of concept drift. To address this issue, we present two novel synopsis structures to extract the clustering characteristics of trajectories, and develop an incremental algorithm for the online clustering of streaming trajectories (called OCluST). It contains a micro-clustering component to cluster and summarize the most recent sets of trajectory line segments at each time instant, and a macro-clustering component to build large macro-clusters based on micro-clusters over a specified time horizon. Finally, we conduct extensive experiments on four real data sets to evaluate the effectiveness and efficiency of OCluST, and compare it with other congeneric algorithms. Experimental results show that OCluST can achieve superior performance in clustering streaming trajectories.  相似文献   

18.
An efficient model for communications between CAD, CAPP, and CAM applications in distributed manufacturing planning environment has been seen as key ingredient for CIM. Integration of design model with process and scheduling information in real-time is necessary in order to increase product quality, reduce the cost, and shorten the product manufacturing cycle. This paper describes an approach to integrate key product realization activities using neutral data representation. The representation is based on established standards for product data exchange and serves as a prototype implementation of these standards. The product and process models are based on object-oriented representation of geometry, features, and resulting manufacturing processes. Relationships between objects are explicitly represented in the model (for example, feature precedence relations, process sequences, etc.). The product model is developed using XML-based representation for product data required for process planning and the process model also uses XML representation of data required for scheduling and FMS control. The procedures for writing and parsing XML representations have been developed in object-oriented approach, in such a way that each object from object-oriented model is responsible for storing its own data into XML format. Similar approach is adopted for reading and parsing of the XML model. Parsing is performed by a stack of XML handlers, each corresponding to a particular object in XML hierarchical model. This approach allows for very flexible representation, in such a way that only a portion of the model (for example, only feature data, or only the part of process plan for a single machine) may be stored and successfully parsed into another application. This is very useful approach for direct distributed applications, in which data are passed in the form of XML streams to allow real-time on-line communication. The feasibility of the proposed model is verified in a couple of scenarios for distributed manufacturing planning that involves feature mapping from CAD file, process selection for several part designs integrated with scheduling and simulation of the FMS model using alternative routings.  相似文献   

19.
X. F. Zha   《Knowledge》2002,15(8):493-506
Multi-agent modeling has emerged as a promising discipline for dealing with decision making process in distributed information system applications. One of such applications is the modeling of distributed design or manufacturing processes which can link up various designs or manufacturing processes to form a virtual consortium on a global basis. This paper proposes a novel knowledge intensive multi-agent cooperative/collaborative framework for concurrent intelligent design and assembly planning, which integrates product design, design for assembly, assembly planning, assembly system design, and assembly simulation subjected to econo-technical evaluations. An AI protocol based method is proposed to facilitate the integration of intelligent agents for assembly design, planning, evaluation and simulation process. A unified class of knowledge intensive Petri nets is defined using the O-O knowledge-based Petri net approach and used as an AI protocol for handling both the integration and the negotiation problems among multi-agents. The detailed cooperative/collaborative mechanism and algorithms are given based on the knowledge objects cooperation formalisms. As such, the assembly-oriented design system can easily be implemented under the multi-agent-based knowledge-intensive Petri net framework with concurrent integration of multiple cooperative knowledge sources and software. Thus, product design and assembly planning can be carried out simultaneously and intelligently in an entirely computer-aided concurrent design and assembly planning system.  相似文献   

20.
《Computer Communications》2001,24(3-4):319-333
This paper presents an approach for automatic executable test case and test sequence generation for a protocol modeled by an SDL system. Our methodology uses a unified method which tests an Extended Finite State Machine (EFSM) based system by using control and data flow techniques. To test an SDL system, it extracts an EFSM from each process then the system is tested by incrementally computing a partial product for each EFSM C, taking into account only transitions which influence (or are influenced by) C, and generating test cases for it. This process ends when the coverage achieved by the generated test cases is satisfactory or when the partial products for all EFSMs are tested. Experimental results show that this method can be applied to systems of practical size.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号