共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
Causal analysis is an integral part of product quality problem-solving (QPS). Quality management within the manufacturing industry has generated a considerable amount of QPS data; while this implies a historical and extensive body of QPS experience, these valuable empirical data are not being fully utilised. Therefore, the current study proposes a method by which to mine know-why from historical empirical data, and it develops an approach for constructing digital cause-and-effect diagrams (CEDs). The K-means algorithm is first adopted to cluster the problems and causes. The random forest classifier is then selected to classify cause text into the main cause categories, which manifest as ‘rib branches’ in the CED. Based on the clustering and classification results, we obtain an abstract cause-and-effect diagram (ACED) and a detailed cause-and-effect diagram (DCED). We use the quality data of an automotive company to validate the method, and we additionally undertake a pilot run of the Fishbone Next system to demonstrate how users can obtain these two CEDs to support causal analysis in QPS. The results show that the proposed approach efficiently constructs a digital CED and thus provides quality management problem-solvers with decision support to derive the potential causes of problems, thereby improving the efficiency and effectiveness of their causal analysis initiatives. 相似文献
3.
Using a keyword mining approach, this paper explores the interdisciplinary and integrative dynamics in five nano research fields. We argue that the general trend of integration in nano research fields is converging in the long run, although the degree of this convergence depends greatly on the indicators one chooses. Our results show that nano technologies applied in the five studied nano fields become more diverse over time. One field learns more and more related technologies from others. The publication and citation analysis also proves that nano technology has developed to a relatively mature stage and has become a standardized and codified technology. 相似文献
4.
Yung-Chi Lee Chad Dalton Brian Regler David Harris 《Drug development and industrial pharmacy》2018,44(9):1551-1556
Lipid-based drug delivery systems have been intensively investigated as a means of delivering poorly water-soluble drugs. Upon ingestion, the lipases in the gastrointestinal tract digest lipid ingredients, mainly triglycerides, within the formulation into monoglycerides and fatty acids. While numerous studies have addressed the solubility of drugs in triglycerides, comparatively few publications have addressed the solubility of drugs in fatty acids, which are the end product of digestion and responsible for the solubility of drug within mixed micelles. The objective of this investigation was to explore the solubility of a poorly water-soluble drug in fatty acids and raise the awareness of the importance of drug solubility in fatty acids. The model API (active pharmaceutical ingredient), a weak acid, is considered a BCS II compound with an aqueous solubility of 0.02?μg/mL and predicted partition coefficient >7. The solubility of API ranged from 120?mg/mL to over 1?g/mL in fatty acids with chain lengths across the range C18 to C6. Hydrogen bonding was found to be the main driver of the solubilization of API in fatty acids. The solubility of API was significantly reduced by water uptake in caprylic acid but not in oleic acid. This report demonstrates that solubility data generated in fatty acids can provide an indication of the solubility of the drug after lipid digestion. This report also highlights the importance of measuring the solubility of drugs in fatty acids in the course of lipid formulation development. 相似文献
5.
Techniques for measurement of higher-order aberrations of a projection optical system in photolithographic exposure tools have been established. Even-type and odd-type aberrations are independently obtained from printed grating patterns on a wafer by three-beam interference under highly coherent illumination. Even-type aberrations, i.e., spherical aberration and astigmatism, are derived from the best focus positions of vertical, horizontal, and oblique grating patterns by an optical microscope. Odd-type aberrations, i.e., coma and three-foil, are obtained by detection of relative shifts of a fine grating pattern to a large pattern by an overlay inspection tool. Quantitative diagnosis of lens aberrations with a krypton fluoride (KrF) excimer laser scanner is demonstrated. 相似文献
6.
Rational design of a polymer specific for microcystin-LR using a computational approach 总被引:11,自引:0,他引:11
Chianella I Lotierzo M Piletsky SA Tothill IE Chen B Karim K Turner AP 《Analytical chemistry》2002,74(6):1288-1293
A computational approach for the design of a molecularly imprinted polymer (MIP) specific for Cyanobacterial toxin microcystin-LR is presented. By using molecular modeling software, a virtual library of functional monomers was designed and screened against the target toxin, employed as a template. The monomers giving the highest binding energy were selected and used in a simulated annealing (molecular dynamics) process to investigate their interaction with the template. The stoichiometric ratio observed from the simulated annealing study was used in MIP preparation for microcystin-LR. The monomers were copolymerized with a cross-linker in the presence of the template. A control (blank) polymer was prepared under the same conditions but in the absence of template. A competitive assay with microcystin-horseradish peroxidase conjugate was optimized and used to evaluate the affinity and cross-reactivity of the polymer. The performance of the artificial receptor was compared to the performance of monoclonal and polyclonal antibodies raised against the toxin. The results indicate that imprinted polymer has affinity and sensitivity comparable to those of polyclonal antibodies (the detection limit for microcystin-LR using the MIP-based assay was found to be 0.1 microg L-1), while superior chemical and thermal stabilities were obtained. Moreover, cross-reactivity to other toxin analogues was very low for the imprinted polymer, in contrast to the results achieved for antibodies. It is anticipated that the polymer designed could be used in assays, sensors, and solid-phase extraction. 相似文献
7.
一、概述
"绿色照明"是20世纪90年代初国际上对采用节约电能、保护环境的照明系统的形象性说法.美国、英国、法国、日本等主要发达国家和部分发展中国家,先后制订了"绿色照明工程"计划,取得了明显效果. 相似文献
8.
Scientometrics - Peer reviews play a vital role in academic publishing. Authors have various feelings towards peer reviews. This study analyzes the experiences shared by authors in Scirev.org to... 相似文献
9.
In the big data era, firms are inundated with customer data, which are valuable in improving services, developing new products, and identifying new markets. However, it is not clear how companies apply data-driven methods to facilitate customer knowledge management when developing innovative new products. Studies have investigated the specific benefits of applying data-driven methods in customer knowledge management, but failed to systematically investigate the specific mechanics of how firms realised these benefits. Accordingly, this study proposes a systematic approach to link customer knowledge with innovative product development in a data-driven environment. To mine customer needs, this study adopts the Apriori algorithm and C5.0 in addition to the association rule and decision tree methodologies for data mining. It provides a systematic and effective method for managers to extract knowledge ‘from’ and ‘about’ customers to identify their preferences, enabling firms to develop the right products and gain competitive advantages. The findings indicate that the knowledge-based approach is effective, and the knowledge extracted is shown as a set of rules that can be used to identify useful patterns for both innovative product development and marketing strategies. 相似文献
10.
In this work, we propose a new computational technique to solve the protein classification problem. The goal is to predict the functional family of novel protein sequences based on their motif composition. In order to improve the results obtained with other known approaches, we propose a new data mining technique for protein classification based on Bayes' theorem, called highest subset probability (HiSP). To evaluate our proposal, datasets extracted from Prosite, a curated protein family database, are used as experimental datasets. The computational results have shown that the proposed method outperforms other known methods for all tested datasets and looks very promising for problems with characteristics similar to the problem addressed here. In addition, our experiments suggest that HiSP performs well on highly imbalanced datasets 相似文献
11.
Ehsan Mohammadi 《Scientometrics》2012,92(3):593-608
Nanoscience and technology (NST) is a relatively new interdisciplinary scientific domain, and scholars from a broad range of different disciplines are contributing to it. However, there is an ambiguity in its structure and in the extent of multidisciplinary scientific collaboration of NST. This paper investigates the multidisciplinary patterns of Iranian research in NST based on a selection of 1,120 ISI??indexed articles published during 1974?C2007. Using text mining techniques, 96 terms were identified as the main terms of the Iranian publications in NST. Then the scientific structure of the Iranian NST was mapped through multidimensional scaling, based upon the co-occurrence of the main terms in the academic publications. The results showed that the NST domain in Iranian publications has a multidisciplinary structure which is composed of different fields, such as pure physics, analytical chemistry, chemistry physics, material science and engineering, polymer science, biochemistry and new emerging topics. 相似文献
12.
This study aims at improving the effectiveness of Quality function deployment (QFD) in handling the vague, subjective and limited information. QFD has long been recognised as an efficient planning and problem-solving tool which can translate customer requirements (CRs) into the technical attributes of product or service. However, in the traditional QFD analysis, the vague and subjective information often lead to inaccurate priority. In order to solve this problem, a novel group decision approach for prioritising more rationally the technical attributes is proposed. Basically, two stages of analysis are described: the computation of CR importance and the prioritising the technical attributes with a hybrid approach based on a rough set theory (RST) and grey relational analysis (GRA). The approach integrates the strength of RST in handling vagueness with less priori information and the merit of GRA in structuring analytical framework and discovering necessary information of the data interactions. Finally, an application in industrial service design for compressor rotor is presented to demonstrate the potential of the approach. 相似文献
13.
Marwa Saidi Olfa Mannai Houcemeddine Hermassi Rhouma Rhouma Safya Belghith 《成像科学杂志》2019,67(5):237-253
In this work, we propose a new adaptive chaotic steganographic method based on the Discrete Cosine Transform (DCT) and a reversible mapping function. The mapping function is used to map the secret bits into their corresponding symbols. This mapping technique has to preserve the same dynamics, properties and distribution of the original DCT coefficients. The novelty of our approach is based on the adaptive selection phase of embedding spots. This selection is established through a blindness condition which is applied over each image of the database. The proposed embedding scheme within the middle DCT coefficients shows lower probability of detection and higher flexibility in extraction. We evaluate the detection of our method using the Ensemble Classifiers and a set of frequency and spatial domain feature extractors such as the Spatial domain Rich Model (SRM) features, Chen et al.'s 486-dimensional both inter- and intra-block Markov-based features and Liu's 216-dimensional adaptive steganography-based features. 相似文献
14.
“Sequential pattern mining” is a prominent and significant method to explore the knowledge and innovation from the large database. Common sequential pattern mining algorithms handle static databases. Pragmatically, looking into the functional and actual execution, the database grows exponentially thereby leading to the necessity and requirement of such innovation, research, and development culminating into the designing of mining algorithm. Once the database is updated, the previous mining result will be incorrect, and we need to restart and trigger the entire mining process for the new updated sequential database. To overcome and avoid the process of rescanning of the entire database, this unique system of incremental mining of sequential pattern is available. The previous approaches, system, and techniques are a priori-based frameworks but mine patterns is an advanced and sophisticated technique giving the desired solution. We propose and incorporate an algorithm called STISPM for incremental mining of sequential patterns using the sequence tree space structure. STISPM uses the depth-first approach along with backward tracking and the dynamic lookahead pruning strategy that removes infrequent and irregular patterns. The process and approach from the root node to any leaf node depict a sequential pattern in the database. The structural characteristic of the sequence tree makes it convenient and appropriate for incremental sequential pattern mining. The sequence tree also stores all the sequential patterns with its count and statistics, so whenever the support system is withdrawn or changed, our algorithm using frequent sequence tree as the storage structure can find and detect all the sequential patterns without mining the database once again. 相似文献
15.
16.
Optimization and Engineering - Palletization, a core activity in warehousing and distribution, involves the solution of a three-dimensional bin packing problem with side constraints. This problem... 相似文献
17.
Abderrahim HA Aoust T Malambu E Sobolev V Van Tichelen K De Bruyn D Maes D Haeck W Van den Eynde G 《Radiation protection dosimetry》2005,116(1-4 PT 2):433-441
Since 1998, SCK*CEN, in partnership with IBA s.a. and many European research laboratories, is designing a multipurpose accelerator driven system (ADS) for Research and Development (R&D) applications-MYRRHA-and is conducting an associated R&D support programme. MYRRHA is an ADS under development at Mol in Belgium and is aiming to serve as a basis for the European experimental ADS to provide protons and neutrons for various R&D applications. It consists of a proton accelerator delivering a 350 MeV x 5 mA proton beam to a liquid Pb-Bi spallation target that in turn couples to a Pb-Bi cooled, subcritical fast core. In the first stage, the project focuses mainly on demonstration of the ADS concept, safety research on sub-critical systems and nuclear waste transmutation studies. In a later stage, the device will also be dedicated to research on structural materials, nuclear fuel, liquid metal technology and associated aspects, and on sub-critical reactor physics. Subsequently, it will be used for research on applications such as radioisotope production. A first preliminary conceptual design file of MYRRHA was completed by the end of 2001 and has been reviewed by an International Technical Guidance Committee, which concluded that there are no show stoppers in the project and even though some topics such as the safety studies and the fuel qualification need to be addressed more deeply before concluding it. In this paper, we are reporting on the state-of-the art of the MYRRHA project at the beginning of 2004 and in particular on the radiation shielding assessment and the radiation protection particular aspects through a remote handling operation approach in order to minimise the personnel exposure to radiation. 相似文献
18.
19.
Machine vision systems, which are being extensively used for intelligent transportation applications, such as traffic monitoring and automatic navigation, suffer from image instability caused by environment unstable conditions. On the other hand, by increasing the use of home video cameras which sometimes need to remove unwanted camera movement, which is created by cameraman shaking hands, video stabilisation algorithms are being considered. The video stabilisation process consists of three essential phases: global motion estimation, intentional motion estimation and motion compensation. Motion estimation process is the main time consuming part of global motion estimation phase. Using motion vectors extracted directly from MPEG compressed video, instead of any other special feature, can increase the algorithm generality. In addition, it provides the facility for integrating video stabilisation and video compression subsystems and removing the block matching phase from video stabilisation procedure. Elimination of any iterative outlier removal preprocessing and adaptive selection of motion vectors has increased speed of the algorithm. Although deterministic approaches are faster than the related probabilistic methods, they have essential problems in escaping from local optima. For this purpose, particle filters, the ability of which is considerable when submitted to non-linear systems with non-Gaussian noises, are utilised. Setting the parameters of the particle filter using a fuzzy control system reduces the incorrect intentional camera motion removal. The proposed method is simulated and applied to video stabilisation problem and its high performance on various video sequences is demonstrated. 相似文献
20.
Jos J. Amador 《International journal of imaging systems and technology》2003,13(4):201-207
This article presents a tractable and empirically accurate algorithm realizing a midlevel visual process for pattern recognition. The algorithm takes advantage of hypotheses provided by a high‐level visual process, thereby, attempting to extract a region in an image based on these hypotheses. The main focus is to recognize quadrilateral as well as arbitrarily shaped objects from synthetic and real‐world images. The novel approach is based on a study of the Hough Transform and its generalized version. To show overall usefulness of the algorithm, an extensive series of experiments was performed. In particular, occlusion and multiple object‐instances were tested, indicating the effectiveness of this work's approach.© 2003 Wiley Periodicals, Inc. Int J Imaging Syst Technol, 13, 201–207;2003; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ima.10056 相似文献