首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
3.
The new composite material Textile Reinforced Concrete (TRC) is a promising development which may open up entirely new fields for the application of the construction material concrete. The possible more filigree structures with high quality surfaces make TRC an attractive choice for the architect and give the engineer more freedom in design. However, the use of TRC requires design rules which are currently being developed at RWTH Aachen University, Germany. In this article, recent experimental results as well as modeling techniques are described.  相似文献   

4.
Lu Lu 《Quality Engineering》2019,31(1):129-132
Abstract

We begin with congratulation to the authors on an insightful and interesting research on designs of reliability experiments. The authors were particularly efficient in charactering the difference between reliability experiments and classical experiments, i.e., the existence of non-normal response and censored observations. Although these characteristics complicate the analysis of reliability experiments, the authors have excellently shown that ideas from design of classical experiments could be borrowed to offer well-performed methods. As indicated in section 4 of Dr. Freeman’s article, incorporation of degradation data is useful in reliability analysis. In our discussion, we aim to amplify this point with a particular focus on degradation tests, an important class of reliability test to estimate lifetime distribution.  相似文献   

5.
Abstract

The mixture-of-mixtures (MoM) experiment is different from the classical mixture experiment in that the mixture component in MoM experiments, known as the major component, is made up of subcomponents, known as the minor components. In this article, we propose an additive heredity model (AHM) for analyzing MoM experiments. The proposed model considers an additive structure to inherently connect the major components with the minor components. To enable a meaningful interpretation for the estimated model, the hierarchical and heredity principles are applied by using the nonnegative garrote technique for model selection. The performance of the AHM was compared to several conventional methods in both unconstrained and constrained MoM experiments. The AHM was then successfully applied in two real-world problems studied previously in the literature. Supplementary materials for this article are available online.  相似文献   

6.
Abstract

The nearest neighbor (NN) searching problem has wide applications. In vector quantization (VQ), both the codebook generation phase and encoding phase (using the codebook just generated) often need to use the NN search. Improper design of the searching algorithm will make the complexity quite big as vector dimensionality k or codebook size N increases. In this paper, a fast NN searching method is proposed, which can then accelerate the LBG codebook generation process for VQ design. The method successfully modifies and improves the LAESA method. Unlike LAESA, the proposed k/2 “fixed” points (allocated far from the data) and the origin are used as the k/2+1 reference points to reduce the searching area. The overhead in memory is only linearly proportional to N and k. The time complexity, including the overhead, is of order O(kN). According to our experiments, the proposed algorithm can reduce the time burden while the distortion remains identical to that of the full search.  相似文献   

7.
ABSTRACT

Many practical situations have both a quality characteristic and a reliability characteristic with the goal to find an appropriate compromise for the optimum conditions. Standard analyses of quality and reliability characteristics in designed experiments usually assume a completely randomized design. However, many experiments involve restrictions on randomization, e.g., subsampling, blocking, split-plot.

This article considers an experiment involving both a quality characteristic and a reliability characteristic (lifetime) within a subsampling protocol. The particular experiment uses Type I censoring for the lifetime. Previous work on analyzing reliability data within a subsampling protocol assumed Type II censoring. This article extends such an analysis for Type I censoring. The method then uses a desirability function approach combined with the Pareto front to obtain a trade-off between the quality and reliability characteristics. A case study illustrates the methodology.  相似文献   

8.
ABSTRACT

When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or temperature. Our goal is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are several potentially non-overlapping priors under consideration. While multiple priors have been used for analysis in the past, they have never been used in a design context. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. We illustrate the method through multiple scenarios and a motivating example. Additional figures for this article are available in the online supplementary information.  相似文献   

9.
Abstract:

Quality Function Deployment (QFD) is a systematic process for capturing and integrating the voice of the customer into every aspect of the design and delivery of products and services. Understanding customer wants or needs is crucial to the successful design and development of new products and services. QFD is a system that utilizes customer demand information to design products or services that will meet a client's mission. In addition, the process prioritizes and deploys these customer-driven characteristics throughout the product or service development to meet the customer needs, wants, and expectations. QFD determines effective development targets for the prioritized product and service characteristics. The QFD process has been used and documented extensively in product development. The service industry, however, lacks in the application of this process. The purpose of this article is to show practitioners and researchers how this process, in its entirety, can be used as a planning process to link customer requirements and service characteristics in the service industry. A case study was developed in which QFD was applied to develop recommendations for the American Society of Engineering Management (ASEM) in an effort to increase customer satisfaction and to identify opportunities to improve member benefits. The results of this study are applicable to any organization to improve the design and delivery of products and service regardless of industry.  相似文献   

10.
Abstract:

Implementation of lean practices results in significant changes in an organization. Current research in lean production primarily focuses on examining relationships between the implementation of lean production (LP) and the performance of the organization; however, there is a need to assess the implications of large-scale changes such as lean production on work design characteristics and employee outcomes. The purpose of this article is to study the impact of lean production on work design characteristics, such as autonomy, task identity and skill variety, and employee outcomes. This article proposes a conceptual framework that indicates key LP practices and their influences on work characteristics and employee outcomes.

Current research in LP identifies transformations in lean models inspired by sociotechnical principles. This article presents a comprehensive literature review of an evaluation of LP based on sociotechnical design principles. This evaluation contributes toward developing an insight for the proposed framework of the work design characteristics in a lean environment. Finally, a causal loop diagram is used to derive theoretical implications of the conceptual framework proposed.  相似文献   

11.
Abstract

Low frequency dielectric spectroscopy (LFDS) is an analytical technique which has found considerable application in the study of pharmaceutical systems. In this article, an outline of the theoretical and practical aspects of the method will be given, as well as a discussion of the advantages and disadvantages of the technique. Examples will be given of how LFDS may be used in the analysis of pharmaceutical systems, including studies on solid dispersions, inter-batch variation, liposome suspensions and cyclodextrins.  相似文献   

12.
Abstract

Engineers use reliability experiments to determine the factors that drive product reliability, build robust products, and predict reliability under use conditions. This article uses recent testing of a howitzer to illustrate the challenges in designing reliability experiments for complex, repairable systems. We review research in complex system reliability models, failure-time experiments, and experimental design principles. We highlight the need for reliability experiments that account for various intended uses and environments. We leverage lessons learned from current research and propose methods for designing an experiment for a complex, repairable system.  相似文献   

13.
ABSTRACT

During the last decades, we evolved from measuring few process variables at sparse intervals to a situation in which a multitude of variables are measured at high speed. This evidently provides opportunities for extracting more information from processes and to pinpoint out-of-control situations, but transforming the large data streams into valuable information is still a challenging task. In this contribution we will focus on the analysis of time-dependent processes since this is the scenario most often encountered in practice, due to high sampling systems and the natural behavior of many real-life applications. The modeling and monitoring challenges that statistical process monitoring (SPM) techniques face in this situation will be described and possible routes will be provided. Simulation results as well as a real-life data set will be used throughout the article.  相似文献   

14.

This article describes an emerging approach to the design of human-machine systems referred to as 'neuroadaptive interface technology'. A neuroadaptive interface is an ensemble of computer-based displays and controls whose functional characteristics change in response to meaningful variations in the user's cognitive and/or emotional states. Variations in these states are indexed by corresponding central nervous system activity, which control functionally adaptive modifications to the interface. The purpose of these modifications is to promote safer and more effective human-machine system performance. While fully functional adaptive interfaces of this type do not currently exist, there are promising steps being taken toward their development, and great potential value in doing so--value that corresponds directly to and benefits from a neuroergonomic approach to systems development. Specifically, it is argued that the development of these systems will greatly enhance overall human-machine system performance by providing more symmetrical communications between users and computer-based systems than currently exist. Furthermore, their development will promote a greater understanding of the relationship between nervous system activity and human behaviour (specifically work-related behaviour), and as such may serve as an exemplary paradigm for neuroergonomics. A number of current research and development areas related to neuroadaptive interface design are discussed, and challenges associated with the development of this technology are described.  相似文献   

15.
Abstract:

Product management as an organizational design has been around for decades as companies continually strive for competitive advantage. The goal is to create a system whereby new products are developed, and existing products managed, by an individual (or team), whose primary responsibility is the business success of those products. The person with the title product manager is expected to understand the technical side of the business as it relates to product design and manufacture, as well as the business side as it relates to satisfying customer needs profitably. However, many people are hired into product manager positions without a clear understanding of the skills and abilities required to be successful. Five specific product manager competencies (along with 18 subcategories) are described in this article, with a final tool for developing a product manager scorecard. The intent of this article is to help engineers who are moving into product management positions better understand the requirements of the new job.  相似文献   

16.
ABSTRACT

Future applications are envisioned in which a single human operator manages multiple heterogeneous unmanned vehicles (UVs) by working together with an autonomy teammate that consists of several intelligent decision-aiding agents/services. This article describes recent advancements in developing a new interface paradigm that will support human-autonomy teaming for air, ground, and surface (sea craft) UVs in defence of a military base. Several concise and integrated candidate control station interfaces are described by which the operator determines the role of autonomy in UV management using an adaptable automation control scheme. An extended play calling based control approach is used to support human-autonomy communication and teaming in managing how UV assets respond to potential threats (e.g. asset allocation, routing, and execution details). The design process for the interfaces is also described including: analysis of a base defence scenario used to guide this effort, consideration of ecological interface design constructs, and generation of UV and task-related pictorial symbology.  相似文献   

17.
Abstract

In feedback stabilizing compensator design for linear systems, mere BIBO (bounded‐input‐bounded‐output) stability is not sufficient since it does not provide a margin of decay rate, e. g., the compensated feedback system may still become insufficiently stable. In this paper, we will define the notion of σ‐stability and will give a complete parametrization of all σ‐stabilizing compensators in terms of a free design parameter via the fractional representation approach. Any σ‐stabilizing compensator will guarantee a decay rate margin σ of the compensated system. Also we will use the state space techniques to obtain the required factors in that parametrization of all σ‐stabilizing compensators.  相似文献   

18.
The multi-objective optimization of multiple geostationary spacecraft refuelling is investigated in this article. A servicing spacecraft (SSc) and a propellant depot (PD), both parked initially in geostationary Earth orbit (GEO), are utilized to refuel multiple GEO targets of known propellant demand. The capacitated SSc is expected to rendezvous with fuel-deficient GEO targets or the PD for the purpose of refuelling or getting refuelled. The multiple geostationary spacecraft refuelling problem is treated as a multi-variable combinatorial optimization problem with the principal objective of minimizing the propellant consumption and the mission duration. A two-level optimization model is built, and the design variables are the refuelling order X, the refuelling time T and the binary decision variable S. The non-dominated sorting genetic algorithm is employed to solve the up-level optimization problem. For the low-level optimization, an exact algorithm is proposed. Finally, numerical simulations are presented to illustrate the effectiveness and validity of the proposed approach.  相似文献   

19.
Abstract

For a few years, we have been in possession of a methodology for the formulation of tablets. It is based on an experimental planning design of rational development work using an instrumented tablet machine. On the other hand, it is possible to organize mathematically the scientific studies in various fields so that we can reach the best result as fast as possible, i.e. to carry out an optimal number of experiments giving the maximum amount of information: this is the application of the statistical experimental designs. The aim of this study is to associate these two concepts.

The major principle is to study the variations of well-chosen answers (tablet crushing strength…) according to the percentages of vehicles chosen as variables (diluent, disintegration agent…) while the methodology of mixing remains fixed. An experimental design is built, a well-defined number of experiments are carried out, and then we try to put into equation the responses as a polynomial function of the percentages of vehicles. The best mathematical models are statistically determined by multilinear regression and used to plot the response surfaces. This mathematical treatment associated with the objectives of the pharmacist allows us to determine the optimal formula. And so, carrying out the theoretical optimal experiment we can verify the accordance of the experiment with the model.

A first approach with wet granulation showed all the difficulties in fixing all the parameters. But it has already shown all the advantages of the experimental designs and that this technique can be very helpful in building the experimental planning design.

With direct compression, two experimental designs were built to determine the optimal formula as fast as possible: these were a Scheffe design and a Mac Lean and Anderson design.

When the constraints on vehicle percentages lead to a triangular experimental field, Scheffe designs are recommanded. For any other configuration Mac Lean and Anderson designs will be useful.

The responses were:

1. Y1/D, where Yl is the maximal pressure measured at the upper punch and D the crushing strength of the tablet.

2. (V10-V500), where V10 and V500 are the volumes of 100 grammes of the final mix of powder after 10 and 500 tamping taps given by a standardized apparatus. Their difference (V10 - V500) should not exceed 20 ml.

3. td, the disintegration time of tablets measured by way of the European Pharmacopea test.

In both cases, the results were excellent considering the differences between the response values given by the mathematical models and the corresponding experimental values.  相似文献   

20.
Tao Wen 《中国工程学刊》2013,36(6):523-529
ABSTRACT

Large-gap magnetic suspension is a conceptual design for experiments which can be used to investigate technical issues associated with magnetic suspension and accurate suspended-load control for large gaps. The traditional linear control strategy has significant limits to its control area, usually around the chosen operating points. With increase in the suspension gap, the nonlinearity and time delay of a system will become more serious. Additionally, the interaction of an estimation scheme and corresponding controller has not been fully examined in the literature. This paper examines a developed compound control method, Smith forecast compensation control with proportional-integral-derivative control (PID control) , as a means of advancing the achievable performance of magnetic suspension. We designed and built a laboratory testbed to determine the feasibility of utilizing proposed methods in these types of applications. The analytical results are supported by the simulations and experiments, showing our research approach to be fairly successful at providing satisfactory performance for motion control under conditions involving a large air gap.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号