首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
We have developed a taxonomy that classifies those needs of a corporation that impact product design. We call these needs corporate requirements. In contrast to the consumer or end-user requirements, corporate requirements come from internal sources such as marketing, finance, manufacturing, and service. This taxonomy allows for an organized method of gathering, managing, and retrieving the requirements. The taxonomy also helps to facilitate a broader, clearer form of Quality Function Deployment. Generic in nature, this taxonomy provides a template with which to create taxonomies for a given product within a given company or industry. We include an industrial case study to demonstrate this concept.  相似文献   

2.
In this study, a chemoresistive sensor was fabricated by the chemical polymerization and coating of either polyaniline (PANI), poly[2-methoxy-5-(2-ethyloxy)-p-phenylenevinylene], or commercial poly(methyl methacrylate) on MWNTs. We investigated the resistance responsiveness of the multilayer samples to simulated chemical warfare agents, including dimethyl methyl phosphonate (DMMP) and dichloromethane (DCM), as well as to organic agents, such as chloroform, tetrahydrofuran, methyl-ethyl ketone, and xylene. The MWNTs–PANI film was characterized by SEM and FT-IR, and the resistivity values for the six solvents were measured at different temperatures. We observed that the MWNTs-PANI sensing film exhibited a high sensitivity, excellent selectivity, and good reproducibility to the detection of all of the aforementioned agent vapors. In addition, we used atomic force microscopy to demonstrate the MWNTs–PANI absorption of DMMP vapor, wherein the sensing film exhibited a swelling phenomenon, such that the film thickness increased from 0.8 to 1.3 μm. In addition, we used principal component analysis to evaluate the performance of the sensor in detecting DMMP, DCM, and the aforementioned organic agent vapors.  相似文献   

3.
This paper questions the prevailing notions that firms within industrial clusters have privileged access to “tacit knowledge” that is unavailable—or available only at high cost—to firms located elsewhere, and that such access provides competitive advantages that cause the growth and development of both firms and regions. It outlines a model of cluster dynamics emphasizing two mutually interdependent processes: the concentration of specialized and complementary epistemic communities, on the one hand, and entrepreneurship and a high rate of new firm formation on the other.  相似文献   

4.
Decellularized heart valve scaffolds possess many desirable properties in valvular tissue engineering. However, their current applications were limited by short durability, easily structural dysfunction and immunological competence. Although crosslinking with chemical reagents, such as glutaraldehyde (GA), will enhance the mechanical properties, the low long-term stability and cytotoxicity of the scaffolds remains potential problem. Nordihydroguaiaretic acid (NDGA) is a bioactive natural product which is able to crosslink collagen and was proven to be effective in preparation of scaffold for tendon tissue engineering. In this paper, NDGA crosslinked decellularized heart valve scaffolds demonstrated higher tensile strength, enzymatic hydrolysis resistance and store stability than the non-crosslinked ones. Its mechanical properties and cytocompability were superior to that of GA-crosslinked heart valve matrix. Below the concentration of 10 μg/ml, NDGA has no visible cytotoxic effect on both endothelial cells (EC) and valvular interstitial cells (VIC) and its cytotoxicity is much less than that of GA. The LC50 (50% lethal concentration) of NDGA on ECs and VICs are 32.6 μg/ml and 47.5 μg/ml, respectively, while those of GA are almost 30 times higher than NDGA (P < 0.05). ECs can attach to and maintain normal morphology on the surface of NDGA-crosslinked valvular scaffolds but not GA-crosslinked ones. This study demonstrated that NDGA-crosslinking of decellularized valvular matrix is a promising approach for preparation of heart valve tissue engineering scaffolds.  相似文献   

5.
A major challenge facing the development of tissue-engineered vascular grafts (TEVGs), promising living replacements for diseased vascular structures, is enhancing angiogenesis. To promote rapid vascularization, endothelial cells (ECs) were co-cultured with smooth muscle cells (SMCs) in decellularized small intestinal submucosa scaffolds to regenerate angiogenic-TEVGs (A-TEVGs). Observation of the A-TEVGs at 1 month post-implantation revealed that a rich network of neocapillaries lining the blood vessel wall had developed; that the ECs of the neovasculatures had been derived from previously seeded ECs and later invading ECs of the host’s vascular bed; that tissue vascularization had not significantly impaired mechanical properties; and that the maximal tensile strength of the A-TEVGs was of the same order of magnitude as that of native porcine femoral arteries. These results indicate that of the co-culturing of ECs with SMCs could enhance vascularization of TEVGs in vivo, possibly increasing graft perfusion and host integration.  相似文献   

6.
Internet technology is an indispensable tool in scientific research. Prior research confirms the importance of professional activities, professional networks, scientific collaboration and the internet among scientists, academics and researchers. In other words, professional activities, networks and collaboration are relevant epistemic strategies in both the short- and long-term objectives of knowledge production. Variations in these strategies are possible across different categories such as race and gender. Involving academics and scientists (n = 204) from sampled institutions in post-apartheid South Africa, this study examines how the use of technology by people in different racial categories influences their epistemic strategies of professional activities, networks and scientific collaboration.  相似文献   

7.
Quantification for the importance degree of engineering characteristics (ECs) is an essential problem in quality function deployment. In real-world scenario, it is sometimes difficult to directly evaluate the correlation degree between ECs and customer requirements (CRs) as ECs are too abstract. Thus, the target ECs have to be further decomposed into several more detailed basic ECs and organised by a multi-level hierarchical structure. The paper investigates the quantification problem for the importance degree of such target ECs and tackles two critical issues. The first issue is how to deal with the uncertainties including fuzziness and incompleteness involved during the evaluation process. A fuzzy evidential reasoning algorithm-based approach is proposed to tackle this issue and derive the correlation degree between each of the basic ECs and the whole CRs. The second issue is how to deal with the interactions among the basic ECs decomposed from the same target EC during the aggregation process. A λ-fuzzy measure and fuzzy discrete Choquet integral-based approach is proposed to tackle this issue and aggregate these basic ECs. Final importance degree of the target ECs can then be obtained. At the end of this paper, a case study is presented to verify the feasibility and effectiveness of the method we propose.  相似文献   

8.
Pure iron is a candidate material for coronary artery stents because of its biodegradable and nontoxic properties. However the degradation characteristics of pure iron in vivo and in vitro are not yet clear. The purpose of the work described here was to determine the degradation rate of pure iron in vitro and to characterize the interaction of individual corrosion products from biocorrodible iron stents with endothelial cells (ECs) from the adjacent tissue. Pure iron was immersed in simulated body fluids (SBF) solution and the mass loss was measured. The response of human ECs to various concentrations of ferrous ions was investigated using WST-8 assay. The results demonstrate that the mean degradation rate of iron in vitro is about 20.4 μg/(cm2 h). The lower iron concentration (< 10 μg/ml) may produce the favorable effect on the metabolic activity of ECs. Conversely, the very high iron ion concentration (> 50 μg/ml) may have cytotoxicity on ECs.  相似文献   

9.
In this paper, we examine the role of lies in human social relations by implementing some salient characteristics of deceptive interactions into an opinion formation model, so as to describe the dynamical behaviour of a social network more realistically. In this model, we take into account such basic properties of social networks as the dynamics of the intensity of interactions, the influence of public opinion and the fact that in every human interaction it might be convenient to deceive or withhold information depending on the instantaneous situation of each individual in the network. We find that lies shape the topology of social networks, especially the formation of tightly linked, small communities with loose connections between them. We also find that agents with a larger proportion of deceptive interactions are the ones that connect communities of different opinion, and, in this sense, they have substantial centrality in the network. We then discuss the consequences of these results for the social behaviour of humans and predict the changes that could arise due to a varying tolerance for lies in society.  相似文献   

10.
Optimization leads to specialized structures which are not robust to disturbance events like unanticipated abnormal loading or human errors. Typical reliability-based and robust optimization mainly address objective aleatory uncertainties. To date, the impact of subjective epistemic uncertainties in optimal design has not been comprehensively investigated. In this paper, we use an independent parameter to investigate the effects of epistemic uncertainties in optimal design: the latent failure probability. Reliability-based and risk-based truss topology optimization are addressed. It is shown that optimal risk-based designs can be divided in three groups: (A) when epistemic uncertainty is small (in comparison to aleatory uncertainty), the optimal design is indifferent to it and yields isostatic structures; (B) when aleatory and epistemic uncertainties are relevant, optimal design is controlled by epistemic uncertainty and yields hyperstatic but nonredundant structures, for which expected costs of direct collapse are controlled; (C) when epistemic uncertainty becomes too large, the optimal design becomes redundant, as a way to control increasing expected costs of collapse. The three regions above are divided by hyperstatic and redundancy thresholds. The redundancy threshold is the point where the structure needs to become redundant so that its reliability becomes larger than the latent reliability of the simplest isostatic system. Simple truss topology optimization is considered herein, but the conclusions have immediate relevance to the optimal design of realistic structures subject to aleatory and epistemic uncertainties.  相似文献   

11.
Impaired mass transfer characteristics of blood-borne vasoactive species such as adenosine triphosphate in regions such as an arterial bifurcation have been hypothesized as a prospective mechanism in the aetiology of atherosclerotic lesions. Arterial endothelial cells (ECs) and smooth muscle cells (SMCs) respond differentially to altered local haemodynamics and produce coordinated macro-scale responses via intercellular communication. Using a computationally designed arterial segment comprising large populations of mathematically modelled coupled ECs and SMCs, we investigate their response to spatial gradients of blood-borne agonist concentrations and the effect of micro-scale-driven perturbation on the macro-scale. Altering homocellular (between same cell type) and heterocellular (between different cell types) intercellular coupling, we simulated four cases of normal and pathological arterial segments experiencing an identical gradient in the concentration of the agonist. Results show that the heterocellular calcium (Ca2+) coupling between ECs and SMCs is important in eliciting a rapid response when the vessel segment is stimulated by the agonist gradient. In the absence of heterocellular coupling, homocellular Ca2+ coupling between SMCs is necessary for propagation of Ca2+ waves from downstream to upstream cells axially. Desynchronized intracellular Ca2+ oscillations in coupled SMCs are mandatory for this propagation. Upon decoupling the heterocellular membrane potential, the arterial segment looses the inhibitory effect of ECs on the Ca2+ dynamics of the underlying SMCs. The full system comprises hundreds of thousands of coupled nonlinear ordinary differential equations simulated on the massively parallel Blue Gene architecture. The use of massively parallel computational architectures shows the capability of this approach to address macro-scale phenomena driven by elementary micro-scale components of the system.  相似文献   

12.
Epistemic uncertainty analysis is an essential feature of any model application subject to ‘state of knowledge’ uncertainties. Such analysis is usually carried out on the basis of a Monte Carlo simulation sampling the epistemic variables and performing the corresponding model runs.In situations, however, where aleatory uncertainties are also present in the model, an adequate treatment of both types of uncertainties would require a two-stage nested Monte Carlo simulation, i.e. sampling the epistemic variables (‘outer loop’) and nested sampling of the aleatory variables (‘inner loop’). It is clear that for complex and long running codes the computational effort to perform all the resulting model runs may be prohibitive.Therefore, an approach of an approximate epistemic uncertainty analysis is suggested which is based solely on two simple Monte Carlo samples: (a) joint sampling of both, epistemic and aleatory variables simultaneously, (b) sampling of aleatory variables alone with the epistemic variables held fixed at their reference values.The applications of this approach to dynamic reliability analyses presented in this paper look quite promising and suggest that performing such an approximate epistemic uncertainty analysis is preferable to the alternative of not performing any.  相似文献   

13.
Error and uncertainty in modeling and simulation   总被引:1,自引:0,他引:1  
This article develops a general framework for identifying error and uncertainty in computational simulations that deal with the numerical solution of a set of partial differential equations (PDEs). A comprehensive, new view of the general phases of modeling and simulation is proposed, consisting of the following phases: conceptual modeling of the physical system, mathematical modeling of the conceptual model, discretization and algorithm selection for the mathematical model, computer programming of the discrete model, numerical solution of the computer program model, and representation of the numerical solution. Our view incorporates the modeling and simulation phases that are recognized in the systems engineering and operations research communities, but it adds phases that are specific to the numerical solution of PDEs. In each of these phases, general sources of uncertainty, both aleatory and epistemic, and error are identified. Our general framework is applicable to any numerical discretization procedure for solving ODEs or PDEs. To demonstrate this framework, we describe a system-level example: the flight of an unguided, rocket-boosted, aircraft-launched missile. This example is discussed in detail at each of the six phases of modeling and simulation. Two alternative models of the flight dynamics are considered, along with aleatory uncertainty of the initial mass of the missile and epistemic uncertainty in the thrust of the rocket motor. We also investigate the interaction of modeling uncertainties and numerical integration error in the solution of the ordinary differential equations for the flight dynamics.  相似文献   

14.
This paper focuses on sensitivity analysis of results from computer models in which both epistemic and aleatory uncertainties are present. Sensitivity is defined in the sense of “uncertainty importance” in order to identify and to rank the principal sources of epistemic uncertainty. A natural and consistent way to arrive at sensitivity results in such cases would be a two-dimensional or double-loop nested Monte Carlo sampling strategy in which the epistemic parameters are sampled in the outer loop and the aleatory variables are sampled in the nested inner loop. However, the computational effort of this procedure may be prohibitive for complex and time-demanding codes. This paper therefore suggests an approximate method for sensitivity analysis based on particular one-dimensional or single-loop sampling procedures, which require substantially less computational effort. From the results of such sampling one can obtain approximate estimates of several standard uncertainty importance measures for the aleatory probability distributions and related probabilistic quantities of the model outcomes of interest. The reliability of the approximate sensitivity results depends on the effect of all epistemic uncertainties on the total joint epistemic and aleatory uncertainty of the outcome. The magnitude of this effect can be expressed quantitatively and estimated from the same single-loop samples. The higher it is the more accurate the approximate sensitivity results will be. A case study, which shows that the results from the proposed approximate method are comparable to those obtained with the full two-dimensional approach, is provided.  相似文献   

15.
Improved attachment, adhesion and proliferation of the surrounding mature endothelial cells (ECs) and circulating endothelial progenitor cells (EPCs) is of primary importance to realize the in situ rapid re-endothelialization of cardiovascular stents. To achieve this, a combinatorial coating of synthesized mussel adhesive polypeptide mimics as well as anti-CD34 antibody was constructed onto the devices through a novel adsorption method in this study. To immobilize the polypeptide and target antibody effectively, polycaprolactone (PCL) was first spin-coated onto the substrate as intermediate. The immobilization of polypeptide and antibody was confirmed by the changes of water contact angles and the attachment, growth of ECs and EPCs on the substrates, respectively. The results showed that after adhesive polypeptide or/and antibody immobilization, the hydrophilicity of coated PCL substrate (PCLS) was obviously improved. The amount of the immobilized antibody, determined by enzymelinked immunoassay (ELISA) method, was enhanced with the increase of antibody concentrations in the range from 5 to 25 μg/ml. The coatings after BSA blocking prevented the unspecific protein adsorption as monitored by fluorescent microscopy. The results of in vitro cell culture showed that compared with the PCLS, polypeptide/anti-CD34 antibody coating could effectively enhance the attachment, growth and adhesion of ECs and EPCs, in particular EPCs. A platelet adhesion experiment revealed that the blood compatibility of the PCLS after polypeptide/anti-CD34 antibody coating was also obviously improved. The results showed that the surface modification with adhesive polypeptide and anti-CD34 antibody will be a promising coating technique for the surface modification of the intravascular prostheses for rapid re-endothelialization.  相似文献   

16.
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations in epistemic uncertainty are ignored or worst-case scenarios are postulated. These strategies tend to produce sub-optimal decisions. We develop a general framework based on Bayesian decision theory and exemplify it for the case of seismic design of buildings. When temporal fluctuations of the epistemic uncertainties and regulatory safety constraints are included, the optimal level of seismic protection exceeds the normative level at the time of construction. Optimal Bayesian decisions do not depend on the aleatory or epistemic nature of the uncertainties, but only on the total (epistemic plus aleatory) uncertainty and how that total uncertainty varies randomly during the lifetime of the project.  相似文献   

17.
Quality Function Deployment (QFD) is a powerful tool that translates the Voice of the Customer (VoC) into the Engineering Characteristics (ECs), which are those that can be modified in order to meet the desires of the customer. A main objective of QFD is the determination of target values of ECs; however, the conventional QFD aims only empirically at finding these targets, which makes it difficult for the ECs to be optimum. This paper proposes a novel method for determining optimum targets in QFD. Fuzzy numbers are used to represent the imprecise nature of the judgements, and to define more appropriately the relationships between ECs and Customer Attributes (CAs). Constraints such as cost, technical difficulty and market position are considered. An example of a car door is presented to show the application of the method.  相似文献   

18.
This paper generally describes research just initiated on the socio-behavioral aspects of disasters resulting from chemical agents, and reports preliminary findings from the first phase of study. These initial observations are about the community and organizational preparations and planning for acute chemical hazard disasters. The results are drawn from data gathered on disaster preparedness in 14 communities and six major threats or actual disasters involving chemical agents in American society.A model for describing and analysing community and organizational disaster planning is outlined. Some initial observations are stated about how communities rank the probability of different kinds of disasters including chemical ones. We then present in general terms a series of findings about community and organizational perceptions and reactions with respect to chemical threats, resources to deal with such threats, the social organization of emergency related groups using such resources and the social climate in which the emergency groups operate. Some implications for planning are then indicated.  相似文献   

19.
Uncertainty quantification (UQ) is the process of determining the effect of input uncertainties on response metrics of interest. These input uncertainties may be characterized as either aleatory uncertainties, which are irreducible variabilities inherent in nature, or epistemic uncertainties, which are reducible uncertainties resulting from a lack of knowledge. When both aleatory and epistemic uncertainties are mixed, it is desirable to maintain a segregation between aleatory and epistemic sources such that it is easy to separate and identify their contributions to the total uncertainty. Current production analyses for mixed UQ employ the use of nested sampling, where each sample taken from epistemic distributions at the outer loop results in an inner loop sampling over the aleatory probability distributions. This paper demonstrates new algorithmic capabilities for mixed UQ in which the analysis procedures are more closely tailored to the requirements of aleatory and epistemic propagation. Through the combination of stochastic expansions for computing statistics and interval optimization for computing bounds, interval-valued probability, second-order probability, and Dempster-Shafer evidence theory approaches to mixed UQ are shown to be more accurate and efficient than previously achievable.  相似文献   

20.
In order to explore new scientific and innovative communities, analyses based on a technological infrastructure and its related tools, for example, ‘Web of science’ database for Scientometric analysis, are necessary. However, there is little systematic documentation of social media data and webometric analysis in relation to Korean and broader Asian innovation communities. In this short communication, we present (1) webometric techniques to identify communication processes on the Internet, such as social media data collection and analysis using an API-based application; and (2) experimentation with new types of data visualization using NodeXL, such as social and semantic network analysis. Our research data is drawn from the social networking site, Twitter. We also examine the overlap between innovation communities in terms of their shared members, and then, (3) calculate entropy values for trilateral relationships.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号