首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Quality assurance techniques used in software development and hardware maintenance/reliability help ensure that data in a computerized information management system are maintained well. However, information workers may not know the quality of data resident in their information systems. Knowledge of the quality of information and data in an enterprise provides managers with important facts for managing and improving the processes that impact information quality. This paper presents quality assessment methodology to assist information workers in planning and implementing an effective assessment of their information data and quality. The areas covered include: identifying appropriate information quality indicators; developing assessment procedures; conducting information quality assessments; reporting information assessment results; tracking improvements in information quality.  相似文献   

2.
It is often said that the aim of risk assessment is to faithfully represent and report the knowledge of some defined experts in the field studied. The analysts' job is to elicit this knowledge, synthesise it and report the results as integrated uncertainty assessments, for example, expressed through a set of probability distributions. Analysts' judgements beyond these tasks should not be incorporated in the uncertainty assessments (distributions). The purpose of the present paper is to discuss the rationale of this perspective. To conduct a risk assessment in practice the analysts need to make a number of judgements related to, for example, the choice of methods and models that to a large extent influence the results. And often the analysts are the real experts on many of the issues addressed in the assessments, in particular, when it comes to understanding how various phenomena and processes interact. Would it then not be more appropriate to fully acknowledge the role of the analysts as uncertainty assessors and probability assigners, and see the results of the risk assessments as their judgements based on input from the experts? The discussion is illustrated by two examples.  相似文献   

3.
Tomas   《Technology in Society》2009,31(3):325-331
This paper presents a framework for understanding risk from the perspective of technological innovation and change. Special focus is put on systemic technological change, which tends to affect several dimensions of society at the same time. By drawing on innovation theory, and exemplifying by reference to the OECD futures project on Emerging Systemic Risk, the article elaborates a framework for technology assessment where the central elements are ubiquitous technological change and risk. Several key dimensions for technology assessment of this kind are identified, including increased mobility of people and goods, magnitude and concentration of humans, the speed and depth of change in the risk landscape, public to private shifts in the ‘ownership’ of risk, and the role played by expectations and perception to risk. The article ends with suggesting a number of new norms for risk and technology assessment coupled with new risk methodologies for further investigation.  相似文献   

4.
Abstract

The Sb–Sn system has been assessed in terms of thermodynamic models for the individual phases. Expressions for the Gibbs energy as a function of temperature and composition are obtained and both phase diagram and thermochemical data are calculated and found to be in good agreement with the experimental data. The compound phase SbSn is treated by means of a Bragg–Williams–Gorsky approach.

MST/421  相似文献   

5.
A quantitative complexity theory of human–computer interaction is presented and validated by means of laboratory experiments. Based on the seminal work of Grassberger in theoretical physics, a complexity measure is introduced. The measure is termed the Effective Measure Complexity and has three main advantages. First, it relies solely on information-theoretic quantities, which are intimately connected with the concept of complexity and not randomness. Second, it is model independent and can be estimated efficiently from data. Third, the estimates can be derived from behavioural patterns in terms of observable interaction events alone. Subjective ratings or psychophysiological measurements can be included but are not mandatory. In order to explain the theory, a simple and easy to generalise example in mobile human–computer interaction is presented. Furthermore, the external validity of the complexity measure is studied in laboratory experiments. The experimental task was to search for multiple targets on an electronic chart display and information system (ECDIS). ECDIS is an integral component of modern ship bridge concepts and therefore the experiments were carried out in a marine simulator. Thirty users participated. The platform motion (with or without motion) and the workplace illumination (800 lux or 30 lux) were varied systematically and the complexity effects were studied. The results show that the complexity of the visual search processes is significantly lower when the simulator is put in sea state characteristics and the users are facing straining motion forces. In addition, interaction complexity is significantly lowered when illuminance is reduced from the daylight level of 800 lux to the twilight level of 30 lux. Therefore, the complexity measure seems to be a valid for the quantitative assessment of human–computer interaction.  相似文献   

6.
This illustrated commentary looks at two very different types of integrated assessment models in context. On one hand the current generation of integrated climate models has achieved a significant role in environmental policy. On the other, integrated models for urban and regional systems have declined in their relevance for the policy process. However, the global models should be closely linked to urban models, which provide a very significant part of their inputs. This leads us to consider the modelling paradigm and future directions in technical tools which link science and policy. A city-region case study provides one example of a response—a deliberately simple and transparent scenario-building model, as a practical link in the problem-solution cycle.  相似文献   

7.
Driver’s collision avoidance performance has a direct link to the collision risk and crash severity. Previous studies demonstrated that the distracted driving, such as using a cell phone while driving, disrupted the driver’s performance on road. This study aimed to investigate the manner and extent to which cell phone use and driver’s gender affected driving performance and collision risk in a rear-end collision avoidance process. Forty-two licensed drivers completed the driving simulation experiment in three phone use conditions: no phone use, hands-free, and hand-held, in which the drivers drove in a car-following situation with potential rear-end collision risks caused by the leading vehicle’s sudden deceleration. Based on the experiment data, a rear-end collision risk assessment model was developed to assess the influence of cell phone use and driver’s gender. The cell phone use and driver’s gender were found to be significant factors that affected the braking performances in the rear-end collision avoidance process, including the brake reaction time, the deceleration adjusting time and the maximum deceleration rate. The minimum headway distance between the leading vehicle and the simulator during the rear-end collision avoidance process was the final output variable, which could be used to measure the rear-end collision risk and judge whether a collision occurred. The results showed that although cell phone use drivers took some compensatory behaviors in the collision avoidance process to reduce the mental workload, the collision risk in cell phone use conditions was still higher than that without the phone use. More importantly, the results proved that the hands-free condition did not eliminate the safety problem associated with distracted driving because it impaired the driving performance in the same way as much as the use of hand-held phones. In addition, the gender effect indicated that although female drivers had longer reaction time than male drivers in critical situation, they were more quickly in braking with larger maximum deceleration rate, and they tended to keep a larger safety margin with the leading vehicle compared to male drivers. The findings shed some light on the further development of advanced collision avoidance technologies and the targeted intervention strategies about cell phone use while driving.  相似文献   

8.
9.
This paper studies the problem of supplier selection and order allocation in a retail supply chain (comprising suppliers, a central purchasing unit and outlets) under disruption risk. The final demand is deterministic. Suppliers are located in different geographic areas, and supplies are subject to a positive probability of disruption. Different capacity and failure probabilities for each supplier are considered. Our analysis focuses on the insurance versus profitability trade-off faced by a supply manager who buys from suppliers for the outlets. Instead of determining optimal decisions given an objective function and the risk sensitivity of the decision-maker, we use a mixed integer linear programming approach to provide decision-making support that shows a supply manager the ‘elasticity of (expected) losses versus (expected) profits’. Under this model, and depending on the profit-and-loss targets, a supply manager of known risk sensitivity (i.e. risk aversion and loss aversion) can make better decisions when choosing suppliers. Moreover, taking into account, the impact of the share of fixed costs that must be covered by the operation, we consider the net values of expected profit and loss. We discuss the potential influence of the level of the firm’s fixed costs on the supply strategy. In particular, we show how the minimum value of the gross margin needed for the strategy’s profitability affects that strategy. A numerical application is conducted to illustrate the contribution of our decision-making support mechanism, and several managerial insights are obtained.  相似文献   

10.
This article reports for first time the state of science and technology in the African Continent on the basis of two scientometric indicators — number of research publications and number of patents awarded. Our analysis shows that Africa produced 68,945 publications over the 2000–2004 period or 1.8% of the World’s publications. In comparison India produced 2.4% and Latin America 3.5% of the World’s research. More detailed analysis reveals that research in Africa is concentrated in just two countries — South Africa and Egypt. These two counties produce just above 50% of the Continent’s publications and the top eight countries produce above 80% of the Continent’s research. Disciplinary analysis reveals that few African countries have the minimum number of scientists required for the functioning of a scientific discipline. Examination of the Continent’s inventive profile, as manifested in patents, indicates that Africa produces less than one thousand of the world’s inventions. Furthermore 88% of the Continent’s inventive activity is concentrated in South Africa. The article recommends that the African Governments should pay particular attention in developing their national research systems.  相似文献   

11.
Struvite crystals were precipitated by the reaction of magnesium chloride hexahydrate and ammonium dihydrogen phosphate using different concentrations of citric acid as the additive (100, 300, and 500 ppm). The structure, morphology, functional groups and particle size of the crystals were evaluated experimentally by scanning electron microscopy, X-ray diffraction, Fourier transform infrared spectroscopy (FTIR) and particle size analysis. The experimental results demonstrated that citric acid exerted a significant influence on the struvite precipitation and the crystal morphology changed from rod-like to tubular shaped with a larger size and hollow bodies. The average particle size changed from 17.60 to 33.60 μm with increasing citric acid concentration. The results of FTIR suggested that the citric acid adsorbed on the crystal surface. Following the characterization of the crystals prepared using different concentrations of citric acid, the response surface methodology coupled with Box-Behnken design were applied as a statistical tool to determine the effects of the key parameters affecting the precipitation process (temperature, pH and additive concentration) on the responses (namely, particle size and specific cake resistance of struvite). Second-order polynomial equations for both responses were improved to correlate the parameters. Analysis of variance (ANOVA) showed a significant quadratic regression model with high coefficients of the determination values. The optimum conditions for particle size were found to be 60 °C, pH 8 and 500 ppm additive concentration.  相似文献   

12.
Current research by the developers of rapid prototyping systems is generally focused on improvements in cost, speed and materials to create truly economic and practical economic rapid manufacturing (RM) machines. In addition to being potentially smarter/faster/cheaper replacements for existing manufacturing technologies, the next generation of these machines will provide opportunities not only for the design and fabrication of products without traditional constraints but also for organizing manufacturing activities in new, innovative and previously undreamt of ways. This paper outlines a novel devolved manufacturing (DM) ‘factory-less’ approach to e-manufacturing, which integrates mass customization (MC) concepts, RM technologies and the communication opportunities of the Internet/World Wide Web, describes two case studies of different DM implementations and discusses the limitations and appropriateness of each, and, finally, draws some conclusions about the technical, manufacturing and business challenges involved.  相似文献   

13.
14.
Social media marketing is an essential and important tool for start-up firms, which can help start-up firms remedy the marketing limitations through ease and relatively low costs. Predicting start-up firms’ social media engagement level can allow them to gauge the effectiveness of their social media marketing efforts and can provide numerous benefits related to strategic marketing processes. This study focuses on developing a methodology involving data science processes and machine learning models to account for the ongoing advancement of business intelligence methodologies. This study gathered data of 8,434 start-up firms from Twitter, generated social media-based features, and created machine learning models to predict the social media engagement level of each firm. The results show that deep learning provides the best accuracy in predicting the engagement levels. The results also show that the number of tweets by the firms, the number of retweets received, and the number of likes received have the most significance in determining the effectiveness of social media marketing activities.  相似文献   

15.
The first-order reliability method (FORM) is well recognized as an efficient approach for reliability analysis. Rooted in considering the reliability problem as a constrained optimization of a function, the traditional FORM makes use of gradient-based optimization techniques to solve it. However, the gradient-based optimization techniques may result in local convergence or even divergence for the highly nonlinear or high-dimensional performance function. In this paper, a hybrid method combining the Salp Swarm Algorithm (SSA) and FORM is presented. In the proposed method, a Lagrangian objective function is constructed by the exterior penalty function method to facilitate meta-heuristic optimization strategies. Then, SSA with strong global optimization ability for highly nonlinear and high-dimensional problems is utilized to solve the Lagrangian objective function. In this regard, the proposed SSA-FORM is able to overcome the limitations of FORM including local convergence and divergence. Finally, the accuracy and efficiency of the proposed SSA-FORM are compared with two gradient-based FORMs and several heuristic-based FORMs through eight numerical examples. The results show that the proposed SSA-FORM can be generally applied for reliability analysis involving low-dimensional, high-dimensional, and implicit performance functions.  相似文献   

16.
Impact assessment (IA) has become one of the most prevalent environmental policy instruments today. Its introduction under the National Environmental Policy Act (US) in 1969 was revolutionary. Perhaps it is not surprising, then, that such a widely used tool has received its share of criticism, including that it fails to meet some of its fundamental goals. Over the last fifty years, IA has broadened in scope and application and embraced new techniques. It has followed evolved, but has not changed fundamentally.

We believe that IA must continue to change to meet the societal and environmental challenges of the 21st century. But will it be enough for IA to progress through incremental change (evolution), or is a complete overhaul of impact assessment (revolution) needed? We provide some ideas as to what ‘evolution’ and ‘revolution’ may look like, but rather then offering a definitive way forward now, we invite stakeholders to present their thoughts and suggestions at the IAIA19 Annual Conference in Brisbane, which carries the same theme as the title of this article.  相似文献   

17.
Biodegradable and biocompatible materials are the basis for tissue engineering. As an initial step for developing bone tissue engineering scaffolds, the in vitro biocompatibility of degradable and bioactive composites consisting of polyhydroxybutyrate-co-hydroxyvalerate (PHBV) and wollastonite (W) was studied by culturing osteoblasts on the PHBV/W substrates, and the cell adhesion, morphology, proliferation, and alkaline phosphatase (ALP) activity were evaluated. The results showed that the incorporation of wollastonite benefited osteoblasts adhesion and the osteoblasts cultured on the PHBV/W composite substrates spread better as compared to those on the pure PHBV after culturing for 3 h. In the prolonged incubation time, the osteoblasts cultured on the PHBV/W composite substrates revealed a higher proliferation and differentiation rate than those on the pure PHBV substrates. In addition, an increase of proliferation and differentiation rate was observed when the wollastonite content in the PHBV/W composites increased from 10 to 20 wt%. All of the results showed that the addition of wollastonite into PHBV could stimulate osteoblasts to proliferate and differentiate and the PHBV/W composites with wollastonite up to 20 wt% were more compatible than the pure PHBV materials for bone repair and bone tissue engineering.  相似文献   

18.
The study of science at the individual scholar level requires the disambiguation of author names. The creation of author’s publication oeuvres involves matching the list of unique author names to names used in publication databases. Despite recent progress in the development of unique author identifiers, e.g., ORCID, VIVO, or DAI, author disambiguation remains a key problem when it comes to large-scale bibliometric analysis using data from multiple databases. This study introduces and tests a new methodology called seed + expand for semi-automatic bibliographic data collection for a given set of individual authors. Specifically, we identify the oeuvre of a set of Dutch full professors during the period 1980–2011. In particular, we combine author records from a Dutch National Research Information System (NARCIS) with publication records from the Web of Science. Starting with an initial list of 8,378 names, we identify ‘seed publications’ for each author using five different approaches. Subsequently, we ‘expand’ the set of publications in three different approaches. The different approaches are compared and resulting oeuvres are evaluated on precision and recall using a ‘gold standard’ dataset of authors for which verified publications in the period 2001–2010 are available.  相似文献   

19.
In the assessment of welded joints submitted to multiaxial loading the calculations method applied, independently of the concept (nominal, structural, hot-spot or local), must consider primarily the materials ductility. While proportional loading can be assessed by von Mises, the principal stress hypothesis, the Findley method or the Gough–Pollard relationship, using any of the mentioned concepts, difficulties occur when the loading is non-proportional, i.e. the principal stress (strain) direction changes. This causes a significant fatigue life reduction for ductile steel welds, but an indifferent behaviour for semi-ductile aluminium welds. This different response to non-proportional loading can be assessed when ductility related mechanisms of fatigue failures, i.e. the mean value of plane oriented shear stresses for ductile materials and a combination of shear and normal stresses for semi-ductile materials, are properly considered.However, as these methods require a good expertise in multiaxial fatigue, for design codes used by non-fatigue experts, simpler but sound calculation methodologies are required. The evaluation of known fatigue data obtained with multiaxial constant and variable amplitude (spectrum) loading in the range N > 104 cycles suggests the application of the modified interaction algorithm of Gough–Pollard. In the case of variable amplitude loading, constant normal and shear stresses are replaced by modified reference normal and shear stresses of the particular spectrum. The modification of the reference stresses is based on the consideration of the real Palmgren–Miner damage sum of DPM = 0.5 (for spectra with constant mean loads) and the modification of the Gough–Pollard algorithm by consideration of the multiaxial damage parameter DMA = 1.0 or 0.5, which is dependent on the material’s ductility and on whether the multiaxial loading is proportional or non-proportional. This method is already part of the IIW-recommendations for the fatigue design of welded joints and can also be applied by using hot-spot or local stresses.  相似文献   

20.
This paper studies the long-term (20,000 exposure hours) behavior of titanium and Ti–5Al–4V alloy—Carter–Brugirard saliva interface and the short-term (500 exposure hours) resistance of titanium and Ti–5Al–4V alloy—Tani&Zucchi saliva interface. Potentiodynamic polarization method was applied for the determination of the main electrochemical parameters. Linear polarization measurements for to obtain the corrosion rates were used. Monitoring of the open circuit potentials (E oc) for long-term have permitted to calculate the potential gradients due to the pH, ΔE oc(pH) and to the saliva composition ΔE oc(c) changes which can appear “in vivo” conditions and can generate local corrosion. Atomic force microscopy (AFM) has analyzed the surface roughness. Ion release was studied by atomic absorption spectroscopy (AAS). In Carter–Brugirard saliva both titanium and Ti–5Al–4V alloy present very stable passive films, long-term stability, “very good” resistance, low values of the open circuit potential gradients, which cannot generate local corrosion. In Tani&Zucchi artificial saliva, pitting corrosion and noble pitting protection potentials (which cannot be reached in oral cavity) were registered; titanium ion release is very low; surface roughness increase in time and in the presence of the fluoride ions, denoting some increase in the anodic activity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号