首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 11 毫秒
1.
Dr. Ted Trainer's paper in this issue contends that “de-materialisation” (decreasing energy and material inputs per unit of output) is a “myth” that must now be dropped from arguments against the “limits to growth” thesis. His specific arguments against de-materialisation are questioned in this commentary. This paper goes on to argue that even if de-materialisation has not taken place, it does not follow that near-term “zero growth” becomes necessary. On the contrary, the “limits to growth” position rests on erroneous Malthusian projections, and if the scarcity and spillover effects of growth are appropriately priced, conservation and substitution will be induced. Economic growth will facilitate technological and economic solutions to pollution and depletion. Institutional arrangements that will structure incentives, such as making better use of markets to set appropriate prices, are at the heart of the sustainability problem.  相似文献   

2.
We present the creation and use of a generalized cost function methodology based on costlets for automated optimization for conformal and intensity modulated radiotherapy treatment plans. In our approach, cost functions are created by combining clinically relevant “costlets”. Each costlet is created by the user, using an “evaluator” of the plan or dose distribution which is incorporated into a function or “modifier” to create an individual costlet. Dose statistics, dose-volume points, biological model results, non-dosimetric parameters, and any other information can be converted into a costlet. A wide variety of different types of costlets can be used concurrently. Individual costlet changes affect not only the results for that structure, but also all the other structures in the plan (e.g., a change in a normal tissue costlet can have large effects on target volume results as well as the normal tissue). Effective cost functions can be created from combinations of dose-based costlets, dose-volume costlets, biological model costlets, and other parameters. Generalized cost functions based on costlets have been demonstrated, and show potential for allowing input of numerous clinical issues into the optimization process, thereby helping to achieve clinically useful optimized plans. In this paper, we describe and illustrate the use of the costlets in an automated planning system developed and used clinically at the University of Michigan Medical Center. We place particular emphasis on the flexibility of the system, and its ability to discover a variety of plans making various trade-offs between clinical goals of the treatment that may be difficult to meet simultaneously.  相似文献   

3.
“Grey-box” modelling combines the use of first-principle based “white-box” models and empirical “black-box” models, offering particular benefits when: (a) there is a lack of fundamental theory to describe the system or process modelled; (b) there is a scarcity of suitable experimental data for validation or (c) there is a need to decrease the complexity of the model. The grey-box approach has been used, for example, to create mathematical models to predict the shelf life of chilled products or the thermal behaviour of imperfectly mixed fluids, or to create models that combine artificial neural networks and dynamic differential equations for control-related applications. This paper discusses the main characteristics of white-box, black-box and their integration into grey-box models, the requirements and sourcing of accurate data for model development and important validation concepts and measures.  相似文献   

4.
Graphite isotope ratio method (GIRM) is a technique that uses measurements and computer models to estimate total plutonium (Pu) production in a graphite-moderated reactor. First, isotopic ratios of trace elements in extracted graphite samples from the target reactor are measured. Then, computer models of the reactor relate those ratios to Pu production. Because Pu is controlled under non-proliferation agreements, an estimate of total Pu production is often required, and a declaration of total Pu might need to be verified through GIRM. In some cases, reactor information (such as core dimensions, coolant details, and operating history) are so well documented that computer models can predict total Pu production without the need for measurements. However, in most cases, reactor information is imperfectly known, so a measurement and model-based method such as GIRM is essential. Here, we focus on GIRM's estimation procedure and its associated uncertainty. We illustrate a simulation strategy for a specific reactor that estimates GIRM's uncertainty and determines which inputs contribute most to GIRM's uncertainty, including inputs to the computer models. These models include a “local” code that relates isotopic ratios to the local Pu production, and a “global” code that predicts the Pu production shape over the entire reactor. This predicted shape is included with other 3D basis functions to provide a “hybrid basis set” that is used to fit the local Pu production estimates. The fitted shape can then be integrated over the entire reactor to estimate total Pu production. This GIRM evaluation provides a good example of several techniques of uncertainty analysis and introduces new reasons to fit a function using basis functions in the evaluation of the impact of uncertainty in the true 3D shape.  相似文献   

5.
Material heterogeneities and discontinuities such as porosity, second phase particles, and other defects at meso/micro/nano scales, determine fatigue life, strength, and fracture behavior of aluminum castings. In order to achieve better performance of these alloys, a design-centered computer-aided renovative approach is proposed. Here, the term “design-centered” is used to distinguish the new approach from the traditional trial-and-error design approach by formulating a clear objective, offering a scientific foundation, and developing a computer-aided effective tool for the alloy development. A criterion for tailoring “child” microstructure, obtained by “parent” microstructure through statistical correlation, is proposed for the fatigue design at the initial stage. A dislocations pileup model has been developed. This dislocation model, combined with an optimization analysis, provides an analytical-based solution on a small scale for silicon particles and dendrite cells to enhance both fatigue performance and strength for pore-controlled castings. It can also be used to further tailor microstructures. In addition, a conceptual damage sensitivity map for fatigue life design is proposed. In this map there are critical pore sizes, above which fatigue life is controlled by pores; otherwise it is controlled by other mechanisms such as silicon particles and dendrite cells. In the latter case, the proposed criteria and the dislocation model are the foundations of a guideline in the design-centered approach to maximize both the fatigue life and strength of Al-Si-based light-weight alloy.  相似文献   

6.
Functional block diagrams (FBDs) and their equivalent event trees are introduced as logical models in the quantification of occupational risks. Although a FBD is similar to an influence diagram or a belief network it provides a framework for introduction in a compact form of the logic of the model through the partition of the paths of the equivalent event tree. This is achieved by consideration of an overall event which has as outcomes the outmost consequences defining the risk under analysis. This event is decomposed into simpler events the outcome space of which is partitioned into subsets corresponding to the outcomes of the initial joint event. The simpler events can be further decomposed into simpler events creating a hierarchy where the events in a given level (parents) are decomposed to a number of simpler events (children) in the next level of the hierarchy. The partitioning of the outcome space is transferred from level to level through logical relationships corresponding to the logic of the model.Occupational risk is modeled trough a general FBD where the undesirable health consequence is decomposed to “dose” and “dose/response”; “dose” is decomposed to “center event” and “mitigation”; “center event” is decomposed to “initiating event” and “prevention”. This generic FBD can be transformed to activity—specific FBDs which together with their equivalent event trees are used to delineate the various accident sequences that might lead to injury or death consequences.The methodology and the associated algorithms have been computerized in a program with a graphical user interface (GUI) which allows the user to input the functional relationships between parent and children events, corresponding probabilities for events of the lowest level and obtain at the end the quantified corresponding simplified event tree.The methodology is demonstrated with an application to the risk of falling from a mobile ladder. This type of accidents has been analyzed as part of the Workgroup Occupational Risk Model (WORM) project in the Netherlands aiming at the development and quantification of models for a full range of potential risks from accidents in the workspace.  相似文献   

7.
The aim of this paper is to provide an overview of different statistical analyses from patent and literature databases that in combination are helpful for a variety of mostly strategic decision settings in firms. For the case of optoelectronics we assess the patenting and publishing activity of firms and individuals and their citation frequency.The analyses identified leading players in the field, revealed technological dependencies, and the existence of patent clusters as patenting strategies. Co-citation analysis highlighted technological similarities between two firms involved in patent litigation trials. In this science-based technology field individuals combining characteristics of key inventors (a high activity and citation level in patenting) as well as core scientists (a high activity and citation frequency level in publishing) – therefore labelled “R&D dualists” – successfully bridge the gap between science and technology, but are exceptionally rare. Citation-weighted patent counts demonstrated the pivotal impact of one “R&D dualist” in an industrial R&D laboratory, severely affecting the laboratories’ outcome when this individual left. An increasing level of R&D cooperation in particular technological subfields after the individual’s departure could be found. However, patent analysis did not find evidence for long-term competence transfer in these subfields.  相似文献   

8.
The freezing process is widely used in the food industry. In the 70s, French regulation authorities have created in collaboration with the food industry the concept of «surgélation» process with the objective of improving the image of high quality frozen foods. The process of “surgélation” which could be translated as “super freezing” corresponds to a freezing process for which a final temperature of −18 °C must be reached “as fast as possible”. This concept was proposed in opposition to a conventionally “freezing” process for which no specific freezing rate is expected and the final storage temperature can be of −12 °C only. The objective of this work is to propose a methodology to evaluate the mean amount of frozen ice in a complex food as a function of temperature and to deduce a target temperature that must be considered as the temperature for which the food may be considered as “frozen”. Based on the definition proposed by the IIF-IIR red book, this target temperature has been defined as the temperature for which 80% of the freezable water is frozen. A case study is proposed with a model food made of two constituents.  相似文献   

9.
Rodney W.   《Technology in Society》2007,29(4):369-377
Around the world every year, nations urgently need assistance to cope with natural disasters, refugees, famines. Such chronic urgencies for “foreign aid” tend to drive out actions aimed at achieving crucial goals for long-term economic development. Just as these pressures affect all donors of foreign assistance, they undermine the capacity- building essential in all developing countries. The program of the US Agency for Development (AID) is a prime example of the distortions that result. Past priorities in foreign assistance on enhancing science and technology, and on nurturing human capital, now rate much less attention. Yet progress in S&T is central for economic growth, and historical trends show that the path to innovation demands multiple incentives rewarding autonomy, diversity, and experiment within the private sector. Further, development must be bolstered—over decades—by patiently reinforcing and building the educational and technological institutions of the recipient of “aid.” Accordingly, this article proposes that AID appoint an S&T Adviser and establish a $50 million R&D effort. And it is also imperative to restore an emphasis on human capital throughout AID's strategy. To do this well means conducting rigorous evaluations of results and responding thoughtfully to the priorities seen by the recipients of aid.  相似文献   

10.
The expression, “ethics of family planning,” it is argued, has no firm meaning, and should not be taken to imply that a full set of moral rules and principles governing family planning has been or is likely to be established. A survey is made of recent views on population and economic and social development, and it is argued that, although there is, indeed, no “universal problem” of population, the optimistic — as well as the pessimistic — view of this relationship is open to doubt. It is further argued that “ethics” cannot be imposed on subject matter of population from without: The very identification of a “problem” of population is evaluative from the start. A scheme of analysis to appraise the ethical status of measures to arrest or promote population growth is proposed, and a number of such measures are critically analyzed.  相似文献   

11.
Summary Although its use in informetrics dates back at least to 1987, data analysed in a recent paper by Shan et al. (2004) has rekindled interest in the generalized Waring distribution (GWD). The purpose of this note is to show that for many purposes, the distribution is best motivated via a familiar informetric scenario of a population of “sources” producing “items” over time leading to a stochastic process from which the univariate, bivariate and multivariate forms of the GWD are natural consequences. Earlier work and possible future applications are highlighted. Many of the results are due to Irwin and Xekalaki while much of the material on the Waring process has been previously available in an unpublished research report by the author (Burrell, 1991).  相似文献   

12.
Since the early 1960s, the OECD has been an important “think tank” in the area of science and technology policy. To the OECD, we owe the development of various statistical analyses, standards and norms for evaluating and, more importantly, for ranking countries on their scientific and technological performances. This paper traces the origins of this practice of ranking countries to the debate over technological gaps between the United States and Western Europe in the late 1960s. It shows how the OECD documented these gaps in a series of statistics that would form the basis for its later work on best practices, benchmarking exercises and scoreboards of indicators.  相似文献   

13.
A clear and appropriate role is presented for the federal government and the national laboratories with respect to technology development, technology utilization and competitiveness. The selective use of federal policy tools and resources for enhancing economic competitiveness and for providing “sustainable” economic development is proposed. A novel approach to a coherent national R&D strategy is advocated with three major components: tax credits, technology facilitation, and federal investment directed towards sustainable competitive participation Sustainable competitive participation is based on the blending of the concepts of economic competitiveness and sustainable development. Four sectors of technology development and competitiveness are considered for this comprehensive national science and technology strategy. Several examples illustrate the appropriate federal government and national laboratory role in sustainable competitive participation.  相似文献   

14.
Black Tournai “marble”, a fine-grained Lower Carboniferous (Tournaisian) limestone able to take a good polish has been widely used in the Flanders region (Belgium). Highly crafted baptismal fonts and tombslabs were also exported to England, France and elsewhere during the Middle Ages. Such objects are particularly valuable since their distribution aids the dating of historical events and the reconstruction of medieval trade. Similar black “marble” was extracted in the Meuse valley (Belgium) in the Middle Ages, and there are exploited sources in the UK, Ireland and elsewhere. Thus, it is not straightforward to determine the provenance of black “marble”. Based on geological, stylistic and historical evidence, this paper shows the likelihood that a black “marble” tombslab found in Nidaros Cathedral in Trondheim (Central Norway) was extracted and crafted in Tournai and shipped northwards around 1160, possibly for the grave of the first Norwegian archbishop, Jon Birgerson. The tombslab represents the first known crafted stone imported to Norway from the European continent/British Isles and is thus unique in a historical context. The properties of the Trondheim tombslab match those of black Tournai “marble”: It is a silicified, bioclastic packstone loaded with crinoids, featuring bryozoa and fragments of brachiopods and ostracods. The high silica content and absence of foraminifers distinguish the stone from the Viséan black “marble” quarried in the Meuse valley.  相似文献   

15.
Recently, several manufacturers of domestic refrigerators have introduced models with “quick thaw” and “quick freeze” capabilities. In this study, the time required for freezing and thawing different meat products was determined for five different models of household refrigerators. Two refrigerators had “quick thaw” compartments and three refrigerators had “quick freeze” capabilities. It was found that some refrigerator models froze and thawed foods significantly faster than others (P<0.05). The refrigerators with the fastest freezing and thawing times were found to be those with “quick thaw” and “quick freeze” capabilities. Heat transfer coefficients ranged from 8 to 15 Wm−2K−1 during freezing, and the overall heat transfer coefficients ranged from 5 to 7 Wm−2 K−1 during thawing. Mathematical predictions for freezing and thawing time in the refrigerators gave results similar to those obtained in experiments. With the results described, manufacturers can improve their design of refrigerators with quick thawing and freezing functions.  相似文献   

16.
The oxidation kinetics of AlN–SiC–TiB2 composite in air has been investigated in terms of a theoretical analysis associated with the experiment data. The effects of temperature and temperature heating rate on the oxidation reaction have been discussed by using the “characteristic oxidation time”. The results show that the calculated results by our model can give a good agreement with the experimental data. From this study it is shown that, the “characteristic oxidation time” is a very useful parameter for comparing the property of oxidation resistance for different composites.  相似文献   

17.
An innovative photoelectric technology is presented in this paper for detecting frost in the field of defrost-control of refrigerator or freezer applications. Experiments were conducted with a small-scale laboratory test section under natural convection conditions. Two agreeable properties of the new technology, “on–off” and “linear”, have been demonstrated by the experiments. The first property provides an effective judgment of the defrost-control strategy while the second one is suitable for developing an accurate measurement of the frost height. The characteristics (electric current, environment temperature, metal surface temperature, light intensity and sensor location) that affect the properties were investigated for the development of this technology.  相似文献   

18.
A Capped Drucker–Prager (CDP) model was used to simulate the deformation-load response of a low density (150–250 kg/m3) snow being loaded at high strain rates (i.e., strain rates associated with vehicle passage) in the temperature range of −1 to −10 °C. The range in the appropriate model parameters was determined from experimental data. The model parameters were refined by running finite-element models of a radially confined uniaxial compression test and a plate sinkage test and comparing these results with laboratory and field experiments of the same. This effort resulted in the development of two sets of model parameters for low density snow, one set that is applicable for weak or “soft” snow and a second set that is representative of stronger or “hard” (aged or sintered) snow. Together, these models provide a prediction of the upper and lower bound of the macroscale snow response in this density range. Furthermore, the modeled snow compaction density agrees well with measured data. These models were used to simulate a tire rolling through new fallen snow and showed good agreement with the available field data over the same depth and density range.  相似文献   

19.
The objective of the Arbeitsgemeinschaft Deutscher Patentinformationszentren e.V. (The Association of German Patent Information Centres) is to encourage the ongoing development of the centres and to achieve broader public dissemination of industrial property rights information. Since use of the Internet has steadily increased amongst wide sections of the public in recent years, and since patent offices have taken steps to give the public free access to patent information, the association has set up a Germany-wide network known as PIZnet (“PIZ” standing for “patent information centres” in German) in co-operation with the Federal Printing Office. This network includes a presentation on all German patent information centres and the services they provide, it answers questions on the patent system, publishes offers of licences, and covers much else besides.  相似文献   

20.
Granular segregation in a rotating tumbler occurs due to differences in either particle size or density, which are often varied individually while the other is held constant. Both cases present theoretical challenges; even more challenging, however, is the case where density and size segregation may compete or reinforce each other. The number of studies addressing this situation is small. Here we present an experimental study of how the combination of size and density of the granular material affects mixing and segregation. Digital images are obtained of experiments performed in a half-filled quasi-2D circular tumbler using a bi-disperse mixture of equal volumes of different sizes of steel and glass beads. For particle size and density combinations where percolation and buoyancy both contribute to segregation, either radial streaks or a “classical” core can occur, depending on the particle size ratio. For particle combinations where percolation and buoyancy oppose one another, there is a transition between a core composed of denser beads to a core composed of smaller beads. Mixing can be achieved instead of segregation if the denser beads are also bigger and if the ratio of particle size is greater than the ratio of particle density. Temporal evolution of these segregated patterns is quantified in terms of a “segregation index” (based on the area of the segregated pattern) and a “shape index” (based on the area and perimeter of the segregated pattern).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号