首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 53 毫秒
1.
Fatigue crack growth is represented using fracture mechanics parameters, ΔK and Kmax. Environmental effects that depend on time and stress affect the fatigue behavior predominantly through Kmax parameter. The superimposed effects of environment and stress are seemingly complex. We have developed a methodology for classifying and separating the effects of environment on fatigue crack growth. A “crack growth trajectory map” is constructed from the behavior of ΔK versus Kmax for various constant crack growth rate curves. A “pure fatigue” behavior is defined, in terms of environment-free behavior, such as in high vacuum. Deviation from this “pure fatigue” reference of the trajectory map is associated with either monotonic mode of fracture or to the superimposed environmental effects on crack growth. Using such an approach, called “Unified Damage Approach”, we classify the environmental effects in almost all materials into only five types. Each of these types shows the combination of time and stress affecting the crack tip driving force, and thus ΔK and Kmax. The trajectory map depicts the changing material resistance due to the changing crack growth mechanisms with increasing crack growth rate, as reflected in terms of the applied stress intensities, ΔK and Kmax. Thus the trajectory map provides a useful tool to separate the contributions from pure fatigue and superimposed monotonic modes and the governing crack growth mechanisms as a function of load-ratio, crack growth rate and environment. Understanding and quantification of the governing mechanisms would help in developing a more fundamental and reliable life prediction method.  相似文献   

2.
Recently, several manufacturers of domestic refrigerators have introduced models with “quick thaw” and “quick freeze” capabilities. In this study, the time required for freezing and thawing different meat products was determined for five different models of household refrigerators. Two refrigerators had “quick thaw” compartments and three refrigerators had “quick freeze” capabilities. It was found that some refrigerator models froze and thawed foods significantly faster than others (P<0.05). The refrigerators with the fastest freezing and thawing times were found to be those with “quick thaw” and “quick freeze” capabilities. Heat transfer coefficients ranged from 8 to 15 Wm−2K−1 during freezing, and the overall heat transfer coefficients ranged from 5 to 7 Wm−2 K−1 during thawing. Mathematical predictions for freezing and thawing time in the refrigerators gave results similar to those obtained in experiments. With the results described, manufacturers can improve their design of refrigerators with quick thawing and freezing functions.  相似文献   

3.
Cross-ply laminates made of carbon/epoxy IM7/977-2 system are investigated. The fatigue study is confined to ambient temperature conditions and zero loading ratio. Damage is characterized by the transverse crack density ρ in the central 90°-layer. The family of experimental fatigue cracking curves (ρ versus N, where N is the number of cycles, for each tensile test maximum stress amplitude) can be replaced with a set of “iso-damage curves”, i.e. contour curves of constant ρ in the σ–log (N) plane. The iso-damage curves approximately constitute a fan of straight lines that intersect at a common point (σe, log (Ns)), where Ns is a very large number of cycles beyond which no more crack appears, and σe is some fatigue limit.Our aim is to propose a simple method to predict fatigue cracking at an arbitrary maximum stress level loading by using data stemming from a constant strain rate test. This method essentially rests upon the construction of the above “iso-damage” curves, using very simple assumptions.  相似文献   

4.
The presence of high electric fields at the drain junction in polycrystalline silicon (polysilicon) thin film transistors (TFTs), enhances several undesired effects, such as hot-carrier related instabilities and kink effect. In order to reduce the drain electric field, non-self-aligned (NSA) device architecture can be adopted. In this case, dopant activation and active layer crystallization are achieved at the same time by excimer laser annealing, resulting in a substantial lateral dopant diffusion. The gradual doping profile provides not only a reduction of the drain electric field, but also a channel length shortening. Therefore, an effective channel length (Leff) has to be determined in such devices, in order to successfully design circuit applications. In this work, Leff and parasitic resistance (Rp) modulation effects have been investigated in both n- and p-channel NSA polysilicon TFTs. Three different parameter extraction methods, originally proposed for the crystalline MOSFETs technology, have been used and compared in order to extract Leff and Rp, including: the “channel resistance” method; the “paired Vg” method; the “shift and ratio” method. These methods indicate a channel length reduction up to 1 μm and a non negligible parasitic resistance effect. The reliability of the results of the three methods are discussed in terms of applicability of the underlying assumptions in the case of polysilicon TFTs and numerical simulations are used to support the analysis.  相似文献   

5.
6.
The mechanical behaviour under uniaxial and loading/unloading tensile tests of high purity nickel with different number of grains across the thickness is studied experimentally. The specimens have a constant 500 μm thickness and the mean number of grains across the thickness (i.e., thickness “t” to grain size “d” ratio) lies between 0.9 and 15. An extended microstructural study is operated and no change of the microstructure appears with a modification of t/d. The experimental results show that the t/d ratio affects the hardening stages, flow stress, intragranular and intergranular backstress of the samples. For specimens with few grains across the thickness, the flow stress is reduced due to a decrease in the intragranular backstress. The main explanation of these results is a delay of the generalization of cross-slip for the lowest t/d ratio specimens due to surface effects.  相似文献   

7.
Summary Parallel mappings of the intellectual and cognitive structure of Software Engineering (SE) were conducted using Author Cocitation Analysis (ACA), PFNet Analysis, and card sorting, a Knowledge Elicitation (KE) method. Cocitation counts for 60 prominent SE authors over the period 1990 - 1997 were gathered from SCISEARCH. Forty-six software engineers provided similar data by sorting authors’ names into labeled piles. At the 8 cluster level, ACA and KE identified similar author clusters representing key areas of SE research and application, though the KE labels suggested some differences between the way that the authors’ works were used and how they were perceived by respondents. In both maps, the clusters were arranged along a horizontal axis moving from “micro” to “macro” level R&D activities (correlation of X axis coordinates = 0.73). The vertical axis of the two maps differed (correlation of Y axis coordinates = -0.08). The Y axis of the ACA map pointed to a continuum of high to low formal content in published work, whereas the Y axis of the KE map was anchored at the bottom by “generalist” authors and at the top by authors identified with a single, highly specific and consistent specialty. The PFNet of the raw ACA counts identified Boehm, Basili, and Booch as central figures in subregions of the network with Boehm being connected directly or through a single intervening author with just over 50% of the author set. The ACA and KE combination provides a richer picture of the knowledge domain and provide useful cross-validation.  相似文献   

8.
Common-cause failures (CCF) are one of the more critical and challenging issues for system reliability and risk analyses. Academic interest in modeling CCF, and more broadly in modeling dependent failures, has steadily grown over the years in the number of publications as well as in the sophistication of the analytical tools used. In the past few years, several influential articles have shed doubts on the relevance of redundancy arguing that “redundancy backfires” through common-cause failures, and that the latter dominate unreliability, thus defeating the purpose of redundancy. In this work, we take issue with some of the results of these publications. In their stead, we provide a nuanced perspective on the (contingent) value of redundancy subject to common-cause failures. First, we review the incremental reliability and MTTF provided by redundancy subject to common-cause failures. Second, we introduce the concept and develop the analytics of the “redundancy–relevance boundary”: we propose this redundancy–relevance boundary as a design-aid tool that provides an answer to the following question: what level of redundancy is relevant or advantageous given a varying prevalence of common-cause failures? We investigate the conditions under which different levels of redundancy provide an incremental MTTF over that of the single component in the face of common-cause failures. Recognizing that redundancy comes at a cost, we also conduct a cost–benefit analysis of redundancy subject to common-cause failures, and demonstrate how this analysis modifies the redundancy–relevance boundary. We show how the value of redundancy is contingent on the prevalence of common-cause failures, the redundancy level considered, and the monadic cost–benefit ratio. Finally we argue that general unqualified criticism of redundancy is misguided, and efforts are better spent for example on understanding and mitigating the potential sources of common-cause failures rather than deriding the concept of redundancy in system design.  相似文献   

9.
The electrostatic separator VASSILISSA is used in the exploration of fusion reactions. The magnetic 37° dipole was installed downstream the second quadrupole triplet of the separator for the mass identification of evaporation residues. Mass determination is an additional method for the identification of new isotopes when traditional methods are insufficient. In the combination “VASSILISSA + 37° dipole magnet”, the mass resolution Δm/m in test reactions of better than 1% was achieved, for single events Δm/m<2% is expected.  相似文献   

10.
Robert B.   《Technology in Society》2003,25(4):513-516
Three tasks must be included when considering the broad topic of urban security. The first is to define the term “critical infrastructure.” Second, security must be viewed from a systems perspective when looking at cities and the infrastructure that serves them. Third, careful scrutiny must be given to heretofore not-considered vulnerabilities that exist in every major city.In the hours and days immediately following the attacks on September 11, everything from foot bridges to tall buildings were considered to be critical infrastructure. But, clearly, not everything in such a broad definition can be defended. So then, what is today’s definition of critical infrastructure? One might be a new version of the “3 R’s”—resist, respond, recover. In those terms, “critical infrastructure” could be defined as: (a) systems whose rapid failure would lead to a catastrophic loss of life; (b) systems whose failure or significant degradation would lead to unacceptable economic consequences; (c) systems whose rapid failure would significantly impact rescue and response efforts; and (d) systems whose significant degradation severely impact recovery efforts.Resist? It would be impossible for a city to resist everything, everywhere. The ability to respond to some events would require efforts that are above and beyond the realistic capability of any city. That moves the scenario to recovery and rebuilding.  相似文献   

11.
In recent years the interest in cooling machines or heat pumps combining the principles of compression and sorption technology is increasing. The reason is that both technologies have specific drawbacks which can be overcome by the combination. Our discussion is centred around absorption cycles which use a compressor, and, consequently, an input of a significant amount of mechanical work in addition to heat. In most publications cycles of this kind are discussed in terms of one single COP as usual in the refrigeration industry. This, however, is wrong from a thermodynamic, and misleading from a technical and economical point of view. In order to highlight the need for a strict thermodynamic approach, a fundamental difference between distinct kinds of work input, namely “recoverable work”, “dissipative work” and “heat transformation work” is discussed in the first part of the paper. In the second part it is shown how the input of both work and heat into a energy conversion system has to be handled with both mechanical and thermal COP. The method is thermodynamically sound and straightforward, technically feasible and easy to apply, and most quickly transferred into economical terms. In the third part, a practical example of a compression–absorption hybrid is investigated.  相似文献   

12.
The oxidation kinetics of AlN–SiC–TiB2 composite in air has been investigated in terms of a theoretical analysis associated with the experiment data. The effects of temperature and temperature heating rate on the oxidation reaction have been discussed by using the “characteristic oxidation time”. The results show that the calculated results by our model can give a good agreement with the experimental data. From this study it is shown that, the “characteristic oxidation time” is a very useful parameter for comparing the property of oxidation resistance for different composites.  相似文献   

13.
This paper presents experimental results from a prototype ammonia chiller with an air-cooled condenser and a plate evaporator. The main objectives were charge reduction and compactness of the system. The charge is reduced to 20 g/kW (2.5 oz/Ton). This is lower than any currently available air-cooled ammonia chiller on the market. The major contribution comes from use of microchannel aluminum tubes. Two aluminum condensers were evaluated in the chiller: one with a parallel tube arrangement between headers and “microchannel” tubes (hydraulic diameter Dh = 0.7 mm), and the other with a single serpentine “macrochannel” tube (Dh = 4.06 mm). The performances of the chiller and condensers are compared based on various criteria to other available ammonia chillers. This prototype was made and examined in the Air Conditioning and Refrigeration Center in 1998, at the University of Illinois at Urbana-Champaign.  相似文献   

14.
Characteristics of the discrete vibration levels of substitution impurity are calculated for frequencies belong to gap between acoustic and optical zones (deuterium in PdH) as well as for the areas outside the phonon spectra (hydrogen in PdD). Data of neutron diffraction analysis are used to determine the force constants of pure palladium as well as PdH solid solutions. Diverse configurations of vacancies close to impurity atoms are discussed. With the help of the calculated frequencies and intensities of the local and “gap” vibrations in the system “isotope-defect + vacancies” it is possible to analyze behavior of the certain characteristics in disordered solid solutions PdH(D)x<1 and to obtain information about the vacancy structure of the considered compounds.PACS numbers: 63.20 Mt, 63.20 Pw, 63.50 +x.  相似文献   

15.
The time evolutions of the local fields BL(t) have been measured on the surface of the superconducting bulk disk magnetized by a two-stage pulse-field magnetizing technique, called a modified multi-pulse technique combined with stepwise cooling (MMPSC), and the magnetic flux movement and the flux trapping have been investigated. The optimum concaved (“M-shaped”) trapped field profile, which is a necessary condition at the first stage to enhance the final trapped field BT, makes a larger magnetic gradient (dB/dx) at the bulk periphery in the ascending stage of the applied magnetic pulse at the second stage due to the large viscous force Fv. The magnetic fluxes, which stay at the bulk periphery, start to flow to the center of the bulk, after the applied pulse field reaches a maximum, at which the flux velocity v is nearly zero and then Fv decrease. As a result, a large number of the magnetic fluxes are trapped at the bulk center. The effect of the “M-shaped” profile at the first stage in MMPSC on the enhancement of BT is discussed.  相似文献   

16.
Design seismic forces depend on the peak ground acceleration (PGA) and on the shape of design spectrum curves dictated in building codes. At present there is no doubt that it is necessary to construct so-called “site and region-specific” design input ground motions reflecting influence from different magnitude events at different distances that may occur during a specified time period. A unified approach to ground motion parameters estimation is described. A collection of ground motion recordings of small to moderate (3.0–3.5≤ML≤6.5) earthquakes obtained during the execution of the Taiwan Strong Motion Instrumentation Program (TSMIP) since 1991 was used to study source scaling model, attenuation relations and site effects in Taiwan region. A stochastic simulation technique was applied to predict PGA and response spectra for the Taipei basin. “Site and region-dependent” uniform hazard response spectra were estimated for various geological conditions in the Taipei basin using a technique of probabilistic seismic hazard analysis.  相似文献   

17.
Although joined together by their commitment to inquiry, in their pursuit of seemingly divergent goals science and the humanities sometimes appear to be in tension. This article suggests that the public humanities programs sponsored by state humanities councils, the independent nonprofit state affiliates of the National Endowment for the Humanities, serve as vehicles for reconciling the differing concerns of science and the humanities. The article highlights a variety of thoughtful, successful community-focused science and humanities programs offered by state humanities councils, including a series of targeted programs supported through a special initiative jointly-sponsored by NEH and NSF in the mid-1990s, and invites consideration of opportunities for future collaboration.Whether as informed inquiry or organized skepticism, the process of questioning represents a crucial connection between science and the humanities. The importance of this connection was especially significant to the scholars, educators, and politicians who helped establish the National Endowment for the Humanities (NEH) in 1965. They were concerned about perceptions that the humanities and science were at odds, and they were anxious about the apparent advantageous position of science, as reflected in the 15-year existence of the National Science Foundation, and magnified by major increases in federal support for science following the 1957 launch of Sputnik and the ensuing “space race”.These worries no doubt also motivated the group’s focus on the similarities between science and the humanities “as systematic approaches to knowledge and understanding”, further buttressing their argument that the humanistic disciplines were a legitimate national concern [1]. The 1964 Report of the Commission on the Humanities [2], which laid the foundation for the NEH, noted, “if the interdependence of science and the humanities were more generally understood, men would be more likely to become masters of their technology and not its unthinking servants” (p.2).In the years following the 1964 Report, we are still pursuing a broader understanding of the “interdependence of science and the humanities”. Throughout the 20th and into the 21st century, science and technology have emerged as preeminent forces that shape and define nearly every aspect of life. From the microchip revolution to the prospect of human cloning, from smart bombs to smart highways, from the fanciful notion of a universe defined by string theory to the rhetoric of the Unabomber’s infamous Manifesto [3], the challenges posed by science and technology—their impacts on our lives, our institutions, and our basic understanding of the world—have been profound.Despite this intertwining of science and human experience, many Americans, while readily endorsing increased funding for science, believe the culture of science to be inaccessible. They make light of their capacity to derive satisfaction from science literature or learning and, more significantly, retreat from discussions of public policy issues involving science and technology. Their withdrawal from such policy debate threatens the long-term health of our democratic society.It is illuminating to examine ways in which the public humanities, particularly the community-based work of state humanities councils, make it possible to reconcile the two strands of potentially divergent thought defining the relationship between science and the humanities and to facilitate meaningful connections. One approach, self-reflective in its analysis and based on successfully attracting resources for work done in the respective fields, considers the pursuit of knowledge in science and the humanities as having far different ends—and scientific ends being of greater utility (i.e., “two cultures”, one of greater significance).A recent example of this approach can be seen in the Final Report of the Roundtable on Scholarly Communication in the Humanities and Social Sciences [4]. The Roundtable was convened by the Association of Research Libraries, the National Humanities Alliance, and the Knight Collaborative, with support from NEH, for the purpose of considering the future dissemination of scholarly findings in the humanities and social sciences. Reflecting on the publishing challenges confronting the “disciplines that are rooted in a non-profit ethos” at a time of rising costs and changing technologies, the authors of the report observed that in thinking about
...the predilections of the humanists and social scientists thus assembled, we talked about ends more than means—about the purposes of discourse and discovery, and only subsequently about the dissemination of results. In the fields that were the primary focus of “To Publish and Perish”, principally science, medicine, and technology, the issues were really ones of access, cost, and control. While these concerns matter to humanists and social scientists, the more central issues of audience, style and purpose often overshadow them. (p. 2)
The authors continue, “The societal tendency through the latter half of the 20th century, however, has been to distinguish between kinds of knowledge—and to value the practical advances in science, medicine, and technology over scholarship in such areas as literature, languages, history, philosophy, politics, and art.” (p. 3) Finally, in a statement apropos to a discussion of making a public case, the report notes,
No scholar in the humanities and social sciences can fail to perceive the difference between the kind of external support provided to the scientific fields and that which the work in his or her own discipline attracts. ...Through the past two decades, the scientific disciplines have proven remarkably successful in building public support for research in apparently inscrutable domains, deploying the popular media to help communicate both the excitement and value of scientific discovery. (p. 5)
Although this visceral appeal to the public is significant, when it is complemented with humanistic inquiry, there is a far more important additional benefit: deepening public understanding of the moral complexity of science and technology. This is the goal of programs sponsored by several state humanities councils seeking to bridge the gap between the practicality and apparent certainty of science and the often-frustrating ambiguity of the humanities. For example, during the period 2000–2002, the Texas Council on the Humanities issued a Request for Proposals (RFP) for projects directed at the theme “Science and Human Values” [5]. The RFP asked:
Is the universe a vast yet ultimately predictable machine? Or is it an infinitely dynamic process rendered unpredictable by countless random events? What does either model have to do with the way we go about the daily business of living? Are there certain assumptions implicit in the worldview of Newtonian physics, quantum mechanics, or chaos theory that impact the human imagination and influence human interaction? How do new technologies affect the way we relate to one another and form communities? These are a few of the questions driving important conversations between the sciences and the humanities. Exploding scientific discoveries and rapidly-developing technologies are affecting the way we interpret our experience and the way we live, taking us always, as Jacob Brownoski says, “to the brink of what is known”. TCH invites proposals for projects that will provide opportunities for Texans to consider and discuss issues such as:
• Web of Human Relationships: New Technologies, New Communities
• From Revolutions in Science to Evolutions in Human Thought
• Technologies of Life: Health Care, Genetics, and Medical Ethics
• The Self and the Laws of Science
• Artificial Intelligence and the Nature of Knowledge
• New Theories in Education and Business
• The History of Science and Society
Much of the discussion in these and other humanities-based programs involves examining assumptions that shape the work of those pursuing science and technology, or in some cases, considering how significant outcomes may be overlooked or disregarded when the question of purpose is ignored. In the case of a successful project initiated by the Maine Humanities Council and now carried out by several New England councils, entitled “Literature & Medicine: Humanities at the Heart of Health Care,” health care providers participate in a series of reading and discussion programs that encourage them to connect the world of medicine with the world of lived experience. A family physician who attended the seminars in Maine for three years stated, “we use literature to help strip away the assumptions we bring to work, and improve our understanding of our patients and each other.”In the mid-1990s, the Virginia Foundation for the Humanities brought together scientists, humanities scholars, and members of the public, especially teachers, in an “Initiative on Science, Technology and Society” [6]. The project, comprising a public discussion series and a teachers’ institute that attracted teachers in a broad range of disciplines from across the state, was designed to help those attending become familiar with the goals of science by considering the implications for the average individual.The council described the impetus for the program as follows:
For many people, science and the benefits that result from technological innovation are inseparable from the idea of human progress. Others believe that science may have gone too far, or at least progressed too quickly, raising issues that society cannot answer and moral dilemmas that individuals can barely comprehend, much less address. Meanwhile, increasingly sophisticated technologies born of new scientific discoveries are continually reshaping the fields of health care, education, transportation, communication, agriculture, and a host of other activities from sexuality and human reproduction to personal banking. But at the same time, very few non-scientists become actively engaged in making judgments about what science should do. Likewise, few opportunities exist at present for so-called ordinary people to question and interpret the work of science and the applications of new research, or to participate in structured discussions about the impact of technology on their lives. [p. 33]
The council noted further:
Our goal is to continue to provide a framework and a stimulus for new programs that erode the customary distinctions between scientists and humanists; programs that foster a renewed public interest in the work that scientists do and a greater understanding of the complex relationship between science, culture, government, and the marketplace; programs that encourage informed debate about the directions of scientific research and the applications of technology in light of their practical and moral consequences. [p. 34-35]
Their public discussion series, “Science and Society: Toward a New Understanding of the Covenant,” involved five lectures on the benefits and challenges presented by science in our democratic society, and all five lectures were broadcast and downlinked to six remote sites.Information about the Virginia program was drawn from material the council submitted in connection with “Nature, Technology and Human Understanding”, a joint initiative of the NSF and NEH [7]. From 1993 to 1995, the NSF/NEH ran a competition for state councils which was designed to promote greater public understanding of the interrelationships between science and the humanities. The guidelines for the first year of the initiative provided that it would support “public programs designed to inform and stimulate discussion about the interrelations of science, technology and the humanities... It is also hoped that the project will have long-term benefits for public understanding of the sciences and humanities...” (p. 1). Suggested topics included:
• —conception and definition of science
• —understanding nature and mind
• —science, engineering, and social change
• —history of science and evolution of engineering
• —science and its cultural context.
It is interesting to note the subtle shift in emphasis from 1993 to 1995, apparent in the guidelines for the 1995 funding cycle:
NEH and NSF expect that the projects will have long-term benefits for public understanding of the sciences, engineering, and humanities, and of the ways in which these systematic approaches to knowledge form part of our daily lives.The agencies also hope that, through these projects, the public will come to have a deeper understanding of the basic character of humanistic inquiry and scientific methods as well as a heightened awareness of the socio-political aspects of scientific institutions, including the interaction of science and technology with democratic processes, and the philosophical issues raised by the practice of science and engineering in particular social contexts. (p. 1)
Here we see the emphasis not on science and the humanities as two cultures with distinctive goals, but rather as complementary, interrelated systems of knowledge. The shift is reflected in the slightly different thrust of the topics suggested in the 1995 guidelines:
• —the social context of science
• —political culture and science
• —understanding nature and mind
• —approaches to knowledge
• —science and its cultural context in the USA.
The Virginia council was one of several state humanities councils that developed a diverse array of projects which were subsequently supported through the NSF/NEH initiative. Several councils drew upon the integrative model of the relationship between science and technology and the humanities. The Georgia council sponsored a series of public lectures and discussion sessions on “Technology and the African American Experience”; the Kentucky council sponsored lectures and discussion sessions on the theme, “Science in Our Lives”, which examined the state’s transition from an agrarian lifestyle and an economy based on tobacco farming and coal mining to more technologically sophisticated alternatives; the Nevada council sponsored a seven-part lecture series, “Nevada in the Nuclear Age”, which explored growing scientific, philosophical, and social concerns associated with the nuclear era.Like its counterpart in Virginia, the New Hampshire Humanities Council pursued a fairly comprehensive model. Indeed, by the time of the launch of the NEH/NSF initiative, the New Hampshire council had already developed its unique “Scientist as Humanist Project”, which for several years (1990–1994) brought together science and humanities teachers from New Hampshire schools for resident summer teacher institutes in which they explored connections between science and the humanities. In describing the success of this project in their application for a grant under the 1993 NEH/NSF special initiative, the council reported “dramatic results in integrating sciences and the humanities in the classroom, in breaking down the perceived barriers between the sciences and the humanities, and in redefining the notions of insight, creativity, and categories of knowledge.” To their earlier initiative they added “Of Apples and Origins: Stories about Life on Earth”, which featured reading discussions exploring the history of great ideas in science and philosophy, a series of public lectures, and a closing conference where audiences were offered new insights into 20th century science and its implications for everyday life.Responding to the tremendous success of these programs, the New Hampshire council sought and received special funding from NEH for a second phase of the project, “Of Apples and Origins II: The Brain, The Mind and Human Meaning”, designed to explore new ideas about human consciousness and the brain. Following the same model as its predecessor, the program culminated in a two-day public conference focused on the new brain/mind science and its impact on our understanding of the self.The appeal of this approach is its relevance to the everyday life of the average individual. State humanities council programs in this area seek to make science and technology accessible by connecting theoretical knowledge with its practical application. Thus, during 1997–1999, the Pennsylvania Humanities Council, building on the NSF/NEH-supported project they had created on new communication technologies and their effect on American society, pursued a project entitled “Technology, Communications and Community”. With support from the Howard Heinz Endowment, this project gave participants opportunities, through community forums, read-and-discuss groups, Internet training, and demonstrations, to discuss the impacts on American communities of 20th-century technologies such as radio, television, and the computer. A subsequent project, “Technology and Community”, brought humanities scholars together with the public in face-to-face and online conversations about the proliferation of the Internet in society. Other similar programs supported by the council included a panel discussion and in-person and online lectures on DNA research and its impact on the work of family historians conducted by the Genealogical Society of Pennsylvania, and a project sponsored by the Franklin Institute which gathered an audiovisual collection of oral histories that document how scientific and technological innovation in the past century had affected and shaped the lives of local citizens.Today, almost a decade later, there are still areas where science and the humanities clearly share concerns and find opportunities for joint problem-solving. Examples include:
—The commoditization of knowledge. The rise of market-driven, corporate-financed research at universities has been noted with increasing alarm by scholars and lamented by public commentators. A major source of concern—the compromising of openness and sharing which traditionally characterized and gave distinction to the nonprofit knowledge enterprise represented by the university—is a question for the humanities.
—The growing divide between pure and applied research. The increasingly pervasive emphasis on connecting funding to immediately demonstrable, utilitarian results and/or economic benefits (reflected in skewed federal support and private funding) mirrors the phenomenon discussed earlier in this essay regarding the “divide” between science and the humanities.
—The persistence of non-scientific, and widespread growth of anti-scientific, views.
The public humanities provide a means for resisting the push to commoditize knowledge and willful science illiteracy. Public programs, reading and discussion programs, and teacher institutes supported by state humanities councils offer a context for public conversations about issues that require scientific expertise. Such community-based activity is a powerful complement to the efforts of beleaguered communities of humanities academics, which one would expect to be among those leading the charge against trends toward commercialization on the one hand and the questioning of scientific knowledge on the other. Unfortunately, the post-modern tendency to delegitimize knowledge has led many university-based humanities scholars to retreat from public debate about right and wrong and to adopt a relativistic approach to all knowledge. This runs the risk of abandoning an intellectually bereft public to the easy picking of a market without morality or to a backward-looking ideology of denial bent on resisting the onslaught of modernity.State humanities councils have a rich history of helping citizens feel they can participate in conversations about science and technology. Council programs and projects take advantage of the unique capacity of the humanities to explore and explain complexity in human life. These are the vitally important occasions where science and the humanities join together to make their public case.  相似文献   

18.
Creep deformation and failure of E911/E911 and P92/P92 similar weld-joints   总被引:1,自引:0,他引:1  
This paper deals with characterisation of microstructure and creep behaviour of similar weld-joints of advanced 9% Cr ferritic steels, namely E911 and P92. The microstructures of the investigated weld-joints exhibit significant variability in different weld-joint regions such as weld metal (WM), heat-affected zone (HAZ), and base metal (BM). The cross-weld creep tests were carried out at 625 °C with initial applied stresses of 100 and 120 MPa. Both weld-joints ruptured by the “type IV cracking failure mode” in their fine-grained heat-affected zones (FG-HAZ). The creep fracture location with the smallest precipitation density corresponds well with its smallest measured cross-weld hardness. The welds of P92 steel exhibit better creep resistance than those of E911 steel. Whereas the microstructure of P92 weld after creep still contains laths, the microstructure of E911 weld is clearly recrystallized. The creep stress exponents are 14.5 and 8 for E911 and P92 weld-joints, respectively. These n-values indicate the “power-law creep” with dislocation-controlled deformation mechanism for both investigated weld-joints.  相似文献   

19.
The assessment of the system unreliability is usually accomplished through well-known tools such as block diagram, fault tree, Monte Carlo and others. These methods imply the knowledge of the failure probability density function of each component “k” (pdf pk). For this reason, possibly, the system failure probability density function (psys) has never been explicitly derived.The present paper fills this gap achieving an enlightening formulation which explicitly gives psys as the sum of (positive) terms representing the complete set of transitions leading the system from an operating to a failed configuration, due to the failure of “a last” component. As a matter of fact, these are all the independent sequences leading the system to the failure.In our opinion, this formulation is important from both methodological and practical point of views. From the methodological one, a clear insight of the system-vs-components behaviors can be grasped and, in general, the explicit link between psys and pk seems to be a notable result. From a practical point of view, psys allows a rigorous derivation of Monte Carlo algorithms and suggests a systematic tool for investigating the system failure sequences.  相似文献   

20.
This review examines six different mechanisms of forming solid structures by freezing: solidification driven extrusion (SDE), hydrodynamic sputtering, laser-induced periodic surface structures (LIPSS), capillary waves, the Mullins–Sekerka instability, and laser zone texturing. Particular emphasis is placed on how these mechanisms relate to structures formed after melting of surfaces with laser irradiation, even though several of these mechanisms operate also for more conventional melts. The Bally–Dorsey model of SDE explains a mechanism for making spikes of materials that expand upon melting and that scales from the centimeter regime down the nanoscale. Capillary waves are often said to “cause” or be “responsible for” the formation of structures with periodicities Λ ranging anywhere from the wavelength of the incident light to over 10× the wavelength. Here it is shown that while capillary waves formed in conjunction with femtosecond to nanosecond pulsed laser irradiation above the melting threshold support structures with 150 nm  Λ  5 μm, they do not cause the structures to form and some other stimulus is required to select the dominant capillary wave. Capillarity actually inhibits the formation of the smallest structures. Features with larger periodicities can be formed during laser irradiation but they require mass transport that is achieved by, e.g., thermocapillarity or the pressure of the laser ablation plume.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号