首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
This study proposed a methodology that integrate sociotechnical systems (STS) and media big data analysis using text mining for the new, real-time technology assessment (TA). The essential steps of this method are composed of data collection using a cultural map, analysis with trends and patents, and synthesis using media big data. By applying this methodology to artificial organs, first, we have shown that STS can be apply to biosocial technical systems beyond the sustainability transition. The result reveals that a media discourse structures, in which eight countries began to form socio-technical regimes around technologies with their respective strengths, in an objective way. Each technology corresponded to the vested interests in each country's socio-technical regimes. These discourse structures helped us to identify substitution, two types of transformation, and reconfiguration as transition pathways. More importantly, our analysis results have also shown that the methodology helps to overcome the anticipation dilemma, saving the time and resources required for TA. Our integrated methodology has achieved similar results by using 23% of the budget, 25% of the time, and 14% of the work hours used for official TA. Lastly, the “objectivity” and “agenda setting” of this methodology can provide a breakthrough in overcoming the control dilemma.  相似文献   

3.
The fundamentals of the retention equilibrium in reversed-phase liquid chromatography (RPLC) are studied on the basis of enthalpy-entropy compensation (EEC). First, retention data were acquired and the influence of the nature of the compounds, organic solvent modifier, and temperature on these data was assessed. Then, the data were analyzed according to the four different methods proposed by Krug et al., and an EEC was formally established. Linear correlations were observed between the logarithm of the adsorption equilibrium constants under the different RPLC conditions, suggesting linear free energy relationships (LFERs). Finally, the variations of the retentions with the experimental conditions are shown to be quantitatively explained by a new model based on EEC. This model affords a comprehensive interpretation of the variations of retention originating from changes of either one parameter alone or several simultaneously. The slope and intercept of the LFER that relates two equilibrium systems are accounted for by the new model. The parameters of this model are the changes of enthalpy and entropy associated with the retention, the compensation temperatures, and the experimental conditions.  相似文献   

4.
Feature recognition (FR) for machining planning has most often focused on extracting simple-shaped form features such as holes, slots, rectangular pockets, etc. In Part I of this paper, it was proposed that it is possible to search for more complex shaped features - provided that machining planning for such shapes can be performed automatically and without loss of total machining time. Part II describes the greedy tool heuristic, which can be used as the basis for computationally tractable search strategies to mill complex profiled 2.5D pockets in near-optimal time. A machining planner based on this strategy has been developed. Its successful integration with the FR system previously described justifies the premise that distinguishes the present system from previous ones: maximize the proportion of the delta volume consumed by generalized pockets. It will be shown that machining plans of complex pockets using this method are more efficient than plans based on FR systems using fixed form features. It is believed that this approach for integrated FR and machining planning will lead to more robust process planning systems for practical parts.  相似文献   

5.
In order to describe better the probability of failure of structures, crack growth data of the underlying material is required. In particular, it is well known that to calculate accurately the probability of failure of a structure that has sustained low loads for long periods, crack growth data of very low rates of growth are necessary. There exists a multitude of experimental techniques, incorporating various fracture systems, which have been utilized in the past to measure crack growth data. One such unique system is the Hertzian fracture system. However, in the past this system could not be successfully employed to measure accurately crack growth rates, over a broad range. Recently, it has been shown that limitations associated with the Hertzian fracture system can be overcome, and thus, demonstrate that this system is well suited for measuring very low rates of crack growth: as low as 10?15 m s?1. It is the purpose of this publication to review the past work associated with the application of the Hertzian fracture system for measuring crack growth data. In particular, the relevant issues associated with crack growth measurement using this system are highlighted and previously unpublished scanning electron micrographs of Hertzian cone crack fracture surfaces are presented.  相似文献   

6.
The effect of aliasing on sampled noise power measurements is investigated. It is shown that although aliasing does not bias noise power measurements, it can significantly increase the measurement variance. The equivalent noise bandwidth (EENBW) of a sampled data system is defined by equating the mean and variance of the noise power measurement to those of a system with a rectangular spectral response. With this definition, which is shown to be consistent with that for analog systems, it follows that the ENBW is bounded by the smaller of the analog ENBW or half the sampling frequency. Practical examples of sampled-data systems are used to demonstrate the increased variance, integration time, and noise spectral density that accompany aliasing  相似文献   

7.
BATCH SCHEDULING TO MINIMIZE CYCLE TIME, FLOW TIME, AND PROCESSING COST   总被引:2,自引:0,他引:2  
We consider a multi-stage batch processing system in which a group of identical units, requiring the same set of operations, is manufactured. In this system, work on the batch must be continuous at each operation, but the batch can be split into sub-batches for transfer between consecutive operations. We examine the case in which operations can be performed in any sequence, using cycle time, total flow time (static and dynamic), and processing cost as measures of performance. It is shown that these problems are closely related to a traveling-salesman problem with a special cost matrix. Optimal scheduling rules are developed for all measures of performance.  相似文献   

8.
A general approach to the dimensional reduction of non‐linear finite element models of solid dynamics is presented. For the Newmark implicit time‐discretization, the computationally most expensive phase is the repeated solution of the system of linear equations for displacement increments. To deal with this, it is shown how the problem can be formulated in an approximation (Ritz) basis of much smaller dimension. Similarly, the explicit Newmark algorithm can be also written in a reduced‐dimension basis, and the computation time savings in that case follow from an increase in the stable time step length. In addition, the empirical eigenvectors are proposed as the basis in which to expand the incremental problem. This basis achieves approximation optimality by using computational data for the response of the full model in time to construct a reduced basis which reproduces the full system in a statistical sense. Because of this ‘global’ time viewpoint, the basis need not be updated as with reduced bases computed from a linearization of the full finite element model. If the dynamics of a finite element model is expressed in terms of a small number of basis vectors, the asymptotic cost of the solution with the reduced model is lowered and optimal scalability of the computational algorithm with the size of the model is achieved. At the same time, numerical experiments indicate that by using reduced models, substantial savings can be achieved even in the pre‐asymptotic range. Furthermore, the algorithm parallelizes very efficiently. The method we present is expected to become a useful tool in applications requiring a large number of repeated non‐linear solid dynamics simulations, such as convergence studies, design optimization, and design of controllers of mechanical systems. Copyright © 2001 John Wiley & Sons, Ltd.  相似文献   

9.
Production management is concerned with the efficient transformation of inputs into desired outputs. Usually the operations are carried out under conditions of uncertain and fluctuating customers’ demand, varying procurement lead times for difference inputs, and the necessity to fix the production schedules long before the magnitude and the mix of the orders that must be filled from the resulting output can be determined. Under certain conditions it is possible to view production of this kind as a closed, self-regulating system which may be studied using some common techniques and procedures.

In this paper we define such an operation in terms of several difference equations which describe the behaviour of a simple coupled production inventory system. It is shown that the coupling arises from the fact that the equations are both functionally and time-wise inter-dependent.

A single equation of motion is developed to describe the dynamic behaviour of the system. Analysis ofthe equation shows that production and inventories are dynamically related. Furthermore: (a) neither the strength of control nor the time lag can be ignored; (b) as the time lag increases, the strength control must be reduced to provide the same level of stability; (c) a ‘quick responding system’ is not desirable when the system contains time lags; (d) production stability is enhanced by employing a target inventory level which does not vary, and by utilizing forecasts which are statistically unbiased and stable  相似文献   

10.
A system was developed in 2008 to calculate patient doses using Radiology Information System (RIS) data and presents these data as a patient dose audit. One of the issues with this system was the quality of user-entered data. It has been shown that Digital Imaging and Communication in Medicine (DICOM) header data can be used to perform dose audits with a high level of data accuracy. This study aims to show that using RIS data for dose audits is not only a viable alternative to using DICOM header data, but that it has advantages. A new system was developed to pull header data from DICOM images easily and was installed on a workstation within a hospital department. Data were recovered for a common set of examinations using both RIS and DICOM header data. The data were compared on a result-by-result basis to check for consistency of common fields between RIS and DICOM, as well as assessing the value of data fields uncommon to both systems. The study shows that whilst RIS is not as accurate as DICOM, it does provide enough accurate data and that it has other advantages over using a DICOM approach. These results suggest that a 'best of both worlds' may be achievable using Modality Performed Procedure Step (MPPS).  相似文献   

11.
Some road safety problems have persisted for a long time in nearly all motorised countries, suggesting that they are not easily solved. This paper documents the persistence over time of five such problems: the high risk of accidents involving young drivers; the high risk of injury run by unprotected road users; risks attributable to incompatibility between different types of vehicles and groups of road users; differences in risk between different types of traffic environment and speeding. A taxonomy of road safety problems is developed in order to identify characteristics of problems that can make them difficult to solve. It is argued that if a problem is not perceived as a problem, is attributable to a misguided confidence in road user rationality, involves social dilemmas, or is closely related to the physics of impacts then it is likely to be difficult to solve. Problems to which biological factors contribute are also likely to be difficult to solve. The characteristics that can make a problem difficult to solve are to some extent present for all the five problems shown to be persistent in this paper.  相似文献   

12.
This is a terse review of recent results on isochronous dynamical systems, namely systems of (first-order, generally nonlinear) ordinary differential equations (ODEs) featuring an open set of initial data (which might coincide with the entire set of all initial data), from which emerge solutions all of which are completely periodic (i.e. periodic in all their components) with a fixed period (independent of the initial data, provided they are within the isochrony region). A leitmotif of this presentation is that 'isochronous systems are not rare'. Indeed, it is shown how any (autonomous) dynamical system can be modified or extended so that the new (also autonomous) system thereby obtained is isochronous with an arbitrarily assigned period T, while its dynamics, over time intervals much shorter than the period T, mimics closely that of the original system, or even, over an arbitrarily large fraction of its period T, coincides exactly with that of the original system. It is pointed out that this fact raises the issue of developing criteria providing, for a dynamical system, some kind of measure associated with a finite time scale of the complexity of its behaviour (while the current, standard definitions of integrable versus chaotic dynamical systems are related to the behaviour of a system over infinite time).  相似文献   

13.
A novel scheme using spatial data stream multiplexing (SDSM) in the upcoming multiple-input multiple-output (MIMO)-based IEEE 802.11n physical layer is proposed. It is shown that with SDSM, the same data rate can be achieved by using less number of transmit and receive antennas and therefore this scheme can reduce the number of antennas which results in reducing mutual coupling effects, hardware costs and implementation complexities. The maximum data rates that can be achieved using a 2times2 MIMO system is 270 Mbps and for a 4times4 MIMO system is 540 Mbps. The same data rates can be achieved using the SDSM technique which reduces the 2times2 MIMO system to 1times1 SISO system and the 4times4 MIMO system to a 2times2 MIMO system.  相似文献   

14.
Earlier studies have shown that by using cross-sectional data for a group of developing countries, a significant relationship can be established between fatality rates and vehicle ownership levels. This paper updates relationships established in earlier years and identifies whether or not the slope of the regression line has continued to increase (and suggests that for the group of countries as a whole, there is a worsening in the safety situation). Similar relationships are also established for casualty rates. A detailed analysis is made of the relationship between fatality rates and parameters which describe, in part, the social, physical and economic characteristics of the developing countries. These include vehicle ownership, gross national product per capita, road density, vehicle density (per kilometre of road), population per physician and population per hospital bed. Again, comparisons are made with results obtained on earlier studies.  相似文献   

15.
基于HHT的利用结构自由振动响应进行系统识别的方法,只需要测得一点的响应就能得到各阶模态频率和阻尼比,通过各点的响应分析可以得到各阶振型,从而求得结构的刚度矩阵和阻尼矩阵。通过对一个4层2跨2开间的钢筋混凝土框架结构模型进行了锤击测试试验。用HHT方法识别其各阶频率、振型和各阶阻尼比,并计算得到刚度矩阵和阻尼矩阵。由识别的刚度矩阵可以看出用考虑节点转角并进行静力凝聚的杆系一层模型能较精确地反映框架结构的振动。试验数据分析还表明,当初始振幅较大时框架结构的阻尼比具有时变性,随着振幅的减小,阻尼比减小。  相似文献   

16.
The information analysis process includes a cluster analysis or classification step associated with an expert validation of the results. In this paper, we propose new measures of Recall/Precision for estimating the quality of cluster analysis. These measures derive both from the Galois lattice theory and from the Information Retrieval (IR) domain. As opposed to classical measures of inertia, they present the main advantages to be both independent of the classification method and of the difference between the intrinsic dimension of the data and those of the clusters. We present two experiments on the basis of the MultiSOM model, which is an extension of Kohonen's SOM model, as a cluster analysis method. Our first experiment on patent data shows how our measures can be used to compare viewpoint-oriented classification methods, such as MultiSOM, with global cluster analysis method, such as WebSOM. Our second experiment, which takes part in the EICSTES EEC project, is an original Webometrics experiment that combines content and links classification starting from a large non-homogeneous set of web pages. This experiment highlights the fact that break-even points between our different measures of Recall/Precision can be used to determine an optimal number of clusters for web data classification. The content of the clusters obtained when using different break-even points are compared for determining the quality of the resulting maps. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

17.
Two models are presented, describing the development of traffic and traffic safety. Traffic volumes, measured by the total amount of vehicle kilometers per year, are expected to follow a sigmoid saturation curve over time. The logistic function is used to model this development. The fatality rate, the number of fatalities per vehicle kilometer, is chosen to measure safety. The (negative) exponential function is selected to model the fatality rates over time. It is argued that these two aspects of the traffic system are fundamental and that the development of the number of fatalities results by multiplication. Given this assumption, the fall in the number of fatalities, noticed in almost all developed countries after a steady increase until 1970, does not need a special explanation. It follows from the combination of the monotonically increasing traffic volumes and the monotonically decreasing fatality rates. The two parsimonious models fit the data fairly well for six developed countries. The parameters differ substantially between countries, but also show common features. It is found from the parameters of the logistic function, that for all countries the points of maximum increase in traffic volume coincide just after 1970, the moment of the energy crisis. It is concluded from this finding that the energy crisis was caused by the cumulating demands of the oil-consuming countries, resulting in a reaction of the oil-producing countries. From the parameters of the exponential function, it is found that there also is a common point of intersection for fatality rates around 1980. It is shown that the development of safety is directly related to the development of traffic. The ten-year delay is interpreted as the time necessary for planning and implementation of safety measures. Finally, a striking relation is found between the volume parameters and the fatality-rate parameters, suggesting that the number of fatalities is a function of the derivative of the amount of traffic in the mathematical sense.  相似文献   

18.
小区间干扰是影响未来蜂窝移动通信系统性能的主要因素,如何克服小区间干扰成为系统设计的关键所在。多点协作传输技术通过控制相邻小区之间的干扰,并将原本是干扰的信号转变为有用信息,从理论上突破了单点非协作系统的干扰受限容量,实现了链路可靠性的提高和传输速率的增加。多点协作技术传输技术被认为是降低小区间干扰、提升小区边缘吞吐量和系统吞吐量的更本质更有效的技术。TDD蜂窝移动通信系统中实现多点协作传输技术具有天然的优势:多点协作所需的大量的信道状态信息在TDD系统中通过上下行信道的互易性可以很容易的获得。本文对TDD系统实现多点协作传输的方案以及可能遇到的问题进行了分析,并通过系统仿真结果证明了两者结合所能带来的性能增益。  相似文献   

19.
Bhattacharya  Sujit  Pal  Chandra  Arora  Jagdish 《Scientometrics》2000,47(1):131-142
In an earlier study, a methodology was described for identifying Frontier Areas in a research field, i.e., areas which experienced in a particular time period significant increase in research output in comparison to a preceding time period. The application of this methodology was shown by identifying Frontier Areas of research in Physics in 1995. Comparison was done with respect to the outputs in different areas in 1990. Profiles of countries active in the identified Frontier Areas were then constructed.In this paper, attempt is made to reveal the active research topics/themes within these Frontier Areas in 1990 and 1995. The active research topics, which are uncovered, are classified as Frontier Topics. Countries active in these frontier topics are distinguished in each time period. Association among countries and Frontier Topics are observed using the multivariate technique of correspondence analysis. Dynamics are observed by analysing the changes in the profiles of the countries in the two time periods. Results and implications of this study for decision-making and as a policy tool are highlighted.  相似文献   

20.
This paper discusses the analysis of a single server finite queue with Poisson arrival and arbitrary service time distribution, wherein the arrival rates are state dependent which are all distinct or all equal, service times are conditioned on the system length at the moment of service initiation. The analytic analysis of the queue is carried out and the final results have been presented in the form of recursive equations which can be easily programmed on any PC to obtain the distributions of number of customers in the system at arbitrary, departure and pre-arrival epochs. It is shown that the method works for all service time distributions including the non-phase type and also for low and high values of the model parameters. Some performance measures, and relations among the state probabilities at arbitrary, departure and pre-arrival epochs are also discussed. Furthermore, it is shown that results for a number of queueing models can be obtained from this model as special cases. To demonstrate the effectiveness of our method some numerical examples have been presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号