首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 546 毫秒
1.
This paper aims at being a comprehensive reference for stakeholders, policy makers, and scholars interested in analyzing the problem of efficiency, effectiveness, and impacts of rail transport systems in a sound empirical way, paying specific attention to passenger transport services. The paper combines different analytical frameworks (engineering, economics, impacts), systematic review techniques, and advanced mappings. Framing economic efficiency studies into a transport planning perspective permits to move from efficiency to effectiveness issues. In addition, including impacts offers a critical discussion of the existing empirical studies, relating them to the main methodological approaches used. This analysis can be useful for those interested in developing new techniques for the evaluation of this sector. The critical analysis developed in this paper provides a catalog of inputs, outputs, external factors, possible impacts to account for, data, and approaches, which allows us to identify areas in which new methodological developments, new approaches, are needed to address the relevant societal challenges of the rail transport sector.  相似文献   

2.
The analysis of large graphs plays a prominent role in various fields of research and is relevant in many important application areas. Effective visual analysis of graphs requires appropriate visual presentations in combination with respective user interaction facilities and algorithmic graph analysis methods. How to design appropriate graph analysis systems depends on many factors, including the type of graph describing the data, the analytical task at hand and the applicability of graph analysis methods. The most recent surveys of graph visualization and navigation techniques cover techniques that had been introduced until 2000 or concentrate only on graph layouts published until 2002. Recently, new techniques have been developed covering a broader range of graph types, such as time‐varying graphs. Also, in accordance with ever growing amounts of graph‐structured data becoming available, the inclusion of algorithmic graph analysis and interaction techniques becomes increasingly important. In this State‐of‐the‐Art Report, we survey available techniques for the visual analysis of large graphs. Our review first considers graph visualization techniques according to the type of graphs supported. The visualization techniques form the basis for the presentation of interaction approaches suitable for visual graph exploration. As an important component of visual graph analysis, we discuss various graph algorithmic aspects useful for the different stages of the visual graph analysis process. We also present main open research challenges in this field.  相似文献   

3.
ContextFormal methods, and particularly formal verification, is becoming more feasible to use in the engineering of large highly dependable software-based systems, but so far has had little rigorous empirical study. Its artefacts and activities are different to those of conventional software engineering, and the nature and drivers of productivity for formal methods are not yet understood.ObjectiveTo develop a research agenda for the empirical study of productivity in software projects using formal methods and in particular formal verification. To this end we aim to identify research questions about productivity in formal methods, and survey existing literature on these questions to establish face validity of these questions. And further we aim to identify metrics and data sources relevant to these questions.MethodWe define a space of GQM goals as an investigative framework, focusing on productivity from the perspective of managers of projects using formal methods. We then derive questions for these goals using Easterbrook et al.’s (2008) taxonomy of research questions. To establish face validity, we document the literature to date that reflects on these questions and then explore possible metrics related to these questions. Extensive use is made of literature concerning the L4.verified project completed within NICTA, as it is one of the few projects to achieve code-level formal verification for a large-scale industrially deployed software system.ResultsWe identify more than thirty research questions on the topic in need of investigation. These questions arise not just out of the new type of project context, but also because of the different artefacts and activities in formal methods projects. Prior literature supports the need for research on the questions in our catalogue, but as yet provides little evidence about them. Metrics are identified that would be needed to investigate the questions. Thus although it is obvious that at the highest level concepts such as size, effort, rework and so on are common to all software projects, in the case of formal methods, measurement at the micro level for these concepts will exhibit significant differences.ConclusionsEmpirical software engineering for formal methods is a large open research field. For the empirical software engineering community our paper provides a view into the entities and research questions in this domain. For the formal methods community we identify some of the benefits that empirical studies could bring to the effective management of large formal methods projects, and list some basic metrics and data sources that could support empirical studies. Understanding productivity is important in its own right for efficient software engineering practice, but can also support future research on cost-effectiveness of formal methods, and on the emerging field of Proof Engineering.  相似文献   

4.
The purpose of this paper is to estimate the efficiencies of, and to discuss the managerial implications for 12 international airports in the Asia–Pacific region based on data from the period 1998–2006. We applied data envelopment analysis (DEA) and stochastic frontier analysis (SFA) to compute efficiency estimates, and the empirical results are discussed in terms of management perspectives and mathematical analysis. From the management perspectives, we suggest that airports should focus more on investment than on human resources. In addition, we found that inefficiency effects associated with the production functions of airports increased over the investigated period. From the perspective of mathematical analysis, we determined that deviations from the efficient frontiers of production functions are largely attributed to technical inefficiency. Finally, the empirical results imply that employing the discretion to adjust the scale size of the production function appears to improve efficiency. The main contribution of the paper is in showing how DEA and SFA can be used together to complement each other.  相似文献   

5.
In this paper we focus on a new computational procedure, which permits an efficient calculation within the classical auxiliary field methodology. As has been previously reported, the method suffers from a sign problem, typically encountered in methodologies based on a field-theoretical approach. To ameliorate its statistical convergence, the efforts have so far exclusively been concentrated on the development of efficient analytical integral transformation techniques, such as the method of Gaussian equivalent representation of Efimov et al. In the present work we reformulate the classical auxiliary field methodology according to the concepts of the stationary phase Monte Carlo method of Doll et al., a numerical strategy originally developed for the simulation with real-time path integrals. The procedure, which is here employed for the first time for auxiliary field computation, utilizes an importance sampling strategy, to identify the regions of configuration space that contribute most strongly to the functional integral averages. Its efficiency is here compared to the method of Gaussian equivalent representation.  相似文献   

6.
Based on a literature survey of the development of concepts supplemented by a selective literature review for the years 2000 to 2010, this article aims to trace the dynamic development of the field of design management – a cross‐disciplinary research field seeking to establish itself in its own right. The framework of this research is evolutionary theory, and our analysis are based on two prime design management journals followed by a comparative review two adjunct journals The Design Journal and Creativity and Innovation Management. The challenge is to avoid being overwhelmed by the established paradigms and logics dominating the field of management research. As far as this limited study of adjunct journals indicates, design management as a field of study has not yet been able to acquire attention outside the design research community.  相似文献   

7.
This review article provides an extensive literature survey on the research progress of dielectric resonator antenna (DRA) at millimeter‐wave frequency band that includes concepts of DRAs, their empirical formulae and design methodologies for different shaped DRAs at 60 GHz frequency band. The different shaped DRAs such as cylindrical, rectangular, hexagonal, and octagonal at 60 GHz are designed, simulated and analyzed using CST microwave studio solver. The ?10 dB impedance bandwidth of cylindrical, rectangular, hexagonal, and octagonal DRAs are 52.7 to 62.8 GHz, 57 to 62.2 GHz, 55.8 to 64.2 GHz, and 54.2 to 63.5 GHz, respectively. The idea behind getting broad impedance bandwidth is due to use of double‐layer substrate with different permittivity (εr1 = 4 and εr2 = 11.9). Empirical formulae are deduced for hexagonal and octagonal DRA, by studying the analogy of dielectric resonator geometry. Consequently, the mode of different shaped DRAs, that is, HEM111 and TE111 are investigated by the electric field and magnetic field distribution. With these analysis, a comprehensive research review over the period of the last two decades is carried for investigating various techniques, targeted to realized gain, circular polarization, and impedance bandwidth. Along with these analysis the state‐of‐the‐art at different shaped DRAs at mm‐wave frequency band are also reported.  相似文献   

8.
The purpose of this paper is to analyze productivity and efficiency of information technology industries of fourteen OECD countries. IT productivity has been studied for years and most papers have confirmed the IT productivity. Some even proved that IT productivity is far better than other factors. The impact of IT to the economy depends not only on the productivity but also on the size of IT production. Thus, to analyze productivity and efficiency of IT industries is important. They are the drive for the economic growth in the new economy. Although productivity has been analyzed and discussed in the information systems field for years, little research has been done in efficiency of IT.This paper analyzes the productivity and productive efficiency of fourteen OECD countries and compared them to each other. The models we used in this paper follow Kumbhakar's 1989 models [Rev. Econ. Stat. (1989) 595] and some other econometric models in other productive efficiency studies. By using these models, we were allowed to estimate three different types of (in)efficiencies—technical, allocative and scale, and the percentage loss due to the inefficiency.  相似文献   

9.
基于形式概念的语义网本体的构建与展现   总被引:4,自引:0,他引:4  
作为语义网基础的本体是共享概念模型的明确的形式化规范说明,它提供一种让计算机可以交换、搜寻和认同文字信息的方式。有效地构建、展现本体成为应用本体的关键问题,然而,现有构建本体的各种方法都在不同方面存在着限制。经过分析比较,本文采用形式概念分析理论构造本体阶层来弥补缺陷,并结合机率模式展现本体,用于表达概念之间及概念、资料间的相关性,利用文件与概念的相关性排序结果,以便于用户找到最相关的信息,从而有效地提高了信息查找的效率。本文通过实例来演示本体的构造与表达。  相似文献   

10.
由于流程挖掘技术的快速发展, 流程挖掘算法种类增加迅速, 已有的算法研究文章介绍已不全面. 针对这一情况对迄今为止的流程挖掘主要算法进行系统性的分析总结. 首先对流程挖掘算法现状进行总体分析, 接着根据算法特性将流程挖掘算法分为传统的流程挖掘算法和基于计算智能和机器学习技术的流程挖掘算法两大类, 简要介绍其中代表性算法...  相似文献   

11.
12.
This paper discusses methods for content-based image retrieval (CBIR) systems based on relevance feedback according to two active learning paradigms, named greedy and planned. In greedy methods, the system aims to return the most relevant images for a query at each iteration. In planned methods, the most informative images are returned during a few iterations and the most relevant ones are only presented afterward. In the past, we proposed a greedy approach based on optimum-path forest classification (OPF) and demonstrated its gain in effectiveness with respect to a planned method based on support-vector machines and another greedy approach based on multi-point query. In this work, we introduce a planned approach based on the OPF classifier and demonstrate its gain in effectiveness over all methods above using more image databases. In our tests, the most informative images are better obtained from images that are classified as relevant, which differs from the original definition. The results also indicate that both OPF-based methods require less user involvement (efficiency) to satisfy the user's expectation (effectiveness), and provide interactive response times.  相似文献   

13.
14.
In Search of What We Experimentally Know about Unit Testing   总被引:2,自引:0,他引:2  
Gathering evidence in any discipline is a lengthy procedure, requiring experimentation and empirical confirmation to transform information from mere opinion to undisputed fact. Software engineering is a relatively young field and experimental SE is even younger, so undisputed facts are few and far between. Nevertheless, ESE's relevance is growing because experimental results can help practitioners make better decisions. We have aggregated results from unit-testing experiments with the aim of identifying information with some experimental basis that might help practitioners make decisions. Most of the experiments focus on two important characteristics of testing techniques: effectiveness and efficiency. Some other experiments study the quality of test-case sets according to different criteria  相似文献   

15.
为了直观地了解人工智能领域发展现状及研究前沿,剖析国内外研究存在的异同点,助力国内人工智能研究。以Web of Science数据库和CNKI数据库的2008-2019年期刊论文为依据,借助Citespace软件对期刊论文进行科学知识图谱绘制和可视化分析。根据客观数据和科学知识图谱发现:2016年后,人工智能领域迎来新的研究热潮,且呈现“中美双雄”的格局;在发文质量上,北美区域是当前人工智能研究水平最高的区域;目前,人工智能研究的主力军是高校,且尚未形成产学研相结合的体系;研究主题具有鲜明的时代特征,人工神经网络、算法、大数据、机器人、计算机视觉、法律伦理学等成为当下的研究热点;最后根据人工智能研究脉络演进图与高频突现词提出该领域的“深度强化学习”“人工智能+”“智能社会科学”三个研究前沿,为后续人工智能研究提供方向建议。  相似文献   

16.
Along with vast non-fossil potential and significant expertise, there is a question of whether Asian nations are attaining efficient consumption and exploitation of renewable resources. From this perspective, the paper aims to evaluate the efficiency of 14 potential Asia countries in renewable energy consumption during the six-year periods (2014–2019). In analyzing the performance of the renewable energy sector, the data envelopment analysis (DEA) with an undesirable output model approach has been widely utilized to measure the efficiency of peer units compared with the best practice frontier. We consider four inputs and two outputs to a DEA-based efficiency model. Labor force, total energy consumption, share of renewable energy, and total renewable energy capacity are inputs. The outputs consist of CO2 emissions as an undesirable output and gross domestic product as a desirable output. The results show that United Arab Emirates, Saudi Arabia, Japan, and South Korea consistently outperform in the evaluation, achieving perfect efficiency scores during the research period. Uzbekistan is found to have the lowest average efficiency of renewable energy utilization.  相似文献   

17.
In the last 15 years, software architecture has emerged as an important software engineering field for managing the development and maintenance of large, software-intensive systems. Software architecture community has developed numerous methods, techniques, and tools to support the architecture process (analysis, design, and review). Historically, most advances in software architecture have been driven by talented people and industrial experience, but there is now a growing need to systematically gather empirical evidence about the advantages or otherwise of tools and methods rather than just rely on promotional anecdotes or rhetoric. The aim of this paper is to promote and facilitate the application of the empirical paradigm to software architecture. To this end, we describe the challenges and lessons learned when assessing software architecture research that used controlled experiments, replications, expert opinion, systematic literature reviews, observational studies, and surveys. Our research will support the emergence of a body of knowledge consisting of the more widely-accepted and well-formed software architecture theories.  相似文献   

18.
Summary This paper covers important developments in the use of computers for quantitative research in cultural anthropology, particularly in areas which (unlike statistics) are uniquely anthropological. These fall into statistical topics and topics in scaling and measurement. By far the largest single usage of computers by cultural anthropologists is for statistical summaries of field data and for simple statistical tests such as thechi-squared for the analysis of field data or for cross-cultural studies. As the discipline develops this situation will remain the same. In fact, the proportion of people who use the computer primarily for contingency tables, frequency counts, and correlation analysis may very well increase, since there are many potential users who would fall in this category and only a few potential users who would perform other operations such as multi-dimensional scaling or simulation. The few other computer techniques that would be relevant to anthropology, and for which the technology already exists, include linear regression, as practiced by economists, and linear programming (also practiced by economists), both of which could be extremely useful in the study of peasant economy. Careful research with such models could dispel some of the controversy which has been hindering the development of economic anthropology for the last fifteen years. The training of anthropologists who can understand the relevance of such models to their work may be far in the future, since the majority of them are still skeptical of most formal methods and of the computers which make them work.  相似文献   

19.
This article describes the analysis of emotional state and work productivity using a Web-based Biometric Computer Mouse Advisory System to Analyze a User's Emotions and Work Productivity (Advisory system hereafter) developed by this paper's authors. The Advisory system determines the level of emotional state and work productivity integrally by employing three main biometric techniques (physiological, psychological and behavioral). By using these three biometric techniques, the Advisory system can analyze a person's eleven states of being (stress, work productivity, mood, interest in work) and seven emotions (self-control, happiness, anger, fear, sadness, surprise and anxiety) during a realistic timeframe. Furthermore, to raise the reliability of the Advisory system even more, it also integrated the data supplied by the Biometric Finger (blood pressure and pulse rates). Worldwide research includes various scientists who conducted in-depth studies on the different and very important areas of biometric mouse systems. However, biometric mouse systems cannot generate recommendations. The Advisory system determines a user's physiological, psychological and behavioral/movement parameters based on that user's real-time needs and existing situation. It then generates thousands of alternative stress management recommendations based on the compiled Maslow's Pyramid Tables and selects out the most rational of these for the user's specific situation. The information compiled for Maslow's Pyramid Tables consists of a collection of respondent surveys and analyses of the best global practices. Maslow's Pyramid Tables were developed for an employee working with a computer in a typical organization. The Advisory system provides a user with a real-time assessment of his/her own productivity and emotional state. This article presents the Advisory system, a case study and a scenario used to test and validate the developed Advisory system and its composite parts to demonstrate its validity, efficiency and usefulness.  相似文献   

20.
形式概念分析理论已经广泛地应用于计算机诸多领域.当前,模糊概念格直接构造仍然是该领域主要问题之一,其构造过程具有指数级时间复杂度.为了提高模糊概念格构造效率,文中对串行模糊概念构造算法进行并行化改进,将模糊集合组合搜索空间映射为自然数区间,简化了搜索空间表示、划分和遍历过程,进而提出并行模糊概念构造算法(ParallelFuzzyNextClosure,ParaFuNeC).该算法对搜索空间均匀划分,子搜索空间彼此独立,从而避免并行任务之间同步、通讯等时间耗费,达到提高模糊概念构造效率的目标.时间复杂度分析和实验结果表明该算法在大规模计算任务情况下,加速比随着并行度的提高呈正比增长趋势.另外,串行比例指标表明ParaFuNeC算法在大规模计算任务情况下具有更好的可扩展性.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号