首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Hedge A 《Ergonomics》2000,43(7):1019-1029
This paper briefly reviews research studies of interest to environmental ergonomists. It includes some recent work on the health effects of office lighting, especially the effects of daylighting, fluorescent lighting and full-spectrum lighting. It also covers studies of indoor air quality in offices, especially investigations of localized air filtration and the sick building syndrome. It argues the value of a systematic, ergonomics approach to designing the built environment.  相似文献   

2.
3.
科技引领未来,可是我们到底需要什么样的科技? 看到这篇文章的题目,相信很多朋友都能够想起Windows 95问世的时候,微软公司打出的“Where are you go today”广告语。的确,这篇文章正是想从微软说起。Windows95发布至今,短短的10 年过去,整个IT业界产生了很大的变化,我们心目中“IT”所控制的范围  相似文献   

4.
Where are linear feature extraction methods applicable?   总被引:4,自引:0,他引:4  
A fundamental problem in computer vision and pattern recognition is to determine where and, most importantly, why a given technique is applicable. This is not only necessary because it helps us decide which techniques to apply at each given time. Knowing why current algorithms cannot be applied facilitates the design of new algorithms robust to such problems. In this paper, we report on a theoretical study that demonstrates where and why generalized eigen-based linear equations do not work. In particular, we show that when the smallest angle between the ith eigenvector given by the metric to be maximized and the first i eigenvectors given by the metric to be minimized is close to zero, our results are not guaranteed to be correct. Several properties of such models are also presented. For illustration, we concentrate on the classical applications of classification and feature extraction. We also show how we can use our findings to design more robust algorithms. We conclude with a discussion on the broader impacts of our results.  相似文献   

5.
6.
The 0–1 knapsack problem (KP) is a well-known intractable optimization problem with wide range of applications. Harmony Search (HS) is one of the most popular metaheuristic algorithms to successfully solve 0–1 KPs. Nevertheless, metaheuristic algorithms are generally compute intensive and slow when implemented in software. In this paper, we present an FPGA-based pipelined hardware accelerator to reduce computation time for solving large dimension 0–1 KPs using Binary Harmony Search algorithm. The proposed architecture exploits the intrinsic parallelism of population based metaheuristic algorithm and the flexibility and parallel processing capabilities of FPGAs to perform the computation concurrently thus enhancing performance. To validate the efficiency of the proposed hardware accelerator, experiments were conducted using a large number of 0–1 KPs. Comparative analysis on experimental results reveals that the proposed approach offers promising speedups of 51× – 111× as compared with a software implementation and 2× – 5× as compared with a hardware implementation of Binary Particle Swarm Optimization algorithm.  相似文献   

7.
A key word with regard to a sub-corpus is a word of which the frequency in that sub-corpus is significantly higher than expected under the hypothesis that its use and the variable part of the corpus are mutually independent. A study in literary statistics almost invariably includes a chapter devoted to key words. However, a strong attack has been recently launched upon the way stylometry has been modelling texts since the classical works of Herdan, Guiraud or Muller. In fact statistical modelling seems as valid in stylistics as in any other field of the humanities and social sciences. What is questionable is the fact that many studies in literary statistics are more satisfied with the easy identification of monsters, i.e. literary phenomena unexplained by wrong models, than with the laborious research of models fitting the textual data well. A short examination of the mentioned controversy and the quantitative analysis of an example provided by Laclos' novelLes Liaisons dangereuses endeavour to support this argument.Christian Delcourt is a senior lecturer in the Department of Romance Philology at the University of Liége.  相似文献   

8.
To explore virtual environments that are larger than the available physical tracking space by real walking, it is necessary to use so-called redirected walking. Redirection techniques allow the user to explore an unlimited virtual environment in a limited tracking space by introducing a small mismatch between a user’s real and virtual movement, thus preventing the user from colliding with the physical walls of the tracking space. Steering algorithms are used to select the most suitable redirection technique at any given time, depending on the geometry of the real and virtual environment. Together with prediction of a user’s future walking path, these algorithms select the best redirection strategy by an optimal control scheme. In this paper, a new approach for the prediction of a person’s locomotion target is presented. We use various models of human locomotion together with a set of possible targets to create a set of expected paths. These paths are then compared to the real path the user already traveled to calculate the probability of a certain target being the one the user is heading for. A new approach for comparing paths with each other is introduced and is compared to three others. For describing the human’s path to a given target, four different models are used and compared. To gather data for the comparison of the models against the real path, a user study was conducted. Based on the results of the user study, the paper concludes with a discussion on the prediction performance of the different approaches.  相似文献   

9.
Indirect Immunofluorescence (IIF) images are increasingly being used for the diagnosis of autoimmune diseases. However, the analysis of this kind of images has until now reached a comparatively low level of automation, if compared with other medical imaging techniques. The Special Issue on the Analysis and Recognition of Indirect Immunofluorescence Images of the Pattern Recognition journal aims at providing a comprehensive evaluation of the state of the art for the staining pattern classification problem, through the adoption of a common experimental protocol and the testing of all the methods on a publicly available dataset.  相似文献   

10.
The built environment sector impacts significantly on communities. At the same time, it is the sector with the highest cost and environmental saving potentials provided effective strategies are implemented. The emerging Semantic Web promises new opportunities for efficient management of information and knowledge about various domains. While other domains, particularly bioinformatics have fully embraced the Semantic Web, knowledge about how the same has been applied to the built environment is sketchy. This study investigates the development and trend of Semantic Web applications in the built environment. Understanding the different applications of the Semantic Web is essential for evaluation, improvement and opening of new research. A review of over 120 refereed articles on built environment Semantic Web applications has been conducted. A classification of the different Semantic Web applications in relation to their year of application is presented to highlight the trend. Two major findings have emerged. Firstly, despite limited research about easy-to-use applications, progress is being made from often too-common ontological concepts to more innovative concepts such as Linked Data. Secondly, a shift from traditional construction applications to Semantic Web sustainable construction applications is gradually emerging. To conclude, research challenges, potential future development and research directions have been discussed.  相似文献   

11.
The term Classroom Proxemics refers to how teachers and students use classroom space, and the impact of this and the spatial design on learning and teaching. This study addresses the divide between, on the one hand, substantial work on proxemics based on classroom observations and, on the other hand, emerging work to design automated feedback that helps teachers identify salient patterns in their use of the classroom space. This study documents how digital analytics were designed in service of a senior teacher's practice-based inquiry into classroom proxemics. Indoor positioning data from four teachers were analysed, visualized and used as evidence to compare three distinct learning designs enacted in a physics classroom. This study demonstrates how teachers can make effective use of such visualizations, to gain insight into their classroom practice. This is evidenced by (a) documenting teachers' reflections on visualizations of positioning data, both their own and that of peers and (b) identifying the types of indicator (operationalized as analytical metrics) that foreground the most useful information for teachers to gain insight into their practice.  相似文献   

12.
Research with computer systems and musical grammars into improvisation as found in the tabla drumming system of North India has indicated that certain musical sentences comprise (a) variable prefixes, and (b) fixed suffixes (or cadences) identical with those of their original rhythmic themes. It was assumed that the cadence functioned as a kind of target in linear musical space, and yet experiments showed that defining what exactly constituted the cadence was problematic. This paper addresses the problem of the status of cadential patterns, and demonstrates the need for a better understanding and formalization of ambiguity in musico-cognitive processing. It would appear from the discussion that the cadence is not a discrete unit in itself, but just part of an ever-present underlying framework comprising the entire original rhythmic theme. Improvisations (variations), it is suggested, merely break away from and rejoin this framework at important structural points. This endorses the theory of simultaneity. However, the general cognitive implications are still unclear, and further research is required to explore musical ambiguity and the interaction of musical, linguistic, and spatio-motor grammars.  相似文献   

13.
The paper presents an empirical study of user involvement in developing a technical standard for a scientific community's information system project. The case illustrates how multiple perspectives are involved when considering the user role in practice. The case presents a situation where both developers and users were pre‐defined in the design and development phases of the standard as homogeneous groups of actors. Groups of actors changed to become more heterogeneous and ‘fluid’ in the deployment and implementation phases, thus forming ‘webs of developers’ and ‘webs of users’. Detailed analysis of the process in its entirety shows the blurredness of boundaries between ‘developer’ and ‘user’ categories and roles, and reveals challenges at social and organizational levels. Three models pertaining to the system development process are presented in order to illuminate differing perspectives on the user and on the development process itself. The paper draws theoretically from information systems, social informatics, and science and technology studies. The research contributes to a deeper, interdisciplinary understanding of ‘the’ user, of multiple roles in systems development, and of dynamic sets of user–developer relations.  相似文献   

14.
This paper studies a group of basic state reduction based dynamic programming (DP) algorithms for the multi-objective 0–1 knapsack problem (MKP), which are related to the backward reduced-state DP space (BRDS) and forward reduced-state DP space (FRDS). The BRDS is widely ignored in the literature because it imposes disadvantage for the single objective knapsack problem (KP) in terms of memory requirements. The FRDS based DP algorithm in a general sense is related to state dominance checking, which can be time consuming for the MKP while it can be done efficiently for the KP. Consequently, no algorithm purely based on the FRDS with state dominance checking has ever been developed for the MKP. In this paper, we attempt to get some insights into the state reduction techniques efficient to the MKP. We first propose an FRDS based algorithm with a local state dominance checking for the MKP. Then we evaluate the relative advantage of the BRDS and FRDS based algorithms by analyzing their computational time and memory requirements for the MKP. Finally different combinations of the BRDS and FRDS based algorithms are developed on this basis. Numerical experiments based on the bi-objective KP instances are conducted to compare systematically between these algorithms and the recently developed BRDS based DP algorithm as well as the existing FRDS based DP algorithm without state dominance checking.  相似文献   

15.
《Software, IEEE》2001,18(3):41-45
We often hear that it is difficult to get software measurement into practice. Traditional measurement addresses the decisions that support increased quality, increased programmer productivity, and reduced costs: key elements for organizations strategically focused on operational excellence. But what if the organization's highest priority isn't operational excellence? The article shows that such organizations have different measurement needs and presents ideas on how to address those needs, thereby making measurement more appealing. While the disparity discussed here involves measurement, it applies to all areas of software process improvement. For example, the Software Engineering Institute's Capability Maturity Model for Software is silent on two of the three strategies of high-performing organizations: customer intimacy and product innovation. Like traditional measurement, the Capability Maturity Model applies only to organizations wanting to be operationally excellent  相似文献   

16.
SUMMARY

University students-whether adult, distance learner, or traditional age-need access to university services and quick accurate answers to their questions beyond traditional “business” hours. Students' busy schedules and changing life patterns dictate that university services meet their needs. At DePaul University in Chicago, student focus groups repeatedly pointed to the need for one central place to get an answer or solve a problem. DePaul Central was created as an information and referral service to satisfy that student need, at the same capitalizing on the value of the librarian skill set.  相似文献   

17.
《Data Processing》1984,26(6):4-6
The article offers an update on the current state of videotex. Topics covered are: industry standards, Prestel, current trends and future trends.  相似文献   

18.
We are concerned with a variation of the knapsack problem, the bi-objective max–min knapsack problem (BKP), where the values of items differ under two possible scenarios. We have given a heuristic algorithm and an exact algorithm to solve this problem. In particular, we introduce a surrogate relaxation to derive upper and lower bounds very quickly, and apply the pegging test to reduce the size of BKP. We also exploit this relaxation to obtain an upper bound in the branch-and-bound algorithm to solve the reduced problem. To further reduce the problem size, we propose a ‘virtual pegging’ algorithm and solve BKP to optimality. As a result, for problems with up to 16,000 items, we obtain a very accurate approximate solution in less than a few seconds. Except for some instances, exact solutions can also be obtained in less than a few minutes on ordinary computers. However, the proposed algorithm is less effective for strongly correlated instances.  相似文献   

19.
The objective of the multi-dimensional knapsack problem (MKP) is to find a subset of items with maximum value that satisfies a number of knapsack constraints. Solution methods for MKP, both heuristic and exact, have been researched for several decades. This paper introduces several fast and effective heuristics for MKP that are based on solving the LP relaxation of the problem. Improving procedures are proposed to strengthen the results of these heuristics. Additionally, the heuristics are run with appropriate deterministic or randomly generated constraints imposed on the linear relaxation that allow generating a number of good solutions. All algorithms are tested experimentally on a widely used set of benchmark problem instances to show that they compare favourably with the best-performing heuristics available in the literature.  相似文献   

20.
The optimization of the execution time of a parallel algorithm can be achieved through the use of an analytical cost model function representing the running time. Typically the cost function includes a set of parameters that model the behavior of the system and the algorithm. In order to reach an optimal execution, some of these parameters must be fitted according to the input problem and to the target architecture. An optimization problem can be stated where the modeled execution time for the algorithm is used to estimate the parameters. Due to the large number of variable parameters in the model, analytical minimization techniques are discarded. Exhaustive search techniques can be used to solve the optimization problem, but when the number of parameters or the size of the computational system increases, the method is impracticable due to time restrictions. The use of approximation methods to guide the search is also an alternative. However, the dependence on the algorithm modeled and the bad quality of the solutions as a result of the presence of many local optima values in the objective functions are also drawbacks to these techniques. The problem becomes particularly difficult in complex systems hosting a large number of heterogeneous processors solving non-trivial scientific applications. The use of metaheuristics allows for the development of valid approaches to solve general problems with a large number of parameters. A well-known advantage of metaheuristic methods is the ability to obtain high-quality solutions at low running times while maintaining generality. We propose combining the parameterized analytical cost model function and metaheuristic minimization methods, which contributes to a novel real alternative to minimize the parallel execution time in complex systems. The success of the proposed approach is shown with two different algorithmic schemes on parallel heterogeneous systems. Furthermore, the development of a general framework allows us to easily develop and experiment with different metaheuristics to adjust them to particular problems.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号