首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
The ADA-method is an attempt to integrate work environment issues into a usability evaluation method. The intention is to provide a method that can be used for the analysis of computer systems that are used by skilled professionals as a major part of their work.An ADA-analysis is performed as a semi-structured observation interview. The objectives of the ADA-method are (1) to identify usability and cognitive work environment problems in a computer supported work situation, and (2) to be a basis for further analysis and discussions concerning improvements of the system.The method was designed to suit the needs of occupational health specialists as a complement to their traditional methods for investigating physical and psychosocial work environments. However, the method has a more general applicability as it can be taught to any usability expert to facilitate work environment considerations in their analysis and evaluation work. Furthermore, the paper reports on the use of the method in several different settings and the results thereof.  相似文献   

2.
A model‐based method for assessing the usability of graphical, direct‐manipulation style interfaces was developed. The method involves collecting and integrating verbal protocol data, history logs, and videotapes of the system display. Then, an analyst familiar with the task, the data, and Norman's (1986) user activity model reviews the data and makes determinations on what they mean in terms of the model. An encoding scheme is next applied to the integrated data, to structure the Human Computer Interaction (HCI) process at a detailed interaction level. The structured data now support the application of quantitative methods and the identification of meaningful patterns and frequencies that highlight potential usability problems or instances of indirectness. Error encodings reflect user‐system interface difficulties not only in the execution stage but also in the psychological stages. The method was used to evaluate the usability of a military airspace scheduling system; the types of usability problems identified and the advantages of the method are discussed.  相似文献   

3.
Biological sex has become a common variable in studies analyzing participation levels in both traditional oral and computer classrooms. This article, however, argues for the overlay of the biological sex variable and one that measures a socially constructed gender (such as the Bem Sex-Role Inventory). This article reports on a study of male and female students' participation in class discussions (measured in word counts), which found that students (in general) participated more frequently in electronic than in face-to-face discussions. Overall, students participated more frequently in face-to-face discussions after they participated in Daedalus interchange sessions, but socially constructed variables such as gender led some students to participate less frequently in traditional oral·discussions after using interchange. These findings indicate that although the computer environment may not promote egalitarian discourse in which everyone participates equally, it does tend to produce more democratic discourse, in which everyone has an equal opportunity to participate.  相似文献   

4.
Abstract

We compared the effectiveness of lab testing, beta testing, and forum testing at identifying software usability problems. Thirty participants were involved in the experiment, with ten participants in each of the three test conditions. The lab test involved participants performing prescribed scenarios with the software in a controlled lab environment, while human factors engineers recorded participant's problems. The beta test method had participants use the software in their own environment to perform their real world work and record their own problems. The forum test was similar to the beta test, except that the software was made available on a company-wide computer bulletin board and the participants selected themselves. Findings show that the beta test method was as effective as the lab test method in the number of problem types identified. The lab test uncovered a larger proportion of serious usability problems than did the beta test. The beta test method was the most cost-effective method. The forum test method found the fewest number of problem types and was the least cost-effective. Thus, the results of this study broaden the current literature by showing that the beta test method may be a cost-effective alternative to the traditional lab test.  相似文献   

5.
《Ergonomics》2012,55(7):609-625
In-vehicle information systems (IVIS) can be controlled by the user via direct or indirect input devices. In order to develop the next generation of usable IVIS, designers need to be able to evaluate and understand the usability issues associated with these two input types. The aim of this study was to investigate the effectiveness of a set of empirical usability evaluation methods for identifying important usability issues and distinguishing between the IVIS input devices. A number of usability issues were identified and their causal factors have been explored. These were related to the input type, the structure of the menu/tasks and hardware issues. In particular, the translation between inputs and on-screen actions and a lack of visual feedback for menu navigation resulted in lower levels of usability for the indirect device. This information will be useful in informing the design of new IVIS, with improved usability.

Statement of Relevance: This paper examines the use of empirical methods for distinguishing between direct and indirect IVIS input devices and identifying usability issues. Results have shown that the characteristics of indirect input devices produce more serious usability issues, compared with direct devices and can have a negative effect on the driver–vehicle interaction.  相似文献   

6.
ContextUncertainty is an unavoidable issue in software engineering and an important area of investigation. This paper studies the impact of uncertainty on total duration (i.e., make-span) for implementing all features in operational release planning.ObjectiveThe uncertainty factors under investigation are: (1) the number of new features arriving during release construction, (2) the estimated effort needed to implement features, (3) the availability of developers, and (4) the productivity of developers.MethodAn integrated method is presented combining Monte-Carlo simulation (to model uncertainty in the operational release planning (ORP) process) with process simulation (to model the ORP process steps and their dependencies as well as an associated optimization heuristic representing an organization-specific staffing policy for make-span minimization). The method allows for evaluating the impact of uncertainty on make-span. The impact of uncertainty factors both in isolation and in combination are studied in three different pessimism levels through comparison with a baseline plan. Initial evaluation of the method is done by an explorative case study at Chartwell Technology Inc. to demonstrate its applicability and its usefulness.ResultsThe impact of uncertainty on release make-span increases – both in terms of magnitude and variance – with an increase of pessimism level as well as with an increase of the number of uncertainty factors. Among the four uncertainty factors, we found that the strongest impact stems from the number of new features arriving during release construction. We have also demonstrated that for any combination of uncertainty factors their combined (i.e., simultaneous) impact is bigger than the addition of their individual impacts.ConclusionThe added value of the presented method is that managers are able to study the impact of uncertainty on existing (i.e., baseline) operational release plans pro-actively.  相似文献   

7.
This paper analyzes the relation between usability and aesthetics. In a laboratory study, 80 participants used one of four different versions of the same online shop, differing in interface-aesthetics (low vs. high) and interface-usability (low vs. high). Participants had to find specific items and rate the shop before and after usage on perceived aesthetics and perceived usability, which were assessed using four validated instruments. Results show that aesthetics does not affect perceived usability. In contrast, usability has an effect on post-use perceived aesthetics. Our findings show that the “what is beautiful is usable” notion, which assumes that aesthetics enhances the perception of usability can be reversed under certain conditions (here: strong usability manipulation combined with a medium to large aesthetics manipulation). Furthermore, our results indicate that the user’s affective experience with the usability of the shop might serve as a mediator variable within the aesthetics–usability relation: The frustration of poor usability lowers ratings on perceived aesthetics. The significance of the results is discussed in context of the existing research on the relation between aesthetics and usability.  相似文献   

8.
Cognition, Technology & Work - Navigating a ship is a complex task that requires close interaction between navigators and technology available on the ship’s bridge. The quality of this...  相似文献   

9.
《Computers & Education》1986,10(1):89-96
This paper describes the development of a computer based teaching package for use at upper secondary school level. The project was funded by Understanding Electricity, which is the educational service of the Electricity Council. The principles of electricity supply are integral to several subject areas in the school curriculum. The purpose of this package is firstly to provide well-resourced teaching material to cover these curriculum needs and secondly to provide an industrial context in which to apply the technical skills acquired in individual subject disciplines. The package is based on a computer simulation that enables the user to experience the problems of running an electricity supply system. The use of the computer in this way enables a powerful investigative approach to be adopted in the classroom.The paper begins by discussing the background to joint industry/education projects and the procedural models that have emerged. It continues by presenting the project history, the development team model used and the educational concepts of electricity supply that provide the background to the computer simulation. Attention is drawn to the ways in which the finished package matches the industrial resource material to curriculum needs.Teachers were involved from the beginning, both in the development of the computer software and in the production of curriculum material. They also undertook the design of investigations in specific subject areas—economics, physics, mathematics and geography. This paper presents the results of field trials and discusses the problems of evaluating and marketing educational material.Although the detail of this paper deals specifically with educational material developed for the Electricity Council, the principles discussed have significance for approaches to the production of educational software generally.  相似文献   

10.
Over the past twenty years industry and academia have been working to develop computer systems to increase work group's productivity, commonly referred to as groupware. Groupware encompasses a broad spectrum of research and development including group support systems, computer-supported collaborative work, group decision support systems, and computer mediated collaboration. Applications arising out of these efforts included concurrent multi-user authoring systems, computer conferencing, integrated computer/video meeting systems, electronic voting, brainstorming, and workflow systems. The papers in this special issue are some of the best from over 100 papers submitted to the GROUP'97 conference sponsored by the ACM Special Interest Group on Supporting Group Work. They represent work conducted by researchers on four continents from both industry and academia. As a group the authors present a blend of theory, practice, and technological innovation from the groupware research arena. This paper is intended to serve as an introduction to the area of groupware research and development. In it we explore the evolution of groupware and expose some of its effects on organizations and society.  相似文献   

11.
To address concerns raised regarding the use of online course‐based summative assessment methods, a quasi‐experimental design was implemented in which students who completed a summative assessment either online or offline were compared on performance scores when using their self‐reported preferred or non‐preferred modes. Performance scores were found not to differ depending on whether the assessment was completed in the preferred or non‐preferred mode. These findings provide preliminary support for the validity of online assessment methods. Future studies could help determine the extent to which this finding generalizes beyond the assessment procedures and type of sample used here. Suggestions for follow‐up studies are offered, including exploring the validity of more complex computer‐related online assessment tasks and investigating the impact of using preferred and non‐preferred modes upon the quality of the student experience.  相似文献   

12.
Is current research on computing by older adults simply looking at a short-term problem? Or will the technology problems that plague the current generation also be problematic for today’s tech-savvy younger generations when they become “old”? This paper considers age-related and experience-related issues that affect ability to use new technology. Without more consideration of the skills of older users, it is likely that applications and devices 20 years from now will have changed such that this “older” generation finds themselves confronting an array of technologies that they little understand and find generally inaccessible. Recent evidence suggests that older adults bring specific strengths to Web browsing. A fuller investigation of these strengths and how to design to optimize for strengths of older users has the potential to address the need for usable technology for this increasingly important demographic.  相似文献   

13.
The lattice Boltzmann method is being increasingly employed in the field of computational fluid dynamics due to its computational efficiency. Floating-point operations in the lattice Boltzmann method involve local data and therefore allow easy cache optimization and parallelization. Due to this, the cache-optimized lattice Boltzmann method has superior computational performance over traditional finite difference methods for solving unsteady flow problems. When solving steady flow problems, the explicit nature of the lattice Boltzmann discretization limits the time step size and therefore the efficiency of the lattice Boltzmann method for steady flows. To quantify the computational performance of the lattice Boltzmann method for steady flows, a comparison study between the lattice Boltzmann method (LBM) and the alternating direction implicit (ADI) method was performed using the 2-D steady Burgers’ equation. The comparison study showed that the LBM performs comparatively poor on high-resolution meshes due to smaller time step sizes, while on coarser meshes where the time step size is similar for both methods, the cache-optimized LBM performance is superior. Because flow domains can be discretized with multiblock grids consisting of coarse and fine grid blocks, the cache-optimized LBM can be applied on the coarse grid block while the traditional implicit methods are applied on the fine grid blocks. This paper finds the coupled cache-optimized lattice Boltzmann-ADI method to be faster by a factor of 4.5 over the traditional methods while maintaining similar accuracy.  相似文献   

14.
A method is developed for application of the Tsypkin criterion and the Jury and Lee stability criteria for certain classes of non-linear sampled data systems

The method, which uses root-locus techniques, is based on the root-locus interpretation of the Popov criterion for continuous data non-linear systems. This method was developed by Ramapriyan et al. (1966).  相似文献   

15.
Government legislation and calls for greater levels of oversight and transparency are leading public bodies to publish their raw datasets online. Policy makers and elected officials anticipate that the accessibility of open data through online Government portals for citizens will enable public engagement in policy making through increased levels of fact based content elicited from open data. The usability and benefits of such open data are being argued as contributing positively towards public sector reforms, which are under extreme pressures driven by extended periods of austerity. However, there is very limited scholarly studies that have attempted to empirically evaluate the performance of government open data websites and the acceptance and use of these data from a citizen perspective. Given this research void, an adjusted diffusion of innovation model based on Rogers’ diffusion of innovations theory (DOI) is proposed and used in this paper to empirically determine the predictors influencing the use of public sector open data. A good understanding of these predictors affecting the acceptance and use of open data will likely assist policy makers and public administrations in determining the policy instruments that can increase the acceptance and use of open data through an active promotion campaign to engage-contribute-use.  相似文献   

16.
This work explores the feasibility of proposing universal design guidelines for E-training modules considering aging differences as an important factor. A controlled experiment was designed and conducted to evaluate the effects of module design characteristics on information recall, satisfaction, disorientation, and task workload, and the implications for E-Training. Sixteen Web modules with two different lesson content types were developed for this study, considering different independent variables such as camera focus, environment simulator, video size, and instructor’s gender. The experimental results suggest that an interface that ensures high levels of satisfaction and information recall as well as low levels of disorientation and task workload could be accomplished only partially if young and aging participants were to be target simultaneously with the same type of training module. Based on the results of this study the specific preferences in design suggest an interface that provides narrative type information, where a large video is displayed with a realistic background, and text is larger than18 point font avoiding colored text, is preferred over other combination of design variables.  相似文献   

17.
The lattice Boltzmann method (LBM) and traditional finite difference methods have separate strengths when solving the incompressible Navier–Stokes equations. The LBM is an explicit method with a highly local computational nature that uses floating-point operations that involve only local data and thereby enables easy cache optimization and parallelization. However, because the LBM is an explicit method, smaller grid spacing requires smaller numerical time steps during both transient and steady state computations. Traditional implicit finite difference methods can take larger time steps as they are not limited by the CFL condition, but only by the need for time accuracy during transient computations. To take advantage of the strengths of both methods, a multiple solver, multiple grid block approach was implemented and validated for the 2-D Burgers’ equation in Part I of this work. Part II implements the multiple solver, multiple grid block approach for the 2-D backward step flow problem. The coupled LBM–VSM solver is found to be faster by a factor of 2.90 (2.87 and 2.93 for Re = 150 and Re = 500, respectively) on a single processor than the VSM for the 2-D backward step flow problem while maintaining similar accuracy.  相似文献   

18.
Much research has investigated computer self-efficacy. Despite these efforts, the relation between efficacy beliefs concerning the task being performed on the computer and beliefs dealing with the computer application remains overlooked. In this study, we apply associationism to show how task-specific self-efficacy beliefs (TSE) positively influence computer-specific self-efficacy (CSE) judgments. We also show that this relation might be more complex than first thought: the degree of match between the novelty of the task and the novelty the application moderates this relation. That is, when both the task and the application are novel (or not), the influence of TSE on CSE is greater than when one is novel and the other is not. Furthermore, we show that CSE positively influences perceptions of usefulness, and as such, CSE represents one of the building blocks of the formation of beliefs about computer applications. Finally, several implications for practice and future research are discussed.  相似文献   

19.

Quality is a rather slippery concept, and its assessment in subtitling can be a challenging task, as its appreciation can easily vary depending on the different stakeholders involved in the production and reception of subtitles. In this paper, we evaluate quality indicators in subtitling as perceived by professional subtitlers and viewers. After exploring the various subtitle parameters that can have an impact on the quality of the end product (such as line breaks, synchronisation, display rates), we present the results of two qualitative studies conducted with professional subtitlers and subtitle viewers with different audiovisual backgrounds. The results yield some similarities and discrepancies, particularly in the way in which the strategy of condensation is perceived by the two groups, and they also help delineate the subtitle parameters that should be taken into consideration in order to improve the creative process as well as the reception of subtitles.

  相似文献   

20.
To determine if the contribution of slipperiness to occupational slip, trip and fall (STF)-related injuries could be isolated from injury surveillance systems in the USA, the UK and Sweden, six governmental systems and one industrial system were consulted. The systems varied in their capture approaches and the degree of documentation of exposure to slipping. The burden of STF-related occupational injury ranged from 20 to 40% of disabling occupational injuries in the developed countries studied. The annual direct cost of fall-related occupational injuries in the USA alone was estimated to be approximately US$6 billion. Slipperiness or slipping were found to contribute to between 40 and 50% of fall-related injuries. Slipperiness was more often a factor in same level falls than in falls to lower levels. The evaluation of the burden of slipperiness was hampered by design limitations in many of the data systems utilized. The resolution of large-scale injury registries should be improved by collecting more detailed incident sequence information to better define the full scope and contribution of slipperiness to occupational STF-related injuries. Such improvements would facilitate the allocation of prevention resources towards reduction of first-event risk factors such as slipping.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号