首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
《Ergonomics》2012,55(5):674-679
The objective of this study was to examine the relationship between slip, trip and fall injuries and obesity in a population of workers at the Idaho National Laboratory (INL) in Idaho Falls, Idaho. INL is an applied engineering facility dedicated to supporting the US Department of Energy's mission. An analysis was performed on injuries reported to the INL Medical Clinic to determine whether obesity was related to an increase in slip, trip and fall injuries. Records were analysed that spanned a 6-year period (2005–2010), and included 8581 employees (mean age, 47 ± 11 years and body mass index [BMI], 29 ± 5 kg/m2; 34% obesity rate). Of the 189 people who reported slip, trip and fall injuries (mean age, 48 ± 11 years), 51% were obese (P < 0.001 compared with uninjured employees), and their mean BMI was 31 ± 6 kg/m2 (P < 0.001). Obesity in this population was associated with a greater rate of slip, trip and fall injuries.  相似文献   

2.
《Automatica》1970,6(4):581-589
Changes in the quality, inaccuracy of the dosage of energy and materials streams, sudden oscillations of the mass flows, these are the main factors which cause the losses in the sugar production. To minimize these losses an optimizing control, structured as a multilayer system is proposed.The first layer consists of a stabilizing control. The set points of the controllers are determined by local static optimizers. The losses in the whole technological line are minimized by the global static optimizer which coordinates the work of the local optimizers. The desired production rate is presented as a solution to a dynamic optimization problem—the scheduling problem.If a temporary limitation of the desired production rate is caused for any reason, then a mass flow coordination system is called upon to control flow rates in different parts of the technological line, and thus minimize the loss. When the limitation is over, the coordination system brings the line to the previous optimal steady state under the control of the static optimizers.  相似文献   

3.
The organization of talent in online communities has been pivotal for the development of open source software. We are currently witnessing a related phenomenon that is at least of equal importance: the ‘open-sourcing’ of digital content through a dramatic increase in user-generated content and the development of appropriate licenses for users to share their works and build on each other's creativity. This article compares and contrasts (a) the objectives of software development vis-à-vis the development of new media content, (b) the organizational forms that have developed in respective online communities, and (c) the role that licensing plays in the production of ‘functional’ vis-à-vis ‘cultural’ goods.  相似文献   

4.
《Computers & chemistry》1995,19(3):299-301
The application of the near infrared technique (NIR) on a large scale is possible thanks to the use of rapid and inexpensive computers, and facilitates both quantitative and qualitative analyses. The great advantage of NIR is rapid analyses and its non-destructive character, and possibility of measuring many different components at the same time. The NIR method was used to study changes in chemical composition in leaves of sugar-beet under the influence of various fertilizers. The results obtained show that this method is an effective tool to predict protein, nitrogen and saccharides.  相似文献   

5.
6.
7.
Mesken J  Lajunen T  Summala H 《Ergonomics》2002,45(7):469-483
The aim of the present study was to replicate the distinction between errors, lapses and violations, and to identify aggressive violations from normal or highway code violations. Furthermore, the relationship of these behaviours with road traffic accidents was examined. A total number of 1126 Finnish drivers completed a questionnaire containing the Driver Behaviour Questionnaire (DBQ) with extended violations scale, and questions regarding background information, such as age, gender and mileage. Also, questions about previous accidents and fines were asked. Factor analysis showed that a four-factor structure seemed more appropriate than the earlier established three-factor structure. The four factors were errors, lapses, speeding violations and interpersonal violations. The two types of violations result from different motives, and seem to be associated with different kinds of affect. Both interpersonal and speeding violations were reported most by young males, which was consistent with earlier findings. Logistic regression analyses indicated that errors predicted active accident involvement after partialling out the effects of demographic variables, whereas interpersonal violations were positively related to involvement in passive accidents. This was presumably due to different reporting tendencies of respondents. Speeding tickets were predicted by speeding and interpersonal violations and lapses and penalties for speeding by both kinds of violations and errors. Penalties for speeding, parking and other offences were predicted by interpersonal violations. The implications of these results are discussed.  相似文献   

8.
9.
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development and implementation process of a FLOSS-based system. In this paper, we observes how implementing FLOSS in an Italian high school challenges the conventional relationship between end users themselves (e.g. teachers and students) and that between users and developers. The findings will shed some light on the social aspects of FLOSS-based computerization – including the role of FLOSS in social and organizational change in educational environments and the ways that the social organization of FLOSS are influenced by social forces and social practices.  相似文献   

10.
General formulas are proposed to quantify the effects of changing the model parameters in the so-called BCMP network [F. Baskett et al., J. ACM 22 (2) (April 1975) 248–260]. These formulas relate the derivative of the expectation of any function of both the state and the paramaters of the network with respect to any model parameter (i.e., arrival rate, mean service demand, service rate, visit ratio, traffic intensity) to known functions of the state variables. Applications of our results to sensitivity analysis and optimization problems are given.  相似文献   

11.
Soil organic matter dynamics are essential for terrestrial ecosystem functions as they affect biogeochemical cycles and, thus, the provision of plant nutrients or the release of greenhouse gases to the atmosphere. Most of the involved processes are driven by microorganisms. To investigate and understand these processes, individual-based models allow analyzing complex microbial systems' behavior based on rules and conditions for individual entities within these systems, taking into account local interactions and individual variations. Here, we present a streamlined, user-friendly and open version of the individual-based model INDISIM-SOM, which describes the mineralization of soil carbon and nitrogen. It was implemented in NetLogo, a widely used and easily accessible software platform especially designed for individual-based simulation models. Including powerful means to observe the model behavior and a standardized documentation, this increases INDISIM-SOM's range of potential uses and users, and facilitates the exchange among soil scientists as well as between different modeling approaches.  相似文献   

12.
Traditionally, computer and software applications have been used by economists to off-load otherwise complex or tedious tasks onto technology, freeing up time and intellect to address other, intellectually more rewarding, aspects of research. On the negative side, this increasing dependence on computers has resulted in research that has become increasingly difficult to replicate. In this paper, we propose some basic standards to improve the production and reporting of computational results in economics for the purpose of accuracy and reproducibility. In particular, we make recommendations on four aspects of the process: computational practice, published reporting, supporting documentation, and visualization. Also, we reflect on current developments in the practice of computing and visualization, such as integrated dynamic electronic documents, distributed computing systems, open source software, and their potential usefulness in making computational and empirical research in economics more easily reproducible.   相似文献   

13.
Abstract This paper presents results from research into open source projects from a software engineering perspective. The research methodology employed relies on public data retrieved from the CVS repository of the GNOME project and relevant discussion groups. This methodology is described, and results concerning the special characteristics of open source software development are given. These data are used for a first approach to estimating the total effort to be expended.  相似文献   

14.
The data center network (DCN), which is an important component of data centers, consists of a large number of hosted servers and switches connected with high speed communication links. A DCN enables thc deployment of resources centralization and on-demand access of the information and services of data centers to users. In recent years, the scale of the DCN has constantly increased with the widespread use of cloud-based services and the unprecedented amount of data delivery in/between data centers, whereas the traditional DCN architecture lacks aggregate bandwidth, scalability, and cost effectiveness for coping with the increasing demands of tenants in accessing the services of cloud data centers. Therefore, the design of a novel DCN architecture with the features of scalability, low cost, robustness, and energy conservation is required. This paper reviews the recent research findings and technologies of DCN architectures to identify the issues in the existing DCN architectures for cloud computing. We develop a taxonomy for the classification of the current DCN architectures, and also qualitatively analyze the traditional and contemporary DCN architectures. Moreover, the DCN architectures are compared on the basis of the significant characteristics, such as bandwidth, fault tolerance, scalability, overhead, and deployment cost. Finally, we put forward open research issues in the deployment of scalable, low-cost, robust, and energy-efficient DCN architecture, for data centers in computational clouds.  相似文献   

15.
Competitive enterprises have to react fast and flexible to an increasing dynamic environment. To achieve the ability to adapt on these new requirements autonomous cooperating logistic processes seem to be an appropriate method. In order to prove in which case autonomously controlled processes are more advantageous than conventionally managed processes, it is essential to specify what is exactly meant with autonomous control, how autonomous control does differ from conventional control and how the achievement of logistic objectives in autonomously controlled systems can be estimated and compared to the achievement of objectives in conventionally controlled systems. This paper introduces a general definition of autonomous control as well as a definition in the context of engineering science and its meaning in a logistics context. Based on this, a catalogue of criteria is developed to ensure the identification of autonomous cooperating processes in logistic systems and its distinction to conventionally controlled processes. To demonstrate this catalogue, its criteria and the concerning properties are explained by means of an exemplary shop-floor scenario.  相似文献   

16.
Greenhouse gas inventories and emissions reduction programs require robust methods to quantify carbon sequestration in forests. We compare forest carbon estimates from Light Detection and Ranging (Lidar) data and QuickBird high-resolution satellite images, calibrated and validated by field measurements of individual trees. We conducted the tests at two sites in California: (1) 59 km2 of secondary and old-growth coast redwood (Sequoia sempervirens) forest (Garcia-Mailliard area) and (2) 58 km2 of old-growth Sierra Nevada forest (North Yuba area). Regression of aboveground live tree carbon density, calculated from field measurements, against Lidar height metrics and against QuickBird-derived tree crown diameter generated equations of carbon density as a function of the remote sensing parameters. Employing Monte Carlo methods, we quantified uncertainties of forest carbon estimates from uncertainties in field measurements, remote sensing accuracy, biomass regression equations, and spatial autocorrelation. Validation of QuickBird crown diameters against field measurements of the same trees showed significant correlation (r = 0.82, P < 0.05). Comparison of stand-level Lidar height metrics with field-derived Lorey's mean height showed significant correlation (Garcia-Mailliard r = 0.94, P < 0.0001; North Yuba R = 0.89, P < 0.0001). Field measurements of five aboveground carbon pools (live trees, dead trees, shrubs, coarse woody debris, and litter) yielded aboveground carbon densities (mean ± standard error without Monte Carlo) as high as 320 ± 35 Mg ha− 1 (old-growth coast redwood) and 510 ± 120 Mg ha− 1 (red fir [Abies magnifica] forest), as great or greater than tropical rainforest. Lidar and QuickBird detected aboveground carbon in live trees, 70-97% of the total. Large sample sizes in the Monte Carlo analyses of remote sensing data generated low estimates of uncertainty. Lidar showed lower uncertainty and higher accuracy than QuickBird, due to high correlation of biomass to height and undercounting of trees by the crown detection algorithm. Lidar achieved uncertainties of < 1%, providing estimates of aboveground live tree carbon density (mean ± 95% confidence interval with Monte Carlo) of 82 ± 0.7 Mg ha− 1 in Garcia-Mailliard and 140 ± 0.9 Mg ha− 1 in North Yuba. The method that we tested, combining field measurements, Lidar, and Monte Carlo, can produce robust wall-to-wall spatial data on forest carbon.  相似文献   

17.
This research had two aims. Firstly, to examine availability of emotional support in chat rooms, and secondly, to investigate openness and dishonesty in chat rooms. Three hundred and twenty respondents (160 women and 160 men) filled out the ‘Chat Room Survey’. It was found that people who spend more time in chat rooms were more likely to be open about themselves, receive emotional support, and give emotional support. Women were more likely than men to give emotional support. Men were more likely to than women to lie, and were more likely to lie about their socio-economic status. In contrast, women were more likely than men to lie for safety reasons. This study challenges some past speculations about online relationships, and argues that future research must consider demographic details more when examining interactions on the Internet.  相似文献   

18.
The data center network(DCN), which is an important component of data centers, consists of a large number of hosted servers and switches connected with high speed communication links. A DCN enables the deployment of resources centralization and on-demand access of the information and services of data centers to users. In recent years, the scale of the DCN has constantly increased with the widespread use of cloud-based services and the unprecedented amount of data delivery in/between data centers, whereas the traditional DCN architecture lacks aggregate bandwidth, scalability, and cost effectiveness for coping with the increasing demands of tenants in accessing the services of cloud data centers. Therefore, the design of a novel DCN architecture with the features of scalability, low cost, robustness, and energy conservation is required. This paper reviews the recent research findings and technologies of DCN architectures to identify the issues in the existing DCN architectures for cloud computing. We develop a taxonomy for the classification of the current DCN architectures, and also qualitatively analyze the traditional and contemporary DCN architectures. Moreover, the DCN architectures are compared on the basis of the significant characteristics, such as bandwidth, fault tolerance, scalability, overhead, and deployment cost. Finally, we put forward open research issues in the deployment of scalable, low-cost, robust, and energy-efficient DCN architecture, for data centers in computational clouds.  相似文献   

19.
Information and Communication technology (ICT) pervades every aspect of our daily lives to support us solving tasks and providing information. However, we are facing an increasing complexity in ICT due to interconnectedness and coupling of large-scale distributed systems. One particular challenge in this context is openness, i.e. systems and components are free to join and leave at any time, including those that are faulty or even malicious. In this article, we present a novel concept to master openness by detecting groups of similarly behaving systems in order to identify and finally isolate malicious elements. More precisely, we present a mechanism to cluster groups of systems at runtime and to estimate their contribution to the overall system utility. For evaluation and demonstration purposes, we use the Trusted Desktop Grid (TDG), where the system utility is an averaged speedup in job calculation for all benevolent participants. This TDG resembles typical Organic Computing characteristics such as self-organisation, adaptive behaviour of heterogeneous entities, and openness. We show that our concept is able to successfully identify groups of systems with undesired behaviour, ranging from freeriding to colluding attacks.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号