首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Game Transfer Phenomena (GTP) (i.e. altered perceptions, spontaneous thoughts and behaviors with game content) occur on a continuum from mild to severe. This study examined the differences between mild, moderate and severe levels of GTP. A total of 2281 gamers’ participated in an online survey. The majority of gamers experienced a mild level of GTP. The factors significantly associated with the severe level of GTP were: (i) being students, (ii) being aged 18 to 22 years, (iii) being professional gamers, (iv) playing videogames every day in sessions of 6 h or more, (iv) playing to escape from the real world, (v) having a sleep disorder, mental disorder or reported dysfunctional gaming, and (vi) having experienced distress or dysfunction due to GTP. In addition, having used drugs and experiencing flashbacks as side-effects of drug use were significantly less likely to be reported by those with mild level of GTP. In a regression analysis, predictors of severe GTP included positive appraisals of GTP, distress or dysfunction due to GTP, and tendency to recall dreams. In general, the findings suggest that those with severe level of GTP share characteristics with profiles of gamers with dysfunctional gaming (e.g., problematic and/or addictive gaming).  相似文献   

2.
The technology acceptance scale (TAS) by van der Laan, Heino, and De Waard (1997) measures the psychological construct of the same term as a sum of attitudes of an operator toward a specific complex sociotechnical system. The TAS has been claimed to comprise two subscales, usefulness and satisfaction. However, recent empirical work has found evidence for only one underlying factor. To provide further insight into the factor structure of the TAS, this study adopts a Bayesian exploratory factor analysis (BEFA) to analyse the data of a flight simulation study regarding single pilot operations. A series of Markov chain Monte Carlo (MCMC) models is used to assess the latent factor structure of the TAS for the two different crewing conditions and their corresponding workstation and cockpit setups of the copilot. A reliable step-by-step data analysis of the MCMC models provides evidence for a one-factor solution of the scale. The divergence to the previous studies which claim two factors can be due to the different applications as well as due to different statistical paradigms and methodological issues in exploratory factor analysis.  相似文献   

3.
It has long been thought that an optical sensor, such as a light waveguide implemented total analysis system (TAS), is one of the functional components that will be needed to realize a “ubiquitous human healthcare system” in the near future. We have already proposed the fundamental structure for a light waveguide capable of illuminating a living cell or particle running along a microfluidic channel, as well as of detecting fluorescence even from the extremely weak power of such a minute particle. In order to develop novel functions to detect the internal structure of living cells quickly, an angular scanning method that sequentially changes the direction of illumination of the minute cell or particle may be crucial. In this paper, we investigate fluorescence detection from moving particles by switching the laser power delivery path of plural light waveguides as a preliminary experiment toward this novel method. To construct an experimental system able to incorporate a switching light source mechanism cost effectively, we utilized a conventional TAS chip with plural waveguide pairs arranged in parallel, and a forced vibration mechanism on an optical fiber tip by a piezoelectric actuator. With this system, we performed an experiment to detect extremely weak fluorescence using micro particles with a fluorescent substance attached and an optical TAS chip that incorporated a microfluidic channel and three pairs of laser-power-delivering light waveguide cores. We successfully obtained clear, quasi-triangular-shaped pulses in fluorescent signals from resin particles running across the intersection under three different conditions: (1) a particle with approximately the same velocity as that of a forced-vibrated optical fiber tip of approximately 700 mm/s, (2) a particle with velocity 1 digit smaller than that of an optical fiber tip, and (3) a particle with velocity approximately 1/20 that of an optical fiber tip.  相似文献   

4.
5.
The purpose of this study was to develop a scale for measuring teachers' perceptions towards ICTs in teaching-learning process in the classroom. The sample of the study consisted of volunteering Turkish teachers (n = 200). This study developed a new scale for measuring teachers' perceptions towards ICTs in teaching-learning process. In order to test the validity of the scale, the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were carried out in the research. A result of the EFA, the scale consisted of three factors: attitude, usage, and belief with 25 items. It was also seen that there were positive correlations amongst the three factors of the scale. Cronbach's Alpha reliability coefficient value was found as 0.92 and Spearman-Brown split-half correlation value was found as 0.85 in the study. It was seen that reliability coefficient values of the factors of in the scale ranged between 0.88 and 0.72 in the research. Lastly, as a result of the CFA, it was understood that the obtained values (Δχ2 (n = 200)/df = 4.85/3; GFI = 0.96; AGFI = 0.94; RMSEA = 0.026; CFI = 0.97; TLI = 0.98) confirmed the three-factor structure of the scale.  相似文献   

6.
基于传统信息中心建设中存在的一些缺点,提出了一种基于vSphere架构的优化的信息中心虚拟化实施方案。相对于其他的虚拟化实施方案,本方案主要从可靠性和存储设备 I/O 性能两方面进行了优化。该方案从服务器、网络和存储设备等三个方面对系统可靠性进行了优化,分析了影响存储设备I/O性能的几大因素,并分别从是否采用多路径输入输出、Write-back cache写策略、磁盘I/O块大小、服务器性能等几个方面对系统I/O性能进行了优化。最后通过对整个系统的测试,验证了这一方案的优势。  相似文献   

7.
详细介绍了数字音效调节算法及其在TAS3103A上的软件实现。本系统采用了TI公司的48位DSP TAS3103A。该DSP可实现3声道,12波段的独立的信道均衡。  相似文献   

8.
A cross-sectional survey of female office workers (n=333) was undertaken to determine the level of neck pain and disability (Neck Disability Index-NDI) and to explore the relationship between individual and workplace risk factors with the NDI score and the presence of pain. Workers reported nil (32%), mild (53%), moderate (14%) and severe (1%) neck pain. There were more risk factors associated with the NDI score than the presence of neck pain. The presence of neck pain was associated with a history of neck trauma (OR: 4.8), using a graduated lens (OR: 4.6), and negative affectivity (OR: 2.7) in the multiple regression model. Factors associated with higher NDI score were using the computer mouse for more than 6h per day, higher negative affectivity, older age and an uncomfortable workstation. These results suggest that measuring the level of neck pain and disability rather than just the presence of neck pain provides more specific directives for the prevention and management of this disorder.  相似文献   

9.
Design of experiments (DOEs) are useful techniques for improving the reliability (or quality) of a product. The main work of a DOE is to select significant factors that affect the product reliability (or quality). Then the significant factors can be set at the levels which lead to reliability improvement. One of the basic assumptions of DOEs is that the (logged) observations at each run follow a normal distribution. In practical applications, normal and extreme value distributions are much alike. They may fit the data at hand well in practical applications. However, their predictions may lead to a significant difference. A well-known assertion: “moderate departures from normality are of little concern in the fixed effects analysis of variance” [Montgomery, D. C. (1997). Design & analysis of expremients (4th ed.). New York: Wiley]. The main purpose of the present paper is to evaluate the assertion by investigating the impact of mis-specification between normal and extreme value distributions on the precision of selecting significant factors for a screening experiment. For each of these two distributions, the probabilities of correct and incorrect selections under correct specification and mis-specification are computed. The results indicate that for both of normal and extreme value distributions, the selection precision is significantly influenced by mis-specification. An example is used to illustrate the proposed method. Finally, some numerical results are provided to evaluate the impacts of mis-specification on the selection precision for the screening experiment. Th numerical results indicate that for both of normal and extreme value distributions, the smaller the main effect and the sample size, the more the impact of mis-specification is. Surprisingly, this seems to violate the assertion stated above.  相似文献   

10.
The aim of this study was to quantify the precision of manual video digitization of three typical industrial tasks, as evaluated by the comparison of four cumulative kinetic parameters at the L4/L5 intervertebral joint: compression, joint shear, reaction shear and moment. Ten observers were recruited (five male and five female), with an undergraduate background in human anatomy. On each of three test days, each observer digitized five repeats of each of three typical industrial lifting tasks of 5 to 6 s in duration. A rigid link segment model that incorporated a single muscle equivalent model was used to calculate the cumulative loading based on the digitized coordinates. Inter-observer reliability was assessed using a mixed model ANOVA, and no significant differences were found to result from observer, gender, day or trial. Intraclass correlation coefficients (ICC) were calculated within each task to quantify intra-observer reliability. Overall, the ICCs were excellent (>0.75), with the exception of moderate values for reaction shear for Tasks 2 and 3. Compression and moment demonstrated the highest reliability of the four parameters studied, which is beneficial from an ergonomic standpoint, as compression is the most commonly used parameter for job assessments. This study demonstrated manual video digitization to be a reliable tool for the quantification of cumulative spinal loading, both within a given observer, and across days, trials and observers.  相似文献   

11.
基于RS的黄河三角洲地区土地利用分类监测研究   总被引:2,自引:0,他引:2       下载免费PDF全文
运用遥感图像处理中的缨帽变换、非监督分类、监督分类以及分类后处理等综合分类的方法,将黄河三角洲的土地利用类型划分成9类,并将盐碱地分成轻度盐碱地、中度盐碱地、重度盐碱地、光板地4个级别,经过精度评价,这种综合分类法在精度上有了较大的提高。  相似文献   

12.
Shin  K.G. 《Computer》1991,24(5):25-35
The design, implementation, and evaluation of a distributed real-time architecture called HARTS (hexagonal architecture for real-time systems) are discussed, emphasizing its support of time-constrained, fault-tolerant communications and I/O (input/output) requirements. HARTS consists of shared-memory multiprocessor nodes, interconnected by a wrapped hexagonal mesh. This architecture is intended to meet three main requirements of real-time computing: high performance, high reliability, and extensive I/O. The high-level and low-level architecture is described. The evaluation of HARTS, using modeling and simulation with actual parameters derived from its implementation, is reported. Fault-tolerant routing, clock synchronization and the I/O architecture are examined  相似文献   

13.
根据鱼雷雷位误差影响因素,构建雷位误差试验失败为顶事件的故障树,进行雷位误差可靠性的定性分析和定量分析,根据重要度分析计算的关键重要度,得出了各因素对雷位误差系统的影响程度,确定了系统的薄弱环节以及维修故障的顺序。通过控制这些关键部件的可靠度便能提高系统的可靠度,从而提高鱼雷雷位误差试验的可靠度。  相似文献   

14.
Approximately 1% of children are born with a moderate to severe congenital heart defect, and half of them undergo one or more surgeries to fix it. SURGEM, a solid modeling environment, is used to improve surgical outcome by allowing the surgeon to design the geometry for several possible surgical options before the operation and to evaluate their relative merits using computational fluid simulation. We describe here the solid modeling and graphical user interface challenges that we have encountered while developing support for three surgeries: (1) repair of double-outlet right ventricle, which adds a graft wall within the cardiac chambers to split the solid model of the unique ventricle, (2) the Fontan procedure, which routes a graft tube to connect the inferior vena cava to the pulmonary arteries, and (3) stenosis repair, which adds a stent to expand a constricted artery. We describe several solutions that we have developed to address these challenges and to improve the performance, reliability, and usability of SURGEM for these tasks.  相似文献   

15.
This study focuses on the one of the most critical issues of modeling under severe conditions of uncertainty: determining the relative importance (weight) of the explanatory variables. The ability to determine relative importance of explanatory variables and the reliability of such outcome are of utmost importance to the decision makers, who utilize such models as components of decision support or decision making. We compare the reliability of traditional method multiple linear regression versus fuzzy logic‐based soft regression. We provide a case study (cross‐national model of background factors facilitating economic growth) to illustrate the performance of both methods. We conclude that soft regression is definitely more reliable and consistent tool to determine relative importance of explanatory variables.  相似文献   

16.
Software quality is important for the success of any information systems (IS). In this research, we find the determinants of software quality. We used five attributes for software quality: system reliability, maintainability, ease of use, usefulness, and relevance. By surveying 112 IS project managers, we collected data about their perceptions on the software quality attributes and their determinants. We arrived at six factors through exploratory factor analysis. We determined the individual factors that impacted the software quality attributes; for example, reliability is associated with responsiveness of IS department; ease of use is influenced by the capabilities of users and attitude of management; and usefulness is impacted by capabilities of IS department and responsiveness of IS department. We show that organizational factors are more important than technical factors in impacting software quality in IS projects. We provide implications of our research to practice and to future research.  相似文献   

17.
The performance shaping factors (PSFs) of the standardized plant analysis of risk-human reliability analysis (SPAR-H) method are unclearly defined, which contributes to the uncertainty of human reliability analysis (HRA) in nuclear power plants (NPPs). This work proposes an expert-based modification approach for redefining the PSFs based on four criteria in terms of less overlap, hierarchy, flexibility, and digitalization.For demonstration, the proposed approach is used to assign PSFs to three specific human failure events in NPPs. Three tests (Kendall's W Test, Jonckheere-Terpstra Test, and Paired Samples Test) are applied to analyze the assignments. Compared to the PSF assignment of SPAR-H, the results show that the redefined PSFs meet the four criteria and reduce the overestimation of human error probabilities (HEPs).  相似文献   

18.
《Ergonomics》2012,55(11):788-797
The aim of this study was to quantify the precision of manual video digitization of three typical industrial tasks, as evaluated by the comparison of four cumulative kinetic parameters at the L4/L5 intervertebral joint: compression, joint shear, reaction shear and moment. Ten observers were recruited (five male and five female), with an undergraduate background in human anatomy. On each of three test days, each observer digitized five repeats of each of three typical industrial lifting tasks of 5 to 6 s in duration. A rigid link segment model that incorporated a single muscle equivalent model was used to calculate the cumulative loading based on the digitized coordinates. Inter-observer reliability was assessed using a mixed model ANOVA, and no significant differences were found to result from observer, gender, day or trial. Intraclass correlation coefficients (ICC) were calculated within each task to quantify intra-observer reliability. Overall, the ICCs were excellent (>0.75), with the exception of moderate values for reaction shear for Tasks 2 and 3. Compression and moment demonstrated the highest reliability of the four parameters studied, which is beneficial from an ergonomic standpoint, as compression is the most commonly used parameter for job assessments. This study demonstrated manual video digitization to be a reliable tool for the quantification of cumulative spinal loading, both within a given observer, and across days, trials and observers.  相似文献   

19.
随着大数据时代的到来,数据存储正接受着严峻的考验。为了改进传统Hadoop分布式文件系统HDFS存在的冗余度高、负载均衡能力不足等问题,提出了一种基于柯西码的动态分散式存储优化策略CDDS。对于系统中的数据块,在保证数据可用性的基础上,依据其热度的不同生成相应的存储方案。对于系统中的冷数据与热数据,分别采用基于柯西码的纠删码技术进行单副本与多副本存储,既保证了数据的可靠性又保证了系统的I/O能力。经测试,运用该策略存储数据所需要的存储空间减小为原来的75%,系统的可靠性与负载均衡能力也得到了增强。  相似文献   

20.
Process energy analysis and optimization in selective laser sintering   总被引:1,自引:0,他引:1  
Additive manufacturing (AM) processes are increasingly being used to manufacture complex precision parts for the automotive, aerospace and medical industries. One of the popular AM processes is the selective laser sintering (SLS) process which manufactures parts by sintering metallic, polymeric and ceramic powder under the effect of laser power. The laser energy expenditure of SLS process and its correlation to the geometry of the manufactured part and the SLS process parameters, however, have not received much attention from AM/SLS researchers. This paper presents a mathematical analysis of the laser energy required for manufacturing simple parts using the SLS process. The total energy expended is calculated as a function of the total area of sintering (TAS) using a convex hull based approach and is correlated to the part geometry, slice thickness and the build orientation. The TAS and laser energy are calculated for three sample parts and the results are provided in the paper. Finally, an optimization model is presented which computes the minimal TAS and energy required for manufacturing a part using the SLS process.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号