首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Exposure control research with polytomous item pools has determined that randomization procedures can be very effective for controlling test security in computerized adaptive testing (CAT). The current study investigated the performance of four procedures for controlling item exposure in a CAT under the partial credit model. In addition to a no exposure control baseline condition, the Kingsbury-Zara, modified-within-.10-logits, Sympson-Hetter, and conditional Sympson-Hetter procedures were implemented to control exposure rates. The Kingsbury-Zara and the modified-within-.10-logits procedures were implemented with 3 and 6 item candidate conditions. The results show that the Kingsbury-Zara and modified-within-.10-logits procedures with 6 item candidates performed as well as the conditional Sympson-Hetter in terms of exposure rates, overlap rates, and pool utilization. These two procedures are strongly recommended for use with partial credit CATs due to their simplicity and strength of their results.  相似文献   

2.
3.
The partial credit model (PCM) is commonly employed to parameterize items and individuals using responses to a set of polytomous items. Because the PCM does not include a discrimination parameter, it may encounter substantial lack of fit to the data in certain situations. To determine the impact of model misfit on the estimation of person and item parameters using the PCM, a simulation study was conducted in which data were generated according to the generalized partial credit model, and the bias and efficiency of the resulting person and item parameter estimates were assessed. The results suggest that small amounts of unsystematic misfit do not lead to dramatic levels of bias or loss of efficiency of the estimators, but large levels of unsystematic misfit and moderate levels of systematic misfit result in substantial loss of efficiency and bias of the estimators.  相似文献   

4.
Past research on Computer Adaptive Testing (CAT) has focused almost exclusively on the use of binary items and minimizing the number of items to be administrated. To address this situation, extensive computer simulations were performed using partial credit items with two, three, four, and five response categories. Other variables manipulated include the number of available items, the number of respondents used to calibrate the items, and various manipulations of respondents' true locations. Three item selection strategies were used, and the theoretically optimal Maximum Information method was compared to random item selection and Bayesian Maximum Falsification approaches. The Rasch partial credit model proved to be quite robust to various imperfections, and systematic distortions did occur mainly in the absence of sufficient numbers of items located near the trait or performance levels of interest. The findings further indicate that having small numbers of items is more problematic in practice than having small numbers of respondents to calibrate these items. Most importantly, increasing the number of response categories consistently improved CAT's efficiency as well as the general quality of the results. In fact, increasing the number of response categories proved to have a greater positive impact than did the choice of item selection method, as the Maximum Information approach performed only slightly better than the Maximum Falsification approach. Accordingly, issues related to the efficiency of item selection methods are far less important than is commonly suggested in the literature. However, being based on computer simulations only, the preceding presumes that actual respondents behave according to the Rasch model. CAT research could thus benefit from empirical studies aimed at determining whether, and if so, how, selection strategies impact performance.  相似文献   

5.
建立了RGTrust信用模型,以该模型为基础,研究在现有的网上交易模式下改善信用体制的方法.RGTrust引入高效惩罚因子,旨在彻底解决商家网络交易中重复博弈的"囚徒困境",它迫使任一理性商家在准备失信欺骗时都要仔细思考可能受到的损失.  相似文献   

6.
Positive (PA) and negative affect (NA) are important constructs in health and well-being research. Good longitudinal measurement is crucial to conducting meaningful research on relationships between affect, health, and well-being across the lifespan. One common affect measure, the PANAS, has been evaluated thoroughly with factor analysis, but not with Racsh-based latent trait models (RLTMs) such as the Partial Credit Model (PCM), and not longitudinally. Current longitudinal RLTMs can computationally handle few occasions of data. The present study compares four methods of anchoring PCMs across 56 occasions to longitudinally evaluate the psychometric properties of the PANAS plus additional items. Anchoring item parameters on mean parameter values across occasions produced more desirable results than using no anchor, using first occasion parameters as anchors, or allowing anchor values to vary across occasions. Results indicated problems with NA items, including poor category utilization, gaps in the item distribution, and a lack of easy-to-endorse items. PA items had much more desirable psychometric qualities.  相似文献   

7.
This paper reports on the use of simulation when a randomization procedure is used to control item exposure in a computerized adaptive test for certification. We present a method to determine the optimum width of the interval from which items are selected and we report on the impact of relaxing the interval width on measurement precision and item exposure. Results indicate that, if the item bank is well targeted, it may be possible to widen the randomization interval and thus reduce item exposure, without seriously impacting the error of measure for test takers whose ability estimate is near the pass point.  相似文献   

8.
The current study investigates the performance of two Rasch measurement programs and their parameter estimations on the linear logistic test model (LLTM; Fischer, 1973). These two programs, LinLog (Whitely & Nieh, 1981) and FACETS (Linacre, 2002), are used to investigate within-item complexity factors in a spatial memory measure tool. LinLog uses conditional maximum likelihood to estimate person and item parameters and is an LLTM specific program. FACETS is usually reserved for the many-facet Rasch model (MFRM; Linacre, 1989), however in the case of specifically designed within-item solution processes, a multifaceted approach makes good sense. It is possible to consider each dimension within the item as a separate facet, just as if there were multiple raters for each item. Simulations of 500 and 1000 persons expand the original data set (114 persons) to better examine each estimation technique. LinLog and FACETS analyses show strikingly similar results in both the simulation and original data conditions, indicating that the FACETS program produces accurate LLTM parameter estimates.  相似文献   

9.
There has been some discussion among researchers as to the benefits of using one calibration process over the other during equating. Although literature is rife with the pros and cons of the different methods, hardly any research has been done on anchoring (i.e., fixing item parameters to their pre-determined values on an established scale) as a method that is commonly used by psychometricians in large-scale assessments. This simulation research compares the fixed form of calibration with the concurrent method (where calibration of the different forms on the same scale is accomplished by a single run of the calibration process, treating all non-included items on the forms as missing or not reached), using the dichotomous Rasch (Rasch, 1960) and the Rasch partial credit (Masters, 1982) models, and the WINSTEPS (Linacre, 2003) computer program. Contrary to the belief and some researchers' contention that the concurrent run with larger n-counts for the common items would provide greater accuracy in the estimation of item parameters, the results of this paper indicate that the greater accuracy of one method over the other is confounded by the sample-size, the number of common items, etc., and there is no real benefit in using one method over the other in the calibration and equating of parallel tests forms.  相似文献   

10.
In this paper, we present a fuzzy multiple criteria decision making (FMCDM) model known as fuzzy balancing and ranking. In contrast to other MCDM models, our proposed model does not require the weights of decision making criteria. First, we appraise the performance of alternatives against criteria via linguistic variables which are expressed as triangular fuzzy numbers. The foregoing model obtains the alternative rankings through a four-stage process. Second, an outranking matrix is derived indicating that the frequency with which one alternative is superior to all other alternatives based on each criterion. Third, the outranking matrix is triangularised to obtain an implicit pre-ordering or provisional order of alternatives. Fourth, the provisional order of alternatives is subjected to various screening and balancing operations that require sequential application of a balancing principle to the so-called advantages–disadvantages table that combines the criteria with the pair-wise comparisons of alternatives. Additionally, to demonstrate the procedural implementation of the proposed model and its effectiveness, we apply it on a case study regarding the problem of supplier selection.  相似文献   

11.
12.
This paper discusses methods of selecting optimally efficient production plans to address aggregate production planning problems that involve fluctuations in product demand. Traditional single objective function approaches to this type of problem may prove complex or impractical in application. Thus, this paper begins by proposing a new model incorporating multiple production criteria, and then goes on to develop simplified production strategies via the new model. MADM approaches are used to select the most efficient APP strategy. A multiple-product fixed-workforce example presented in Thompson et al . ( 1990 ) is modified to demonstrate the methodology. Moreover, different MADM approaches to the evaluation of these simplified strategies are compared. Sensitivity analysis is carried out on variations of subjective weights assigned to the criteria.  相似文献   

13.
A continuous time model using optimal control techniques is presented which implies that a scientist's productivity will eventually decline with age. This implication is at variance withCole's empirical findings1 but is consistent withDiamond's empirical findings.2  相似文献   

14.
Development of the fault detection and diagnosis (FDD) for chiller systems is very important for improving the equipment reliability and saving energy consumption. The results of FDD performance are strongly dependent on the accuracy of chiller models. Since the accuracy of the chiller models depends on the indefinite model parameters which are normally chosen by experiments or experiences, an accurate chiller model is difficult to build. Therefore, optimization of model parameters is very useful to increase the accuracy of chiller models. This paper presents a new FDD strategy for centrifugal chillers of building air-conditioning systems, which is the combination between the nonlinear least squares support vector regression (LSSVR) based on the differential evolution (DE) algorithm and the exponentially weighted moving average (EWMA) control charts. In this strategy, the nonlinear LSSVR, which is a reformulation of SVR model with better generalization performances, is adopted to develop the reference feature parameter models in a typical non-linear chiller system. The DE algorithm which is a real-coding optimal algorithm with powerful global searching capacity is employed to enhance the accuracy of LSSVR models. The exponentially weighted moving average (EWMA) control charts are introduced to improve the fault detection capability as well as to reduce the Type II errors in a t-statistics-based way. Six typical faults of the chiller from the real-time experimental data of ASHRAE RP-1043 project are chosen to validate proposed FDD methods. Comprehensive comparisons between the proposed method and two similarly previous studies are performed. The comparison results show that the proposed method has achieved significant improvement in accuracy and reliability, especially at low severity levels. The proposed DE-LSSVR-EWMA strategy is robust for fault detection and diagnosis in centrifugal chiller systems.  相似文献   

15.
H. G. Small  D. Crane 《Scientometrics》1979,1(5-6):445-461
The technique of co-citation cluster analysis is applied to a special three-year (1972–1974) file of theSocial Sciences Citation Index. An algorithm is devised for identifying clusters which belong to a discipline based on the percentage of source documents which appear in a disciplinary journal set. Clusters in three disciplines (economics, sociology and psychology) are identified using this algorthm. Clusters in a specialty of natural science (particle physics) obtained from the 1973Science Citation Index are compared and contrasted with the three groups of social sciences clusters. Certain common structural characteristics of the social science and natural science groups suggest that knowledge is developing in parts of the social science disciplines in a manner similar to the natural sciences.Prepared for presentation at the joint meeting of The Society for Social Studies of Science and the Research Committee on the Sociology of Science of the International Sociological Association, Cornell University, Ithaca, New York, November 4–6, 1976.  相似文献   

16.
Automatic exposure controls (AECs) used with computed radiography (CR) equipment need to be set for a constant signal level in the resultant images. The response varies with the energy of the X-ray beam in a different way from conventional film screen combinations. Dose to the imaging receptor has been employed in adjustment of the AECs for varying exposure conditions for CR systems installed in hospitals in the west of Scotland. However, other parameters could potentially be applied. In this study, three quantities have been investigated for use in setting the AEC function: the exposure indicator defined by the CR manufacturer, dose to the image receptor and image noise. Experiences gained in setting up the systems are described and results of a patient dose survey are reported.  相似文献   

17.
针对目前尚未深入研究多视点视频编码(MVC)码率控制的状况,在分析现有视频码率控制中率失真模型的不足和多视点视频编码的特点的基础上,提出了一种基于二次率失真(R-D)模型的多视点视频编码码率控制算法。该算法的核心是先根据视差预测和运动预测的结构关系,将所有图像分成6种类型的编码帧,并改进二项式率失真模型,然后根据已编码信息进行视点间、帧层、基本单元层比特分配与码率控制。实验仿真结果表明,与目前采用固定量化参数的JVT的MVC相比,该算法能够有效地控制多视点视频编码的码率,同时保持高效的编码效率。  相似文献   

18.
Stability of human gait is the ability to maintain upright posture during walking against external perturbations. It is a complex process determined by a number of cross-related factors, including gait trajectory, joint impedance and neural control strategies. Here, we consider a control strategy that can achieve stable steady-state periodic gait while maintaining joint flexibility with the lowest possible joint impedance. To this end, we carried out a simulation study of a heel-toe footed biped model with hip, knee and ankle joints and a heavy head-arms-trunk element, working in the sagittal plane. For simplicity, the model assumes a periodic desired joint angle trajectory and joint torques generated by a set of feed-forward and proportional-derivative feedback controllers, whereby the joint impedance is parametrized by the feedback gains. We could show that a desired steady-state gait accompanied by the desired joint angle trajectory can be established as a stable limit cycle (LC) for the feedback controller with an appropriate set of large feedback gains. Moreover, as the feedback gains are decreased for lowering the joint stiffness, stability of the LC is lost only in a few dimensions, while leaving the remaining large number of dimensions quite stable: this means that the LC becomes saddle-type, with a low-dimensional unstable manifold and a high-dimensional stable manifold. Remarkably, the unstable manifold remains of low dimensionality even when the feedback gains are decreased far below the instability point. We then developed an intermittent neural feedback controller that is activated only for short periods of time at an optimal phase of each gait stride. We characterized the robustness of this design by showing that it can better stabilize the unstable LC with small feedback gains, leading to a flexible gait, and in particular we demonstrated that such an intermittent controller performs better if it drives the state point to the stable manifold, rather than directly to the LC. The proposed intermittent control strategy might have a high affinity for the inverted pendulum analogy of biped gait, providing a dynamic view of how the step-to-step transition from one pendular stance to the next can be achieved stably in a robust manner by a well-timed neural intervention that exploits the stable modes embedded in the unstable dynamics.  相似文献   

19.
Wang  H. Poo  G.-S. 《Communications, IET》2007,1(4):684-692
Load balancing in the provisioning of virtual private network (VPN) service in the hose model is studied. Single-path routing and tree routing for the hose model tend to aggregate bandwidth reservations on a small number of links, thus leading to congestion problems in service provider networks. If the link capacity is depleted as a result of improper routing, all future non-VPN traffic will be blocked. We propose a novel multi-objective multi-path (MOMP) routing linear program with the maximum fraction of traffic on a path (MFTP) constraint to solve the problem. The MOMP routing algorithm is able to reduce the bandwidth reservation on the most loaded link by as much as 50%, thus effectively alleviating the potential congestion problems in service provider network. The MFTP constraint provides a guarantee of the availability of multiple paths for each VPN endpoint pair. Further reduction of the bandwidth reservation can be achieved depending on the MFTP value. This is highly significant.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号