首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
H. Colonius (1995) agreed with the fundamental tenets of the instance theory of automaticity. His article addressed the mathematical development of the theory, pointing out an error in one of two arguments that G. D. Logan (1988, 1992) used to justify the choice of the Weibull as the distribution of retrieval times and suggesting an alternative argument that places different emphasis on the power function speedup and the Weibull distribution. This article attempts to clarify the problematic argument, point out some practical limitations on H. Colonius's (1995) alternative argument, and suggest important future directions for the mathematical development of the theory. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

2.
It is shown that in order to derive the Weibull shape for the response time distribution in the instance theory of automaticity (G. D. Logan, 1988) by an asymptotic argument from the theory of extreme value statistics, it is necessary to determine the domain of attraction of the underlying parent distribution. An alternative, nonasymptotic characterization property equivalent to the power law of practice is presented here that gives a more feasible justification for the choice of the Weibull. This result leads to a different emphasis on the empirical conditions testing the theory. Some problems arising from the use of the asymptotic theory of extreme value statistics for the stochastic modeling of behavioral data are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
This article presents a theory in which automatization is construed as the acquisition of a domain-specific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur only after practice in a consistent environment. Practice is important because it increases the amount retrieved and the speed of retrieval; consistency is important because it ensures that the retrieved instances will be useful. The theory accounts quantitatively for the power-function speed-up and predicts a power-function reduction in the standard deviation that is constrained to have the same exponent as the power function for the speed-up. The theory accounts for qualitative properties as well, explaining how some may disappear and others appear with practice. More generally, it provides an alternative to the modal view of automaticity. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

4.
A reanalysis of the numerosity judgment data described in T. J. Palmeri (see record 1997-03378-004) showed that the mean latency exhibits clear deviations from the power function as predicted by the component power laws (CMPL) theory of strategy shifting (T. C. Rickard, 1997). The variance of the latency systematically increases and then decreases with practice for large numerosities, a result that is also consistent with the CMPL theory. Neither of these results are predicted by existing versions of either the exemplar-based random walk or the instance theories. These findings suggest that numerosity judgment, like other skills, reflects one at a time rather than concurrent execution of algorithmic and memory retrieval strategies. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

5.
In a recent reanalysis of numerosity judgment data from T. J. Palmeri (see record 1997-03378-004), T. C. Rickard (see record 1999-00904-017) found that mean response times did not decrease as a pure power law of practice and standard deviations systematically increased and then decreased with practice in some conditions. Rickard argued that these results were consistent with the component power laws (CMPL) theory of strategy shifting (Rickard, 1997), but were inconsistent with instance theory (G. D. Logan, 1988) and the exemplar-based random walk (EBRW) model (R. M. Nosofsky & Palmeri, 1997). In this article, the author demonstrates how a slightly more complex power function fitted the numerosity data nearly as well as the CMPL function, and how race models, such as instance theory and the EBRW, can predict deviations from a pure power law of practice and can predict nonmonotonic changes in standard deviations with practice. Potential limitations of CMPL are also discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

6.
The shift with practice from use of generic, multistep problem-solving strategies to fast and relatively effortless memory-based strategies, was explored in 2 experiments using pseudoarithmetic tasks. A complete transition to the memory strategy occurred by about the 60th exposure to each problem. The power law of practice did not hold in the overall data for either the mean or the standard deviation of response latency, but it did hold within each strategy (algorithm or retrieval). Learning was highly specific to the practiced problems. These results constitute the 1st clear demonstration of a skill for which the power law does not apply overall. The results do not support the instance theory of automatization (G. D. Logan, 1988) but are consistent with an alternative component power laws (CMPL) theory that assumes that because of intrinsic attentional limitations, only 1 strategy can be executed at a time. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

7.
The nonlocal generalization of Weibull theory previously developed for structures that are either notched or fail only after the formation of a large crack is extended to predict the probability of failure of unnotched structures that reach the maximum load before a large crack forms, as is typical of the test of modulus of rupture (flexural strength). The probability of material failure at a material point is assumed to be a power function (characterized by the Weibull modulus and scaling parameter) of the average stress in the neighborhood of that point, the size of which is the material characteristic length. This indirectly imposes a spatial correlation. The model describes the deterministic size effect, which is caused by stress redistribution due to strain softening in the boundary layer of cracking with the associated energy release. As a basic check of soundness, it is proposed that for quasibrittle structures much larger than the fracture process zone or the characteristic length of material, the probabilistic model of failure must asymptotically reduce to Weibull theory with the weakest link model. The present theory satisfies this condition, but the classical stochastic finite-element models do not, which renders the use of these models for calculating loads of very small failure probabilities dubious. Numerical applications and comparisons to test results are left for Part II.  相似文献   

8.
Race models of visual search make specific predictions about the nature of the speedup in search performance as the number of stimulus features cuing the spatial location of a target increases. First, both the reaction time (RT) mean and standard deviation must decrease as a power function of the number of stimulus dimensions specifying a target location. Second, the rate parameter of these two power functions must be equivalent. Finally, the shape of the entire RT distribution must be determined by the same factors that determine the shape of the power functions. Two experiments are presented that provide strong support for these predictions. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

9.
Weibull statistical fracture theory for the fracture of ceramics   总被引:1,自引:0,他引:1  
The Weibull statistical fracture theory is widely applied to the fracture of ceramic materials. The foundations of the Weibull theory for brittle fracture are reviewed. This theory predicts that brittle fracture strength is a function of size, stress distribution, and stress state. Experimental multiaxial loading results for A12O3 tubes are compared to the stress state predictions of the Weibull theory. For the most part, the Weibull theory yields reasonable predictions, although there may be some difficulties in dealing with shear stress effects on fracture. This paper is based on a presentation made at the symposium “Stochastic Aspects of Fracture” held at the 1986 annual AIME meeting in New Orleans, LA, on March 2-6, 1986, under the auspices of the ASM/MSD Flow and Fracture Committee.  相似文献   

10.
The degradation mechanism of SiC(SCS-6)/Super α 2 composite due to the interfacial reaction was studied using single-fiber composite specimens fabricated by the sputtering method, heat treated at 1273 K for various times, and tensile tested at room temperature. The main results are summarized as follows. (1) The tensile strength was reduced with progress of interfacial reaction by the formed defects on the fiber surface, while the formation of the reaction layers in the matrix side was not the direct reason for the reduction. (2) From the fracture mechanical analysis of the experimentally observed relation of the size and shape of the surface defects to the fiber strength, the fracture toughness of the fiber employed in the present work was estimated to be 2 to 4 MPa ρm. (3) The change in distribution of strength of the reacted fiber with progressing reaction was simulated successfully by combining the Monte Carlo method with the Weibull distribution function for the strength of the unreacted fiber, the Gumbell distribution function for the maximum effective size of the surface defect of the reacted fiber, and the fracture mechanics.  相似文献   

11.
Effects of exemplar similarity on the development of automaticity were investigated with a task in which participants judged the numerosity of random patterns of between 6 and 11 dots. After several days of training, response times were the same at all levels of numerosity, signaling the development of automaticity. In Experiment 1, response times to new patterns were a function of their similarity to old patterns. In Experiment 2, responses to patterns with high within-category similarity became automatized more quickly than responses to patterns with low within-category similarity. In Experiment 3, responses to patterns with high between-category similarity became automatized more slowly than responses to patterns with low between-category similarity. A new theory, the exemplar-based random walk (EBRW) model, was used to explain the results. Combining elements of G. D. Logan's (1988) instance theory of automaticity and R. M. Nosofsky's (1986) generalized context model of categorization, the theory embeds a dynamic similarity-based memory retrieval mechanism within a competitive random walk decision process.  相似文献   

12.
The function obtained in prothetic psychophysical scaling depends in part on the measure selected to represent physical stimulus magnitude. Stimulus measurement is usually open to alternatives. Choice of stimulus scale can determine (a) the size of an exponent in a power function; (b) the size of the constant k in a logarithmic (Fechnerian) function; (c) whether the general shape of a psychophysical function will be a power, logarithmic, or some other type; (d) the size of the Weber ratio and degree of constancy it exhibits; and (e) interdimensional correlation between power function exponents and stimulus dynamic range. A variety of alternative stimulus measures from the literature are described and their consequences discussed. (1? p ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

13.
 Load distribution is the foundation of shape control and gauge control, in which it is necessary to take into account the shape control ability of TCM (tandem cold mill) for strip shape and gauge quality. First, the objective function of generalized shape and gauge decoupling load distribution optimization was established, which considered the rolling force characteristics of the first and last stands in TCM, the relative power, and the TCM shape control ability. Then, IGA (immune genetic algorithm) was used to accomplish this multi objective load distribution optimization for TCM. After simulation and comparison with the practical load distribution strategy in one tandem cold mill, generalized shape and gauge decoupling load distribution optimization on the basis of IGA approved good ability of optimizing shape control and gauge control simultaneously.  相似文献   

14.
A new theory proposes that sensation grows as a polynomial function of physical intensity. The theory reproduces all of the published data perfectly without error. The degree of the polynomial is found to be always independent of all the experimental manipulations affecting the power function exponent except the number of stimuli. The polynomial law always provides a superior fit to the data, and should be used to determine the status of experimental methods and as a validity criterion for testing substantive theories. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

15.
Because causal relations are neither observable nor deducible, they must be induced from observable events. The 2 dominant approaches to the psychology of causal induction—the covariation approach and the causal power approach—are each crippled by fundamental problems. This article proposes an integration of these approaches that overcomes these problems. The proposal is that reasoners innately treat the relation between covariation (a function defined in terms of observable events) and causal power (an unobservable entity) as that between scientists' law or model and their theory explaining the model. This solution is formalized in the power PC theory, a causal power theory of the probabilistic contrast model (P. W. Cheng & L. R. Novick, 1990). The article reviews diverse old and new empirical tests discriminating this theory from previous models, none of which is justified by a theory. The results uniquely support the power PC theory. (PsycINFO Database Record (c) 2011 APA, all rights reserved)  相似文献   

16.
The relationship between the size of a familiar object and the distances at which it is imaged is examined in three experiments. The distance at which an imaged object overflows the visual field is linearly related to object size, a result consistent with the size–distance invariance hypothesis (Kosslyn, 1980). The distance at which an object is initially imaged, first-sight distance, is related to the object size by a power function with an exponent less than 1. In addition, time required to scan from the first-sight to the overflow distance increases as a function of the difference between the two distance estimates. The distance at which an imaged object becomes too small to be identified, vanishing point distance, is related to object size by a power function with an exponent less than 1. This result does not support predictions made from the size–distance invariance hypothesis or Kosslyn's model of visual imagery. Implications for a theory of visual imagery and memory are discussed. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

17.
The growth of the subjective reward magnitude of medial forebrain bundle stimulation in the rat (Sprague-Dawley) as a function of train duration and pulse frequency was measured in 2 ways: (1) a titration method, which used differences in rate of reward on 2 levers to compensate for differences in the magnitude of the rewards; and (2) a direct method, in which the ratio of the reward magnitudes at the 2 levers was assumed to be given by the ratio of times spent on each lever. The results of the 2 methods agree. Reward magnitude grows as approximately a power function of train duration up to train durations of about 1 sec, then declines somewhat over the interval from 2–20 sec. The exponent of growth varies from 0.4 to 2.3. With stronger stimulation (higher pulse frequency), peak reward magnitude is bigger, but the saturating train duration is approximately the same. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

18.
Investigated predictions of the network-interference theory in the retrieval of multiplication facts in 2 experiments involving 27 undergraduates (aged 19–30 yrs). Ss were tested with 36 interference or practice multiplication problems. Results indicate that retrieval of a product via one problem increased the probability that this product will be retrieved in error to another problem later. The effect of retrieval priming on error patterns was problem-specific, occurring when there were relatively strong associations between a problem and the false answers that were primed. Correct retrieval times for specific problems were increased or decreased by manipulating the activation levels of strong false associates through correct retrievals via other problems. It is argued that single-digit multiplication problems access a network structure of candidate answers and that different problems can overlap in the network substructures they activate. A model of network interference is proposed in which activation is distributed among candidate answers as a multiplicative function of associative and node strengths. Predictions derived from the hypothesis that adults' production of simple number combinations is governed substantially by procedural operations were not confirmed. (40 ref) (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

19.
The deformation of rolled copper and aluminum sheets has been studied at a constant load in the elastic range. Anisotropy in their elastic aftereffect has been detected. The time-dependent part of the deformation is found to be best described by a power function of time with a fractional exponent. A mathematical model is proposed to describe the elastic aftereffect of a metal using the fractal concepts of deformation. A strictly exponential time dependence is shown to transform into an anomalous dependence when a continuous relaxation-period distribution transforms into a fractal relaxation-period distribution during elastic aftereffect in fcc metals. The exponent of the power dependence of the strain on the time determines the fractal dimension of relaxation during the elastic aftereffect.  相似文献   

20.
High fatigue strength is one of the important factors that facilitates the industrial usage of bulk metallic glasses (BMGs). Fatigue data were analyzed using the Weibull probability models for BMGs produced with different casting processes in order to study the reliability of fatigue strengths of cast glassy alloys. The fatigue data of tilt-cast and high-pressure-cast BMGs can be explained by a three-parameter Weibull cumulative distribution function (cdf) and a mixture model of two-parameter Weibull cdfs, respectively. We conclude that the cast defects, which reduce fatigue strength, should be eliminated in order to realize a high reliability of fatigue strengths.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号