首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   47篇
  免费   3篇
电工技术   2篇
化学工业   4篇
金属工艺   1篇
建筑科学   4篇
轻工业   4篇
无线电   13篇
一般工业技术   4篇
冶金工业   9篇
自动化技术   9篇
  2020年   1篇
  2019年   1篇
  2018年   2篇
  2017年   1篇
  2016年   2篇
  2014年   1篇
  2013年   2篇
  2012年   1篇
  2011年   2篇
  2010年   2篇
  2009年   4篇
  2008年   6篇
  2007年   3篇
  2006年   3篇
  2005年   2篇
  2004年   3篇
  2002年   2篇
  2001年   1篇
  1998年   4篇
  1997年   2篇
  1996年   1篇
  1995年   1篇
  1994年   1篇
  1993年   1篇
  1986年   1篇
排序方式: 共有50条查询结果,搜索用时 32 毫秒
1.
2.
Existing empirical studies on test-driven development (TDD) report different conclusions about its effects on quality and productivity. Very few of those studies are experiments conducted with software professionals in industry. We aim to analyse the effects of TDD on the external quality of the work done and the productivity of developers in an industrial setting. We conducted an experiment with 24 professionals from three different sites of a software organization. We chose a repeated-measures design, and asked subjects to implement TDD and incremental test last development (ITLD) in two simple tasks and a realistic application close to real-life complexity. To analyse our findings, we applied a repeated-measures general linear model procedure and a linear mixed effects procedure. We did not observe a statistical difference between the quality of the work done by subjects in both treatments. We observed that the subjects are more productive when they implement TDD on a simple task compared to ITLD, but the productivity drops significantly when applying TDD to a complex brownfield task. So, the task complexity significantly obscured the effect of TDD. Further evidence is necessary to conclude whether TDD is better or worse than ITLD in terms of external quality and productivity in an industrial setting. We found that experimental factors such as selection of tasks could dominate the findings in TDD studies.  相似文献   
3.
This work represents the first attempt to develop a sensory system, specifically designed for the characterization of wines, which combines three sensory modalities: an array of gas sensors, an array of electrochemical liquid sensors, and an optical system to measure color by means of CIElab coordinates. This new analytical tool, that has been called "electronic panel," includes not only sensors, but also hardware (injection system and electronics) and the software necessary for fusing information from the three modules. Each of the three sensory modalities (volatiles, liquids, and color) has been designed, tested, and optimized separately. The discrimination capabilities of the system have been evaluated on a database consisting of six red Spanish wines prepared using the same variety of grape (tempranillo) but differing in their geographic origins and aging stages. Sensor signals from each module have been combined and analyzed using pattern recognition techniques. The results of this work show that the discrimination capabilities of the system are significantly improved when signals from each module are combined to form a multimodal feature vector.  相似文献   
4.
In no science or engineering discipline does it make sense to speak of isolated experiments. The results of a single experiment cannot be viewed as representative of the underlying reality. Experiment replication is the repetition of an experiment to double-check its results. Multiple replications of an experiment increase the confidence in its results. Software engineering has tried its hand at the identical (exact) replication of experiments in the way of the natural sciences (physics, chemistry, etc.). After numerous attempts over the years, apart from experiments replicated by the same researchers at the same site, no exact replications have yet been achieved. One key reason for this is the complexity of the software development setting, which prevents the many experimental conditions from being identically reproduced. This paper reports research into whether non-exact replications can be of any use. We propose a process aimed at researchers running non-exact replications. Researchers enacting this process will be able to identify new variables that are possibly having an effect on experiment results. The process consists of four phases: replication definition and planning, replication operation and analysis, replication interpretation, and analysis of the replication’s contribution. To test the effectiveness of the proposed process, we have conducted a multiple-case study, revealing the variables learned from two different replications of an experiment.  相似文献   
5.
We will propose a dynamic reconfigurable wavelength-division-multiplexed (WDM) millimeter-waveband (mm-waveband) radio-over-fiber (RoF) access network and demonstrate, for the first time, a dynamic-channel-allocation capability of millimeter-waveband optical RoF signals in WDM access network using a supercontinuum light source, arrayed-waveguide gratings, and a reconfigurable optical-crossconnect switch. The dynamic reconfigurable RoF network architecture is presented, and its features are described. Then, four 155-Mb/s RoF channels are effectively generated, transmitted through 25 km of fiber, switched, transmitted again through 2 km of fiber, and detected with an error-free operation (bit error rate < 10-10). The proposed RoF architecture is highly scalable, both in terms of channel and access-point counts.  相似文献   
6.
Associative and statistical theories of causal and predictive learning make opposite predictions for situations in which the most recent information contradicts the information provided by older trials (e.g., acquisition followed by extinction). Associative theories predict that people will rely on the most recent information to best adapt their behavior to the changing environment. Statistical theories predict that people will integrate what they have learned in the two phases. The results of this study showed one or the other effect as a function of response mode (trial by trial vs. global), type of question (contiguity, causality, or predictiveness), and postacquisition instructions. That is, participants are able to give either an integrative judgment, or a judgment that relies on recent information as a function of test demands. The authors concluded that any model must allow for flexible use of information once it has been acquired. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
7.
This work reports the project of thermal effects in mineralogy and texture of the paper sludge to employ them as pozzolanic material.For this reason, the chemistry and mineralogical composition has been studied, as well as its morphology by XRD, SEM and EDX.The initial sludge has been treated to 700, 750 and 800 °C during 2 and 5 h being observed that initial kaolinite becomes metakaolinite and that its pozzolanic activity with a paper sludge treated to 700 °C for 2 h is comparable to that of a commercial metakaolinite. The transformation of kaolinite after the dehydroxylation is to convert in amorphous metakaolinite. At the temperature mentioned above, calcite from the initial sludge is maintained active.It is concluded that the pozzolanic activity of metakaolinite is strongly related to the crystallinity of the original kaolinite. Well-ordered kaolinite is transformed into more reactive metakaolinite.  相似文献   
8.
9.
Harmonic/noise ratio and spectrographic analysis in vocal abuse pathology   总被引:1,自引:0,他引:1  
To evaluate the use of dual energy X-ray absorptiometry (DXA) in multiple myeloma (MM) we performed a prospective study of 34 patients with newly diagnosed MM. Most patients had advanced disease and all but two patients had osteolytic bone destructions and/or pathological fractures. Bone mineral content (BMC) and bone mineral density (BMD) of the lumbar spine (L1-L4) and hip were measured using a Hologic QDR-1000 scanner. Collapsed vertebrae were not excluded from analysis. Data from 289 healthy Danish volunteers aged 21-79 yr were used for calculation of Z-scores. Lumbar spine BMC (Z-score -0.46 +/- 0.23, p = 0.05) and lumbar spine BMD (Z-score -0.56 +/- 0.23, p = 0.02) were significantly reduced in MM patients, whereas no reduction was seen in hip BMC or BMD. Collapsed vertebrae had marked reduced BMD (Z-score -1.34 +/- 0.22, p < 0.001), as had non-fractured vertebrae in the same individuals (Z-score -1.42 +/- 0.25, p < 0.001). Lumbar spine BMD correlated with radiologically assessed bone morbidity (r -0.37, p = 0.03) and stronger with the incidence of vertebral fractures (r -0.64, p < 0.001). Thus, osteopenia of the back is common in multiple myeloma and correlates with an increased incidence of fractures. DXA may identify subjects with increased risk of vertebral fractures for more intensive chemotherapeutic or anti-resorptive treatment.  相似文献   
10.
The finite-difference time-domain (FDTD) method is extended to include magnetized ferrites. The treatment of the ferrite material is based on the equation of motion of the magnetization vector. Magnetic losses are also included in the equation of motion by means of Gilbert's approximation of the phenomenological Landau-Lifschitz damping term. The discretization scheme is based on central finite-differences and linear interpolation. This scheme allows the fully explicit nature of the FDTD method to be maintained. This extension of the FDTD method to magnetized ferrites is applied to the full-wave analysis of ferrite-loaded waveguides. The dispersion curves are calculated by using a recently proposed 2D-FDTD formulation for dispersion analysis which has been adapted to the present problem. The results for both the phase and attenuation constants of various transversely and longitudinally magnetized ferrite-loaded waveguides are compared with the exact values and with those obtained by means of Schelkunoff's method  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号