首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   43篇
  免费   3篇
化学工业   10篇
能源动力   1篇
无线电   1篇
一般工业技术   5篇
冶金工业   9篇
自动化技术   20篇
  2023年   1篇
  2021年   1篇
  2020年   1篇
  2019年   1篇
  2017年   1篇
  2016年   2篇
  2015年   1篇
  2014年   1篇
  2013年   1篇
  2012年   3篇
  2011年   2篇
  2010年   1篇
  2008年   3篇
  2007年   1篇
  2006年   2篇
  2005年   1篇
  2003年   2篇
  2002年   2篇
  2001年   2篇
  2000年   2篇
  1999年   2篇
  1998年   2篇
  1996年   1篇
  1995年   1篇
  1993年   1篇
  1991年   1篇
  1990年   2篇
  1989年   2篇
  1987年   1篇
  1985年   1篇
  1980年   1篇
排序方式: 共有46条查询结果,搜索用时 93 毫秒
1.
Research on achievement goal orientation in sport has relied primarily on traditional statistical methodology to examine group mean differences. Unfortunately, examination of the measurement model is generally ignored prior to means testing. This study reports an application of structural equation modeling (SEM) in testing measurement invariance and latent mean structure of the Task and Ego Orientation in Sport Questionnaire (TEOSQ; Duda & Nicholls, 1989) using male and female college students. A confirmatory factor analysis for testing invariance revealed invariant measurement properties and factor structures across gender, indicating that task and ego orientation are similarly conceptualized by male and female students. Subsequent testing of latent mean structures, however, showed significant gender differences with respect to ego orientation, but no difference in task orientation. The SEM procedures used in the present study demonstrate additional construct validity and internal consistency reliability for the TEOSQ and, by confirming its factor structure, provide a sound psychometric basis for its continued use in substantive studies focusing on the comparison of achievement goal orientation across gender.  相似文献   
2.
As heterogeneous data from different sources are being increasingly linked, it becomes difficult for users to understand how the data are connected, to identify what means are suitable to analyze a given data set, or to find out how to proceed for a given analysis task. We target this challenge with a new model-driven design process that effectively codesigns aspects of data, view, analytics, and tasks. We achieve this by using the workflow of the analysis task as a trajectory through data, interactive views, and analytical processes. The benefits for the analysis session go well beyond the pure selection of appropriate data sets and range from providing orientation or even guidance along a preferred analysis path to a potential overall speedup, allowing data to be fetched ahead of time. We illustrate the design process for a biomedical use case that aims at determining a treatment plan for cancer patients from the visual analysis of a large, heterogeneous clinical data pool. As an example for how to apply the comprehensive design approach, we present Stack'n'flip, a sample implementation which tightly integrates visualizations of the actual data with a map of available data sets, views, and tasks, thus capturing and communicating the analytical workflow through the required data sets.  相似文献   
3.
The secondary diamine 1,3,5,7-tetrahydro[1,2c:4,5c'] benzodipyrrole (3) and 1,2,4,5-tetrabromomethylbenzene (1) form a polymeric ionene with spirane structure through a repetitive alkylation reaction. The structure of the product could be proven by13C-NMR spectroscopy by comparison with suitable reference compounds. Solutions in aqueous methanol exhibit a typical polyelectrolyte effect. Variation of the counterions produces sufficient solubility in organic solvents. From the crystal structure of a similar model compound one can conclude that the synthesized polymer has a rod-like shape.  相似文献   
4.
Polyepichlorohydrin (PECH) rubbers were found to toughen epoxy resins based on the diglycidyl ether of bisphenol A (DGEBA) and cured with piperidine. The degree of toughening depends on the molecular weight of the PECH and on the curing temperature. Best toughening was achieved with PECH of the highest nominal molecular weight of 3400 (Hydrin 10 × 2). Hydrin 10 × 1 (nominal molecular weight 1700) did not toughen the epoxy resin unless bisphenol A was also added, whereas Hydrin 10 × 2 toughened it in the absence of bisphenol A. Curing resins containing bisphenol A and Hydrin 10 × 1 at 160°C resulted in a slightly more brittle resin than when cured at 120°C. The effect of PECH rubbers on the Tg, modulus, and hot/wet properties is similar to that of carboxy-terminated butadiene-acrylonitrile rubbers (CTBN). Dynamic mechanical thermal analysis (DMTA) and scanning electron micrographs (SEM) of fractured surfaces show that the PECH separates as a discrete phase during curing. © 1993 John Wiley & Sons, Inc.  相似文献   
5.
The magnetic properties of non-oriented magnetic steel are specified at a frequency of 50 or 60 Hz. The effect of increasing the frequency of the magnetizing field upon the specific total loss, the field strength and the specific apparent power is demonstrated. Regarding the non-oriented steel at 50 Hz total loss comprises mainly hysteresis loss, whereas the eddy-current loss of most of the grades amounts to less than 40%. The approximately square increase of the eddy-current loss with frequency means that at 400 Hz the total loss is determined by the resistivity. For excitations up to a peak polarization of 1.6 T increasing sheet thickness results in an increasing field strength at 400 Hz. While at 50 Hz low-alloyed grades have a lower field strength above 1.3 T, at 400 Hz the field strength of the high-alloyed grades is more favourable up to 1.4 T. With respect to the apparent power, the square of which is proportional to the copper losses, there is an advantage of the higher alloyed grades up to 1.5 T.  相似文献   
6.
A numerical scheme is developed to simulate the non-isothermal steady-state behaviour of a MOS field effect transistor. In a desire to develop a fast, stable numerical scheme, physical instabilities were eliminated by using a simplified device model. The numerical technique developed permits a computer solution of the majority carrier transport equation, the nonlinear heat conduction equation, in which the heat generation term is obtained from the solution of the transport equation, and a number of auxiliary differential equations. The simplified model of the MOS transistor adopted will not, of course, produce any information on the actual operation of the short channel MOS transistor of practical interest today, but the numerical scheme can be extended to simulate short channel models that are of great practical interest.  相似文献   
7.
qRT-PCR still remains the most widely used method for quantifying gene expression levels, although newer technologies such as next generation sequencing are becoming increasingly popular. A critical, yet often underappreciated, problem when analysing qRT-PCR data is the selection of suitable reference genes. This problem is compounded in situations where up to 25% of all genes may change (e.g., due to leukocyte invasion), as is typically the case in ARDS. Here, we examined 11 widely used reference genes for their suitability in commonly used models of acute lung injury (ALI): ventilator-induced lung injury (VILI), in vivo and ex vivo, lipopolysaccharide plus mechanical ventilation (MV), and hydrochloric acid plus MV. The stability of reference gene expression was determined using the NormFinder, BestKeeper, and geNorm algorithms. We then proceeded with the geNorm results because this is the only algorithm that provides the number of reference genes required to achieve normalisation. We chose interleukin-6 (Il-6) and C-X-C motif ligand 1 (Cxcl-1) as the genes of interest to analyse and demonstrate the impact of inappropriate normalisation. Reference gene stability differed between the ALI models and even within the subgroup of VILI models, no common reference gene index (RGI) could be determined. NormFinder, BestKeeper, and geNorm produced slightly different, but comparable results. Inappropriate normalisation of Il-6 and Cxcl1 gene expression resulted in significant misinterpretation in all four ALI settings. In conclusion, choosing an inappropriate normalisation strategy can introduce different kinds of bias such as gain or loss as well as under- or overestimation of effects, affecting the interpretation of gene expression data.  相似文献   
8.
Multivariate networks are made up of nodes and their relationships (links), but also data about those nodes and links as attributes. Most real‐world networks are associated with several attributes, and many analysis tasks depend on analyzing both, relationships and attributes. Visualization of multivariate networks, however, is challenging, especially when both the topology of the network and the attributes need to be considered concurrently. In this state‐of‐the‐art report, we analyze current practices and classify techniques along four axes: layouts, view operations, layout operations, and data operations. We also provide an analysis of tasks specific to multivariate networks and give recommendations for which technique to use in which scenario. Finally, we survey application areas and evaluation methodologies.  相似文献   
9.
Multivariate graphs are prolific across many fields, including transportation and neuroscience. A key task in graph analysis is the exploration of connectivity, to, for example, analyze how signals flow through neurons, or to explore how well different cities are connected by flights. While standard node‐link diagrams are helpful in judging connectivity, they do not scale to large networks. Adjacency matrices also do not scale to large networks and are only suitable to judge connectivity of adjacent nodes. A key approach to realize scalable graph visualization are queries: instead of displaying the whole network, only a relevant subset is shown. Query‐based techniques for analyzing connectivity in graphs, however, can also easily suffer from cluttering if the query result is big enough. To remedy this, we introduce techniques that provide an overview of the connectivity and reveal details on demand. We have two main contributions: (1) two novel visualization techniques that work in concert for summarizing graph connectivity; and (2) Graffinity, an open‐source implementation of these visualizations supplemented by detail views to enable a complete analysis workflow. Graffinity was designed in a close collaboration with neuroscientists and is optimized for connectomics data analysis, yet the technique is applicable across domains. We validate the connectivity overview and our open‐source tool with illustrative examples using flight and connectomics data.  相似文献   
10.
Female Navy recruits (N=5,226) completed surveys assessing history of childhood sexual abuse (CSA), childhood strategies for coping with CSA, childhood parental support, and current psychological adjustment. Both CSA and parental support independently predicted later adjustment. In analyses examining whether CSA victims' functioning was associated with CSA severity (indexed by 5 variables), parental support (indexed by 3 variables), and coping (constructive, self-destructive, and avoidant), the negative coping variables were the strongest predictors. A structural equation model revealed that the effect of abuse severity on later functioning was partially mediated by coping strategies. However, contrary to predictions, the model revealed that childhood parental support had little direct or indirect impact on adult adjustment. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号