首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3435篇
  免费   234篇
  国内免费   3篇
电工技术   19篇
综合类   6篇
化学工业   911篇
金属工艺   39篇
机械仪表   49篇
建筑科学   206篇
矿业工程   3篇
能源动力   55篇
轻工业   589篇
水利工程   36篇
石油天然气   18篇
无线电   139篇
一般工业技术   517篇
冶金工业   623篇
原子能技术   9篇
自动化技术   453篇
  2023年   24篇
  2022年   48篇
  2021年   131篇
  2020年   56篇
  2019年   87篇
  2018年   95篇
  2017年   84篇
  2016年   114篇
  2015年   85篇
  2014年   153篇
  2013年   221篇
  2012年   184篇
  2011年   266篇
  2010年   179篇
  2009年   153篇
  2008年   183篇
  2007年   154篇
  2006年   168篇
  2005年   104篇
  2004年   119篇
  2003年   100篇
  2002年   84篇
  2001年   57篇
  2000年   57篇
  1999年   49篇
  1998年   45篇
  1997年   44篇
  1996年   51篇
  1995年   43篇
  1994年   38篇
  1993年   33篇
  1992年   40篇
  1991年   36篇
  1990年   47篇
  1989年   32篇
  1988年   35篇
  1987年   29篇
  1986年   22篇
  1985年   22篇
  1984年   31篇
  1983年   12篇
  1982年   24篇
  1980年   15篇
  1979年   23篇
  1978年   13篇
  1977年   8篇
  1975年   13篇
  1974年   6篇
  1971年   6篇
  1969年   6篇
排序方式: 共有3672条查询结果,搜索用时 31 毫秒
61.
Objective

Image post-processing corrects for cardiac and respiratory motion (MoCo) during cardiovascular magnetic resonance (CMR) stress perfusion. The study analyzed its influence on visual image evaluation.

Materials and methods

Sixty-two patients with (suspected) coronary artery disease underwent a standard CMR stress perfusion exam during free-breathing. Image post-processing was performed without (non-MoCo) and with MoCo (image intensity normalization; motion extraction with iterative non-rigid registration; motion warping with the combined displacement field). Images were evaluated regarding the perfusion pattern (perfusion deficit, dark rim artifact, uncertain signal loss, and normal perfusion), the general image quality (non-diagnostic, imperfect, good, and excellent), and the reader’s subjective confidence to assess the images (not confident, confident, very confident).

Results

Fifty-three (non-MoCo) and 52 (MoCo) myocardial segments were rated as ‘perfusion deficit’, 113 vs. 109 as ‘dark rim artifacts’, 9 vs. 7 as ‘uncertain signal loss’, and 817 vs. 824 as ‘normal’. Agreement between non-MoCo and MoCo was high with no diagnostic difference per-patient. The image quality of MoCo was rated more often as ‘good’ or ‘excellent’ (92 vs. 63%), and the diagnostic confidence more often as “very confident” (71 vs. 45%) compared to non-MoCo.

Conclusions

The comparison of perfusion images acquired during free-breathing and post-processed with and without motion correction demonstrated that both methods led to a consistent evaluation of the perfusion pattern, while the image quality and the reader’s subjective confidence to assess the images were rated more favorably for MoCo.

  相似文献   
62.
Automatic annotation is an essential technique for effectively handling and organizing Web objects (e.g., Web pages), which have experienced an unprecedented growth over the last few years. Automatic annotation is usually formulated as a multi-label classification problem. Unfortunately, labeled data are often time-consuming and expensive to obtain. Web data also accommodate much richer feature space. This calls for new semi-supervised approaches that are less demanding on labeled data to be effective in classification. In this paper, we propose a graph-based semi-supervised learning approach that leverages random walks and ? 1 sparse reconstruction on a mixed object-label graph with both attribute and structure information for effective multi-label classification. The mixed graph contains an object-affinity subgraph, a label-correlation subgraph, and object-label edges with adaptive weight assignments indicating the assignment relationships. The object-affinity subgraph is constructed using ? 1 sparse graph reconstruction with extracted structural meta-text, while the label-correlation subgraph captures pairwise correlations among labels via linear combination of their co-occurrence similarity and kernel-based similarity. A random walk with adaptive weight assignment is then performed on the constructed mixed graph to infer probabilistic assignment relationships between labels and objects. Extensive experiments on real Yahoo! Web datasets demonstrate the effectiveness of our approach.  相似文献   
63.
Prior research has documented that IT investment increases market returns. Economic theories predict such returns to be recognized in accounting profitability; this relationship remains ambiguous in prior literature. We reexamine the relationship between IT investment and firm profitability. Our approach is unique in that we examine complementarities between distinct IT components. We document that a firm’s investments in IT components exhibit different impacts on its profitability conditional on the level of investments in complementary components.  相似文献   
64.
One has a large workload that is “divisible”—its constituent work’s granularity can be adjusted arbitrarily—and one has access to p remote worker computers that can assist in computing the workload. How can one best utilize the workers? Complicating this question is the fact that each worker is subject to interruptions (of known likelihood) that kill all work in progress on it. One wishes to orchestrate sharing the workload with the workers in a way that maximizes the expected amount of work completed. Strategies are presented for achieving this goal, by balancing the desire to checkpoint often—thereby decreasing the amount of vulnerable work at any point—vs. the desire to avoid the context-switching required to checkpoint. Schedules must also temper the desire to replicate work, because such replication diminishes the effective remote workforce. The current study demonstrates the accessibility of strategies that provably maximize the expected amount of work when there is only one worker (the case p=1) and, at least in an asymptotic sense, when there are two workers (the case p=2); but the study strongly suggests the intractability of exact maximization for p≥2 computers, as work replication on multiple workers joins checkpointing as a vehicle for decreasing the impact of work-killing interruptions. We respond to that challenge by developing efficient heuristics that employ both checkpointing and work replication as mechanisms for decreasing the impact of work-killing interruptions. The quality of these heuristics, in expected amount of work completed, is assessed through exhaustive simulations that use both idealized models and actual trace data.  相似文献   
65.
Already parasitized hosts are often of poorer quality than healthy hosts. It is therefore usually advantageous for parasitoid females to recognize and reject them. Parasitized hosts can be identified on the basis of various physical or chemical marks present on the surface or inside the hosts or their surroundings in the case of concealed host. Here we studied host discrimination behaviors of females of a certain population of Pachycrepoideus vindemmiae, a solitary ectoparasitoid, which are known to reject large-sized parasitized hosts after an abdominal examination of their surface. We first investigated females' recognition behaviors of host parasitism status when confronted to small-sized hosts (Drosophila melanogaster pupae) as host size may influence the use of different cues for host selection. We showed that, in such a situation, females also discriminate parasitized hosts after an external host exploration with the tip of their ovipositor sheath (third valvulae). We then described the sense organs present on the different parts of the ovipositor by means of scanning and transmission electron microscopy analysis. As the extremity of the third valvulae bears only one type of sensilla that appears to be chemoreceptors, we considered these sensilla as highly likely to be involved in host discrimination in P. vindemmiae. To our knowledge, this is the first time that receptors located on the ovipositor sheath are described as implicated in host discrimination in parasitoid wasps. We discuss potential chemical markers that might be detected by these receptors.  相似文献   
66.
Relationships between large-scale environmental factors and the incidence of type E avian botulism outbreaks in Lake Michigan were examined from 1963 to 2008. Avian botulism outbreaks most frequently occurred in years with low mean annual water levels, and lake levels were significantly lower in outbreak years than in non-outbreak years. Mean surface water temperatures in northern Lake Michigan during the period when type E outbreaks tend to occur (July through September) were significantly higher in outbreak years than in non-outbreak years. Trends in fish populations did not strongly correlate with botulism outbreaks, although botulism outbreaks in the 1960s coincided with high alewife abundance, and recent botulism outbreaks coincided with rapidly increasing round goby abundance. Botulism outbreaks occurred cyclically, and the frequency of outbreaks did not increase over the period of record. Climate change scenarios for the Great Lakes predict lower water levels and warmer water temperatures. As a consequence, the frequency and magnitude of type E botulism outbreaks in the Great Lakes may increase.  相似文献   
67.
In this paper, we report on our experience with the application of validated models to assess performance, reliability, and adaptability of a complex mission critical system that is being developed to dynamically monitor and control the position of an oil-drilling platform. We present real-time modeling results that show that all tasks are schedulable. We performed stochastic analysis of the distribution of task execution time as a function of the number of system interfaces. We report on the variability of task execution times for the expected system configurations. In addition, we have executed a system library for an important task inside the performance model simulator. We report on the measured algorithm convergence as a function of the number of vessel thrusters. We have also studied the system architecture adaptability by comparing the documented system architecture and the implemented source code. We report on the adaptability findings and the recommendations we were able to provide to the system’s architect. Finally, we have developed models of hardware and software reliability. We report on hardware and software reliability results based on the evaluation of the system architecture.  相似文献   
68.
Dynamic composition and optimization of Web services   总被引:1,自引:0,他引:1  
Process-based composition of Web services has recently gained significant momentum for the implementation of inter-organizational business collaborations. In this approach, individual Web services are choreographed into composite Web services whose integration logics are expressed as composition schema. In this paper, we present a goal-directed composition framework to support on-demand business processes. Composition schemas are generated incrementally by a rule inference mechanism based on a set of domain-specific business rules enriched with contextual information. In situations where multiple composition schemas can achieve the same goal, we must first select the best composition schema, wherein the best schema is selected based on the combination of its estimated execution quality and schema quality. By coupling the dynamic schema creation and quality-driven selection strategy in one single framework, we ensure that the generated composite service comply with business rules when being adapted and optimized.  相似文献   
69.
In many Grid infrastructures different kinds of information services are in use, which utilize different incompatible data structures and interfaces to encode and provide their data. Homogeneous monitoring of these infrastructures with the monitoring data being accessible everywhere independently of the middleware which provided it, is the basis for a consistent status reporting on the Grids’ resources and services. Thus, interoperability or interoperation between the different information services in a heterogeneous Grid infrastructure is required. Monitoring data must contain the identity of the affected Virtual Organization (VO) so that it can be related to the resources and services the VO has allocated to enable VO-specific information provision. This paper describes a distributed architecture for an interoperable information service, which combines data unification and categorization with policies for VO membership, VO resource management and data transformations. This service builds the basis for an integrated and interoperating monitoring of Grids, which provide their data to more than one VO and utilize heterogeneous information services.  相似文献   
70.
Proteomics analysis of serum from patients with type 1 diabetes (T1D) may lead to novel biomarkers for prediction of disease and for patient monitoring. However, the serum proteome is highly sensitive to sample processing and before proteomics biomarker research serum cohorts should preferably be examined for potential bias between sample groups. SELDI‐TOF MS protein profiling was used for preliminary evaluation of a biological‐bank with 766 serum samples from 270 patients with T1D, collected at 18 different paediatric centers representing 15 countries in Europe and Japan over 2 years (2000–2002). Samples collected 1 (n = 270), 6 (n = 248), and 12 (n = 248) months after T1D diagnosis were grouped across centers and compared. The serum protein profiles varied with collection site and day of analysis; however, markers of sample processing were not systematically different between samples collected at different times after diagnosis. Three members of the apolipoprotein family increased with time in patient serum collected 1, 6, and 12 months after diagnosis (ANOVA, p<0.001). These results support the use of this serum cohort for further proteomic studies and illustrate the potential of high‐throughput MALDI/SELDI‐TOF MS protein profiling for evaluation of serum cohorts before proteomics biomarker research.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号