全文获取类型
收费全文 | 36270篇 |
免费 | 1364篇 |
国内免费 | 63篇 |
专业分类
电工技术 | 371篇 |
综合类 | 29篇 |
化学工业 | 7238篇 |
金属工艺 | 725篇 |
机械仪表 | 738篇 |
建筑科学 | 1963篇 |
矿业工程 | 114篇 |
能源动力 | 1054篇 |
轻工业 | 2883篇 |
水利工程 | 430篇 |
石油天然气 | 117篇 |
武器工业 | 5篇 |
无线电 | 2475篇 |
一般工业技术 | 6113篇 |
冶金工业 | 6600篇 |
原子能技术 | 268篇 |
自动化技术 | 6574篇 |
出版年
2023年 | 214篇 |
2022年 | 440篇 |
2021年 | 683篇 |
2020年 | 463篇 |
2019年 | 617篇 |
2018年 | 780篇 |
2017年 | 694篇 |
2016年 | 833篇 |
2015年 | 757篇 |
2014年 | 1040篇 |
2013年 | 2372篇 |
2012年 | 1680篇 |
2011年 | 2091篇 |
2010年 | 1651篇 |
2009年 | 1547篇 |
2008年 | 1800篇 |
2007年 | 1774篇 |
2006年 | 1591篇 |
2005年 | 1439篇 |
2004年 | 1174篇 |
2003年 | 1122篇 |
2002年 | 1051篇 |
2001年 | 703篇 |
2000年 | 549篇 |
1999年 | 595篇 |
1998年 | 583篇 |
1997年 | 575篇 |
1996年 | 550篇 |
1995年 | 572篇 |
1994年 | 525篇 |
1993年 | 510篇 |
1992年 | 500篇 |
1991年 | 288篇 |
1990年 | 418篇 |
1989年 | 389篇 |
1988年 | 319篇 |
1987年 | 355篇 |
1986年 | 310篇 |
1985年 | 418篇 |
1984年 | 416篇 |
1983年 | 318篇 |
1982年 | 295篇 |
1981年 | 281篇 |
1980年 | 270篇 |
1979年 | 270篇 |
1978年 | 247篇 |
1977年 | 225篇 |
1976年 | 206篇 |
1975年 | 194篇 |
1974年 | 173篇 |
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
991.
David E. Wilkins 《Artificial Intelligence》1982,18(1):1-51
PARADISE (PAttern Recognition Applied to DIrecting SEarch) uses a knowledge-based analysis and little searching to find the correct move in chess middle game positions. PARADISE's search does not have a depth limit or any other artificial effort limit. This paper describes the methods used to constrain the search. The ideas of using different strategies to show that one move is best and using ranges to express the values of moves (first developed in Berliner's B1 search), are extended and clarified. PARADISE combines these ideas with the use of plans, a threshold, and various measures of possibility. Examples are presented, including one in which PARADISE uses an indirect strategy to prove that one move is best without finding the winning line (a first for a chess program). 相似文献
992.
Three experiments used the "list-before-the-last" free recall paradigm (Shiffrin, 1970) to investigate retrieval for context and the manner in which context changes. This paradigm manipulates target and intervening list lengths to measure the interference from each list, providing a measure of list isolation. Correct target list recall was only affected by the target list length when participants engaged in recall between the lists, whereas there were effects of both list lengths with other activities. This suggests that the act of recalling drives context change, thus isolating the target list from interference. Correspondingly, incorrect recall of intervening list items was affected only by the length of the intervening list when recall occurred between the lists, but was otherwise affected by both list lengths. Concurrent with these changes in context similarity, there were apparent changes in context retrieval, as indicated by the overall levels of target retrieval versus intervening recall. A multinomial model of sampling and recovery was implemented to assess the adequacy of this account and to quantify context similarity and context retrieval. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
993.
Friedman Richard; Sobel David; Myers Patricia; Caudill Margaret; Benson Herbert 《Canadian Metallurgical Quarterly》1995,14(6):509
The use of medical services is a function of several interacting psychological and social variables as well as a function of physical malfunction. The clinical significance of addressing patients' psychosocial issues has only occasionally been considered. However, the shift in health care economics toward health care maintenance is responsible for the increased interest in interventions in the domain of behavioral medicine and health psychology. Evidence is reviewed for 6 mechanistic pathways by which behavioral interventions can maximize clinical care and result in significant economic benefits. The rationale for further integration of behavioral and biomedicine interventions is also reviewed. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
994.
Spencer-Rodgers Julie; Williams Melissa J.; Hamilton David L.; Peng Kaiping; Wang Lei 《Canadian Metallurgical Quarterly》2007,93(4):525
In 3 studies, the authors tested the hypothesis that Chinese participants would view social groups as more entitative than would Americans and, as a result, would be more likely to infer personality traits on the basis of group membership--that is, to stereotype. In Study 1, Chinese participants made stronger stereotypic trait inferences than Americans did on the basis of a target's membership in a fictitious group. Studies 2 and 3 showed that Chinese participants perceived diverse groups as more entitative and attributed more internally consistent dispositions to groups and their members. Guided by culturally based lay theories about the entitative nature of groups, Chinese participants may stereotype more readily than do Americans when group membership is available as a source of dispositional inference. (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
995.
Andrykowski Michael A.; Carpenter Janet S.; Studts Jamie L.; Cordova Matthew J.; Cunningham Lauren L. C.; Beacham Abbie; Sloan David; Kenady Daniel; McGrath Patrick 《Canadian Metallurgical Quarterly》2002,21(5):485
The impact of benign breast biopsy (BBB) on distress and perceptions of risk for breast cancer (BC) was examined. Interviews were conducted with 100 women shortly after notification of biopsy results and 4 and 8 months post-BBB. Compared with matched healthy comparison (HC) women without BBB, the BBB group evidenced greater BC-specific distress at baseline. BC-specific distress declined after BBB, remaining elevated relative to the HC group at the 8-month follow-up. Dispositional (optimism, informational coping style), demographic (education), clinical (family history of BC), and cognitive (BC risk perception) variables were associated with baseline levels of BC-specific distress or persistence of distress. Results support the monitoring process model (S. M. Miller, 1995) and the cognitive social health information processing model (S. M. Miller, Y. Shoda, & K. Hurley, 1996). (PsycINFO Database Record (c) 2010 APA, all rights reserved) 相似文献
996.
Ian Foster David R. Kohr Jr. Rakesh Krishnaiyer Alok Choudhary 《Journal of Parallel and Distributed Computing》1997,45(2):284
Pure data-parallel languages such as High Performance Fortran version 1 (HPF) do not allow efficient expression of mixed task/data-parallel computations or the coupling of separately compiled data-parallel modules. In this paper, we show how these common parallel program structures can be represented, with only minor extensions to the HPF model, by using a coordination library based on the Message Passing Interface (MPI). This library allows data-parallel tasks to exchange distributed data structures using calls to simple communication functions. We present microbenchmark results that characterize the performance of this library and that quantify the impact of optimizations that allow reuse of communication schedules in common situations. In addition, results from two-dimensional FFT, convolution, and multiblock programs demonstrate that the HPF/MPI library can provide performance superior to that of pure HPF. We conclude that this synergistic combination of two parallel programming standards represents a useful approach to task parallelism in a data-parallel framework, increasing the range of problems addressable in HPF without requiring complex compiler technology. 相似文献
997.
Wankang Zhao William Kreahling David Whalley Christopher Healy Frank Mueller 《Real-Time Systems》2006,34(2):129-152
It is advantageous to perform compiler optimizations that attempt to lower the worst-case execution time (WCET) of an embedded application since tasks with lower WCETs are easier to schedule and more likely to meet their deadlines.
Compiler writers in recent years have used profile information to detect the frequently executed paths in a program and there
has been considerable effort to develop compiler optimizations to improve these paths in order to reduce the average-case execution time (ACET). In this paper, we describe an approach to reduce the WCET by adapting and applying optimizations designed for frequent
paths to the worst-case (WC) paths in an application. Instead of profiling to find the frequent paths, our WCET path optimization uses feedback from
a timing analyzer to detect the WC paths in a function. Since these path-based optimizations may increase code size, the subsequent
effects on the WCET due to these optimizations are measured to ensure that the worst-case path optimizations actually improve
the WCET before committing to a code size increase. We evaluate these WC path optimizations and present results showing the
decrease in WCET versus the increase in code size.
A preliminary version of this paper entitled “Improving WCET by optimizing worst-case paths” appeared in the 2005 Real-Time and Embedded Technology and Applications Symposium.
Wankang Zhao received his PhD in Computer Science from Florida State University in 2005. He was an associate professor in Nanjin University
of Post and Telecommunications. He is currently working for Datamaxx Corporation.
William Kreahling received his PhD in Computer Science from Florida State University in 2005. He is currently an assistant professor in the
Math and Computer Science department at Western Carolina University. His research interests include compilers, computer architecture
and parallel computing.
David Whalley received his PhD in CS from the University of Virginia in 1990. He is currently the E.P. Miles professor and chair of the
Computer Science department at Florida State University. His research interests include low-level compiler optimizations,
tools for supporting the development and maintenance of compilers, program performance evaluation tools, predicting execution
time, computer architecture, and embedded systems. Some of the techniques that he developed for new compiler optimizations
and diagnostic tools are currently being applied in industrial and academic compilers. His research is currently supported
by the National Science Foundation. More information about his background and research can be found on his home page, http://www.cs.fsu.edu/∼whalley.
Dr. Whalley is a member of the IEEE Computer Society and the Association for Computing Machinery.
Chris Healy earned a PhD in computer science from Florida State University in 1999, and is currently an associate professor of computer
science at Furman University. His research interests include static and parametric timing analysis, real-time and embedded
systems, compilers and computer architecture. He is committed to research experiences for undergraduate students, and his
work has been supported by funding from the National Science Foundation. He is a member of ACM and the IEEE Computer Society.
Frank Mueller is an Associate Professor in Computer Science and a member of the Centers for Embedded Systems Research (CESR) and High Performance
Simulations (CHiPS) at North Carolina State University. Previously, he held positions at Lawrence Livermore National Laboratory
and Humboldt University Berlin, Germany. He received his Ph.D. from Florida State University in 1994. He has published papers
in the areas of embedded and real-time systems, compilers and parallel and distributed systems. He is a founding member of
the ACM SIGBED board and the steering committee chair of the ACM SIGPLAN LCTES conference. He is a member of the ACM, ACM
SIGPLAN, ACM SIGBED and the IEEE Computer Society. He is a recipient of an NSF Career Award. 相似文献
998.
999.
A computational grid ensures the on-demand delivery of computing resources, in a security-aware, shared, scalable, and standards-based computing environment. A major concern is how to evolve a general and an encompassing framework that guarantees users’ satisfaction measured as Quality of Services (QoS). To obtain a higher QoS, effective QoS perceived by subscribers (users) must conform to specified QoS agreements in the Service Level Agreements (SLAs) document—a legal contract between the Grid Services Provider (GSP) and users. Sometimes the effective user QoS does not conform to the specifications in the SLA because of the vagueness in linguistic definitions in the SLA. Existing approaches overcommitted resources to meet QoS. In this paper, we propose a fuzzy logic framework for calibrating a grid resources user-QoS that addresses the vagueness in linguistic definitions of the SLA document without overcommitting grid resources. 相似文献
1000.
L. Lawrence Ho David J. Cavuto Symeon Papavassiliou Anthony G. Zawadzki 《Journal of Network and Systems Management》2001,9(2):139-159
Adaptive algorithms for real-time and proactive detection of network/service anomalies, i.e., soft performance degradations, in transaction-oriented wide area networks (WANs) have been developed. These algorithms (i) adaptively sample and aggregate raw transaction records to compute service-class based traffic intensities, in which potential network anomalies are highlighted; (ii) construct dynamic and service-class based performance thresholds for detecting network and service anomalies; and (iii) perform service-class based and real-time network anomaly detection. These anomaly detection algorithms are implemented as a real-time software system called TRISTAN (
ansaction
n
antaneous
nomaly
otification), which is deployed in the AT&T Transaction Access Services (TAS) network. The TAS network is a commercially important, high volume (millions of transactions per day), multiple service classes (tens), hybrid telecom and data WAN that services transaction traffic such as credit card transactions in the US and neighboring countries. TRISTAN is demonstrated to be capable of automatically and adaptively detecting network/service anomalies and correctly identifying the corresponding "guilty" service classes in TAS. TRISTAN can detect network/service faults that elude detection by the traditional alarm-based network monitoring systems. 相似文献