全文获取类型
收费全文 | 121141篇 |
免费 | 2885篇 |
国内免费 | 439篇 |
专业分类
电工技术 | 1020篇 |
综合类 | 2341篇 |
化学工业 | 18198篇 |
金属工艺 | 5431篇 |
机械仪表 | 3752篇 |
建筑科学 | 3345篇 |
矿业工程 | 624篇 |
能源动力 | 1857篇 |
轻工业 | 8583篇 |
水利工程 | 1587篇 |
石油天然气 | 512篇 |
武器工业 | 4篇 |
无线电 | 11289篇 |
一般工业技术 | 22801篇 |
冶金工业 | 13013篇 |
原子能技术 | 451篇 |
自动化技术 | 29657篇 |
出版年
2023年 | 349篇 |
2022年 | 332篇 |
2021年 | 875篇 |
2020年 | 733篇 |
2019年 | 672篇 |
2018年 | 15465篇 |
2017年 | 14426篇 |
2016年 | 11119篇 |
2015年 | 1676篇 |
2014年 | 1603篇 |
2013年 | 2886篇 |
2012年 | 5362篇 |
2011年 | 11187篇 |
2010年 | 9696篇 |
2009年 | 6833篇 |
2008年 | 8286篇 |
2007年 | 9033篇 |
2006年 | 1262篇 |
2005年 | 2144篇 |
2004年 | 2016篇 |
2003年 | 1940篇 |
2002年 | 1235篇 |
2001年 | 550篇 |
2000年 | 635篇 |
1999年 | 662篇 |
1998年 | 3401篇 |
1997年 | 2034篇 |
1996年 | 1342篇 |
1995年 | 766篇 |
1994年 | 663篇 |
1993年 | 643篇 |
1992年 | 217篇 |
1991年 | 244篇 |
1990年 | 221篇 |
1989年 | 214篇 |
1988年 | 201篇 |
1987年 | 166篇 |
1986年 | 180篇 |
1985年 | 210篇 |
1984年 | 182篇 |
1983年 | 132篇 |
1982年 | 160篇 |
1981年 | 172篇 |
1980年 | 153篇 |
1979年 | 127篇 |
1978年 | 118篇 |
1977年 | 198篇 |
1976年 | 418篇 |
1975年 | 88篇 |
1973年 | 81篇 |
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
Marion Woytasik Johan Moulin Emile Martincic Anne-Lise Coutrot Elisabeth Dufour-Gergam 《Microsystem Technologies》2008,14(7):951-956
Recent advances in microtechnology allow realization of planar microcoils. These components are integrated in MEMS as magnetic sensor or actuator. In the latter case, it is necessary to maximize the effective magnetic field which is proportional to the current passing through the copper track and depends on the distance to the generation microcoil. The aim of this work was to determine the optimal microcoil design configuration for magnetic field generation. The results were applied to magnetic actuation, taking into account technological constraints. In particular, we have considered different realistic configurations that involve a magnetically actuated device coupled to a microcoil. Calculations by a semi-analytical method using Matlab software were validated by experimental measurements. The copper planar microcoils are fabricated by UV micromoulding on different substrates: flexible polymer (Kapton®) and silicate on silicon. They are constituted by a spiral-like continuous track. Their total surface is about 1 mm2. 相似文献
992.
There is an increasing use of computer media for negotiations. However, the use of computer-mediated channels increases the hostile expressions of emotion, termed flaming. Although researchers agree that flaming has important effects on negotiation, predictions concerning these effects are inconsistent, suggesting a need for further investigation. We address this need by extending current flaming and negotiation research in two ways. First, we identify two different types of flaming: that which is motivated by perceptions concerning the negotiating opponent (e.g., he/she is unfair) and that which is motivated by perceptions concerning the negotiating context (e.g., the communication channel is too slow). Second, we differentiate between the effects of flaming on the concession behaviors of the flame sender and the flame recipient, and the effects of these behaviors on negotiated agreement. Via a laboratory study, we demonstrate that flames directed at the negotiation opponent slightly decrease the likelihood of reaching an agreement, and when an agreement is reached, it result in outcomes significantly favoring the flame recipient rather than the flame sender. In contrast, flames directed at the negotiation context significantly increase the likelihood of agreement, although outcomes still favor the flame recipient over the flame sender. These results suggest that flame senders are generally worse off than flame recipients, which provides an important basis for the strategic use of flaming in negotiations. 相似文献
993.
About 20 years ago, Markus and Robey noted that most research on IT impacts had been guided by deterministic perspectives and had neglected to use an emergent perspective, which could account for contradictory findings. They further observed that most research in this area had been carried out using variance theories at the expense of process theories. Finally, they suggested that more emphasis on multilevel theory building would likely improve empirical reliability. In this paper, we reiterate the observations and suggestions made by Markus and Robey on the causal structure of IT impact theories and carry out an analysis of empirical research published in four major IS journals, Management Information Systems Quarterly (MISQ), Information Systems Research (ISR), the European Journal of Information Systems (EJIS), and Information and Organization (I&O), to assess compliance with those recommendations. Our final sample consisted of 161 theory-driven articles, accounting for approximately 21% of all the empirical articles published in these journals. Our results first reveal that 91% of the studies in MISQ, ISR, and EJIS focused on deterministic theories, while 63% of those in I&O adopted an emergent perspective. Furthermore, 91% of the articles in MISQ, ISR, and EJIS adopted a variance model; this compares with 71% from I&O that applied a process model. Lastly, mixed levels of analysis were found in 14% of all the surveyed articles. Implications of these findings for future research are discussed. 相似文献
994.
Since DeLone and McLean (D&M) developed their model of IS success, there has been much research on the topic of success as well as extensions and tests of their model. Using the technique of a qualitative literature review, this research reviews 180 papers found in the academic literature for the period 1992–2007 dealing with some aspect of IS success. Using the six dimensions of the D&M model – system quality, information quality, service quality, use, user satisfaction, and net benefits – 90 empirical studies were examined and the results summarized. Measures for the six success constructs are described and 15 pairwise associations between the success constructs are analyzed. This work builds on the prior research related to IS success by summarizing the measures applied to the evaluation of IS success and by examining the relationships that comprise the D&M IS success model in both individual and organizational contexts. 相似文献
995.
Raymond R Panko 《欧洲信息系统杂志》2008,17(3):182-197
In the 1990s, enrollments grew rapidly in information systems (IS) and computer science. Then, beginning in 2000 and 2001, enrollments declined precipitously. This paper looks at the enrollment bubble and the dotcom bubble that drove IT enrollments. Although the enrollment bubble occurred worldwide, this paper focuses primarily on U.S. data, which is widely available, and secondarily on Western Europe data. The paper notes that the dotcom bubble was an investment disaster but that U.S. IT employment fell surprisingly little and soon surpassed the bubble's peak IT employment. In addition, U.S. IT unemployment rose to almost the level of total unemployment in 2003, then fell to traditional low levels by 2005. Job prospects in the U.S. and most other countries are good for the short term, and the U.S. Bureau of Labor Statistics employment projections for 2006–2016 indicate that job prospects in the U.S. will continue to be good for most IT jobs. However, offshoring is a persistent concern for students in Western Europe and the United States. The data on offshoring are of poor quality, but several studies indicate that IT job losses from offshoring are small and may be counterbalanced by gains in IT inshoring jobs. At the same time, offshoring and productivity gains appear to be making low-level jobs such as programming and user support less attractive. This means that IS and computer science programs will have to focus on producing higher-level job skills among graduates. In addition, students may have to stop considering the undergraduate degree to be a terminal degree in IS and computer science. 相似文献
996.
997.
Category Partition Method (CPM) is a general approach to specification-based program testing, where test frame reduction and
refinement are two important issues. Test frame reduction is necessary since too many test frames may be produced, and test
frame refinement is important since during CPM testing new information about test frame generation may be achieved and considered
incrementally. Besides the information provided by testers or users, implementation related knowledge offers alternative information
for reducing and refining CPM test frames. This paper explores the idea by proposing a call patterns semantics based test
frame updating method for Prolog programs, in which a call patterns analysis is used to collect information about the way
in which procedures are used in a program. The updated test frames will be represented as constraints. The effect of our test
frame updating is two-fold. On one hand, it removes “uncared” data from the original set of test frames; on the other hand,
it refines the test frames to which we should pay more attention. The first effect makes the input domain on which a procedure
must be tested a subset of the procedure’s input domain, and the latter makes testers stand more chance to find out the faults
that are more likely to show their presence in the use of the program under consideration. Our test frame updating method
preserves the effectiveness of CPM testing with respect to the detection of faults we care. The test case generation from
the updated set of test frames is also discussed. In order to show the applicability of our method an approximation call patterns
semantics is proposed, and the test frame updating on the semantics is illustrated by an example.
相似文献
Lingzhong ZhaoEmail: |
998.
Statistical process control (SPC) is a conventional means of monitoring software processes and detecting related problems,
where the causes of detected problems can be identified using causal analysis. Determining the actual causes of reported problems
requires significant effort due to the large number of possible causes. This study presents an approach to detect problems
and identify the causes of problems using multivariate SPC. This proposed method can be applied to monitor multiple measures
of software process simultaneously. The measures which are detected as the major impacts to the out-of-control signals can
be used to identify the causes where the partial least squares (PLS) and statistical hypothesis testing are utilized to validate
the identified causes of problems in this study. The main advantage of the proposed approach is that the correlated indices
can be monitored simultaneously to facilitate the causal analysis of a software process.
Ching-Pao Chang is a PhD candidate in Computer Science & Information Engineering at the National Cheng-Kung University, Taiwan. He received his MA from the University of Southern California in 1998 in Computer Science. His current work deals with the software process improvement and defect prevention using machine learning techniques. Chih-Ping Chu is Professor of Software Engineering in Department of Computer Science & Information Engineering at the National Cheng-Kung University (NCKU) in Taiwan. He received his MA in Computer Science from the University of California, Riverside in 1987, and his Doctorate in Computer Science from Louisiana State University in 1991. He is especially interested in parallel computing and software engineering. 相似文献
Chih-Ping ChuEmail: |
Ching-Pao Chang is a PhD candidate in Computer Science & Information Engineering at the National Cheng-Kung University, Taiwan. He received his MA from the University of Southern California in 1998 in Computer Science. His current work deals with the software process improvement and defect prevention using machine learning techniques. Chih-Ping Chu is Professor of Software Engineering in Department of Computer Science & Information Engineering at the National Cheng-Kung University (NCKU) in Taiwan. He received his MA in Computer Science from the University of California, Riverside in 1987, and his Doctorate in Computer Science from Louisiana State University in 1991. He is especially interested in parallel computing and software engineering. 相似文献
999.
The identification of part families and machine groups that form the cells is a major step in the development of a cellular
manufacturing system and, consequently, a large number of concepts, theories and algorithms have been proposed. One common
assumption for most of these cell formation algorithms is that the product mix remains stable over a period of time. In today’s
world, the market demand is being shaped by consumers resulting in a highly volatile market. This has given rise to a new
class of products characterized by low volume and high variety. To incorporate product mix changes into an existing cellular
manufacturing system many important issues have to be tackled. In this paper, a methodology to incorporate new parts and machines
into an existing cellular manufacturing system has been presented. The objective is to fit the new parts and machines into an existing cellular manufacturing system thereby increasing machine utilization and reducing
investment in new equipment. 相似文献
1000.
Xiaoyu Wang Wen Wang Yong Huang Nhan Nguyen Kalmanje Krishnakumar 《Journal of Intelligent Manufacturing》2008,19(4):383-396
Hard turning with cubic boron nitride (CBN) tools has been proven to be more effective and efficient than traditional grinding
operations in machining hardened steels. However, rapid tool wear is still one of the major hurdles affecting the wide implementation
of hard turning in industry. Better prediction of the CBN tool wear progression helps to optimize cutting conditions and/or
tool geometry to reduce tool wear, which further helps to make hard turning a viable technology. The objective of this study
is to design a novel but simple neural network-based generalized optimal estimator for CBN tool wear prediction in hard turning.
The proposed estimator is based on a fully forward connected neural network with cutting conditions and machining time as
the inputs and tool flank wear as the output. Extended Kalman filter algorithm is utilized as the network training algorithm
to speed up the learning convergence. Network neuron connection is optimized using a destructive optimization algorithm. Besides
performance comparisons with the CBN tool wear measurements in hard turning, the proposed tool wear estimator is also evaluated
against a multilayer perceptron neural network modeling approach and/or an analytical modeling approach, and it has been proven
to be faster, more accurate, and more robust. Although this neural network-based estimator is designed for CBN tool wear modeling
in this study, it is expected to be applicable to other tool wear modeling applications. 相似文献