全文获取类型
收费全文 | 96123篇 |
免费 | 1229篇 |
国内免费 | 441篇 |
专业分类
电工技术 | 956篇 |
综合类 | 2332篇 |
化学工业 | 13222篇 |
金属工艺 | 5031篇 |
机械仪表 | 3315篇 |
建筑科学 | 2627篇 |
矿业工程 | 583篇 |
能源动力 | 1340篇 |
轻工业 | 5194篇 |
水利工程 | 1369篇 |
石油天然气 | 369篇 |
无线电 | 10566篇 |
一般工业技术 | 18015篇 |
冶金工业 | 5711篇 |
原子能技术 | 420篇 |
自动化技术 | 26743篇 |
出版年
2022年 | 108篇 |
2021年 | 196篇 |
2020年 | 117篇 |
2019年 | 171篇 |
2018年 | 14601篇 |
2017年 | 13515篇 |
2016年 | 10160篇 |
2015年 | 734篇 |
2014年 | 491篇 |
2013年 | 779篇 |
2012年 | 3557篇 |
2011年 | 9927篇 |
2010年 | 8681篇 |
2009年 | 5945篇 |
2008年 | 7219篇 |
2007年 | 8212篇 |
2006年 | 515篇 |
2005年 | 1601篇 |
2004年 | 1435篇 |
2003年 | 1479篇 |
2002年 | 822篇 |
2001年 | 389篇 |
2000年 | 457篇 |
1999年 | 375篇 |
1998年 | 1058篇 |
1997年 | 653篇 |
1996年 | 512篇 |
1995年 | 357篇 |
1994年 | 294篇 |
1993年 | 302篇 |
1992年 | 173篇 |
1991年 | 163篇 |
1990年 | 148篇 |
1989年 | 150篇 |
1988年 | 146篇 |
1987年 | 110篇 |
1986年 | 117篇 |
1985年 | 158篇 |
1984年 | 109篇 |
1983年 | 98篇 |
1982年 | 73篇 |
1981年 | 98篇 |
1980年 | 92篇 |
1979年 | 94篇 |
1977年 | 113篇 |
1976年 | 161篇 |
1973年 | 62篇 |
1968年 | 66篇 |
1955年 | 66篇 |
1954年 | 69篇 |
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
991.
This paper presents an automatic stock portfolio selection system. In the proposed approach, 53 financial indices are collected
for each stock item and are consolidated into six financial ratios [Grey relational grades (GRGs)] using a Grey relational
analysis model. The GRGs are processed using a modified form of the PBMF index method (designated as the Huang index function)
to determine the optimal number of clusters per GRG. The resulting cluster indices are then processed using rough set theory
to identify the stocks within the lower approximate sets. Finally, the GRGs of each stock item in the lower approximate sets
are consolidated into a single GRG, indicating the ability of the stock item to maximize the rate of return. It is demonstrated
that the proposed stock selection mechanism yields a higher rate of return than several existing portfolio selection systems. 相似文献
992.
J. Christopher Westland 《Information Technology and Management》2011,12(4):387-408
There is no agreement on how to formally incorporate affective data into statistical analysis and research conclusions. The information systems (IS) literature has recently published several position papers that have established a framework and perspective for using affective technology in IS research though. The frameworks have not been extensively tested, and are likely to evolve over time as empirical studies are conducted, and the validity of the methodologies is confirmed or disproved. A major goal of the current paper is to take the initial steps in translating the frameworks to usable methodologies, with application to improving our understanding of how to make effective empirical tests. This paper also investigates the adoption cycle of one of these technologies—electrodermal response (EDR) technologies—whose incarnation in the polygraph in forensic applications went through a complete adoption cycle in the twentieth century. The use of EDR response data in marketing research and surveys is nascent, but prior experience can help us to forecast and encourage its adoption in new research contexts. This research investigates three key questions: (1) What technology adoption model is appropriate for electrodermal response technology in forensic science? (2) What is the accuracy of affective electrodermal response readings? (3) What information is useful after superimposing affective EDR readings on contemporaneous survey data collection? Affective data acquisition technologies appear to add the most information when survey subjects are inclined to lie and have strong emotional feelings. Such data streams are informative, non-invasive and cost-effective. Informativeness is context-dependent though, and it relies on a complex set of still poorly understood human factors. Survey protocols and statistical analysis methods need to be developed to address these challenges. 相似文献
993.
Research on the group decision-making about emergency event based on network technology 总被引:1,自引:1,他引:0
Kefan Xie Gang Chen Qian Wu Yang Liu Pan Wang 《Information Technology and Management》2011,12(2):137-147
In order to improve decision-making efficiency about emergency event, this paper proposes a novel concept, i.e., Agile-Delphi
Method, which is an integration of agile decision and Delphi Method implicating that the decision-makers instantly deliver,
respond, treat, and utilize information via Delphi process while conducting group decision-making about emergency event. The
paper details the mechanism of group decision-making about emergency event based on network technology and Agile-Delphi Method.
Finally, the paper conducts an empiric analysis taking the “111 event”, i.e., the liquid ammonia spill event happened on November
1, 2006 in a phosphorus chemical company in China, as an example. 相似文献
994.
Angsana A. Techatassanasoontorn Shuguang Suo 《Information Technology and Management》2011,12(4):357-385
In the IT industry, de facto standards emerge from standards competition as firms offer incompatible technologies, and user choices determine the outcome of the competition. The standards literature suggests that strong network effects create a bias toward a standard with a large installed base, leading to a winner-take-all outcome. More recently, several researchers have revealed that the dynamics of standardization are much more complex than the explanation offered by the economic theory of networks. Markets do not always exhibit tipping behavior so there is not always a single winner in de facto standardization; and the size of an overall installed base does not always exert a strong influence on adoption decisions. In contrast, network effects drawn from local social influence may be more salient to user adoption decisions. We ask: (1) Do we always observe a winner-take-all outcome in de facto standards competition? (2) What are the different technology adoption patterns observed in de facto standards competition? (3) What are the implications of network effects, switching costs, pricing, and functionality enhancement strategies on the outcome of de facto standards competition in different user network structures? Drawing on the economic theory of networks, the complex network theory, and previous work in the standards literature, we examine the influence of network effects, switching costs, price, and technology functionality on user adoption decisions using agent-based simulation. We incorporate underlying user network structures frequently observed in the real world as an important determining factor of user adoption decisions. Our results suggest that de facto standardization process does not always follow a three-phased S-shaped pattern. Winner-take-all is not a necessary outcome of standards competition. User network structures have a significant impact on the dynamics and outcomes of standards competition. 相似文献
995.
Modeling spatially distributed phenomena in terms of its controlling factors is a recurring problem in geoscience. Most efforts
concentrate on predicting the value of response variable in terms of controlling variables either through a physical model
or a regression model. However, many geospatial systems comprises complex, nonlinear, and spatially non-uniform relationships,
making it difficult to even formulate a viable model. This paper focuses on spatial partitioning of controlling variables
that are attributed to a particular range of a response variable. Thus, the presented method surveys spatially distributed
relationships between predictors and response. The method is based on association analysis technique of identifying emerging
patterns, which are extended in order to be applied more effectively to geospatial data sets. The outcome of the method is
a list of spatial footprints, each characterized by a unique “controlling pattern”—a list of specific values of predictors
that locally correlate with a specified value of response variable. Mapping the controlling footprints reveals geographic
regionalization of relationship between predictors and response. The data mining underpinnings of the method are given and
its application to a real world problem is demonstrated using an expository example focusing on determining variety of environmental
associations of high vegetation density across the continental United States. 相似文献
996.
997.
Leo Freitas John McDermott 《International Journal on Software Tools for Technology Transfer (STTT)》2011,13(5):463-489
This paper reports on the Xenon project’s use of formal methods. Xenon is a higher-assurance secure hypervisor based on re-engineering
the Xen open-source hypervisor. The Xenon project used formal specifications both for assurance and as guides for security
re-engineering. We formally modelled the fundamental definition of security, the hypercall interface behaviour, and the internal
modular design. We used three formalisms: CSP, Z, and Circus for this work. Circus is a combination of Standard Z, CSP with its semantics given in Hoare and He’s unifying theories of programming. Circus is suited for both event-based and state-based modelling. Here, we report our experiences to date with using these formalisms
for assurance. 相似文献
998.
Kunal Sain Abhishek Dasgupta Utpal Garain 《International Journal on Document Analysis and Recognition》2011,14(1):75-85
Performance evaluation of mathematical expression recognition systems is attempted. The proposed method assumes expressions
(input as well as recognition output) are coded following MathML or TEX/LaTEX (which also gets converted into MathML) format. Since any MathML representation follows a tree structure, evaluation of
performance has been modeled as a tree-matching problem. The tree corresponding to the expression generated by the recognizer
is compared with the groundtruthed one by comparing the corresponding Euler strings. The changes required to convert the tree
corresponding to the expression generated by the recognizer into the groundtruthed one are noted. The number of changes required
to make such a conversion is basically the distance between the trees. This distance gives the performance measure for the
system under testing. The proposed algorithm also pinpoints the positions of the changes in the output MathML file. Testing
of the proposed evaluation method considers a set of example groundtruthed expressions and their corresponding recognized
results produced by an expression recognition system. 相似文献
999.
This paper proposes a multi-section vector quantization approach for on-line signature recognition. We have used a database
of 330 users which includes 25 skilled forgeries performed by 5 different impostors. This database is larger than those typically
used in the literature. Nevertheless, we also provide results from the SVC database. Our proposed system obtains similar results
as the state-of-the-art online signature recognition algorithm, Dynamic Time Warping, with a reduced computational requirement,
around 47 times lower. In addition, our system improves the database storage requirements due to vector compression, and is
more privacy-friendly because it is not possible to recover the original signature using the codebooks. Experimental results
reveal that our proposed multi-section vector quantization achieves a 98% identification rate, minimum Detection Cost Function
value equal to 2.29% for random forgeries and 7.75% for skilled forgeries. 相似文献
1000.
The introduction of Internet and Information and communication Technologies (ICT) in offices is a global phenomenon that transformed
white-collar worker job demands. Although there are several studies of e-skills and mental workload for central countries,
there is a lack of similar studies for the Latin American context. An online snowball sampled (n = 352) survey was developed and validated by the authors (internal consistency = 0.7). We characterized ICT worker profiles
from e-skills and these dimensions: attitudes toward, resources usage and technology dependency. Mental Strain was assessed with raw task load index (RTLX) and correlated with the proposed profiles by means of paired
T-tests and Mann–Whitney Tests. The sample was characterized by 7.2% of non visual display terminal users and 92.8% of visual
display terminal, ICT skilled users. Of the latter, 30.7% were ICT practitioners, 30.4% were ICT Users and 27.2% were E-Business
Users. Non VDT users’ mental strain was statistically meaningful smaller than VDT–ICT skilled users’ mental workload. No statistical
differences were found in RTLX results when comparing ICT skilled user profiles. Non VDT users can be identified from ICT
skilled Users by their lower ITC Dependency and minor use of ICT resources. There were no differences in those dimensions
among ICT skilled Profiles. Attitude toward these technologies was a distinct factor for ICT Users in relation to ICT Practitioners
and ITC Business Users. The application of this tool in peripheral and central countries would allow a complete ergonomical
characterization of white-collar workers within the Information Society. 相似文献