首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4323篇
  免费   169篇
  国内免费   19篇
电工技术   115篇
综合类   6篇
化学工业   884篇
金属工艺   110篇
机械仪表   156篇
建筑科学   159篇
矿业工程   8篇
能源动力   281篇
轻工业   367篇
水利工程   55篇
石油天然气   44篇
无线电   421篇
一般工业技术   768篇
冶金工业   598篇
原子能技术   19篇
自动化技术   520篇
  2023年   45篇
  2022年   81篇
  2021年   139篇
  2020年   95篇
  2019年   108篇
  2018年   142篇
  2017年   117篇
  2016年   137篇
  2015年   95篇
  2014年   132篇
  2013年   384篇
  2012年   248篇
  2011年   264篇
  2010年   218篇
  2009年   188篇
  2008年   181篇
  2007年   170篇
  2006年   168篇
  2005年   140篇
  2004年   100篇
  2003年   107篇
  2002年   86篇
  2001年   71篇
  2000年   70篇
  1999年   60篇
  1998年   108篇
  1997年   84篇
  1996年   64篇
  1995年   62篇
  1994年   40篇
  1993年   54篇
  1992年   32篇
  1991年   24篇
  1990年   34篇
  1989年   32篇
  1988年   43篇
  1987年   36篇
  1986年   32篇
  1985年   35篇
  1984年   40篇
  1983年   29篇
  1982年   14篇
  1981年   20篇
  1980年   14篇
  1979年   19篇
  1978年   12篇
  1977年   13篇
  1976年   17篇
  1975年   22篇
  1970年   10篇
排序方式: 共有4511条查询结果,搜索用时 31 毫秒
101.
This paper presents a new document representation with vectorized multiple features including term frequency and term-connection-frequency. A document is represented by undirected and directed graph, respectively. Then terms and vectorized graph connectionists are extracted from the graphs by employing several feature extraction methods. This hybrid document feature representation more accurately reflects the underlying semantics that are difficult to achieve from the currently used term histograms, and it facilitates the matching of complex graph. In application level, we develop a document retrieval system based on self-organizing map (SOM) to speed up the retrieval process. We perform extensive experimental verification, and the results suggest that the proposed method is computationally efficient and accurate for document retrieval.  相似文献   
102.
The advent of internet has led to a significant growth in the amount of information available, resulting in information overload, i.e. individuals have too much information to make a decision. To resolve this problem, collaborative tagging systems form a categorization called folksonomy in order to organize web resources. A folksonomy aggregates the results of personal free tagging of information and objects to form a categorization structure that applies utilizes the collective intelligence of crowds. Folksonomy is more appropriate for organizing huge amounts of information on the Web than traditional taxonomies established by expert cataloguers. However, the attributes of collaborative tagging systems and their folksonomy make them impractical for organizing resources in personal environments.This work designs a desktop collaborative tagging (DCT) system that enables collaborative workers to tag their documents. This work proposes an application in patent analysis based on the DCT system. Folksonomy in DCT is built by aggregating personal tagging results, and is represented by a concept space. Concept spaces provide synonym control, tag recommendation and relevant search. Additionally, to protect privacy of authors and to decrease the transmission cost, relations between tagged and untagged documents are constructed by extracting document’s features rather than adopting the full text.Experimental results reveal that the adoption rate of recommended tags for new documents increases by 10% after users have tagged five or six documents. Furthermore, DCT can recommend tags with higher adoption rates when given new documents with similar topics to previously tagged ones. The relevant search in DCT is observed to be superior to keyword search when adopting frequently used tags as queries. The average precision, recall, and F-measure of DCT are 12.12%, 23.08%, and 26.92% higher than those of keyword searching.DCT allows a multi-faceted categorization of resources for collaborative workers and recommends tags for categorizing resources to simplify categorization easier. Additionally, DCT system provides relevance searching, which is more effective than traditional keyword searching for searching personal resources.  相似文献   
103.
Coastal water mapping from remote-sensing hyperspectral data suffers from poor retrieval performance when the targeted parameters have little effect on subsurface reflectance, especially due to the ill-posed nature of the inversion problem. For example, depth cannot accurately be retrieved for deep water, where the bottom influence is negligible. Similarly, for very shallow water it is difficult to estimate the water quality because the subsurface reflectance is affected more by the bottom than by optically active water components.

Most methods based on radiative transfer model inversion do not consider the distribution of targeted parameters within the inversion process, thereby implicitly assuming that any parameter value in the estimation range has the same probability. In order to improve the estimation accuracy for the above limiting cases, we propose to regularize the objective functions of two estimation methods (maximum likelihood or ML, and hyperspectral optimization process exemplar, or HOPE) by introducing local prior knowledge on the parameters of interest. To do so, loss functions are introduced into ML and HOPE objective functions in order to reduce the range of parameter estimation. These loss functions can be characterized either by using prior or expert knowledge, or by inferring this knowledge from the data (thus avoiding the use of additional information).

This approach was tested both on simulated and real hyperspectral remote-sensing data. We show that the regularized objective functions are more peaked than their non-regularized counterparts when the parameter of interest has little effect on subsurface reflectance. As a result, the estimation accuracy of regularized methods is higher for these depth ranges. In particular, when evaluated on real data, these methods were able to estimate depths up to 20 m, while corresponding non-regularized methods were accurate only up to 13 m on average for the same data.

This approach thus provides a solution to deal with such difficult estimation conditions. Furthermore, because no specific framework is needed, it can be extended to any estimation method that is based on iterative optimization.  相似文献   
104.
In support of a generalization of systems theory, this paper introduces a new approach in modeling complex distributed systems. It offers an analytic framework for describing the behavior of interactive cyberphysical systems (CPSs), which are networked stationary or mobile information systems responsible for the real-time governance of physical processes whose behaviors unfold in cyberspace. The framework is predicated on a cyberspace-time reference model comprising three spatial dimensions plus time. The spatial domains include geospatial, infospatial, and sociospatial references, the latter describing relationships among sovereign enterprises (rational agents) that choose voluntarily to organize and interoperate for individual and mutual benefit through geospatial (physical) and infospatial (logical) transactions. Of particular relevance to CPSs are notions of timeliness and value, particularly as they relate to the real-time governance of physical processes and engagements with other cooperating CPS. Our overarching interest, as with celestial mechanics, is in the formation and evolution of clusters of cyberspatial objects and the federated systems they form.  相似文献   
105.
Replica Placement Strategies in Data Grid   总被引:1,自引:0,他引:1  
Replication is a technique used in Data Grid environments that helps to reduce access latency and network bandwidth utilization. Replication also increases data availability thereby enhancing system reliability. The research addresses the problem of replication in Data Grid environment by investigating a set of highly decentralized dynamic replica placement algorithms. Replica placement algorithms are based on heuristics that consider both network latency and user requests to select the best candidate sites to place replicas. Due to dynamic nature of Grid, the candidate site holds replicas currently may not be the best sites to fetch replicas in subsequent periods. Therefore, a replica maintenance algorithm is proposed to relocate replicas to different sites if the performance metric degrades significantly. The study of our replica placement algorithms is carried out using a model of the EU Data Grid Testbed 1 [Bell et al. Comput. Appl., 17(4), 2003] sites and their associated network geometry. We validate our replica placement algorithms with total file transfer times, the number of local file accesses, and the number of remote file accesses.  相似文献   
106.
The present study examines the effects that temporal and spatial averagings due to finite size and finite response time of pressure transducers have on the pressure measurements in blast wave flow fields generated by milligram charges of silver azide. In such applications, the characteristic time and length scales of the physical process are of the same order of magnitude as the temporal and spatial characteristics of the transducer. The measured pressure values will then be spatially and temporally averaged, and important parameters for the assessment of blast effects may not be properly represented in the measured trace. In this study, face-on and side-on pressure transducer setups are considered. In the experiments, face-on and side-on readings at the same distance from the charge as well as time-resolved optical visualization of the whole flow field are obtained simultaneously for the same explosive event. The procedure of data extraction from the experimental pressure traces is revisited and discussed in detail. In the numerical modeling part of the study, numerical blast flow fields are generated using an Euler flow solver. A numerical pressure transducer model is developed to qualitatively simulate the averaging effects. The experimental and numerical data show that the results of pressure measurements in experiments with small charges must be used with great caution. The effective averaging of the pressure signal may lead to a significant underestimation of blast wave intensities. The side-on setup is especially prone to this effect. The face-on setup provides results close to those obtained from optical records only if the pressure transducer is sufficiently remote from the charge.  相似文献   
107.
A sparser but more efficient connection rule (called a bond-cutoff method) for a simplified alpha-carbon coarse-grained elastic network model is presented. One of conventional connection rules for elastic network models is the distance-cutoff method, where virtual springs connect an alpha-carbon with all neighbor alpha-carbons within predefined distance-cutoff value. However, though the maximum interaction distance between alpha-carbons is reported as 7 angstroms, this cutoff value can make the elastic network unstable in many cases of protein structures. Thus, a larger cutoff value (>11 angstroms) is often used to establish a stable elastic network model in previous researches. To overcome this problem, a connection rule for backbone model is proposed, which satisfies the minimum condition to stabilize an elastic network. Based on the backbone connections, each type of chemical interactions is considered and added to the elastic network model: disulfide bonds, hydrogen bonds, and salt-bridges. In addition, the van der Waals forces between alpha-carbons are modeled by using the distance-cutoff method. With the proposed connection rule, one can make an elastic network model with less than 7 angstroms distance cutoff, which can reveal protein flexibility more sharply. Moreover, the normal modes from the new elastic network model can reflect conformational changes of a given protein better than ones by the distance-cutoff method. This method can save the computational cost when calculating normal modes of a given protein structure, because it can reduce the total number of connections. As a validation, six example proteins are tested. Computational times and the overlap values between the conformational change and infinitesimal motion calculated by normal mode analysis are presented. Those animations are also available at UMass Morph Server (http://biomechanics.ecs.umass.edu/umms.html).  相似文献   
108.
The d.c. and a.c. electrical properties were studied for various compositions of SiO/GeO2 co-evaporated thin films carrying aluminium electrodes, in the temperature range 193–413 K. A.c. measurements were made over the frequency range 2x102–106Hz. The value of the d.c. activation energy was found to decrease with increasing GeO2 content in the SiO. In the region of high applied field (above 106 Vm–1, the conduction mechanism is governed by Schottky emission at the blocking contact. The a.c. electrical conductivity, (), varies with frequency according to the relation () s, where the exponent s was found to be dependent on temperature and frequency. The a.c. conduction at low temperature was due to an electronic hopping process. The number of localized sites was estimated from the a.c. measurements for different compositions of SiO/GeO2 using the models proposed by Elliott and by Pollak, and the values are compared. The Elliott model satisfactorily accounts for the observed a.c. electrical results. A correlation was found between activation energy, optical band gap, conductivity and number of localized sites for the various compositions of SiO/GeO2 films. The relative dielectric constant, r, and loss factor, tan , were found to increase with the increase of GeO2 content in the films.  相似文献   
109.
Multimedia systems design generally requires a collaborative effort from a group of designers with a variety of backgrounds and tasks, such as content experts, instructional designers, media specialists, users, and so forth. However, currently available design tools on the market are mainly designed for a single user. Tools intended to support a collaborative design process should coordinate independent activities of individual designers.This research investigated support for work groups engaged in designing multimedia systems. Specifically, it discussed a new collaborative design environment, called the KMS (Knowledge Management System)-based design environment, in which multimedia designers could share their design knowledge freely. Through two experimental groups, the research investigated impacts of the KMS-based design environment on their collaborative design activities (knowledge creating, knowledge securing, knowledge distributing, and knowledge retrieving activities). The research findings showed that the KMS-based design environment was a promising environment for collaborative multimedia systems design. More specifically, the research findings indicated that the KMS-based design environment supported creating, securing, and retrieving knowledge, but it did not support distributing knowledge. In addition, the research found that the social interactions between group members played important roles in the success of the collaborative multimedia systems design and that the KMS-based design environment did not support the socialization of group members. Furthermore, the research found that the inability of the KMS-based design environment to support the socialization was linked to its low performance level in supporting the knowledge distributing activity. The research explored the desired features of a collaborative support tool for multimedia systems design.  相似文献   
110.
Abstract. A new parallel hybrid decision fusion methodology is proposed. It is demonstrated that existing parallel multiple expert decision combination approaches can be divided into two broad categories based on the implicit decision emphasis implemented. The first category consists of methods implementing computationally intensive decision frameworks incorporating a priori information about the target task domain and the reliability of the participating experts, while the second category encompasses approaches implementing group consensus without assigning any importance to the reliability of the experts and ignoring other contextual information. The methodology proposed in this paper is a hybridisation of these two approaches and has shown significant performance enhancements in terms of higher overall recognition rates along with lower substitution rates. Detailed analysis using two different databases supports this claim. Received January 19, 1999 / Revised March 20, 2000  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号