全文获取类型
收费全文 | 1244篇 |
免费 | 84篇 |
国内免费 | 2篇 |
专业分类
电工技术 | 9篇 |
综合类 | 1篇 |
化学工业 | 273篇 |
金属工艺 | 21篇 |
机械仪表 | 45篇 |
建筑科学 | 62篇 |
矿业工程 | 3篇 |
能源动力 | 96篇 |
轻工业 | 120篇 |
水利工程 | 15篇 |
石油天然气 | 5篇 |
无线电 | 115篇 |
一般工业技术 | 222篇 |
冶金工业 | 69篇 |
原子能技术 | 15篇 |
自动化技术 | 259篇 |
出版年
2024年 | 4篇 |
2023年 | 13篇 |
2022年 | 29篇 |
2021年 | 39篇 |
2020年 | 34篇 |
2019年 | 36篇 |
2018年 | 63篇 |
2017年 | 36篇 |
2016年 | 65篇 |
2015年 | 38篇 |
2014年 | 63篇 |
2013年 | 121篇 |
2012年 | 77篇 |
2011年 | 101篇 |
2010年 | 64篇 |
2009年 | 134篇 |
2008年 | 79篇 |
2007年 | 56篇 |
2006年 | 56篇 |
2005年 | 29篇 |
2004年 | 29篇 |
2003年 | 22篇 |
2002年 | 21篇 |
2001年 | 12篇 |
2000年 | 14篇 |
1999年 | 7篇 |
1998年 | 13篇 |
1997年 | 9篇 |
1996年 | 3篇 |
1995年 | 6篇 |
1994年 | 3篇 |
1993年 | 3篇 |
1992年 | 2篇 |
1991年 | 4篇 |
1990年 | 5篇 |
1989年 | 2篇 |
1988年 | 1篇 |
1987年 | 1篇 |
1986年 | 3篇 |
1985年 | 1篇 |
1984年 | 3篇 |
1983年 | 5篇 |
1982年 | 5篇 |
1980年 | 2篇 |
1979年 | 7篇 |
1978年 | 3篇 |
1977年 | 1篇 |
1975年 | 1篇 |
1974年 | 3篇 |
1973年 | 2篇 |
排序方式: 共有1330条查询结果,搜索用时 15 毫秒
21.
This paper presents a wavelet-based kernel Principal Component Analysis (PCA) method by integrating the Daubechies wavelet representation of palm images and the kernel PCA method for palmprint recognition. Kernel PCA is a technique for nonlinear dimension reduction of data with an underlying nonlinear spatial structure. The intensity values of the palmprint image are first normalized by using mean and standard deviation. The palmprint is then transformed into the wavelet domain to decompose palm images and the lowest resolution subband coeffcients are chosen for palm representation. The kernel PCA method is then applied to extract non-linear features from the subband coeffcients. Finally, similarity measurement is accomplished by using weighted Euclidean linear distance-based nearest neighbor classifier. Experimental results on PolyU Palmprint Databases demonstrate that the proposed approach achieves highly competitive performance with respect to the published palmprint recognition approaches. 相似文献
22.
23.
The wavelet domain association rules method is proposed for efficient texture characterization. The concept of association rules to capture the frequently occurring local intensity variation in textures. The frequency of occurrence of these local patterns within a region is used as texture features. Since texture is basically a multi-scale phenomenon, multi-resolution approaches such as wavelets, are expected to perform efficiently for texture analysis. Thus, this study proposes a new algorithm which uses the wavelet domain association rules for texture classification. Essentially, this work is an extension version of an early work of the Rushing et al. [10], [11], where the generation of intensity domain association rules generation was proposed for efficient texture characterization. The wavelet domain and the intensity domain (gray scale) association rules were generated for performance comparison purposes. As a result, Rushing et al. [10], [11] demonstrated that intensity domain association rules performs much more accurate results than those of the methods which were compared in the Rushing et al. work. Moreover, the performed experimental studies showed the effectiveness of the wavelet domain association rules than the intensity domain association rules for texture classification problem. The overall success rate is about 97%. 相似文献
24.
Approximate and exact hybrid algorithms for private nearest-neighbor queries with database protection 总被引:1,自引:0,他引:1
Mobile devices with global positioning capabilities allow users to retrieve points of interest (POI) in their proximity. To protect user privacy, it is important not to disclose exact user coordinates to un-trusted entities that provide location-based services. Currently, there are two main approaches to protect the location privacy of users: (i) hiding locations inside cloaking regions (CRs) and (ii) encrypting location data using private information retrieval (PIR) protocols. Previous work focused on finding good trade-offs between privacy and performance of user protection techniques, but disregarded the important issue of protecting the POI dataset D. For instance, location cloaking requires large-sized CRs, leading to excessive disclosure of POIs (O(|D|) in the worst case). PIR, on the other hand, reduces this bound to \(O(\sqrt{|D|})\), but at the expense of high processing and communication overhead. We propose hybrid, two-step approaches for private location-based queries which provide protection for both the users and the database. In the first step, user locations are generalized to coarse-grained CRs which provide strong privacy. Next, a PIR protocol is applied with respect to the obtained query CR. To protect against excessive disclosure of POI locations, we devise two cryptographic protocols that privately evaluate whether a point is enclosed inside a rectangular region or a convex polygon. We also introduce algorithms to efficiently support PIR on dynamic POI sub-sets. We provide solutions for both approximate and exact NN queries. In the approximate case, our method discloses O(1) POI, orders of magnitude fewer than CR- or PIR-based techniques. For the exact case, we obtain optimal disclosure of a single POI, although with slightly higher computational overhead. Experimental results show that the hybrid approaches are scalable in practice, and outperform the pure-PIR approach in terms of computational and communication overhead. 相似文献
25.
Mathematical model of vertical electrical sounding by using resistivity method is studied. The model leads to an inverse problem of determination of the unknown leading coefficient (conductivity) of the elliptic equation in R2 in a slab. The direct problem is obtained in the form of mixed BVP in axisymmetric cylindrical coordinates. The additional (available measured) data is given on the upper boundary of the slab, in the form of tangential derivative. Due to ill-conditionedness of the considered inverse problem the logarithmic transformation is applied to the unknown coefficient and the inverse problem is studied as a minimization problem for the cost functional, with respect to the reflection coefficient. The Conjugate Gradient method (CGM) is applied for the numerical solution of this problem. Computational experiments were performed with noise free and random noisy data. 相似文献
26.
Recognizing people by gait promises to be useful for identifying individuals from a distance; in this regard, improved techniques
are under development. In this paper, an improved method for gait recognition is proposed. Binarized silhouette of a motion
object is first represented by four 1-D signals that are the basic image features called the distance vectors. The distance
vectors are differences between the bounding box and silhouette, and extracted using four projections to silhouette. Fourier
Transform is employed as a preprocessing step to achieve translation invariant for the gait patterns accumulated from silhouette
sequences that are extracted from the subjects’ walk in different speed and/or different time. Then, eigenspace transformation
is applied to reduce the dimensionality of the input feature space. Support vector machine (SVM)-based pattern classification
technique is then performed in the lower-dimensional eigenspace for recognition. The input feature space is alternatively
constructed by using two different approaches. The four projections (1-D signals) are independently classified in the first
approach. A fusion task is then applied to produce the final decision. In the second approach, the four projections are concatenated
to have one vector and then pattern classification with one vector is performed in the lower-dimensional eigenspace for recognition.
The experiments are carried out on the most well-known public gait databases: the CMU, the USF, SOTON, and NLPR human gait
databases. To effectively understand the performance of the algorithm, the experiments are executed and presented as increasing
amounts of the gait cycles of each person available during the training procedure. Finally, the performance of the proposed
algorithm is comparatively illustrated to take into consideration the published gait recognition approaches. 相似文献
27.
Mobility path information of cell phone users play a crucial role in a wide range of cell phone applications, including context based search and advertising, early warning systems, city-wide sensing applications such as air pollution exposure estimation and traffic planning. However, there is a disconnect between the low level location data logs available from the cell phones and the high level mobility path information required to support these cell phone applications. In this paper, we present formal definitions to capture the cell phone users’ mobility patterns and profiles, and provide a complete framework, Mobility Profiler, for discovering mobile cell phone user profiles starting from cell based location data. We use real-world cell phone log data (of over 350 K h of coverage) to demonstrate our framework and perform experiments for discovering frequent mobility patterns and profiles. Our analysis of mobility profiles of cell phone users expose a significant long tail in a user’s location-time distribution: A total of 15% of a cell phone user’s time is spent on average in locations that each appears with less than 1% of total time. 相似文献
28.
Detecting and tracking ground targets is crucial in military intelligence in battlefield surveillance. Once targets have been detected, the system used can proceed to track them where tracking can be done using Ground Moving Target Indicator (GMTI) type indicators that can observe objects moving in the area of interest. However, when targets move close to each other in formation as a convoy, then the problem of assigning measurements to targets has to be addressed first, as it is an important step in target tracking. With the increasing computational power, it became possible to use more complex association logic in tracking algorithms. Although its optimal solution can be proved to be an NP hard problem, the multidimensional assignment enjoyed a renewed interest mostly due to Lagrangian relaxation approaches to its solution. Recently, it has been reported that randomized heuristic approaches surpassed the performance of Lagrangian relaxation algorithm especially in dense problems. In this paper, impelled from the success of randomized heuristic methods, we investigate a different stochastic approach, namely, the biologically inspired ant colony optimization to solve the NP hard multidimensional assignment problem for tracking multiple ground targets. 相似文献
29.
In a network, one of the important problems is making an efficient routing decision. Many studies have been carried out on
making a decision and several routing algorithms have been developed. In a network environment, every node has a routing table
and these routing tables are used for making routing decisions. Nowadays, intelligent agents are used to make routing decisions.
Intelligent agents have been inspired by social insects such as ants. One of the intelligent agent types is self a cloning
ant. In this study, a self cloning ant colony approach is used. Self cloning ants are a new synthetic ant type. This ant assesses
the situation and multiplies through cloning or destroying itself. It is done by making a routing decision and finding the
optimal path. This study explains routing table updating by using the self cloning ant colony approach. In a real net, this
approach has been used and routing tables have been created and updated for every node. 相似文献
30.
Anthony?Etuk Timothy?J.?Norman Murat??ensoyEmail author Mudhakar?Srivatsa 《Autonomous Agents and Multi-Agent Systems》2017,31(3):531-560
The presence of numerous and disparate information sources available to support decision-making calls for efficient methods of harnessing their potential. Information sources may be unreliable, and misleading reports can affect decisions. Existing trust and reputation mechanisms typically rely on reports from as many sources as possible to mitigate the influence of misleading reports on decisions. In the real world, however, it is often the case that querying information sources can be costly in terms of energy, bandwidth, delay overheads, and other constraints. We present a model of source selection and fusion in resource-constrained environments, where there is uncertainty regarding the trustworthiness of sources. We exploit diversity among sources to stratify them into homogeneous subgroups to both minimise redundant sampling and mitigate the effect of certain biases. Through controlled experiments, we demonstrate that a diversity-based approach is robust to biases introduced due to dependencies among source reports, performs significantly better than existing approaches when sampling budget is limited and equally as good with an unlimited budget. 相似文献