首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   152414篇
  免费   14406篇
  国内免费   8242篇
电工技术   11123篇
技术理论   6篇
综合类   10902篇
化学工业   22533篇
金属工艺   8549篇
机械仪表   9488篇
建筑科学   11949篇
矿业工程   4141篇
能源动力   4374篇
轻工业   11607篇
水利工程   3378篇
石油天然气   6896篇
武器工业   1645篇
无线电   19525篇
一般工业技术   17417篇
冶金工业   7205篇
原子能技术   1881篇
自动化技术   22443篇
  2024年   738篇
  2023年   2449篇
  2022年   4897篇
  2021年   6762篇
  2020年   4827篇
  2019年   3848篇
  2018年   4304篇
  2017年   5017篇
  2016年   4424篇
  2015年   6352篇
  2014年   8009篇
  2013年   9843篇
  2012年   11093篇
  2011年   11984篇
  2010年   10583篇
  2009年   10192篇
  2008年   9955篇
  2007年   9254篇
  2006年   8788篇
  2005年   7275篇
  2004年   4901篇
  2003年   3705篇
  2002年   3640篇
  2001年   3108篇
  2000年   2907篇
  1999年   2735篇
  1998年   2298篇
  1997年   1993篇
  1996年   1795篇
  1995年   1525篇
  1994年   1156篇
  1993年   953篇
  1992年   776篇
  1991年   564篇
  1990年   471篇
  1989年   358篇
  1988年   306篇
  1987年   217篇
  1986年   186篇
  1985年   153篇
  1984年   111篇
  1983年   84篇
  1982年   80篇
  1981年   73篇
  1980年   67篇
  1979年   39篇
  1978年   36篇
  1977年   33篇
  1976年   56篇
  1973年   23篇
排序方式: 共有10000条查询结果,搜索用时 15 毫秒
991.
The quick advance in image/video editing techniques has enabled people to synthesize realistic images/videos conveniently. Some legal issues may arise when a tampered image cannot be distinguished from a real one by visual examination. In this paper, we focus on JPEG images and propose detecting tampered images by examining the double quantization effect hidden among the discrete cosine transform (DCT) coefficients. To our knowledge, our approach is the only one to date that can automatically locate the tampered region, while it has several additional advantages: fine-grained detection at the scale of 8×8 DCT blocks, insensitivity to different kinds of forgery methods (such as alpha matting and inpainting, in addition to simple image cut/paste), the ability to work without fully decompressing the JPEG images, and the fast speed. Experimental results on JPEG images are promising.  相似文献   
992.
In overlay networks, the network characteristics before and after a vertical handoff would be drastically different. Consequently, in this paper, we propose an end‐to‐end based scheme to support protocol and application adaptation in vertical handoffs. First, we proposed a Vertical‐handoff Aware TCP, called VA‐TCP. VA‐TCP can identify the packet losses caused by vertical handoffs. If segments losses are due to vertical handoffs, VA‐TCP only retransmits the missing segments but does not invoke the congestion control procedure. Moreover, VA‐TCP dynamically estimates the bandwidth and round‐trip time in a new network. Based on the estimated bandwidth and round‐trip time, VA‐TCP adjusts its parameters to respond to the new network environment. Second, during a vertical handoff, applications also need to be adapted accordingly. Therefore, we design a programming interface that allows applications to be notified upon and adapt to changing network environments. To support our interface, we utilize the signal mechanism to achieve kernel‐to‐user notification. Nevertheless, signals cannot carry information. Thus, we implement the shared memory mechanism between applications and the kernel to facilitate parameters exchange. Finally, we also provide a handoff‐aware CPU scheduler so that tasks that are interested in the vertical‐handoff event are given preference over other processes to attain a prompt response for new network conditions. We have implemented a prototype system on the Linux kernel 2.6. From the experimental results, our proposed protocol and application adaptation mechanisms are shown to effectively improve the performance of TCP and applications during vertical handoffs. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   
993.
Geometric Mean for Subspace Selection   总被引:2,自引:0,他引:2  
Subspace selection approaches are powerful tools in pattern classification and data visualization. One of the most important subspace approaches is the linear dimensionality reduction step in the Fisher's linear discriminant analysis (FLDA), which has been successfully employed in many fields such as biometrics, bioinformatics, and multimedia information management. However, the linear dimensionality reduction step in FLDA has a critical drawback: for a classification task with c classes, if the dimension of the projected subspace is strictly lower than c - 1, the projection to a subspace tends to merge those classes, which are close together in the original feature space. If separate classes are sampled from Gaussian distributions, all with identical covariance matrices, then the linear dimensionality reduction step in FLDA maximizes the mean value of the Kullback-Leibler (KL) divergences between different classes. Based on this viewpoint, the geometric mean for subspace selection is studied in this paper. Three criteria are analyzed: 1) maximization of the geometric mean of the KL divergences, 2) maximization of the geometric mean of the normalized KL divergences, and 3) the combination of 1 and 2. Preliminary experimental results based on synthetic data, UCI Machine Learning Repository, and handwriting digits show that the third criterion is a potential discriminative subspace selection method, which significantly reduces the class separation problem in comparing with the linear dimensionality reduction step in FLDA and its several representative extensions.  相似文献   
994.
Design of DL-based certificateless digital signatures   总被引:1,自引:0,他引:1  
Public-key cryptosystems without requiring digital certificates are very attractive in wireless communications due to limitations imposed by communication bandwidth and computational resource of the mobile wireless communication devices. To eliminate public-key digital certificate, Shamir introduced the concept of the identity-based (ID-based) cryptosystem. The main advantage of the ID-based cryptosystem is that instead of using a random integer as each user’s public key as in the traditional public-key systems, the user’s real identity, such as user’s name or email address, becomes the user’s public key. However, all identity-based signature (IBS) schemes have the inherent key escrow problem, that is private key generator (PKG) knows the private key of each user. As a result, the PKG is able to sign any message on the users’ behalf. This nature violates the “non-repudiation” requirement of digital signatures. To solve the key escrow problem of the IBS while still taking advantage of the benefits of the IBS, certificateless digital signature (CDS) was introduced. In this paper, we propose a generalized approach to construct CDS schemes. In our proposed CDS scheme, the user’s private key is known only to the user himself, therefore, it can eliminate the key escrow problem from the PKG. The proposed construction can be applied to all Discrete Logarithm (DL)-based signature schemes to convert a digital signature scheme into a CDS scheme. The proposed CDS scheme is secure against adaptive chosen-message attack in the random oracle model. In addition, it is also efficient in signature generation and verification.  相似文献   
995.
The evolution of computer science and technology has brought new opportunities for multidisciplinary designers and engineers to collaborate with each other in a concurrent and coordinated manner. The development of computational agents with unified data structures and software protocols contributes to the establishment of a new way of working in collaborative design, which is increasingly becoming an international practice. In this paper, based on the analysis of the dynamic nature of collaborative design process, a new framework for collaborative design is described. This framework adopts an agent-based approach and relocates designers, managers, systems, and the supporting agents in a unified knowledge representation scheme for product design. In order to model the constantly evolving design process and the rationales resulted from design collaboration, a Collaborative Product Data Model (CPDM) and a constraint-based Collaborative Design Process Model (CDPM) are proposed to facilitate the management and coordination of the collaborative design process as well as design knowledge management. A prototype system of the proposed framework is implemented and its feasibility is evaluated using a real design scenario whose objective is designing a set of dining table and chairs.  相似文献   
996.
997.
Google Earth search function was used to study the impacts of small-scale spatial ability, large-scale environmental cognition, and geographical knowledge on new technology usage. The participants were 153 junior high students from central Taiwan. Geography grades served as indicators of prior knowledge, mental rotation and abstract reasoning skills as indicators of spatial ability, and sketch maps of school neighborhoods as indicators of environmental cognition (including landmark representation, intersection representation, and frame of reference). Lastly, the authors announced the landmarks searching worksheet and asked the participants to accomplish 16 familiar and unfamiliar landmark searching tasks using Google Earth with keyword search function disabled. The result showed the strongest predictor of landmark searching performance is ‘frame of reference’ in environmental cognition, followed by ‘mental rotation’ of spatial ability, ‘landmark representation’ of environmental cognition, and geographical knowledge. Google Earth landmark searches require complex cognitive processing; therefore, our conclusion is that GIS-supported image search activities give students good practice of active knowledge construction.  相似文献   
998.
In this paper, a novel clustering method in the kernel space is proposed. It effectively integrates several existing algorithms to become an iterative clustering scheme, which can handle clusters with arbitrary shapes. In our proposed approach, a reasonable initial core for each of the cluster is estimated. This allows us to adopt a cluster growing technique, and the growing cores offer partial hints on the cluster association. Consequently, the methods used for classification, such as support vector machines (SVMs), can be useful in our approach. To obtain initial clusters effectively, the notion of the incomplete Cholesky decomposition is adopted so that the fuzzy c‐means (FCM) can be used to partition the data in a kernel defined‐like space. Then a one‐class and a multiclass soft margin SVMs are adopted to detect the data within the main distributions (the cores) of the clusters and to repartition the data into new clusters iteratively. The structure of the data set is explored by pruning the data in the low‐density region of the clusters. Then data are gradually added back to the main distributions to assure exact cluster boundaries. Unlike the ordinary SVM algorithm, whose performance relies heavily on the kernel parameters given by the user, the parameters are estimated from the data set naturally in our approach. The experimental evaluations on two synthetic data sets and four University of California Irvine real data benchmarks indicate that the proposed algorithms outperform several popular clustering algorithms, such as FCM, support vector clustering (SVC), hierarchical clustering (HC), self‐organizing maps (SOM), and non‐Euclidean norm fuzzy c‐means (NEFCM). © 2009 Wiley Periodicals, Inc.4  相似文献   
999.
Data hiding is a technique that is used to embed secret information into a cover media. It has been widely used in protecting copyright and transmitting sensitive data over an insecure channel. Conventional data hiding schemes only focus on how to reduce the distortion when sensitive data is embedded into the cover image. However, the transmitted images may be compressed or occur transmitting errors. If such errors occur, the receiver cannot extract the correct information from the stego-image. In this paper, we proposed a novel hiding data scheme with distortion tolerance. The proposed scheme not only can prevent the quality of the processed image from being seriously degraded, but also can simultaneously achieve distortion tolerance. Experimental results show that the proposed scheme indeed can obtain a good image quality and is superior to the other schemes in terms of its distortion tolerance.  相似文献   
1000.
Autonomous Vehicle Parking Using Hybrid Artificial Intelligent Approach   总被引:2,自引:0,他引:2  
This paper devotes to design and implement a hybrid artificial intelligent control scheme for a car-like vehicle to perform the task of optimal parking. The parallel parking control scheme addresses three issues: trajectory planner, decisional kernel, and trajectory tracking control. Design of the control scheme consists of several techniques: genetic algorithm, Petri net, and fuzzy logic control. The genetic algorithm is used to determine the feasible parking locations. The Petri net is used to replace the traditional decision flow chart and plan alternative parking routes especially in global space. The parking routine can be re-performed if the initially assigned route is interfered or when the targeted parking space has been occupied. The fuzzy logic controller is used to drive the vehicle along with the optimal parking route. The proposed scheme is put into several scenarios to test and verify its applicability and to manifest its distinguished features.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号