首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   8463篇
  免费   421篇
  国内免费   49篇
电工技术   231篇
综合类   22篇
化学工业   1752篇
金属工艺   171篇
机械仪表   199篇
建筑科学   277篇
矿业工程   8篇
能源动力   594篇
轻工业   842篇
水利工程   101篇
石油天然气   164篇
武器工业   4篇
无线电   1088篇
一般工业技术   1522篇
冶金工业   513篇
原子能技术   80篇
自动化技术   1365篇
  2023年   186篇
  2022年   400篇
  2021年   583篇
  2020年   407篇
  2019年   403篇
  2018年   528篇
  2017年   365篇
  2016年   421篇
  2015年   263篇
  2014年   409篇
  2013年   658篇
  2012年   439篇
  2011年   502篇
  2010年   329篇
  2009年   286篇
  2008年   266篇
  2007年   239篇
  2006年   207篇
  2005年   179篇
  2004年   140篇
  2003年   116篇
  2002年   128篇
  2001年   76篇
  2000年   75篇
  1999年   88篇
  1998年   143篇
  1997年   119篇
  1996年   86篇
  1995年   84篇
  1994年   54篇
  1993年   53篇
  1992年   46篇
  1991年   25篇
  1990年   28篇
  1989年   45篇
  1988年   47篇
  1987年   32篇
  1986年   31篇
  1985年   44篇
  1984年   51篇
  1983年   41篇
  1982年   28篇
  1981年   25篇
  1980年   32篇
  1979年   27篇
  1978年   20篇
  1977年   21篇
  1976年   31篇
  1974年   17篇
  1972年   15篇
排序方式: 共有8933条查询结果,搜索用时 156 毫秒
181.
One of the most pressing concerns for the consumer market is the detection of adulteration in meat products due to their preciousness. The rapid and accurate identification mechanism for lard adulteration in meat products is highly necessary, for developing a mechanism trusted by consumers and that can be used to make a definitive diagnosis. Fourier Transform Infrared Spectroscopy (FTIR) is used in this work to identify lard adulteration in cow, lamb, and chicken samples. A simplified extraction method was implied to obtain the lipids from pure and adulterated meat. Adulterated samples were obtained by mixing lard with chicken, lamb, and beef with different concentrations (10%–50% v/v). Principal component analysis (PCA) and partial least square (PLS) were used to develop a calibration model at 800–3500 cm−1. Three-dimension PCA was successfully used by dividing the spectrum in three regions to classify lard meat adulteration in chicken, lamb, and beef samples. The corresponding FTIR peaks for the lard have been observed at 1159.6, 1743.4, 2853.1, and 2922.5 cm−1, which differentiate chicken, lamb, and beef samples. The wavenumbers offer the highest determination coefficient R2 value of 0.846 and lowest root mean square error of calibration (RMSEC) and root mean square error prediction (RMSEP) with an accuracy of 84.6%. Even the tiniest fat adulteration up to 10% can be reliably discovered using this methodology.  相似文献   
182.
Martensitic phase transformations in the solution-treated and water-quenched binary Ti-Nb alloys in the range of 16–26 at% Nb, were examined. An ordered, base-centred orthorhombic martensite was observed for alloys containing up to 23.4 at% Nb. The substructure of this martensite was generally composed of twins and stacking faults, the presence of antiphase boundaries observed in the plates indicating that the martensite underwent ordering during quenching. Both order-disorder and M s temperatures were observed to be affected by total interstitial content, higher contents increasing both temperatures. Increasing the niobium content to above 23.4% resulted in retention of the phase, this phase containing either athermal or diffuse depending upon niobium and total interstitial concentration. Finally, the microhardness of the Ti-Nb alloys examined was observed to decrease with increase in niobium and decrease in total interstitial content.  相似文献   
183.
Recently, many applications have used Peer-to-Peer (P2P) systems to overcome the current problems with client/server systems such as non-scalability, high bandwidth requirement and single point of failure. In this paper, we propose an efficient scheme to support efficient range query processing over structured P2P systems, while balancing both the storage load and access load. The paper proposes a rotating token scheme to balance the storage load by placing joining nodes in appropriate locations in the identifier space to share loads with already overloaded nodes. Then, to support range queries, we utilize an order-preserving mapping function to map keys to nodes in order preserving way and without hashing. This may result in an access load imbalance due to non-uniform distribution of keys in the identifier space. Thus, we propose an adaptive replication scheme to relieve overloaded nodes by shedding some load on other nodes to balance the access load. We derive a formula for estimating the overhead of the proposed adaptive replication scheme. In this study, we carry simulation experiments with synthetic data to measure the performance of the proposed schemes. Our simulation experiments show significant gains in both storage load balancing and access load balancing.  相似文献   
184.
The unguided visual exploration of volumetric data can be both a challenging and a time-consuming undertaking. Identifying a set of favorable vantage points at which to start exploratory expeditions can greatly reduce this effort and can also ensure that no important structures are being missed. Recent research efforts have focused on entropy-based viewpoint selection criteria that depend on scalar values describing the structures of interest. In contrast, we propose a viewpoint suggestion pipeline that is based on feature-clustering in high-dimensional space. We use gradient/normal variation as a metric to identify interesting local events and then cluster these via k-means to detect important salient composite features. Next, we compute the maximum possible exposure of these composite feature for different viewpoints and calculate a 2D entropy map parameterized in longitude and latitude to point out promising view orientations. Superimposed onto an interactive track-ball interface, users can then directly use this entropy map to quickly navigate to potentially interesting viewpoints where visibility-based transfer functions can be employed to generate volume renderings that minimize occlusions. To give full exploration freedom to the user, the entropy map is updated on the fly whenever a view has been selected, pointing to new and promising but so far unseen view directions. Alternatively, our system can also use a set-cover optimization algorithm to provide a minimal set of views needed to observe all features. The views so generated could then be saved into a list for further inspection or into a gallery for a summary presentation.  相似文献   
185.
This paper presents a new algorithm for de-noising global positioning system (GPS) and inertial navigation system (INS) data and estimates the INS error using wavelet multi-resolution analysis algorithm (WMRA)-based genetic algorithm (GA) with a well-designed structure appropriate for practical and real time implementations because of its very short training time and elevated accuracy. Different techniques have been implemented to de-noise and estimate the INS and GPS errors. Wavelet de-noising is one of th...  相似文献   
186.
Wide band mesh or star oriented networks have recently become a subject of greater interest. Providing wideband multimedia access for a variety of applications has led to the inception of mesh networks. Classic access techniques such as FDMA and TDMA have been the norm for such networks. CDMA maximum transmitter power is much less than TDMA and FDMA counter parts, which is an important asset for mobile operation. In this paper we introduce a code division multiple access/time division duplex technique CDMA/TDD for such networks. The CDMA approach is an almost play and plug technology for wireless access, making it amenable for implementation by the mesh network service station, SS. Further it inherently allows mesh network service stations to use a combination of turbo coding and dynamic parallel orthogonal transmission to improve network efficiency. We outline briefly the new transmitter and receiver structures then evaluate the efficiency, delay and delay jitter. By analysis we show the advantages over classic counter parts with respect to the total network efficiency achievable especially for larger number of hops.  相似文献   
187.
188.
In this paper, we develop an interactive analysis and visualization tool for probabilistic segmentation results in medical imaging. We provide a systematic approach to analyze, interact and highlight regions of segmentation uncertainty. We introduce a set of visual analysis widgets integrating different approaches to analyze multivariate probabilistic field data with direct volume rendering. We demonstrate the user's ability to identify suspicious regions (e.g. tumors) and correct the misclassification results using a novel uncertainty‐based segmentation editing technique. We evaluate our system and demonstrate its usefulness in the context of static and time‐varying medical imaging datasets.  相似文献   
189.
This survey aims at providing multimedia researchers with a state-of-the-art overview of fusion strategies, which are used for combining multiple modalities in order to accomplish various multimedia analysis tasks. The existing literature on multimodal fusion research is presented through several classifications based on the fusion methodology and the level of fusion (feature, decision, and hybrid). The fusion methods are described from the perspective of the basic concept, advantages, weaknesses, and their usage in various analysis tasks as reported in the literature. Moreover, several distinctive issues that influence a multimodal fusion process such as, the use of correlation and independence, confidence level, contextual information, synchronization between different modalities, and the optimal modality selection are also highlighted. Finally, we present the open issues for further research in the area of multimodal fusion.  相似文献   
190.
Context:How can quality of software systems be predicted before deployment? In attempting to answer this question, prediction models are advocated in several studies. The performance of such models drops dramatically, with very low accuracy, when they are used in new software development environments or in new circumstances.ObjectiveThe main objective of this work is to circumvent the model generalizability problem. We propose a new approach that substitutes traditional ways of building prediction models which use historical data and machine learning techniques.MethodIn this paper, existing models are decision trees built to predict module fault-proneness within the NASA Critical Mission Software. A genetic algorithm is developed to combine and adapt expertise extracted from existing models in order to derive a “composite” model that performs accurately in a given context of software development. Experimental evaluation of the approach is carried out in three different software development circumstances.ResultsThe results show that derived prediction models work more accurately not only for a particular state of a software organization but also for evolving and modified ones.ConclusionOur approach is considered suitable for software data nature and at the same time superior to model selection and data combination approaches. It is then concluded that learning from existing software models (i.e., software expertise) has two immediate advantages; circumventing model generalizability and alleviating the lack of data in software-engineering.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号