首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4314篇
  免费   303篇
  国内免费   9篇
电工技术   54篇
综合类   3篇
化学工业   1279篇
金属工艺   43篇
机械仪表   93篇
建筑科学   141篇
矿业工程   4篇
能源动力   170篇
轻工业   598篇
水利工程   29篇
石油天然气   27篇
无线电   350篇
一般工业技术   720篇
冶金工业   190篇
原子能技术   26篇
自动化技术   899篇
  2024年   4篇
  2023年   56篇
  2022年   104篇
  2021年   188篇
  2020年   126篇
  2019年   149篇
  2018年   154篇
  2017年   141篇
  2016年   188篇
  2015年   164篇
  2014年   262篇
  2013年   391篇
  2012年   336篇
  2011年   372篇
  2010年   238篇
  2009年   238篇
  2008年   201篇
  2007年   206篇
  2006年   152篇
  2005年   122篇
  2004年   112篇
  2003年   89篇
  2002年   86篇
  2001年   41篇
  2000年   51篇
  1999年   43篇
  1998年   54篇
  1997年   44篇
  1996年   30篇
  1995年   22篇
  1994年   24篇
  1993年   25篇
  1992年   17篇
  1991年   18篇
  1990年   18篇
  1989年   29篇
  1988年   12篇
  1987年   17篇
  1986年   15篇
  1985年   16篇
  1984年   17篇
  1983年   8篇
  1982年   10篇
  1981年   8篇
  1980年   9篇
  1977年   3篇
  1976年   4篇
  1975年   2篇
  1974年   2篇
  1973年   2篇
排序方式: 共有4626条查询结果,搜索用时 296 毫秒
91.
The multiple determination tasks of chemical properties are a classical problem in analytical chemistry. The major problem is concerned in to find the best subset of variables that better represents the compounds. These variables are obtained by a spectrophotometer device. This device measures hundreds of correlated variables related with physicocbemical properties and that can be used to estimate the component of interest. The problem is the selection of a subset of informative and uncorrelated variables that help the minimization of prediction error. Classical algorithms select a subset of variables for each compound considered. In this work we propose the use of the SPEA-II (strength Pareto evolutionary algorithm II). We would like to show that the variable selection algorithm can selected just one subset used for multiple determinations using multiple linear regressions. For the case study is used wheat data obtained by NIR (near-infrared spectroscopy) spectrometry where the objective is the determination of a variable subgroup with information about E protein content (%), test weight (Kg/HI), WKT (wheat kernel texture) (%) and farinograph water absorption (%). The results of traditional techniques of multivariate calibration as the SPA (successive projections algorithm), PLS (partial least square) and mono-objective genetic algorithm are presents for comparisons. For NIR spectral analysis of protein concentration on wheat, the number of variables selected from 775 spectral variables was reduced for just 10 in the SPEA-II algorithm. The prediction error decreased from 0.2 in the classical methods to 0.09 in proposed approach, a reduction of 37%. The model using variables selected by SPEA-II had better prediction performance than classical algorithms and full-spectrum partial least-squares.  相似文献   
92.
This paper studies connectivity aspects that arise in image operators that process connected components. The focus is on morphological image analysis (i.e., on increasing image operators) and, in particular, on a robustness property satisfied by certain morphological filters that is denominated the strong property. The behavior of alternated compositions of openings and closings is investigated under certain assumptions, particularly connectedness and a connected component preserving condition. It is shown that these conditions cannot in general guarantee the strong property of certain connected alternated filters because of issues related to the locality of the filters. As treated in the paper, there have been a series of misunderstandings in the literature concerning this topic, and it is important to clarify them. The root cause of those problems is discussed, and a solution is indicated. The class of connected openings and closings used to build connected alternated filters should therefore be defined to avoid such situations, since the strong property of alternated filters should be a distinctive characteristic of this class.
Victor MaojoEmail:
  相似文献   
93.
The Amazon rainforest is one of the world's greatest natural wonders and holds great importance and significance for the world's environmental balance. Around 60% of the Amazon rainforest is located in the Brazilian territory. The two biggest states of the Amazon region are Amazonas (the upper Amazon) and Pará (the lower Amazon), which together account for around 73% of the Brazilian Legal Amazon, and are the only states that are serviced by international airports in Brazil's north region. The purpose of this paper is to model and forecast sustainable international tourism demand for the states of Amazonas, Pará, and the aggregate of the two states. By sustainable tourism is meant a distinctive type of tourism that has relatively low environmental and cultural impacts. Economic progress brought about by illegal wood extraction and commercial agriculture has destroyed large areas of the Amazon rainforest. The sustainable tourism industry has the potential to contribute to the economic development of the Amazon region without destroying the rainforest. The paper presents unit root tests for monthly and annual data, estimates alternative time series models and conditional volatility models of the shocks to international tourist arrivals, and provides forecasts for 2006 and 2007.  相似文献   
94.

The permanent availability and relative obscurity of blockchains is the perfect ground for using them for malicious purposes. However, the use of blockchains by malwares has not been characterized yet. This paper analyses the current state of the art in this area. One of the lessons learned is that covert communications for malware have received little attention. To foster further defence-oriented research, a novel mechanism (dubbed Smart-Zephyrus) is built leveraging smart contracts written in Solidity. Our results show that it is possible to hide 4 Kb of secret in 41 s. While being expensive (around USD 1.82 per bit), the provided stealthiness might be worth the price for attackers.

  相似文献   
95.
The development of biocompatible nanomaterials for smart drug delivery and bioimaging has attracted great interest in recent years in biomedical fields. Here, the interaction between the recently reported nitrogenated graphene (C2N) and a prototypical protein (villin headpiece HP35) utilizing atomistic molecular dynamics simulations is studied. The simulations reveal that HP35 can form a stable binding with the C2N monolayer. Although the C2N–HP35 attractive interactions are constantly preserved, the binding strength between C2N and the protein is mild and does not cause significant distortion in the protein's structural integrity. This intrinsic biofriendly property of native C2N is distinct from several widely studied nanomaterials, such as graphene, carbon nanotubes, and MoS2, which can induce severe protein denaturation. Interestingly, once the protein is adsorbed onto C2N surface, its transverse migration is highly restricted at the binding sites. This restriction is orchestrated by C2N's periodic porous structure with negatively charged “holes,” where the basic residues—such as lysine—can form stable interactions, thus functioning as “anchor points” in confining the protein displacement. It is suggested that the mild, immobilized protein attraction and biofriendly aspects of C2N would make it a prospective candidate in bio‐ and medical‐related applications.  相似文献   
96.
Electroencephalography (EEG) is widely used in variety of research and clinical applications which includes the localization of active brain sources. Brain source localization provides useful information to understand the brain's behavior and cognitive analysis. Various source localization algorithms have been developed to determine the exact locations of the active brain sources due to which electromagnetic activity is generated in brain. These algorithms are based on digital filtering, 3D imaging, array signal processing and Bayesian approaches. According to the spatial resolution provided, the algorithms are categorized as either low resolution methods or high resolution methods. In this research study, EEG data is collected by providing visual stimulus to healthy subjects. FDM is used for head modelling to solve forward problem. The low‐resolution brain electromagnetic tomography (LORETA) and standardized LORETA (sLORETA) have been used as inverse modelling methods to localize the active regions in the brain during the stimulus provided. The results are produced in the form of MRI images. The tables are also provided to describe the intensity levels for estimated current level for the inverse methods used. The higher current value or intensity level shows the higher electromagnetic activity for a particular source at certain time instant. Thus, the results obtained demonstrate that standardized method which is based on second order Laplacian (sLORETA) in conjunction with finite difference method (FDM) as head modelling technique outperforms other methods in terms of source estimation as it has higher current level and thus, current density (J) for an area as compared to others.  相似文献   
97.
98.
99.
The paper discusses the theoretical and empirical evidence on the subject and concludes that freight mode choice can be best understood as the outcome of interactions between shippers and carriers, and that mode choice depends to a large extent on the shipment size that results from shipper-carrier interactions. These conclusions are supported by economic experiments designed to test the hypothesis of cooperative behavior. This was accomplished by conducting two sets of experiments (ones with the shipper playing the lead role in selecting the shipment size; and others in which the shipment size decision was left to the carriers), and by comparing their results to the ones obtained numerically under the assumption of perfect cooperation. The comparison of results indicated that the experiments converged to the perfect cooperation case. This is in line with the conclusion from game theory that indicates that under typical market conditions the shipper and carrier would cooperate. These results also imply that it really does not matter who “makes” the decision about the shipment size and mode to be used at a given time period, as over time the shipper—that is the customer—ends up selecting the bids more consistent with its own interest. In other words, these results do not support the assumption that freight mode choice is solely made by the carriers.  相似文献   
100.
Applications of wireless communications networks are emerging continuously. To offer a good level of security in these applications, new standards for wireless communications propose solutions based on cryptographic algorithms working on special modes of operation. This work presents a custom hardware architecture for the AES-CCM protocol (AES-CCMP) which is the basis for the security architecture of the IEEE 802.11i standard. AES-CCMP is based on the AES-CCM algorithm that performs the Advanced Encryption Standard (AES) in CTR with CBC-MAC mode (CCM mode), plus specialized data formatting modules, providing different security services through iterative and complex operations. Results of implementing the proposed architecture targeting FPGA devices are presented and discussed. A comparison against similar works shows significant improvements in terms of both throughput and efficiency.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号