首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1672篇
  免费   143篇
  国内免费   15篇
电工技术   44篇
综合类   8篇
化学工业   430篇
金属工艺   82篇
机械仪表   62篇
建筑科学   87篇
矿业工程   4篇
能源动力   129篇
轻工业   132篇
水利工程   26篇
石油天然气   18篇
无线电   172篇
一般工业技术   254篇
冶金工业   79篇
原子能技术   16篇
自动化技术   287篇
  2024年   7篇
  2023年   36篇
  2022年   55篇
  2021年   99篇
  2020年   80篇
  2019年   105篇
  2018年   146篇
  2017年   131篇
  2016年   150篇
  2015年   88篇
  2014年   116篇
  2013年   185篇
  2012年   125篇
  2011年   96篇
  2010年   79篇
  2009年   64篇
  2008年   43篇
  2007年   38篇
  2006年   18篇
  2005年   9篇
  2004年   11篇
  2003年   13篇
  2002年   13篇
  2001年   12篇
  2000年   8篇
  1999年   4篇
  1998年   15篇
  1997年   3篇
  1996年   6篇
  1995年   12篇
  1994年   5篇
  1993年   5篇
  1992年   6篇
  1991年   5篇
  1990年   4篇
  1989年   6篇
  1988年   3篇
  1987年   2篇
  1986年   5篇
  1985年   3篇
  1984年   1篇
  1983年   3篇
  1982年   5篇
  1979年   3篇
  1975年   1篇
  1973年   2篇
  1972年   1篇
  1970年   2篇
  1969年   1篇
排序方式: 共有1830条查询结果,搜索用时 12 毫秒
21.
This paper presents a method for reconstructing unreliable spectral components of speech signals using the statistical distributions of the clean components. Our goal is to model the temporal patterns in speech signal and take advantage of correlations between speech features in both time and frequency domain simultaneously. In this approach, a hidden Markov model (HMM) is first trained on clean speech data to model the temporal patterns which appear in the sequences of the spectral components. Using this model and according to the probabilities of occurring noisy spectral component at each states, a probability distributions for noisy components are estimated. Then, by applying maximum a posteriori (MAP) estimation on the mentioned distributions, the final estimations of the unreliable spectral components are obtained. The proposed method is compared to a common missing feature method which is based on the probabilistic clustering of the feature vectors and also to a state of the art method based on sparse reconstruction. The experimental results exhibits significant improvement in recognition accuracy over a noise polluted Persian corpus.  相似文献   
22.
With the increased advancements of smart industries, cybersecurity has become a vital growth factor in the success of industrial transformation. The Industrial Internet of Things (IIoT) or Industry 4.0 has revolutionized the concepts of manufacturing and production altogether. In industry 4.0, powerful Intrusion Detection Systems (IDS) play a significant role in ensuring network security. Though various intrusion detection techniques have been developed so far, it is challenging to protect the intricate data of networks. This is because conventional Machine Learning (ML) approaches are inadequate and insufficient to address the demands of dynamic IIoT networks. Further, the existing Deep Learning (DL) can be employed to identify anonymous intrusions. Therefore, the current study proposes a Hunger Games Search Optimization with Deep Learning-Driven Intrusion Detection (HGSODL-ID) model for the IIoT environment. The presented HGSODL-ID model exploits the linear normalization approach to transform the input data into a useful format. The HGSO algorithm is employed for Feature Selection (HGSO-FS) to reduce the curse of dimensionality. Moreover, Sparrow Search Optimization (SSO) is utilized with a Graph Convolutional Network (GCN) to classify and identify intrusions in the network. Finally, the SSO technique is exploited to fine-tune the hyper-parameters involved in the GCN model. The proposed HGSODL-ID model was experimentally validated using a benchmark dataset, and the results confirmed the superiority of the proposed HGSODL-ID method over recent approaches.  相似文献   
23.
The Journal of Supercomputing - Internet of Things (IoT) is an emerging paradigm that consists of numerous connected and interrelated devices with embedded sensors, exchanging data with each other...  相似文献   
24.
Journal of Intelligent Manufacturing - In recent years, driven by Industry 4.0 wave, academic research has focused on the science, engineering, and enabling technologies for intelligent and cyber...  相似文献   
25.
The Journal of Supercomputing - Fast execution of functions is an inevitable challenge in the serverless computing landscape. Inefficient dispatching, fluctuations in invocation rates, burstiness...  相似文献   
26.
In this paper, the effects of dilute charged impurity doping on electronic heat capacity (EHC) and magnetic susceptibility (MS) of a two-dimensional material ferromagnetic gapped graphene-like, MoS2, are investigated within the Green’s function approach by using the Kane-Mele Hamiltonian and self-consistent Born approximation (SCBA) at Dirac points. Our findings show that there is a critical impurity concentration (IC) and scattering strength (ISS) for each valley in EHC and MS curves. Also, we have found that the spin band gap decreases with impurity only for valley K, and \(K^{\prime }, \downarrow \) due to the existence of inversion symmetry between valleys. On the other hand, a magnetic phase transition from ferromagnetic to antiferromagnetic and paramagnetic has been observed. The increase of scattering rate of carriers in the presence of impurity is the main reason of these behaviors.  相似文献   
27.
28.
Mixing sand or soil with small pieces of tire is common practice in civil engineering applications. Although the properties of the soil are changed, it is environmentally friendly and sometimes economical. Nevertheless, the mechanical behavior of such mixtures is still not fully understood and more numerical investigations are required. This paper presents a novel approach for the modeling of sand–tire mixtures based on the discrete element method. The sand grains are represented by rigid agglomerates whereas the tire grains are represented by deformable agglomerates. The approach considers both grain shape and deformability. The micromechanical parameters of the contact law are calibrated based on experimental results from the literature. The effects of tire content and confining pressure on the stress–strain response are investigated in detail by performing numerical triaxial compression tests. The main results indicate that both strength and stiffness of the samples decrease with increasing tire content. A tire contact of 40% is identified as the boundary between rubber-like and sand-like behavior.  相似文献   
29.
Classification of electroencephalogram (EEG) signals for humans can be achieved via artificial intelligence (AI) techniques. Especially, the EEG signals associated with seizure epilepsy can be detected to distinguish between epileptic and non-epileptic regions. From this perspective, an automated AI technique with a digital processing method can be used to improve these signals. This paper proposes two classifiers: long short-term memory (LSTM) and support vector machine (SVM) for the classification of seizure and non-seizure EEG signals. These classifiers are applied to a public dataset, namely the University of Bonn, which consists of 2 classes –seizure and non-seizure. In addition, a fast Walsh-Hadamard Transform (FWHT) technique is implemented to analyze the EEG signals within the recurrence space of the brain. Thus, Hadamard coefficients of the EEG signals are obtained via the FWHT. Moreover, the FWHT is contributed to generate an efficient derivation of seizure EEG recordings from non-seizure EEG recordings. Also, a k-fold cross-validation technique is applied to validate the performance of the proposed classifiers. The LSTM classifier provides the best performance, with a testing accuracy of 99.00%. The training and testing loss rates for the LSTM are 0.0029 and 0.0602, respectively, while the weighted average precision, recall, and F1-score for the LSTM are 99.00%. The results of the SVM classifier in terms of accuracy, sensitivity, and specificity reached 91%, 93.52%, and 91.3%, respectively. The computational time consumed for the training of the LSTM and SVM is 2000 and 2500 s, respectively. The results show that the LSTM classifier provides better performance than SVM in the classification of EEG signals. Eventually, the proposed classifiers provide high classification accuracy compared to previously published classifiers.  相似文献   
30.
The elliptic curve cryptosystem (ECC) has recently received significant attention by researchers due to its high performance, low computational cost, and small key size. In this paper, an efficient key management and derivation scheme based on ECC is proposed to solve dynamic access problems in a user hierarchy. Compared to previous ECC based works, the proposed method does not require constructing interpolate polynomials, therefore, the computational complexity of key generation and key derivation is significantly reduced. At the same time, time complexity of adding/deleting security classes, modifying their relationships, and changing of secret keys is decreased in the proposed method.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号