首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1693篇
  免费   121篇
  国内免费   16篇
电工技术   43篇
综合类   8篇
化学工业   463篇
金属工艺   47篇
机械仪表   79篇
建筑科学   50篇
矿业工程   2篇
能源动力   94篇
轻工业   154篇
水利工程   37篇
石油天然气   21篇
武器工业   2篇
无线电   132篇
一般工业技术   309篇
冶金工业   68篇
原子能技术   17篇
自动化技术   304篇
  2024年   4篇
  2023年   56篇
  2022年   76篇
  2021年   127篇
  2020年   111篇
  2019年   126篇
  2018年   136篇
  2017年   115篇
  2016年   117篇
  2015年   81篇
  2014年   94篇
  2013年   158篇
  2012年   122篇
  2011年   99篇
  2010年   75篇
  2009年   76篇
  2008年   45篇
  2007年   41篇
  2006年   29篇
  2005年   11篇
  2004年   9篇
  2003年   11篇
  2002年   10篇
  2001年   2篇
  2000年   7篇
  1999年   9篇
  1998年   18篇
  1997年   9篇
  1996年   8篇
  1995年   8篇
  1994年   10篇
  1993年   4篇
  1992年   5篇
  1991年   4篇
  1990年   2篇
  1987年   3篇
  1986年   1篇
  1984年   4篇
  1983年   1篇
  1982年   1篇
  1981年   3篇
  1980年   1篇
  1968年   1篇
排序方式: 共有1830条查询结果,搜索用时 250 毫秒
41.
Cell formation is a traditional problem in cellular manufacturing systems that concerns the allocation of parts, operators and machines to the cells. This paper presents a new mathematical programming model for cell formation in which operators’ personality and decision-making styles, skill in working with machines, and also job security are incorporated simultaneously. The model involves the following five objectives: (1) minimising costs of adding new machines to and removing machines from the cells at the beginning of each period, (2) minimising total cost of material handling, (3) maximising job security, (4) minimising inconsistency of operators’ decision styles in cells and (5) minimising cost of suitable skill. On account of the NP-hard nature of the proposed model, NSGA-II as a powerful meta-heuristic approach is used for solving large-sized problems. Furthermore, response surface methodology (RSM) is used for tuning the parameters. Lastly, MOPSO and two scalarization methods are employed for validation of the results obtained. To the best of our knowledge, this is the first study that presents a multi-objective mathematical model for cell formation problem considering operators’ personality and skill, addition and removal of machines and job security.  相似文献   
42.
Electroencephalography (EEG) is widely used in variety of research and clinical applications which includes the localization of active brain sources. Brain source localization provides useful information to understand the brain's behavior and cognitive analysis. Various source localization algorithms have been developed to determine the exact locations of the active brain sources due to which electromagnetic activity is generated in brain. These algorithms are based on digital filtering, 3D imaging, array signal processing and Bayesian approaches. According to the spatial resolution provided, the algorithms are categorized as either low resolution methods or high resolution methods. In this research study, EEG data is collected by providing visual stimulus to healthy subjects. FDM is used for head modelling to solve forward problem. The low‐resolution brain electromagnetic tomography (LORETA) and standardized LORETA (sLORETA) have been used as inverse modelling methods to localize the active regions in the brain during the stimulus provided. The results are produced in the form of MRI images. The tables are also provided to describe the intensity levels for estimated current level for the inverse methods used. The higher current value or intensity level shows the higher electromagnetic activity for a particular source at certain time instant. Thus, the results obtained demonstrate that standardized method which is based on second order Laplacian (sLORETA) in conjunction with finite difference method (FDM) as head modelling technique outperforms other methods in terms of source estimation as it has higher current level and thus, current density (J) for an area as compared to others.  相似文献   
43.
Process capability indices such as Cp are used extensively in manufacturing industries to assess processes in order to decide about purchasing. In practice, the parameter for calculating Cp is rarely known and is frequently replaced with estimates from an in-control reference sample. This article explores the optimal sample size required to achieve a desired error of estimation using absolute percentage error of different Cp estimates. Moreover, some practical tools are created to allow practitioners to find sample size in different situations.  相似文献   
44.
Bone autografts are often used for reconstruction of bone defects; however, due to the limitations of autografts, researchers have been in search of bone substitutes. Dentin is of particular interest for this purpose due to high similarity to bone. This in vitro study sought to assess the surface characteristics and biological properties of dentin samples prepared with different treatments. This study was conducted on regular (RD), demineralized (DemD), and deproteinized (DepD) dentin samples. X-ray diffraction and Fourier transform infrared spectroscopy were used for surface characterization. Samples were immersed in simulated body fluid, and their bioactivity was evaluated under a scanning electron microscope. The methyl thiazol tetrazolium assay, scanning electron microscope analysis and quantitative real-time polymerase chain reaction were performed, respectively to assess viability/proliferation, adhesion/morphology and osteoblast differentiation of cultured human dental pulp stem cells on dentin powders. Of the three dentin samples, DepD showed the highest and RD showed the lowest rate of formation and deposition of hydroxyapatite crystals. Although, the difference in superficial apatite was not significant among samples, functional groups on the surface, however, were more distinct on DepD. At four weeks, hydroxyapatite deposits were noted as needle-shaped accumulations on DemD sample and numerous hexagonal HA deposit masses were seen, covering the surface of DepD. The methyl thiazol tetrazolium, scanning electron microscope, and quantitative real-time polymerase chain reaction analyses during the 10-day cell culture on dentin powders showed the highest cell adhesion and viability and rapid differentiation in DepD. Based on the parameters evaluated in this in vitro study, DepD showed high rate of formation/deposition of hydroxyapatite crystals and adhesion/viability/osteogenic differentiation of human dental pulp stem cells, which may support its osteoinductive/osteoconductive potential for bone regeneration.  相似文献   
45.
With the increasing and rapid growth rate of COVID-19 cases, the healthcare scheme of several developed countries have reached the point of collapse. An important and critical steps in fighting against COVID-19 is powerful screening of diseased patients, in such a way that positive patient can be treated and isolated. A chest radiology image-based diagnosis scheme might have several benefits over traditional approach. The accomplishment of artificial intelligence (AI) based techniques in automated diagnoses in the healthcare sector and rapid increase in COVID-19 cases have demanded the requirement of AI based automated diagnosis and recognition systems. This study develops an Intelligent Firefly Algorithm Deep Transfer Learning Based COVID-19 Monitoring System (IFFA-DTLMS). The proposed IFFA-DTLMS model majorly aims at identifying and categorizing the occurrence of COVID19 on chest radiographs. To attain this, the presented IFFA-DTLMS model primarily applies densely connected networks (DenseNet121) model to generate a collection of feature vectors. In addition, the firefly algorithm (FFA) is applied for the hyper parameter optimization of DenseNet121 model. Moreover, autoencoder-long short term memory (AE-LSTM) model is exploited for the classification and identification of COVID19. For ensuring the enhanced performance of the IFFA-DTLMS model, a wide-ranging experiments were performed and the results are reviewed under distinctive aspects. The experimental value reports the betterment of IFFA-DTLMS model over recent approaches.  相似文献   
46.
One of the most pressing concerns for the consumer market is the detection of adulteration in meat products due to their preciousness. The rapid and accurate identification mechanism for lard adulteration in meat products is highly necessary, for developing a mechanism trusted by consumers and that can be used to make a definitive diagnosis. Fourier Transform Infrared Spectroscopy (FTIR) is used in this work to identify lard adulteration in cow, lamb, and chicken samples. A simplified extraction method was implied to obtain the lipids from pure and adulterated meat. Adulterated samples were obtained by mixing lard with chicken, lamb, and beef with different concentrations (10%–50% v/v). Principal component analysis (PCA) and partial least square (PLS) were used to develop a calibration model at 800–3500 cm−1. Three-dimension PCA was successfully used by dividing the spectrum in three regions to classify lard meat adulteration in chicken, lamb, and beef samples. The corresponding FTIR peaks for the lard have been observed at 1159.6, 1743.4, 2853.1, and 2922.5 cm−1, which differentiate chicken, lamb, and beef samples. The wavenumbers offer the highest determination coefficient R2 value of 0.846 and lowest root mean square error of calibration (RMSEC) and root mean square error prediction (RMSEP) with an accuracy of 84.6%. Even the tiniest fat adulteration up to 10% can be reliably discovered using this methodology.  相似文献   
47.
Classification of electroencephalogram (EEG) signals for humans can be achieved via artificial intelligence (AI) techniques. Especially, the EEG signals associated with seizure epilepsy can be detected to distinguish between epileptic and non-epileptic regions. From this perspective, an automated AI technique with a digital processing method can be used to improve these signals. This paper proposes two classifiers: long short-term memory (LSTM) and support vector machine (SVM) for the classification of seizure and non-seizure EEG signals. These classifiers are applied to a public dataset, namely the University of Bonn, which consists of 2 classes –seizure and non-seizure. In addition, a fast Walsh-Hadamard Transform (FWHT) technique is implemented to analyze the EEG signals within the recurrence space of the brain. Thus, Hadamard coefficients of the EEG signals are obtained via the FWHT. Moreover, the FWHT is contributed to generate an efficient derivation of seizure EEG recordings from non-seizure EEG recordings. Also, a k-fold cross-validation technique is applied to validate the performance of the proposed classifiers. The LSTM classifier provides the best performance, with a testing accuracy of 99.00%. The training and testing loss rates for the LSTM are 0.0029 and 0.0602, respectively, while the weighted average precision, recall, and F1-score for the LSTM are 99.00%. The results of the SVM classifier in terms of accuracy, sensitivity, and specificity reached 91%, 93.52%, and 91.3%, respectively. The computational time consumed for the training of the LSTM and SVM is 2000 and 2500 s, respectively. The results show that the LSTM classifier provides better performance than SVM in the classification of EEG signals. Eventually, the proposed classifiers provide high classification accuracy compared to previously published classifiers.  相似文献   
48.
Magnetic resonance imaging (MRI) brain tumor segmentation is a crucial task for clinical treatment. However, it is challenging owing to variations in type, size, and location of tumors. In addition, anatomical variation in individuals, intensity non-uniformity, and noises adversely affect brain tumor segmentation. To address these challenges, an automatic region-based brain tumor segmentation approach is presented in this paper which combines fuzzy shape prior term and deep learning. We define a new energy function in which an Adaptively Regularized Kernel-Based Fuzzy C-Means (ARKFCM) Clustering algorithm is utilized for inferring the shape of the tumor to be embedded into the level set method. In this way, some shortcomings of traditional level set methods such as contour leakage and shrinkage have been eliminated. Moreover, a fully automated method is achieved by using U-Net to obtain the initial contour, reducing sensitivity to initial contour selection. The proposed method is validated on the BraTS 2017 benchmark dataset for brain tumor segmentation. Average values of Dice, Jaccard, Sensitivity and specificity are 0.93 ± 0.03, 0.86 ± 0.06, 0.95 ± 0.04, and 0.99 ± 0.003, respectively. Experimental results indicate that the proposed method outperforms the other state-of-the-art methods in brain tumor segmentation.  相似文献   
49.
50.
In this paper, adaptive robust control of uncertain systems with multiple time delays in states and input is considered. It is assumed that the parameter uncertainties are time varying norm-bounded whose bounds are unknown but their functional properties are known. To overcome the effect of input delay on the closed loop system stability, new Lyapunov Krasovskii functional will be introduced. It is shown that the proposed adaptive robust controller guarantees globally uniformly exponentially convergence of all system solutions to a ball with any certain convergence rate. Moreover, if there is no disturbance in the system, asymptotic stability of the closed loop system will be established. The proposed design condition is formulated in terms of linear matrix inequality (LMI) which can be easily solved by LMI Toolbox in Matlab. Finally, an illustrative example is included to show the effectiveness of results developed in this paper.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号