首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   503篇
  免费   16篇
  国内免费   1篇
电工技术   8篇
综合类   5篇
化学工业   121篇
金属工艺   7篇
机械仪表   16篇
建筑科学   5篇
能源动力   37篇
轻工业   32篇
无线电   61篇
一般工业技术   107篇
冶金工业   16篇
原子能技术   2篇
自动化技术   103篇
  2024年   4篇
  2023年   15篇
  2022年   26篇
  2021年   33篇
  2020年   31篇
  2019年   35篇
  2018年   35篇
  2017年   24篇
  2016年   36篇
  2015年   13篇
  2014年   15篇
  2013年   60篇
  2012年   28篇
  2011年   31篇
  2010年   34篇
  2009年   13篇
  2008年   17篇
  2007年   16篇
  2006年   16篇
  2005年   10篇
  2004年   8篇
  2003年   4篇
  2002年   3篇
  2001年   2篇
  2000年   1篇
  1999年   1篇
  1997年   1篇
  1996年   4篇
  1994年   1篇
  1993年   1篇
  1980年   1篇
  1976年   1篇
排序方式: 共有520条查询结果,搜索用时 0 毫秒
1.
2.
In this work, a Weiner-type nonlinear black box model was developed for capturing dynamics of open loop stable MIMO nonlinear systems with deterministic inputs. The linear dynamic component of the model was parameterized using orthogonal Laguerre filters while the nonlinear state output map was constructed either using quadratic polynomial functions or artificial neural networks. The properties of the resulting model, such as open loop stability and steady-state behavior, are discussed in detail. The identified Weiner-Laguerre model was further used to formulate a nonlinear model predictive control (NMPC) scheme. The efficacy of the proposed modeling and control scheme was demonstrated using two benchmark control problems: (a) a simulation study involving control of a continuously operated fermenter at its optimum (singular) operating point and (b) experimental verification involving control of pH at the critical point of a neutralization process. It was observed that the proposed Weiner-Laguerre model is able to capture both the dynamic and steady-state characteristics of the continuous fermenter as well as the neutralization process reasonably accurately over wide operating ranges. The proposed NMPC scheme achieved a smooth transition from a suboptimal operating point to the optimum (singular) operating point of the fermenter without causing large variation in manipulated inputs. The proposed NMPC scheme was also found to be robust in the face of moderate perturbation in the unmeasured disturbances. In the case of experimental verification using the neutralization process, the proposed control scheme was found to achieve much faster transition to a set point close to the critical point when compared to a conventional gain-scheduled PID controller.  相似文献   
3.

Data mining has been proven as a reliable technique to analyze road accidents and provide productive results. Most of the road accident data analysis use data mining techniques, focusing on identifying factors that affect the severity of an accident. However, any damage resulting from road accidents is always unacceptable in terms of health, property damage and other economic factors. Sometimes, it is found that road accident occurrences are more frequent at certain specific locations. The analysis of these locations can help in identifying certain road accident features that make a road accident to occur frequently in these locations. Association rule mining is one of the popular data mining techniques that identify the correlation in various attributes of road accident. In this paper, we first applied k-means algorithm to group the accident locations into three categories, high-frequency, moderate-frequency and low-frequency accident locations. k-means algorithm takes accident frequency count as a parameter to cluster the locations. Then we used association rule mining to characterize these locations. The rules revealed different factors associated with road accidents at different locations with varying accident frequencies. The association rules for high-frequency accident location disclosed that intersections on highways are more dangerous for every type of accidents. High-frequency accident locations mostly involved two-wheeler accidents at hilly regions. In moderate-frequency accident locations, colonies near local roads and intersection on highway roads are found dangerous for pedestrian hit accidents. Low-frequency accident locations are scattered throughout the district and the most of the accidents at these locations were not critical. Although the data set was limited to some selected attributes, our approach extracted some useful hidden information from the data which can be utilized to take some preventive efforts in these locations.

  相似文献   
4.
Journal of Signal Processing Systems - Segmentation of thigh tissues (muscle, fat, inter-muscular adipose tissue (IMAT), bone, and bone marrow) from magnetic resonance imaging (MRI) scans is useful...  相似文献   
5.
Electrohydrodynamic (EHD) processes are promising techniques for manufacturing nanoscopic products with different shapes (such as thin films, nanofibers, 2D/3D nanostructures, and nanoparticles) and materials at a low cost using simple equipment. A key challenge in their adoption by nonexperts is the requirement of enormous time and resources in identifying the optimum design/process parameters for the underlying material and EHD system. Machine learning (ML) has made exciting advancements in predictive modeling of different processes, provided it is trained on high-quality datasets at appropriate volumes. This article extends the suitability of such ML-enabled approaches to a new technological domain of EHD spraying and drop-on-demand printing. Different ML models like ridge regression, random forest regression, support vector regression, gradient boosting regression, and multilayer perceptron are trained and their performance using evaluation metrics like RMSE and R2_score is examined. Tree-based algorithms like gradient boosting regression are found to be the most suitable technique for modeling EHD processes. The trained ML models show substantially higher accuracy (average error < 5%) in replicating these nonlinear processes as compared to previously reported scaling laws (average error ≈ 42%) and are well suited for predictive modeling/analysis of the underlying EHD system and process.  相似文献   
6.
Present paper deals with fractional version of a dynamical system introduced by C. Liu, L. Liu and T. Liu [C. Liu, L. Liu, T. Liu, A novel three-dimensional autonomous chaos system, Chaos Solitons Fractals 39 (4) (2009) 1950–1958]. Numerical investigations on the dynamics of this system have been carried out. Properties of the system have been analyzed by means of Lyapunov exponents. Furthermore the minimum effective dimensions have been identified for chaos to exist in commensurate and incommensurate orders. It is noteworthy that the results obtained are consistent with the analytical conditions given in the literature.  相似文献   
7.
This work provides a framework for nominal and robust stability analysis for a class of discrete-time nonlinear recursive observers (DNRO). Given that the system has linear output mapping, local observability and Jacobian matrices satisfying certain conditions, the nominal and robust stability of the DNRO is defined by the property of estimation error dynamics and is analyzed using Lyapunov theory. Moreover, a simultaneous state and parameter estimation scheme is shown to be Input-to-State Stable (ISS), and adaptively reduce plant-model mismatch on-line. Three design strategies of the DNRO that satisfy the stability results are given as examples, including the widely used extended Kalman filter, extended Luenberger observer, and the fixed gain observer.  相似文献   
8.
A key issue that needs to be addressed while performing fault diagnosis using black box models is that of robustness against abrupt changes in unknown inputs. A fundamental difficulty with the robust FDI design approaches available in the literature is that they require some a priori knowledge of the model for unmeasured disturbances or modeling uncertainty. In this work, we propose a novel approach for modeling abrupt changes in unmeasured disturbances when innovation form of state space model (i.e. black box observer) is used for fault diagnosis. A disturbance coupling matrix is developed using singular value decomposition of the extended observability matrix and further used to formulate a robust fault diagnosis scheme based on generalized likelihood ratio test. The proposed modeling approach does not require any a priori knowledge of how these faults affect the system dynamics. To isolate sensor and actuator biases from step jumps in unmeasured disturbances, a statistically rigorous method is developed for distinguishing between faults modeled using different number of parameters. Simulation studies on a heavy oil fractionator example show that the proposed FDI methodology based on identified models can be used to effectively distinguish between sensor biases, actuator biases and other soft faults caused by changes in unmeasured disturbance variables. The fault tolerant control scheme, which makes use of the proposed robust FDI methodology, gives significantly better control performance than conventional controllers when soft faults occur. The experimental evaluation of the proposed FDI methodology on a laboratory scale stirred tank temperature control set-up corroborates these conclusions.  相似文献   
9.
Historical data based fault diagnosis methods exploit two key strengths of multivariate statistical approaches, viz.: (i) data compression ability, and (ii) discriminatory ability. It has been shown that correspondence analysis (CA) is superior to principal components analysis (PCA) on both these counts (Detroja, Gudi, Patwardhan, & Roy, 2006a), and hence is more suited for the task of fault detection and isolation (FDI). In this paper, we propose a CA based methodology for fault diagnosis that can facilitate significant data reduction as well as better discrimination. The proposed methodology is based on the principle of distributional equivalence (PDE). The PDE is a property unique to the CA algorithm and can be very useful in analyzing large datasets. The principle, when applied to historical data sets for FDI, can significantly reduce the data matrix size without significantly affecting the discriminatory ability of the CA algorithm. This can significantly reduce computational load during statistical model building. The data reduction ability of the proposed methodology is demonstrated using a simulation case study involving benchmark quadruple tank laboratory process. The proposed methodology when applied to experimental data obtained from the quadruple tank process also demonstrated data reduction capabilities of the principle of distributional equivalence. The above aspect has also been validated for large-scale data sets using the benchmark Tennessee Eastman process simulation case study.  相似文献   
10.
Ubiquitous computing refers to building a global computing environment where seamless and invisible access to computing resources is provided to the user. Pervasive computing deals with acquiring context knowledge from the environment and providing dynamic, proactive and context-aware services to the user. A Ubiquitous computing environment is created by sharing knowledge and information between Pervasive computing environments. In this paper we propose a framework that uses the potential of the Semantic Web to weave Pervasive computing environments into a Ubiquitous computing environment. We discuss how the collaboration of these Pervasive environments can create an effective Ubiquitous computing environment referred herein as the Integrated Global Pervasive Computing Framework (IGPF). We test the effectiveness of the Ubiquitous environment through a small scenario from a prototype system that we have implemented over this framework to handle medical emergency scenario.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号