首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   137088篇
  免费   21625篇
  国内免费   5617篇
电工技术   7529篇
技术理论   10篇
综合类   8620篇
化学工业   31960篇
金属工艺   6368篇
机械仪表   7256篇
建筑科学   9165篇
矿业工程   3038篇
能源动力   3747篇
轻工业   15142篇
水利工程   2544篇
石油天然气   4787篇
武器工业   1062篇
无线电   17692篇
一般工业技术   21321篇
冶金工业   5107篇
原子能技术   1210篇
自动化技术   17772篇
  2024年   435篇
  2023年   1484篇
  2022年   2980篇
  2021年   4151篇
  2020年   4221篇
  2019年   5422篇
  2018年   5657篇
  2017年   6241篇
  2016年   6470篇
  2015年   7795篇
  2014年   8720篇
  2013年   10808篇
  2012年   9626篇
  2011年   9610篇
  2010年   9595篇
  2009年   9293篇
  2008年   8899篇
  2007年   8367篇
  2006年   7820篇
  2005年   6382篇
  2004年   5040篇
  2003年   4172篇
  2002年   4133篇
  2001年   3599篇
  2000年   3189篇
  1999年   2322篇
  1998年   1484篇
  1997年   1243篇
  1996年   1103篇
  1995年   858篇
  1994年   799篇
  1993年   521篇
  1992年   447篇
  1991年   331篇
  1990年   236篇
  1989年   224篇
  1988年   142篇
  1987年   103篇
  1986年   82篇
  1985年   50篇
  1984年   48篇
  1983年   32篇
  1982年   34篇
  1981年   32篇
  1980年   32篇
  1979年   17篇
  1977年   10篇
  1976年   9篇
  1959年   8篇
  1951年   12篇
排序方式: 共有10000条查询结果,搜索用时 0 毫秒
991.
992.
993.
In this study, we propose an effective method to estimate the reliability of finite element models reduced by the automated multi‐level substructuring (AMLS) method. The proposed error estimation method can accurately predict relative eigenvalue errors in reduced finite element models. A new, enhanced transformation matrix for the AMLS method is derived from the original transformation matrix by properly considering the contribution of residual substructural modes. The enhanced transformation matrix is an important prerequisite to develop the error estimation method. Adopting the basic concept of the error estimation method recently developed for the Craig–Bampton method, an error estimation method is developed for the AMLS method. Through various numerical examples, we demonstrate the accuracy of the proposed error estimation method and explore its computational efficiency. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
994.
A framework to validate and generate curved nodal high‐order meshes on Computer‐Aided Design (CAD) surfaces is presented. The proposed framework is of major interest to generate meshes suitable for thin‐shell and 3D finite element analysis with unstructured high‐order methods. First, we define a distortion (quality) measure for high‐order meshes on parameterized surfaces that we prove to be independent of the surface parameterization. Second, we derive a smoothing and untangling procedure based on the minimization of a regularization of the proposed distortion measure. The minimization is performed in terms of the parametric coordinates of the nodes to enforce that the nodes slide on the surfaces. Moreover, the proposed algorithm repairs invalid curved meshes (untangling), deals with arbitrary polynomial degrees (high‐order), and handles with low‐quality CAD parameterizations (independence of parameterization). Third, we use the optimization procedure to generate curved nodal high‐order surface meshes by means of an a posteriori approach. Given a linear mesh, we increase the polynomial degree of the elements, curve them to match the geometry, and optimize the location of the nodes to ensure mesh validity. Finally, we present several examples to demonstrate the features of the optimization procedure, and to illustrate the surface mesh generation process. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
995.
Under abnormal conditions, timely and effective decisions of system recovery and protective measures are of great significance for safety‐critical systems. The knowledge of the roles that network nodes play in the spreading process is crucial for developing efficient maintenance decisions; for singling out and preferential control, the ‘pivotal spreaders’ may be a way to maximize the chances to timely hinder the fault pervasion. Inspired by the inhomogeneous topological nature of a complex fault propagation network, this study is devoted to exploring the spreading capabilities of nodes regarding both structural connectivity and causal influence strength, so as to provide decisions of preferential recovery actions under specific fault scenarios. Specifically, the dynamic betweenness centrality and nonsymmetrical entropy are incorporated to adaptively measure the system‐wide fault diffusion risk of a set of controllable fault events. In order to model the dynamics and uncertainties involved in the complex fault spreading process, we introduce the model of a dynamic uncertain causality graph, based on which solutions of time‐varying structure decomposition and causality reduction are adopted to improve the reasoning efficiency. Verification experiments consisting of simulated calculation cases and generator faults of a nuclear power plant show empirically the effectiveness and applicability of this method in large‐scale engineering practice. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
996.
The lifetime and the reliability of Blu‐Ray Recordable media were estimated with an acceleration test varying the temperature and the relative humidity. Some brand of the media showed relatively long lifetime over 80 years, but another brand was immeasurable because the media was broken during the acceleration procedure. A strong dependence of the lifetime of the media on the brand was observed. Effect of the initial recording performance in terms of the random symbol error rate and the jitter on the reliability was analyzed. The random symbol error rate showed a strong correlation with the reliability of the Blu‐Ray Recordable media. Copyright © 2014 John Wiley & Sons, Ltd.  相似文献   
997.
The exponentially weighted moving average (EWMA) model has been successfully used in acceptance sampling plans. The EWMA model provides the quality information of the current lot and the preceding lots. In addition, a multiple dependent state (MDS) sampling plan considers the quality information of the preceding lots. In this study, we present two new sampling plans for linear profiles. One is based on EWMA model with yield index using the single sampling plan, and the other is based on EWMA model with yield index using the MDS sampling plans. The plan parameters are determined by a nonlinear optimization approach. As the smoothing parameter value equals to one, the first proposed plan becomes the traditional single sampling plan. In addition, we compare the proposed plans with the traditional single sampling plan. The results indicate that the MDS sampling plan based on EWMA model with yield index with smaller value of smoothing parameter performs better than the traditional single sampling plan and the single sampling plan based on EWMA model with yield index in terms of the sample size required. One real example is used to illustrate the proposed plan. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
998.
Reference range is a statistic that is used in health related fields to represent the range of the most likely values for a variable of interest. Based on this range, individuals are classified as being healthy or unhealthy. In biostatistics, the reference range is calculated as the (1 ? α)% prediction interval, where this prediction interval is based on the estimated population variance from the data. Such estimation of population variance is not precise, because obtained test results do usually have errors associated with them. These errors are due to the imprecise test procedure or gauge used. In this paper, the total variability in the data is decomposed into two categories. The first is the patient‐to‐patient variability and the other is the variability due to the measurement system used. Estimation of the two kinds is performed through a gauge repeatability and reproducibility study, then the reference range is calculated, taking into account only the patient‐to‐patient variability. The revised reference range procedure is illustrated through a case study of vitamin B12 test results. A closed form formula is given to calculate the probability of a given test result being within the revised reference range. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
999.
This article analyzes the simultaneous control of several correlated Poisson variables by using the Variable Dimension Linear Combination of Poisson Variables (VDLCP) control chart, which is a variable dimension version of the LCP chart. This control chart uses as test statistic, the linear combination of correlated Poisson variables in an adaptive way, i.e. it monitors either p1 or p variables (p1 < p) depending on the last statistic value. To analyze the performance of this chart, we have developed software that finds the best parameters, optimizing the out‐of‐control average run length (ARL) for a shift that the practitioner wishes to detect as quickly as possible, restricted to a fixed value for in‐control ARL. Markov chains and genetic algorithms were used in developing this software. The results show performance improvement compared to the LCP chart. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
1000.
Among a set of tools that form the core of statistical process control, statistical control charts are most commonly used for controlling, monitoring, and improving processes. The conventional control charts are based on the assumption that the distribution of the quality characteristic to be monitored follows the normal distribution. However, in real applications, many process distributions may follow a positively skewed distribution such as the lognormal distribution. In this study, we discuss the construction of several control charts for monitoring the mean of the lognormal distribution. A real example is used to demonstrate how these charts can be applied in practice. Copyright © 2015 John Wiley & Sons, Ltd.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号