首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   4512篇
  免费   391篇
  国内免费   40篇
电工技术   85篇
综合类   19篇
化学工业   1161篇
金属工艺   123篇
机械仪表   209篇
建筑科学   215篇
矿业工程   13篇
能源动力   299篇
轻工业   450篇
水利工程   75篇
石油天然气   51篇
无线电   395篇
一般工业技术   721篇
冶金工业   215篇
原子能技术   42篇
自动化技术   870篇
  2024年   16篇
  2023年   86篇
  2022年   154篇
  2021年   320篇
  2020年   278篇
  2019年   334篇
  2018年   350篇
  2017年   331篇
  2016年   314篇
  2015年   196篇
  2014年   327篇
  2013年   493篇
  2012年   334篇
  2011年   346篇
  2010年   231篇
  2009年   204篇
  2008年   116篇
  2007年   69篇
  2006年   70篇
  2005年   54篇
  2004年   41篇
  2003年   33篇
  2002年   33篇
  2001年   17篇
  2000年   19篇
  1999年   16篇
  1998年   24篇
  1997年   18篇
  1996年   16篇
  1995年   16篇
  1994年   7篇
  1993年   12篇
  1991年   6篇
  1990年   5篇
  1989年   2篇
  1987年   4篇
  1986年   4篇
  1985年   4篇
  1984年   5篇
  1983年   4篇
  1982年   5篇
  1981年   2篇
  1980年   3篇
  1979年   2篇
  1978年   4篇
  1977年   5篇
  1975年   6篇
  1974年   2篇
  1969年   1篇
  1965年   1篇
排序方式: 共有4943条查询结果,搜索用时 15 毫秒
41.

Wave absorbers are considered to be fundamental building blocks for the manipulation of light. Almost all optical systems exploit absorbers to realize some functions. A highly tunable wide-band THz absorber is presented herein. Utilizing a dual-bias scheme with a single graphene layer leads to greater freedom to control the absorption response, while a conventional periodic array of graphene ribbons and a layer of graphene sheet are also exploited. Also, a circuit model representation for all the constituent parts of the proposed absorber is developed with an evolved design methodology. According to the simulation results, wide-band absorption from 3.5 to 6 THz is achieved.

  相似文献   
42.
Acute lung injury (ALI) afflicts approximately 200,000 patients annually and has a 40% mortality rate. The COVID-19 pandemic has massively increased the rate of ALI incidence. The pathogenesis of ALI involves tissue damage from invading microbes and, in severe cases, the overexpression of inflammatory cytokines such as tumor necrosis factor-α (TNF-α) and interleukin-1β (IL-1β). This study aimed to develop a therapy to normalize the excess production of inflammatory cytokines and promote tissue repair in the lipopolysaccharide (LPS)-induced ALI. Based on our previous studies, we tested the insulin-like growth factor I (IGF-I) and BTP-2 therapies. IGF-I was selected, because we and others have shown that elevated inflammatory cytokines suppress the expression of growth hormone receptors in the liver, leading to a decrease in the circulating IGF-I. IGF-I is a growth factor that increases vascular protection, enhances tissue repair, and decreases pro-inflammatory cytokines. It is also required to produce anti-inflammatory 1,25-dihydroxyvitamin D. BTP-2, an inhibitor of cytosolic calcium, was used to suppress the LPS-induced increase in cytosolic calcium, which otherwise leads to an increase in proinflammatory cytokines. We showed that LPS increased the expression of the primary inflammatory mediators such as toll like receptor-4 (TLR-4), IL-1β, interleukin-17 (IL-17), TNF-α, and interferon-γ (IFN-γ), which were normalized by the IGF-I + BTP-2 dual therapy in the lungs, along with improved vascular gene expression markers. The histologic lung injury score was markedly elevated by LPS and reduced to normal by the combination therapy. In conclusion, the LPS-induced increases in inflammatory cytokines, vascular injuries, and lung injuries were all improved by IGF-I + BTP-2 combination therapy.  相似文献   
43.
Electrospinning with a collector consisting of two pieces of electrically conductive substrates separated by a gap has been used to prepare uniaxially aligned PAN nanofibers. Solution of 15 wt % of PAN/DMF was used tentatively for electrospinning. The effects of width of the gap and applied voltage on degree of alignment were investigated using image‐processing technique by Fourier power spectrum method. The electrospinning conditions that gave the best alignment of nanofibers for 10–15 wt % solution concentrations were experimentally obtained. Bundles like multifilament yarns of uniaxially aligned nanofibers were prepared using a new simple method. After‐treatments of these bundles were carried out in boiling water under tension. A comparison was made between the crystallinity and mechanical behavior of posttreated and untreated bundles. © 2006 Wiley Periodicals, Inc. J Appl Polym Sci 101: 4350–4357, 2006  相似文献   
44.
Modeling river mixing mechanism in terms of pollution transmission in rivers is an important subject in environmental studies. Dispersion coefficient is an important parameter in river mixing problem. In this study, to model and predict the longitudinal dispersion coefficient (D L ) in natural streams, two soft computing techniques including multivariate adaptive regression splines (MARS) as a new approach to study hydrologic phenomena and multi-layer perceptron neural network as a common type of neural network model were prepared. To this end, related dataset were collected from literature and used for developing them. Performance of MARS model was compared with MLP and the empirical formula was proposed to calculate D L . To define the most effective parameters on D L structure of obtained formula from MARS model and more accurate formula was evaluated. Calculation of error indices including coefficient of determination (R2) and root mean square error (RMSE) for the results of MARS model showed that MARS model with R2?=?0.98 and RMSE?=?0.89 in testing stage has suitable performance for modeling D L . Comparing the performance of empirical formulas, ANN and MARS showed that MARS model is more accurate compared to others. Attention to the structure of developed MARS and the most accurate empirical formulas model showed that flow velocity, depth of flow (H) and shear velocity are the most influential parameters on D L .  相似文献   
45.
Software development processes have been evolving from rigid, pre-specified, and sequential to incremental, and iterative. This evolution has been dictated by the need to accommodate evolving user requirements and reduce the delay between design decision and feedback from users. Formal verification techniques, however, have largely ignored this evolution and even when they made enormous improvements and found significant uses in practice, like in the case of model checking, they remained confined into the niches of safety-critical systems. Model checking verifies if a system’s model \(\mathcal{M}\) satisfies a set of requirements, formalized as a set of logic properties \(\Phi\) . Current model-checking approaches, however, implicitly rely on the assumption that both the complete model \(\mathcal{M}\) and the whole set of properties \(\Phi\) are fully specified when verification takes place. Very often, however, \(\mathcal{M}\) is subject to change because its development is iterative and its definition evolves through stages of incompleteness, where alternative design decisions are explored, typically to evaluate some quality trade-offs. Evolving systems specifications of this kind ask for novel verification approaches that tolerate incompleteness and support incremental analysis of alternative designs for certain functionalities. This is exactly the focus of this paper, which develops an incremental model-checking approach for evolving Statecharts. Statecharts have been chosen both because they are increasingly used in practice natively support model refinements.  相似文献   
46.
This paper addresses the problem of defining a simple End-Effector design for a robotic arm that is able to grasp a given set of planar objects. The OCOG (Objects COmmon Grasp search) algorithm proposed in this paper searches for a common grasp over the set of objects mapping all possible grasps for each object that satisfy force closure and quality criteria by taking into account the external wrenches (forces and torque) applied to the object. The mapped grasps are represented by feature vectors in a high-dimensional space. This feature vector describes the design of the gripper. A database is generated for all possible grasps for each object in the feature vector space. A search algorithm is then used for intersecting all possible grasps over all parts and finding a common grasp suitable for all objects. The search algorithm utilizes the kd-tree index structure for representing the database of the sets of feature vectors. The kd-tree structure enables an efficient and low cost nearest-neighbor search for common vectors between the sets. Each common vector found (feature vector) is the grasp configuration for a group of objects, which implies the future end-effector design. The final step classifies the grasps found to subsets of the objects, according to the common vectors found. Simulations and experiments are presented for four objects to validate the feasibility of the proposed algorithm. The algorithm will be useful for standardization of end-effector design and reducing its engineering time.  相似文献   
47.
In cloud computing, services play key roles. Services are well defined and autonomous components. Nowadays, the demand of using Fuzzy inference as a service is increasing in the domain of complex and critical systems. In such systems, along with the development of the software, the cost of detecting and fixing software defects increases. Therefore, using formal methods, which provide clear, concise, and mathematical interpretation of the system, is crucial for the design of these Fuzzy systems. To obtain this goal, we introduce the Fuzzy Inference Cloud Service (FICS) and propose a novel discipline for formal modeling of the FICS. The FICS provides the service of Fuzzy inference to the consumers. We also introduce four novel formal verification tests, which allow strict analysis of certain behavioral disciplines in the FICS as follows: (1) Internal consistency, which analyzes the service in a strict and delicate manner; (2) Deadlock freeness; (3) Divergence freeness; and (4) Goal reach ability. The four tests are discussed and the FICS is verified to ensure that it can pass all these tests.  相似文献   
48.
The temperature dependence of the diffusion coefficient of ethanol-soluble substances from ground cloves (particle size 250 μm) during extraction was estimated by fitting batch extraction data at several temperatures (27.8, 40, 50, and 60°C) to a previously developed mass transfer model. The model was based on spherical geometry of particles. Nonlinear regression analysis was used to develop an equation that describes the diffusivity as a function of temperature. The temperature dependence ofD A was of the Arrhenius type.  相似文献   
49.
In this paper, a novel algorithm for image encryption based on hash function is proposed. In our algorithm, a 512-bit long external secret key is used as the input value of the salsa20 hash function. First of all, the hash function is modified to generate a key stream which is more suitable for image encryption. Then the final encryption key stream is produced by correlating the key stream and plaintext resulting in both key sensitivity and plaintext sensitivity. This scheme can achieve high sensitivity, high complexity, and high security through only two rounds of diffusion process. In the first round of diffusion process, an original image is partitioned horizontally to an array which consists of 1,024 sections of size 8 × 8. In the second round, the same operation is applied vertically to the transpose of the obtained array. The main idea of the algorithm is to use the average of image data for encryption. To encrypt each section, the average of other sections is employed. The algorithm uses different averages when encrypting different input images (even with the same sequence based on hash function). This, in turn, will significantly increase the resistance of the cryptosystem against known/chosen-plaintext and differential attacks. It is demonstrated that the 2D correlation coefficients (CC), peak signal-to-noise ratio (PSNR), encryption quality (EQ), entropy, mean absolute error (MAE) and decryption quality can satisfy security and performance requirements (CC <0.002177, PSNR <8.4642, EQ >204.8, entropy >7.9974 and MAE >79.35). The number of pixel change rate (NPCR) analysis has revealed that when only one pixel of the plain-image is modified, almost all of the cipher pixels will change (NPCR >99.6125 %) and the unified average changing intensity is high (UACI >33.458 %). Moreover, our proposed algorithm is very sensitive with respect to small changes (e.g., modification of only one bit) in the external secret key (NPCR >99.65 %, UACI >33.55 %). It is shown that this algorithm yields better security performance in comparison to the results obtained from other algorithms.  相似文献   
50.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号