首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   422篇
  免费   40篇
  国内免费   5篇
电工技术   14篇
综合类   3篇
化学工业   144篇
金属工艺   6篇
机械仪表   15篇
建筑科学   6篇
矿业工程   1篇
能源动力   19篇
轻工业   33篇
水利工程   4篇
石油天然气   5篇
无线电   42篇
一般工业技术   55篇
冶金工业   17篇
原子能技术   5篇
自动化技术   98篇
  2023年   5篇
  2022年   11篇
  2021年   22篇
  2020年   19篇
  2019年   23篇
  2018年   26篇
  2017年   28篇
  2016年   26篇
  2015年   11篇
  2014年   29篇
  2013年   45篇
  2012年   21篇
  2011年   37篇
  2010年   27篇
  2009年   19篇
  2008年   17篇
  2007年   8篇
  2006年   5篇
  2005年   5篇
  2004年   6篇
  2003年   6篇
  2002年   7篇
  2001年   8篇
  2000年   6篇
  1999年   7篇
  1998年   4篇
  1997年   4篇
  1996年   3篇
  1995年   2篇
  1994年   3篇
  1993年   2篇
  1992年   3篇
  1991年   1篇
  1990年   3篇
  1988年   3篇
  1987年   2篇
  1986年   1篇
  1985年   4篇
  1984年   1篇
  1983年   1篇
  1981年   1篇
  1975年   3篇
  1967年   1篇
  1963年   1篇
排序方式: 共有467条查询结果,搜索用时 328 毫秒
141.
Colloidal liquid metal alloys of gallium, with melting points below room temperature, are potential candidates for creating electrically conductive and flexible composites. However, inclusion of liquid metal micro‐ and nanodroplets into soft polymeric matrices requires a harsh auxiliary mechanical pressing to rupture the droplets to establish continuous pathways for high electrical conductivity. However, such a destructive strategy reduces the integrity of the composites. Here, this problem is solved by incorporating small loading of nonfunctionalized graphene flakes into the composites. The flakes introduce cavities that are filled with liquid metal after only relatively mild press‐rolling (<0.1 MPa) to form electrically conductive continuous pathways within the polymeric matrix, while maintaining the integrity and flexibility of the composites. The composites are characterized to show that even very low graphene loadings (≈0.6 wt%) can achieve high electrical conductivity. The electrical conductance remains nearly constant, with changes less than 0.5%, even under a relatively high applied pressure of >30 kPa. The composites are used for forming flexible electrically‐conductive tracks in electronic circuits with a self‐healing property. The demonstrated application of co‐fillers, together with liquid metal droplets, can be used for establishing electrically‐conductive printable‐composite tracks for future large‐area flexible electronics.  相似文献   
142.
This paper presents an efficient metamodel building technique for solving collaborative optimization (CO) based on high fidelity models. The proposed method is based on a metamodeling concept, that is designed to simultaneously utilize computationally efficient (low fidelity) and expensive (high fidelity) models in an optimization process. A distinctive feature of the method is the utilization of interaction between low and high fidelity models in the construction of high quality metamodels both at the discipline level and system level of the CO. The low fidelity model is tuned in such a way that it approaches the same level of accuracy as the high fidelity model; but at the same time remains computational inexpensive. In this process, the tuned low fidelity models are used in the discipline level optimization process. In the system level, to handle the computational cost of the equality constraints in CO, model management strategy along with metamodeling technique are used. To determine the fidelity of metamodels, the predictive estimation of model fidelity method is applied. The developed method is demonstrated on a 2D Airfoil design problem, involving tightly coupled high fidelity structural and aerodynamic models. The results obtained show that the proposed method significantly reduces computational cost, and improves the convergence rate for solving the multidisciplinary optimization problem based on high fidelity models.  相似文献   
143.
In this paper, the exergetic performance of a continuous bioreactor for ethanol and acetate synthesis from syngas via a strictly anaerobic autotrophic bacterium Clostridium ljungdahlii was carried out for the first time. The fermentation process was evaluated using both conventional exergy and eco-exergy principles for measuring the productivity and renewability of the process at various liquid media flow rates. The microorganisms successfully upgraded the syngas into invaluable ethanol and acetate through the Wood–Ljungdahl pathway. The exergy efficiency was found to be in the range of 6.5–77.5 and 6.8–77.5 % during the fermentation using conventional exergy and eco-exergy concepts, respectively. The subtle differences observed in the exergetic parameters using the two exergetic concepts were ascribed to the slow growth rate of the microorganisms. Nevertheless, the eco-exergy concept would strongly be recommended for commercial bioreactor containing living organisms due to the inclusion of the information carried by microorganisms in the exergetic calculation. A desired liquid media flow rate of 0.55 mL/min was found according to a newly defined thermodynamic indictor namely exergetic productivity index. More specifically, the maximum exergetic productivity index of the fermentation process was found to be 8.0 using both approaches when the rate of inflow liquid was adjusted at the optimal value. The results of this study revealed that process yield alone cannot be a reliable performance metric for decision making on the productivity of various biofuel production pathways. Finally, the proposed exergetic framework could assist engineers and researchers to link biochemical and physical knowledge more robustly and to quantify and elucidate the general purpose of productivity and renewability.  相似文献   
144.
PbS thin films were grown on glass substrates by chemical bath deposition (CBD) using lead nitrate, thiourea and sodium hydroxide in aqueous solutions at three different temperatures (22, 36 and 50?°C). The microstructure and morphology evolution of the films were investigated using X-ray diffraction, scanning electron microscopy and atomic force microscopy. Optical properties were studied using UV–Vis–IR spectroscopy. The results indicate that temperature plays an important role in controlling the morphology and optical properties of nanostructured PbS thin films through changing deposition mechanism. The active deposition mechanism changed from cluster to ion-by-ion mechanism with an increase in deposition temperature from 22 to 50?°C, and consequently, film properties such as morphology, optical absorption and preferred orientation changed completely.  相似文献   
145.
Previous studies of the two-sided assembly line balancing problem assumed equal relationships between each two tasks assignable to a side of the line. In practice, however, this relationship may be related to such factors as the distance between the implementation place and the tools required for implementation. We know that the more relationships exist between the tasks assigned to each station, the more efficient will be the assembly line. In this paper, we suggest an index for calculating the value of the relationship between each two tasks, and define a performance criterion called ‘assembly line tasks consistency’ for calculating the average relationship between the tasks assigned to the stations of each solution. We propose a simulated annealing algorithm for solving the two-sided assembly line balancing problem considering the three performance criteria of number of stations, number of mated-stations, and assembly line tasks consistency. Also, the simulated annealing algorithm is modified for solving the two-sided assembly line balancing problem without considering the relationships between tasks. This modification finds five new best solutions for the number of stations performance criterion and ten new best solutions for the number of mated-stations performance criterion for benchmark instances.  相似文献   
146.
Metamodel-based collaborative optimization framework   总被引:2,自引:2,他引:0  
This paper focuses on the metamodel-based collaborative optimization (CO). The objective is to improve the computational efficiency of CO in order to handle multidisciplinary design optimization problems utilising high fidelity models. To address these issues, two levels of metamodel building techniques are proposed: metamodels in the disciplinary optimization are based on multi-fidelity modelling (the interaction of low and high fidelity models) and for the system level optimization a combination of a global metamodel based on the moving least squares method and trust region strategy is introduced. The proposed method is demonstrated on a continuous fiber-reinforced composite beam test problem. Results show that methods introduced in this paper provide an effective way of improving computational efficiency of CO based on high fidelity simulation models.  相似文献   
147.
The directed differentiation of human pluripotent stem cells (hPSCs) into defined populations has advanced regenerative medicine, especially for Parkinson's disease where clinical trials are underway. Despite this, tumorigenic risks associated with incompletely patterned and/or quiescent proliferative cells within grafts remain. Addressing this, donor stem cells carrying the suicide gene, thymidine kinase (activated by the prodrug ganciclovir, GCV), are employed to enable the programmed ablation of proliferative cells within neural grafts. However, coinciding the short half-life of GCV with the short S-phase of neural progenitors is a key challenge. To overcome this, a smart hydrogel delivery matrix is fabricatedto prolong GCV presentation. Following matrix embedment, GCV retains its functionality, demonstrated by ablation of hPSCs and proliferating neural progenitors in vitro. A prolonged GCV release is measured by mass spectrometry following the injection of a GCV-functionalized hydrogel into mouse brains. Compared to suboptimal, daily systemic GCV injections, the intracerebral delivery of the functionalized hydrogel, as a “one-off treatment”, reduce proliferative cells in both hPSC-derived teratomas and neural grafts, without affecting the graft's functional unit (i.e., neurons). It is demonstrated that a functionalized biomaterial can enhance prodrug delivery and address safety concerns associated with the use of hPSCs for brain repair.  相似文献   
148.
Is there a need for fuzzy logic?   总被引:1,自引:0,他引:1  
“Is there a need for fuzzy logic?” is an issue which is associated with a long history of spirited discussions and debate. There are many misconceptions about fuzzy logic. Fuzzy logic is not fuzzy. Basically, fuzzy logic is a precise logic of imprecision and approximate reasoning. More specifically, fuzzy logic may be viewed as an attempt at formalization/mechanization of two remarkable human capabilities. First, the capability to converse, reason and make rational decisions in an environment of imprecision, uncertainty, incompleteness of information, conflicting information, partiality of truth and partiality of possibility - in short, in an environment of imperfect information. And second, the capability to perform a wide variety of physical and mental tasks without any measurements and any computations [L.A. Zadeh, From computing with numbers to computing with words - from manipulation of measurements to manipulation of perceptions, IEEE Transactions on Circuits and Systems 45 (1999) 105-119; L.A. Zadeh, A new direction in AI - toward a computational theory of perceptions, AI Magazine 22 (1) (2001) 73-84]. In fact, one of the principal contributions of fuzzy logic - a contribution which is widely unrecognized - is its high power of precisiation.Fuzzy logic is much more than a logical system. It has many facets. The principal facets are: logical, fuzzy-set-theoretic, epistemic and relational. Most of the practical applications of fuzzy logic are associated with its relational facet.In this paper, fuzzy logic is viewed in a nonstandard perspective. In this perspective, the cornerstones of fuzzy logic - and its principal distinguishing features - are: graduation, granulation, precisiation and the concept of a generalized constraint.A concept which has a position of centrality in the nontraditional view of fuzzy logic is that of precisiation. Informally, precisiation is an operation which transforms an object, p, into an object, p, which in some specified sense is defined more precisely than p. The object of precisiation and the result of precisiation are referred to as precisiend and precisiand, respectively. In fuzzy logic, a differentiation is made between two meanings of precision - precision of value, v-precision, and precision of meaning, m-precision. Furthermore, in the case of m-precisiation a differentiation is made between mh-precisiation, which is human-oriented (nonmathematical), and mm-precisiation, which is machine-oriented (mathematical). A dictionary definition is a form of mh-precisiation, with the definiens and definiendum playing the roles of precisiend and precisiand, respectively. Cointension is a qualitative measure of the proximity of meanings of the precisiend and precisiand. A precisiand is cointensive if its meaning is close to the meaning of the precisiend.A concept which plays a key role in the nontraditional view of fuzzy logic is that of a generalized constraint. If X is a variable then a generalized constraint on X, GC(X), is expressed as X isr R, where R is the constraining relation and r is an indexical variable which defines the modality of the constraint, that is, its semantics. The primary constraints are: possibilistic, (r = blank), probabilistic (r = p) and veristic (r = v). The standard constraints are: bivalent possibilistic, probabilistic and bivalent veristic. In large measure, science is based on standard constraints.Generalized constraints may be combined, qualified, projected, propagated and counterpropagated. The set of all generalized constraints, together with the rules which govern generation of generalized constraints, is referred to as the generalized constraint language, GCL. The standard constraint language, SCL, is a subset of GCL.In fuzzy logic, propositions, predicates and other semantic entities are precisiated through translation into GCL. Equivalently, a semantic entity, p, may be precisiated by representing its meaning as a generalized constraint.By construction, fuzzy logic has a much higher level of generality than bivalent logic. It is the generality of fuzzy logic that underlies much of what fuzzy logic has to offer. Among the important contributions of fuzzy logic are the following:
1.
FL-generalization. Any bivalent-logic-based theory, T, may be FL-generalized, and hence upgraded, through addition to T of concepts and techniques drawn from fuzzy logic. Examples: fuzzy control, fuzzy linear programming, fuzzy probability theory and fuzzy topology.
2.
Linguistic variables and fuzzy if-then rules. The formalism of linguistic variables and fuzzy if-then rules is, in effect, a powerful modeling language which is widely used in applications of fuzzy logic. Basically, the formalism serves as a means of summarization and information compression through the use of granulation.
3.
Cointensive precisiation. Fuzzy logic has a high power of cointensive precisiation. This power is needed for a formulation of cointensive definitions of scientific concepts and cointensive formalization of human-centric fields such as economics, linguistics, law, conflict resolution, psychology and medicine.
4.
NL-Computation (computing with words). Fuzzy logic serves as a basis for NL-Computation, that is, computation with information described in natural language. NL-Computation is of direct relevance to mechanization of natural language understanding and computation with imprecise probabilities. More generally, NL-Computation is needed for dealing with second-order uncertainty, that is, uncertainty about uncertainty, or uncertainty2 for short.
In summary, progression from bivalent logic to fuzzy logic is a significant positive step in the evolution of science. In large measure, the real-world is a fuzzy world. To deal with fuzzy reality what is needed is fuzzy logic. In coming years, fuzzy logic is likely to grow in visibility, importance and acceptance.  相似文献   
149.
The ability of technology to transmit multi-media is very dependent on compression techniques. In particular lossy compression has been used in image compression (jpeg) audio compression (mp3) and video compression (mpg) to allow the transmission of audio and video over broadband network connections. Recently the sense of touch or haptics is becoming more important with its addition in computer games or in cruder applications such as vibrations in a cell phone. As haptic technology improves the ability to transmit compressed force sensations becomes more critical. Most lossy audio and visual compression techniques rely on the lack of sensitivity in humans to pick up detailed information in certain scenarios. Similarly limitations in the sensitivity of human touch could be exploited to create haptic models with much less detail and thus requiring smaller bandwidth. The focus of this paper is on the force thresholds of the human haptic system that can be used in a psychophysically motivated lossy haptic (force) compression technique. Most of the research in this field has measured the just noticeable difference (JND) of the human haptic system with a human user in static interaction with a stationary rigid object. In this paper our focus involves cases where the human user or the object are in relative motion. An example of such an application would be the haptic rendering of the user’s hand in contact with of a high-viscous material or interacting with a highly deformable object. Thus an approach is presented to measure the force threshold based on the velocity of the user’s hand motion. Two experiments are conducted to detect the absolute force threshold (AFT) of the human haptic system using methodologies from the field of psychophysics. The AFTs are detected for three different ranges of velocity of the user’s hand motion. This study implies that when a user’s hand is in motion fewer haptic details are required to be stored calculated or transmitted. Finally the implications of this study on a more complete future study will be discussed.
Eric KubicaEmail:
  相似文献   
150.
In a flexible-link manipulator, in general the effect of some parameters such as payload, friction amplitude and damping coefficients cannot be exactly measured. One possibility is to consider the above as parameters with uncertainty. In this paper, constant as well as L2-bounded deviations of parameters from their nominal values are considered as uncertainties. These uncertainties make it difficult for a linear controller to achieve desired closed-loop performance. To remedy this problem, a nonlinear dynamical model of a flexible-link manipulator which has a constant input vector field (g in [xdot]=f(x) + g(x)u) is obtained. Based on recent results in nonlinear robust regulation with an H∞ constraint a nonlinear controller is designed for the flexible-link manipulator. The contribution of this paper is in demonstrating that the nonlinear controller has a larger domain of attraction than the linearized controller. In fact, for the single-link flexible manipulator considered in this paper, the linear H∞ controller results in instability for step changes in the desired output of greater than 3.6 rad, whereas the nonlinear H∞ controller yields desired step changes of 2π rad. Simulation results demonstrating the advantages and superiority of the nonlinear H∞ controller over the linear H∞controller are presented.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号