首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   354篇
  免费   9篇
  国内免费   2篇
电工技术   2篇
化学工业   59篇
金属工艺   8篇
机械仪表   2篇
建筑科学   10篇
能源动力   4篇
轻工业   31篇
水利工程   3篇
无线电   32篇
一般工业技术   52篇
冶金工业   49篇
原子能技术   1篇
自动化技术   112篇
  2023年   5篇
  2022年   11篇
  2021年   19篇
  2020年   11篇
  2019年   10篇
  2018年   11篇
  2017年   8篇
  2016年   9篇
  2015年   3篇
  2014年   12篇
  2013年   19篇
  2012年   18篇
  2011年   36篇
  2010年   23篇
  2009年   15篇
  2008年   25篇
  2007年   12篇
  2006年   19篇
  2005年   9篇
  2004年   11篇
  2003年   7篇
  2002年   8篇
  2001年   5篇
  2000年   5篇
  1999年   3篇
  1998年   2篇
  1997年   7篇
  1996年   5篇
  1995年   5篇
  1994年   4篇
  1993年   1篇
  1991年   1篇
  1990年   1篇
  1989年   1篇
  1988年   1篇
  1987年   2篇
  1986年   1篇
  1984年   2篇
  1982年   6篇
  1981年   2篇
  1980年   2篇
  1979年   1篇
  1977年   5篇
  1969年   2篇
排序方式: 共有365条查询结果,搜索用时 46 毫秒
151.
I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes’ rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam’s internal realism, of Floridi’s informational structural realism, and also of van Fraasen’s empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine—an informational pragmatic realism.  相似文献   
152.
A broad variety of problems, such as targeted marketing and the spread of viruses and malware, have been modeled as maximizing the reach of diffusion through a network. In cyber-security applications, however, a key consideration largely ignored in this literature is stealth. In particular, an attacker who has a specific target in mind succeeds only if the target is reached before the malicious payload is detected and corresponding countermeasures deployed. The dual side of this problem is deployment of a limited number of monitoring units, such as cyber-forensics specialists, to limit the success of such targeted and stealthy diffusion processes. We investigate the problem of optimal monitoring of targeted stealthy diffusion processes. While natural variants of this problem are NP-hard, we show that if stealthy diffusion starts from randomly selected nodes, the defender’s objective is submodular and can be approximately optimized. In addition, we present approximation algorithms for the setting where the choice of the starting point is adversarial. We further extend our results to settings where the diffusion starts at multiple-seed nodes simultaneously, and where there is an inherent delay in detecting the infection. Our experimental results show that the proposed algorithms are highly effective and scalable.  相似文献   
153.
We present a review of the state of the art of segmentation and partitioning techniques of boundary meshes. Recently, these have become a part of many mesh and object manipulation algorithms in computer graphics, geometric modelling and computer aided design. We formulate the segmentation problem as an optimization problem and identify two primarily distinct types of mesh segmentation, namely part segmentation and surface‐patch segmentation. We classify previous segmentation solutions according to the different segmentation goals, the optimization criteria and features used, and the various algorithmic techniques employed. We also present some generic algorithms for the major segmentation techniques.  相似文献   
154.
Recently, the advance reservation functionality gained high importance in grids due to increasing popularity of modern applications that require interactive tasks, co-allocation of multiple resources, and performance guarantees. However, simultaneous scheduling, both advance reservations and batch tasks affects the performance. Advance reservations significantly deteriorate flow time of batch tasks and the overall resource utilization, especially in hierarchical scheduling structures. This is a consequence of unknown batch task processing times and the lack of possibility of altering allocations of advance reservations. To address these issues we present a common model for scheduling both computational batch tasks and tasks with advance reservation requests. We propose simple on-line scheduling policies and generic advices that reduce negative impact of advance reservations on a schedule quality. We also propose novel data structures and algorithms for efficient scheduling of advance reservations. A comprehensive experimental analysis is presented to show the influence of advance reservations on resource utilization, mean flow time, and mean tardiness—the criteria significant for administrators, users submitting batch tasks, and users requesting advance reservations, respectively. All experiments were performed with a well-known real workload using the GSSIM simulator.  相似文献   
155.
Protocols for secure computation enable mutually distrustful parties to jointly compute on their private inputs without revealing anything, but the result. Over recent years, secure computation has become practical and considerable effort has been made to make it more and more efficient. A highly important tool in the design of two-party protocols is Yao’s garbled circuit construction (Yao 1986), and multiple optimizations on this primitive have led to performance improvements in orders of magnitude over the last years. However, many of these improvements come at the price of making very strong assumptions on the underlying cryptographic primitives being used (e.g., that AES is secure for related keys, that it is circular-secure, and even that it behaves like a random permutation when keyed with a public fixed key). The justification behind making these strong assumptions has been that otherwise it is not possible to achieve fast garbling and thus fast secure computation. In this paper, we take a step back and examine whether it is really the case that such strong assumptions are needed. We provide new methods for garbling that are secure solely under the assumption that the primitive used (e.g., AES) is a pseudorandom function. Our results show that in many cases, the penalty incurred is not significant, and so a more conservative approach to the assumptions being used can be adopted.  相似文献   
156.
We describe a momentum imaging setup for direct time-resolved studies of ionization-induced molecular dynamics. This system uses a tabletop ultrafast extreme-ultraviolet (EUV) light source based on high harmonic upconversion of a femtosecond laser. The high photon energy (around 42 eV) allows access to inner-valence states of a variety of small molecules via single photon excitation, while the sub--10-fs pulse duration makes it possible to follow the resulting dynamics in real time. To obtain a complete picture of molecular dynamics following EUV induced photofragmentation, we apply the versatile cold target recoil ion momentum spectroscopy reaction microscope technique, which makes use of coincident three-dimensional momentum imaging of fragments resulting from photoexcitation. This system is capable of pump-probe spectroscopy by using a combination of EUV and IR laser pulses with either beam as a pump or probe pulse. We report several experiments performed using this system.  相似文献   
157.
Most spectrum allocation algorithms in elastic optical networks apply a greedy approach: A new connection is allocated as long as there are enough spectrum slots to accommodate it. Recently, a different approach was proposed. Named Deadlock–Avoidance (DA), it only establishes a new connection if the portion of spectrum left after allocating it is zero (full-link utilization) or is big enough to accommodate future requests. Otherwise, the connection request is blocked as a way to avoid fragmentation. The performance of DA has been evaluated in a single-link scenario, where its performance is not affected by the slot continuity constraint. In this paper, we evaluate for the first time the blocking performance and fragmentation level of DA in a fully dynamic network scenario with different bitrates and number of slots for a single link, a 4-node bus and a mesh topology. The performance was evaluated by simulation, and a lower bound was also derived using a continuous Markov chain model. Results are obtained for DA and three greedy algorithms: First Fit, Exact Fit and First–Last Fit. Results show that DA significantly decreases fragmentation, and thus, it exhibits a much lower blocking due to fragmentation than the greedy algorithms. However, this decrease is compensated by a new type of blocking due to the selective acceptance of connections. As a result, the extra computational complexity of DA does not compensate a gain in performance.  相似文献   
158.
159.
Intermediate pyrolysis reactors are preferred for processes focused on the production of high-quality biochar. The main types are rotary drums, augers, and moving beds agitated with either grates or paddles. These reactors are usually operated in continuous mode, and are designed to provide a pure, homogeneous biochar product by ensuring near plug flow of the reacting particles. There is a need for laboratory reactors that can provide enough biochar for testing in applications such as soil amendment, fillers for concrete or polymers, coke substitution, or pollutant capture. The pyrolysis shaker reactor (PSR) is a new laboratory reactor that is inexpensive, provides good mixing and temperature control, is easy to operate and allows for rapid turnaround between runs. It provides a homogeneous biochar product. Its use was demonstrated with digestate from the anaerobic digestion of food waste. The rapid and thorough testing program made possible with the PSR indicated that this digestate should be pyrolyzed at 250°C to maximize the release of mineral from the biochar to water, and at 400°C to minimize the release of minerals. Its biochar would require post-treatment to be applied as a substitute for activated carbon.  相似文献   
160.
The last couple of decades have seen a large amount of activity in the area of surrogate marker and surrogate endpoint validation, both from a clinical and a statistical perspective. Prentice made a pivotal contribution in the context of a single trial. Subsequently, the framework he proposed has been discussed, criticized, and extended. An important class of extensions considers several rather than a single trial. Recently, a lot of work has been done in this so-called hierarchical or meta-analytic framework. In this paper, we review both the single trial and the hierarchical framework. A number of applications, scattered throughout the literature, are brought together. We outline the statistical issues involved in trying to validate surrogate endpoints. Clearly statistical evidence should only be seen as a component in a decision making process that also involves a number of clinical and biological considerations.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号