首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   178篇
  免费   5篇
电工技术   5篇
化学工业   19篇
机械仪表   3篇
建筑科学   4篇
矿业工程   1篇
能源动力   7篇
轻工业   14篇
无线电   32篇
一般工业技术   29篇
冶金工业   51篇
自动化技术   18篇
  2023年   1篇
  2022年   3篇
  2021年   3篇
  2019年   4篇
  2018年   5篇
  2017年   4篇
  2016年   3篇
  2014年   4篇
  2013年   5篇
  2012年   9篇
  2011年   3篇
  2010年   5篇
  2009年   10篇
  2008年   7篇
  2007年   1篇
  2006年   11篇
  2005年   4篇
  2004年   3篇
  2003年   5篇
  2002年   7篇
  2001年   3篇
  2000年   2篇
  1999年   2篇
  1998年   17篇
  1997年   9篇
  1996年   9篇
  1995年   2篇
  1994年   4篇
  1993年   5篇
  1991年   2篇
  1990年   2篇
  1989年   5篇
  1988年   5篇
  1987年   2篇
  1985年   2篇
  1984年   1篇
  1982年   3篇
  1981年   4篇
  1979年   1篇
  1977年   1篇
  1976年   1篇
  1974年   1篇
  1973年   1篇
  1968年   1篇
  1964年   1篇
排序方式: 共有183条查询结果,搜索用时 15 毫秒
1.
This paper proposes an optimized content-aware authentication scheme for JPEG-2000 streams over lossy networks, where a received packet is consumed only when it is both decodable and authenticated. In a JPEG-2000 codestream, some packets are more important than others in terms of coding dependency and image quality. This naturally motivates allocating more redundant authentication information for the more important packets in order to maximize their probability of authentication and thereby minimize the distortion at the receiver. Towards this goal, with the awareness of its corresponding image content, we formulate an optimization framework to compute an authentication graph to maximize the expected media quality at the receiver, given specific authentication overhead and knowledge of network loss rate. System analysis and experimental results demonstrate that the proposed scheme achieves our design goal in that the rate-distortion (R-D) curve of the authenticated image is very close to the R-D curve when no authentication is required  相似文献   
2.
An important aspect of blood-material interactions is the activation, adhesion, and subsequent aggregation of blood platelets on the artificial surface, all of which are directly affected by local fluid dynamics. The objective of this work was to directly correlate changing local fluid dynamic conditions produced by various vessel geometries, including stenosis, aneurysm, and separate contraction and expansion geometries, with quantitative in vitro measurements of regional platelet deposition. We directly measured platelet deposition as a function of axial position along four Lexan flow chambers with axisymmetric models of these geometries using 111In-labeled platelets. Platelet deposition was maximum in observed areas of flow recirculation and reattachment and minimum in locations of high shear and separation. For the stenosis geometry, two distinct regions of increased platelet deposition were apparent, one proximal to and one distal to the stenosis throat. An approximately linear increase in platelet densities was produced in the aneurysm region, increasing in the direction of flow. Through a comparison of platelet deposition with local fluid streamline orientation, we have shown that platelet deposition is increased in certain areas due to the enhanced convective transport of platelets and blood cells to the vessel wall along locally curved streamlines with velocity components perpendicular to the vessel wall.  相似文献   
3.
A novel GaAs logic family, pseudodynamic latched logic (PDLL), is presented in this paper. It is composed of a dynamic circuit where the logic is performed and a static latch whose function is to permanently refresh the stored data on a dynamic node. Because of this hybrid structure, PDLL takes advantage of both static and dynamic families and thus, permits implementation of very complex structures with good speed-area power tradeoff. Moreover, the inclusion of the latch permits this class of logic family to be highly efficient for pipelined systems working even at high temperature without loss of data due to leakage currents. Barrel-shifters, programmable logic arrays (PLA's), and carry lookahead adders (CLA's) were verified by simulations demonstrating its feasibility for the development of high-performance very large scale integration (VLSI) systems  相似文献   
4.
Two different techniques that allow the implementation of embedded ROMs using a conventional GaAs MESFET technology are presented. The first approach is based on a novel circuit structure named low leakage current FET circuit (L2FC), which reduces significantly subthreshold currents. The second approach is based on pseudo current mode logic (PCML) which is by far the best choice in terms of noise margin levels. This characteristic is found to be the key factor when implementing GaAs ROM's because of its degradation as the number of word lines is increased. A 5-Kb ROM and a 2-Kb ROM were designed giving delays in the order of 2 ns and less than 1 ns, respectively. The results demonstrate the effectiveness of these techniques and their significance toward improving the noise margin  相似文献   
5.
The solution rheology of different generations of hyperbranched polyesters in N‐methyl‐2‐pyrrolidinone (NMP) solvent was examined in this study. The solutions exhibited Newtonian behavior over a wide range of polyester concentrations. Also, the relative viscosities of poly(amidoamine) (PAMAM) dendrimers in ethylenediamine were compared with those of the hyperbranched polyesters in NMP. Both types of dendritic polymers have relative viscosities that are exponential functions of their molar fraction in solution. The slopes of these relative viscosity curves show a linear relationship with respect to the generation number. PAMAM dendrimers have the greater slopes for each generation, reflecting their relatively larger intrinsic viscosity values.  相似文献   
6.
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.  相似文献   
7.
Cohesion of the fibers network is a key element in numerous manufacturing processes of textile structures and composite parts, because it significantly affects the implementability and the obtained result. However, cohesion remains at the moment an intuitive concept. This paper aims to deal with this concept, first proposing a first interpretation of yarn cohesion. Thanks to this definition, the in-plane shear test is proposed to characterize and measure cohesion. Among many difficulties, it appears to be an interesting way to analyze the cohesion of yarns extracted from 7 different batches and to establish the link between cohesion and implementation in manufacturing processes of Herakles which supports this study. In addition, the phenomenon responsible for the yarn cohesion are tackled and the influence of the yarn constitution is analyzed.  相似文献   
8.
9.
Strategic alignment and value maximization for IT project portfolios   总被引:1,自引:1,他引:0  
Managing project portfolios has been a challenge to many IT organizations due to the size and complexity of their initiatives that are often cross-functional, fast changing, and transformational in nature. A governance process on project solicitation, evaluation, and monitoring is thus essential to ensure the resulting portfolio creates tangible values, balances across priorities, and supports business objectives. An optimization model to streamline the decision processes for IT portfolios and programs is proposed. We consider project characteristics such as the extent of strategic alignment, expected benefit, development cost, and cross-project synergy to maximize the portfolio value. We also consider team proficiency and resource availability to determine a project portfolio that could be implemented within the overall development time. The multi-objective model identifies the optimal mix among project types and the solution procedure efficiently produces recommendations that are superior to those found with current empirical techniques. We also describe an evolutionary algorithm to find approximate solutions to the optimization model. Possible extensions on how the optimization procedure can go beyond projects to also streamline decisions such as the renewal or replacement of in-flight applications is discussed.  相似文献   
10.
Technological advances in the collection, storage, and analysis of data have increased the ease with which businesses can make profitable use of information about individuals. Some of this information is private, and individuals are simultaneously becoming more aware of the value of the information and how the loss of control over this information impacts their personal privacy. As a partial solution to these concerns, this paper presents a mechanism that serves two purposes. The first enables the use of private, numerical data in the answering of queries while simultaneously providing a security feature that protects the data owners from a loss of privacy that could result from an unauthorized access. The second develops a compensation model for the use of the data that allows individuals to dynamically redefine their security requirements. The compensation model is built on the information-security mechanism to create the foundation of a market for private information. This paper illustrates how compensation models like the one presented here could be used in a self-regulating market for private information. Additionally, the compensation component of an intermediated market for private information is developed and extensively analyzed. Finally, this paper provides insights and draws several important conclusions on markets for private information.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号