首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   354篇
  免费   20篇
电工技术   3篇
综合类   1篇
化学工业   56篇
金属工艺   3篇
机械仪表   5篇
建筑科学   16篇
能源动力   4篇
轻工业   62篇
无线电   33篇
一般工业技术   51篇
冶金工业   72篇
原子能技术   3篇
自动化技术   65篇
  2023年   3篇
  2022年   7篇
  2021年   11篇
  2020年   12篇
  2019年   6篇
  2018年   12篇
  2017年   10篇
  2016年   8篇
  2015年   7篇
  2014年   11篇
  2013年   15篇
  2012年   26篇
  2011年   22篇
  2010年   11篇
  2009年   19篇
  2008年   15篇
  2007年   22篇
  2006年   18篇
  2005年   9篇
  2004年   7篇
  2003年   9篇
  2002年   6篇
  2001年   6篇
  2000年   5篇
  1999年   4篇
  1998年   9篇
  1997年   6篇
  1996年   8篇
  1995年   7篇
  1994年   3篇
  1993年   5篇
  1990年   2篇
  1988年   4篇
  1987年   2篇
  1986年   3篇
  1985年   2篇
  1984年   8篇
  1983年   3篇
  1982年   7篇
  1981年   4篇
  1980年   5篇
  1979年   2篇
  1978年   3篇
  1977年   1篇
  1975年   1篇
  1974年   1篇
  1972年   1篇
  1968年   2篇
  1967年   1篇
  1961年   1篇
排序方式: 共有374条查询结果,搜索用时 15 毫秒
1.
In this paper, we re-examine the results of prior work on methods for computing ad hoc joins. We develop a detailed cost model for predicting join algorithm performance, and we use the model to develop cost formulas for the major ad hoc join methods found in the relational database literature. We show that various pieces of “common wisdom” about join algorithm performance fail to hold up when analyzed carefully, and we use our detailed cost model to derive op timal buffer allocation schemes for each of the join methods examined here. We show that optimizing their buffer allocations can lead to large performance improvements, e.g., as much as a 400% improvement in some cases. We also validate our cost model's predictions by measuring an actual implementation of each join algorithm considered. The results of this work should be directly useful to implementors of relational query optimizers and query processing systems. Edited by M. Adiba. Received May 1993 / Accepted April 1996  相似文献   
2.
An improved method for the activation of polyethylene glycol with commercially available succinimidyl carbonate is described. The activated polyethylene glycol was coupled to proteins in high yield.  相似文献   
3.
Photo Sequencing     
A group of people taking pictures of a dynamic event with their mobile phones is a popular sight. The set of still images obtained this way is rich in dynamic content but lacks accurate temporal information. We propose a method for photo-sequencing—temporally ordering a set of still images taken asynchronously by a set of uncalibrated cameras. Photo-sequencing is an essential tool in analyzing (or visualizing) a dynamic scene captured by still images. The first step of the method detects sets of corresponding static and dynamic feature points across images. The static features are used to determine the epipolar geometry between pairs of images, and each dynamic feature votes for the temporal order of the images in which it appears. The partial orders provided by the dynamic features are not necessarily consistent, and we use rank aggregation to combine them into a globally consistent temporal order of images. We demonstrate successful photo-sequencing on several challenging collections of images taken using a number of mobile phones.  相似文献   
4.
5.
Tracking in a Dense Crowd Using Multiple Cameras   总被引:1,自引:0,他引:1  
Tracking people in a dense crowd is a challenging problem for a single camera tracker due to occlusions and extensive motion that make human segmentation difficult. In this paper we suggest a method for simultaneously tracking all the people in a densely crowded scene using a set of cameras with overlapping fields of view. To overcome occlusions, the cameras are placed at a high elevation and only people’s heads are tracked. Head detection is still difficult since each foreground region may consist of multiple subjects. By combining data from several views, height information is extracted and used for head segmentation. The head tops, which are regarded as 2D patches at various heights, are detected by applying intensity correlation to aligned frames from the different cameras. The detected head tops are then tracked using common assumptions on motion direction and velocity. The method was tested on sequences in indoor and outdoor environments under challenging illumination conditions. It was successful in tracking up to 21 people walking in a small area (2.5 people per m2), in spite of severe and persistent occlusions.  相似文献   
6.
We present a new method for recovering the 3D shape of a featureless smooth surface from three or more calibrated images illuminated by different light sources (three of them are independent). This method is unique in its ability to handle images taken from unconstrained perspective viewpoints and unconstrained illumination directions. The correspondence between such images is hard to compute and no other known method can handle this problem locally from a small number of images. Our method combines geometric and photometric information in order to recover dense correspondence between the images and accurately computes the 3D shape. Only a single pass starting at one point and local computation are used. This is in contrast to methods that use the occluding contours recovered from many images to initialize and constrain an optimization process. The output of our method can be used to initialize such processes. In the special case of fixed viewpoint, the proposed method becomes a new perspective photometric stereo algorithm. Nevertheless, the introduction of the multiview setup, self-occlusions, and regions close to the occluding boundaries are better handled, and the method is more robust to noise than photometric stereo. Experimental results are presented for simulated and real images.  相似文献   
7.
In this paper we describe a new model suitable for optimization problems with explicitly unknown optimization functions using user’s preferences. The model addresses an ability to learn not known optimization functions thus perform also a learning of user’s preferences. The model consists of neural networks using fuzzy membership functions and interactive evolutionary algorithms in the process of learning. Fuzzy membership functions of basic human values and their priorities were prepared by utilizing Schwartz’s model of basic human values (achievement, benevolence, conformity, hedonism, power, security, self-direction, stimulation, tradition and universalism). The quality of the model was tested on “the most attractive font face problem” and it was evaluated using the following criteria: a speed of optimal parameters computation, a precision of achieved results, Wilcoxon signed rank test and a similarity of letter images. The results qualify the developed model as very usable in user’s preference modeling.  相似文献   
8.
The CALPHAD method can be applied as a tool for both alloy development and process guideline determination. In this study, two Mg alloys were designed, their process parameters derived and, using the CALPHAD method, the final results simulated. These results were later confirmed using tangible experimental methods. It was found that γγ- Mg17Al12 precipitates along the grain boundaries (GB), Mg2Sn forms both along the GB and as fine precipitates in the αα-Mg matrix and the addition of Ce mishmetal (MM) leads to the formation of elongated Al- rare earth (RE) precipitates along the GB. The microstructural stability at 200 °C is high, showing no decrease in microhardness for 32 days. It is shown that the CALPHAD method considerably reduces the effort of alloy design and that the reliability of the results is high.  相似文献   
9.
10.
The weak form of the Efficient Market Hypothesis (EMH) states that current market price reflects fully the information from past prices and rules out prediction based on price data alone. No recent test of time series of stock returns rejects this weak-form hypothesis. This research offers another test of the weak form of the EHM that leads to different conclusions for some time series.The stochastic complexity of a time series is a measure of the number of bits needed to represent and reproduce the information in the time series. In an efficient market, compression of the time series is not possible, because there are no patterns and the stochastic complexity is high. In this research, Rissanen's context tree algorithm is used to identify recurring patterns in the data, and use them for compression. The weak form of the EMH is tested for 13 international stock indices and for all the stocks that comprise the Tel-Aviv 25 index (TA25), using sliding windows of 50, 75, and 100 consecutive daily returns. Statistically significant compression is detected in ten of the international stock index series. In the aggregate, 60% to 84% of the TA25 stocks tested demonstrate compressibility beyond randomness. This indicates potential market inefficiency.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号