全文获取类型
收费全文 | 223篇 |
免费 | 11篇 |
专业分类
电工技术 | 4篇 |
化学工业 | 19篇 |
金属工艺 | 1篇 |
机械仪表 | 4篇 |
建筑科学 | 1篇 |
能源动力 | 12篇 |
轻工业 | 9篇 |
无线电 | 32篇 |
一般工业技术 | 27篇 |
冶金工业 | 45篇 |
原子能技术 | 1篇 |
自动化技术 | 79篇 |
出版年
2023年 | 2篇 |
2022年 | 3篇 |
2021年 | 1篇 |
2020年 | 4篇 |
2019年 | 2篇 |
2018年 | 9篇 |
2017年 | 8篇 |
2016年 | 4篇 |
2015年 | 6篇 |
2014年 | 8篇 |
2013年 | 15篇 |
2012年 | 8篇 |
2011年 | 13篇 |
2010年 | 12篇 |
2009年 | 14篇 |
2008年 | 18篇 |
2007年 | 15篇 |
2006年 | 6篇 |
2005年 | 5篇 |
2004年 | 8篇 |
2003年 | 5篇 |
2002年 | 6篇 |
2001年 | 7篇 |
2000年 | 2篇 |
1999年 | 5篇 |
1998年 | 4篇 |
1997年 | 7篇 |
1996年 | 2篇 |
1995年 | 4篇 |
1994年 | 2篇 |
1993年 | 3篇 |
1992年 | 1篇 |
1991年 | 1篇 |
1990年 | 2篇 |
1989年 | 2篇 |
1988年 | 1篇 |
1987年 | 1篇 |
1986年 | 1篇 |
1985年 | 4篇 |
1984年 | 1篇 |
1983年 | 2篇 |
1981年 | 1篇 |
1980年 | 2篇 |
1978年 | 1篇 |
1976年 | 2篇 |
1975年 | 1篇 |
1916年 | 1篇 |
1913年 | 2篇 |
排序方式: 共有234条查询结果,搜索用时 15 毫秒
1.
Supermedia-enhanced Internet-based telerobotics 总被引:4,自引:0,他引:4
Elhajj I. Ning Xi Wai Keung Fung Yun-Hui Liu Hasegawa Y. Fukuda T. 《Proceedings of the IEEE. Institute of Electrical and Electronics Engineers》2003,91(3):396-421
This paper introduces new planning and control methods for supermedia-enhanced real-time telerobotic operations via the Internet. Supermedia is the collection of video, audio, haptic information, temperature, and other sensory feedback. However, when the communication medium used, such as the Internet, introduces random communication time delay, several challenges and difficulties arise. Most importantly, random communication delay causes instability, loss of transparency, and desynchronization in real-time closed-loop telerobotic systems. Due to the complexity and diversity of such systems, the first challenge is to develop a general and efficient modeling and analysis tool. This paper proposes the use of Petri net modeling to capture the concurrency and complexity of Internet-based teleoperation. Combined with the event-based planning and control method, it also provides an efficient analysis and design tool to study the stability, transparency, and synchronization of such systems. In addition, the concepts of event transparency and event synchronization are introduced and analyzed. This modeling and control method has been applied to the design of several supermedia-enhanced Internet-based telerobotic systems, including the bilateral control of mobile robots and mobile manipulators. These systems have been experimentally implemented in three sites test bed consisting of robotic laboratories in the USA, Hong Kong, and Japan. The experimental results have verified the theoretical development and further demonstrated the stability, event transparency, and event synchronization of the systems. 相似文献
2.
3.
The effect of DC flux on the core loss is examined for the practical range of power and frequency. Relevant core loss equations are derived and applied to an optimization algorithm to determine the minimum core loss at a given ratio of s (DC flux density to AC peak flux density). It has been found that the curves of hysteresis loss density versus the ratio of s exhibit a peak at a critical ratio. Below or above this critical ratio, the loss density decreases drastically. On the other hand, the curves of eddy-current loss density versus the ratio of s exhibits a minimum point at a critical ratio. Below or above this critical ratio, the loss density increases gradually 相似文献
4.
Abstract. Recently, there has been a lot of interest in modelling real data with a heavy‐tailed distribution. A popular candidate is the so‐called generalized autoregressive conditional heteroscedastic (GARCH) model. Unfortunately, the tails of GARCH models are not thick enough in some applications. In this paper, we propose a mixture generalized autoregressive conditional heteroscedastic (MGARCH) model. The stationarity conditions and the tail behaviour of the MGARCH model are studied. It is shown that MGARCH models have tails thicker than those of the associated GARCH models. Therefore, the MGARCH models are more capable of capturing the heavy‐tailed features in real data. Some real examples illustrate the results. 相似文献
5.
In this paper, we propose a new continuous self‐collision detection (CSCD) method for a deformable surface that interacts with a simple solid model. The method is developed based on the radial‐view‐based culling method. Our method is suitable for the deformable surface that has large contact region with the solid model. The deformable surface may consist of small round‐shaped holes. At the pre‐processing stage, the holes of the deformable surface are filled with ghost triangles so as to make the mesh of the deformable surface watertight. An observer primitive (i.e. a point or a line segment) is computed so that it lies inside the solid model. At the runtime stage, the orientations of triangles with respect to the observer primitive are evaluated. The collision status of the deformable surface is then determined. We evaluated our method for several animations including virtual garments. Experimental results show that our method improves the process of CSCD. 相似文献
6.
We study the online preemptive scheduling of intervals and jobs (with restarts). Each interval or job has an arrival time, a deadline, a length and a weight. The objective is to maximize the total weight of completed intervals or jobs. While the deterministic case for intervals was settled a long time ago, the randomized case remains open. In this paper we first give a 2-competitive randomized algorithm for the case of equal length intervals. The algorithm is barely random in the sense that it randomly chooses between two deterministic algorithms at the beginning and then sticks with it thereafter. Then we extend the algorithm to cover several other cases of interval scheduling including monotone instances, C-benevolent instances and D-benevolent instances, giving the same competitive ratio. These algorithms are surprisingly simple but have the best competitive ratio against all previous (fully or barely) randomized algorithms. Next we extend the idea to give a 3-competitive algorithm for equal length jobs. Finally, we prove a lower bound of 2 on the competitive ratio of all barely random algorithms that choose between two deterministic algorithms for scheduling equal length intervals (and hence jobs). A preliminary version of this paper appeared in Fung et al. (The 6th International Workshop on Approximation and Online Algorithmspp, vol. 5426, pp. 53–66, 2008). 相似文献
7.
8.
Analogy based estimation (ABE) generates an effort estimate for a new software project through adaptation of similar past projects (a.k.a. analogies). Majority of ABE methods follow uniform weighting in adaptation procedure. In this research we investigated non-uniform weighting through kernel density estimation. After an extensive experimentation of 19 datasets, 3 evaluation criteria, 5 kernels, 5 bandwidth values and a total of 2090 ABE variants, we found that: (1) non-uniform weighting through kernel methods cannot outperform uniform weighting ABE and (2) kernel type and bandwidth parameters do not produce a definite effect on estimation performance. In summary simple ABE approaches are able to perform better than much more complex approaches. Hence,—provided that similar experimental settings are adopted—we discourage the use of kernel methods as a weighting strategy in ABE. 相似文献
9.
We present a randomized EREW PRAM algorithm to find a minimum spanning forest in a weighted undirected graph. On an n -vertex graph the algorithm runs in o(( log n)
1+
ɛ
) expected time for any ɛ >0 and performs linear expected work. This is the first linear-work, polylog-time algorithm on the EREW PRAM for this problem.
This also gives parallel algorithms that perform expected linear work on two general-purpose models of parallel computation—the
QSM and the BSP. 相似文献
10.
Lean Yu Huanhuan Chen Shouyang Wang Kin Keung Lai 《Evolutionary Computation, IEEE Transactions on》2009,13(1):87-102
In this paper, an evolving least squares support vector machine (LSSVM) learning paradigm with a mixed kernel is proposed to explore stock market trends. In the proposed learning paradigm, a genetic algorithm (GA), one of the most popular evolutionary algorithms (EAs), is first used to select input features for LSSVM learning, i.e., evolution of input features. Then, another GA is used for parameters optimization of LSSVM, i.e., evolution of algorithmic parameters. Finally, the evolving LSSVM learning paradigm with best feature subset, optimal parameters, and a mixed kernel is used to predict stock market movement direction in terms of historical data series. For illustration and evaluation purposes, three important stock indices, S&P 500 Index, Dow Jones Industrial Average (DJIA) Index, and New York Stock Exchange (NYSE) Index, are used as testing targets. Experimental results obtained reveal that the proposed evolving LSSVM can produce some forecasting models that are easier to be interpreted by using a small number of predictive features and are more efficient than other parameter optimization methods. Furthermore, the produced forecasting model can significantly outperform other forecasting models listed in this paper in terms of the hit ratio. These findings imply that the proposed evolving LSSVM learning paradigm can be used as a promising approach to stock market tendency exploration. 相似文献