全文获取类型
收费全文 | 44篇 |
免费 | 0篇 |
专业分类
化学工业 | 1篇 |
机械仪表 | 1篇 |
无线电 | 5篇 |
一般工业技术 | 20篇 |
冶金工业 | 6篇 |
自动化技术 | 11篇 |
出版年
2022年 | 1篇 |
2018年 | 1篇 |
2015年 | 2篇 |
2014年 | 1篇 |
2013年 | 1篇 |
2012年 | 2篇 |
2011年 | 2篇 |
2010年 | 2篇 |
2007年 | 1篇 |
2006年 | 5篇 |
2005年 | 3篇 |
2004年 | 3篇 |
2003年 | 2篇 |
2002年 | 2篇 |
1998年 | 2篇 |
1997年 | 2篇 |
1996年 | 3篇 |
1993年 | 1篇 |
1991年 | 1篇 |
1989年 | 1篇 |
1985年 | 1篇 |
1977年 | 1篇 |
1972年 | 1篇 |
1971年 | 1篇 |
1968年 | 2篇 |
排序方式: 共有44条查询结果,搜索用时 15 毫秒
1.
This paper is the first attempt to successfully design efficient approximation algorithms for the single-machine weighted
flow-time minimization problem when jobs have different release dates and weights equal to their processing times under the
assumption that one job is fixed (i.e., the machine is unavailable during a fixed interval corresponding to the fixed job).
Our work is motivated by an interesting algorithmic application to the generation of valid inequalities in a branch-and-cut
method. Our analysis shows that the trivial FIFO sequence can lead to an arbitrary large worst-case performance bound. Hence,
we modify this sequence so that a new 2-approximation solution can be obtained for every instance and we prove the tightness
of this bound. Then, we propose a fully polynomial-time approximation algorithm with efficient running time for the considered
problem. Especially, the complexity of our algorithm is strongly polynomial. 相似文献
2.
H. Kellerer 《Computing》1991,46(3):183-191
The well-known, NP-complete problem of scheduling a set ofn independent jobs nonpreemptively onm identical parallel processors to minimize the maximum finish time is considered. Let ω0 be the finish time of an optimal schedule and ω the finish time of a schedule found by the Longest Processing Time (LPT-)heuristic. We will improve the Graham-bound for the LPT-heuristic (ω/ω0 ≤ 4/3 ? 1/3m) which is tight in general, by considering only jobs with similar processing times. 相似文献
3.
4.
Sir, Response to Are all photon radiations similar in largeabsorbers?A comparison of electron spectra byA. M. Kellerer and H. Roos When the ICRP adopted a quality factorand subsequentlya radiation weighting factorthat gives equal weight todifferent photon radiations, it did not, necessarily imply thatequal 相似文献
5.
Bartholomäus Kellerer Manfred Reitenspiess 《International Journal on Software Tools for Technology Transfer (STTT)》2005,7(4):376-387
Telecommunications technologies are undergoing a major paradigm shift. Standards-based, off-the-shelf components and the Internet are gaining wide acceptance. The success of this move is strongly dependent upon the quality and availability of these technologies.Practical quality assurance in this environment can take advantage of the tools and methods developed when carrier-grade systems for the telecommunications market were being deployed. Besides standard test methods, availability-related methods for redundant hardware and software components are applied. Statistics are available that prove the success of this approach. The statistical data are derived from the deployment of the commercial product RTP4 Continuous Services, a standards-based high-availability middleware.Additional momentum has been gained in the Service Availability Forum (www.saforum.org), where the interface standards are validated and certified in independent test processes. 相似文献
6.
EP Ivanov GV Tolochko LP Shuvaeva S Becker E Nekolla AM Kellerer 《Canadian Metallurgical Quarterly》1996,35(2):75-80
Childhood leukemia (ICD 204-208 [1]) incidence rates in the different regions of Belarus are reported for a period before and after the Chernobyl accident (1982-1994). There are, at this point, no recognizable trends towards higher rates. 相似文献
7.
Conventional X rays, i.e. X rays generating voltage between roughly 150 and 300 kV, are used in many radio-diagnostic procedures and also in radiobiological experiments. They release less energetic and, therefore, more densely ionising electrons than the high-energy gamma rays from 60Co or from the A bombs. Accordingly, they are considered to be somewhat more effective, especially at low doses. Various radiobiological studies, especially studies on chromosome aberrations have confirmed this assumption, but epidemiological investigations, e.g. the comparison of the excess relative risk for mammary cancer in the X-ray exposed patients and in the gamma-ray exposed A bomb survivors, have not demonstrated a similar difference. In view of the missing epidemiological evidence and largely for the reasons of practicality in radiation protection, the ICRP has recommended the radiation weighting factor unity equally for all photon radiations. However, in the discussion preceding the 2005 Recommendations of the ICRP, the issue remains controversial. In a recent paper, Harder et al. argue--with reference to an assessment by the German Radiation Protection Commission (SSK)--that the use of the same weighting factor for different photon energies can be justified more directly. For high-energy incident photons, they present the degraded photon spectra at different depths in a phantom, and they conclude that much of the difference between high-energy gamma rays and conventional X rays disappears in a large phantom. The present assessment, which is more direct, compares the spectra of electrons released (through pair production, Compton effect and photo effect) in a small and in a very large receptor for the incident photons of 150 keV, 1 MeV and 6 MeV. For the 1 Mev and 6 MeV photons, there is a substantial shift towards smaller electron energies in the large receptor, but the electron spectra remain much harder than those from the 150 keV incident photons. Furthermore, it is seen--in agreement with earlier conclusions by Straume--that for the broad gamma-ray spectrum from the A bombs there is no shift at all to lower energies within the body, but rather some degree of hardening of the radiation. The assumption that distinct differences between high-energy gamma rays and conventional X rays are restricted to small samples must, thus, be rejected. The attribution of the same effective quality factor or radiation weighting factor to all photon energies remains, therefore, an issue that is based on the considerations beyond dosimetry. 相似文献
8.
The length changes caused by oxidation in air of Ti-6A1-4V were investigated at temperatures between 800° and 1040°C. In 3.1
mm-thick specimens a 60 min exposure at 950°C results in a net expansion of 0.7 pct. If oxidation and the corresponding expansion
are restricted to one surface of a sheet metal specimen a bimetallic strip effect is obtained and the specimens deform into
the shape of an arc. Several mechanisms can contribute to deformation during oxidation. The increase of the “c” lattice parameter with increasing oxygen content accounts for most of the observed volume expansion. Because oxygen stabilizesα, the surface layers contain a higher than equilibriumα concentration. Higher thermal expansion of α and its larger volume per unit cell cause additional deformation. These mechanisms
apparently can introduce surface stresses up to several kg per sq mm which result in extensive creep deformation. 相似文献
9.
Hans Kellerer 《IIE Transactions》1998,30(11):991-999
In this paper we present algorithms for the problem of scheduling n independent jobs on m identical machines. As a generalization of the classical multiprocessor scheduling problem each machine is available only at a machine dependent release time. Two objective functions are considered. To minimize the makespan, we develop a dual approximation algorithm with a worst case bound of 5/4. For the problem of maximizing the minimum completion time, we develop an algorithm, such that the minimum completion time in the schedule produced by this algorithm is at least 2/3 times the minimum completion time in the optimum schedule. The paper closes with some numerical results. 相似文献
10.
Thorsten Biermann Luca Scalia Changsoon Choi Holger Karl Wolfgang Kellerer 《Pervasive and Mobile Computing》2012,8(5):662-681
Coordinated Multi-Point (CoMP) transmission and reception is a promising solution for managing interference and increasing performance in future wireless cellular systems. Due to its strict requirements in terms of capacity, latency, and synchronization among cooperating Base Stations (BSs), its successful deployment depends on the capability of the mobile backhaul network infrastructure.We deal with the feasibility of CoMP transmission/reception, in particular of Joint Transmission (JT). For this, we first evaluate which cluster sizes are reasonable from the wireless point-of-view to achieve the desired performance gains. Thereafter, we analyze how different backhaul topologies (e.g., mesh and tree structures) and backhaul network technologies (e.g., layer-2 switching and single-copy multicast capabilities) can support these desired clusters. We study for different traffic scenarios and backhaul connectivity levels, which part of the desired BS clusters are actually feasible according to the backhaul characteristics. We found out that a significant mismatch exists between the desired and feasible clusters. Neglecting this mismatch causes overheads in real JT implementations, which complicates or even prevents their deployment.Based on our findings, we propose a clustering system architecture that not only includes wireless information, as done in the state of the art, but also combines wireless and backhaul network feasibility information in a smart way. This avoids unnecessary signaling and User Equipment (UE) data exchange among BSs which are not eligible to take part in the cooperative cluster. Evaluations show that our scheme reduces the signaling and UE data exchange overhead by up to 85% compared to conventional clustering approaches, which do not take into account the backhaul network’s status. 相似文献