首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   91篇
  免费   0篇
化学工业   10篇
建筑科学   14篇
能源动力   2篇
水利工程   1篇
无线电   3篇
一般工业技术   4篇
冶金工业   2篇
自动化技术   55篇
  2022年   3篇
  2021年   7篇
  2020年   1篇
  2019年   1篇
  2018年   1篇
  2017年   2篇
  2015年   2篇
  2013年   4篇
  2012年   3篇
  2011年   7篇
  2010年   4篇
  2009年   7篇
  2008年   6篇
  2007年   5篇
  2006年   2篇
  2005年   5篇
  2004年   4篇
  2003年   2篇
  2002年   1篇
  2001年   3篇
  2000年   1篇
  1999年   1篇
  1998年   1篇
  1997年   4篇
  1996年   4篇
  1994年   4篇
  1993年   1篇
  1992年   1篇
  1991年   2篇
  1987年   1篇
  1986年   1篇
排序方式: 共有91条查询结果,搜索用时 15 毫秒
71.
The SQuad data structure represents the connectivity of a triangle mesh by its “S table” of about 2 rpt (integer references per triangle). Yet it allows for a simple implementation of expected constant‐time, random‐access operators for traversing the mesh, including in‐order traversal of the triangles incident upon a vertex. SQuad is more compact than the Corner Table (CT), which stores 6 rpt, and than the recently proposed SOT, which stores 3 rpt. However, in‐core access is generally faster in CT than in SQuad, and SQuad requires rebuilding the S table if the connectivity is altered. The storage reduction and memory coherence opportunities it offers may help to reduce the frequency of page faults and cache misses when accessing elements of a mesh that does not fit in memory. We provide the details of a simple algorithm that builds the S table and of an optimized implementation of the SQuad operators.  相似文献   
72.
Shape complexity     
The complexity of 3D shapes that are represented in digital form and processed in CAD/CAM/CAE, entertainment, biomedical, and other applications has increased considerably. Much research was focused on coping with or on reducing shape complexity. However, what exactly is shape complexity? We discuss several complexity measures and the corresponding complexity reduction techniques. Algebraic complexity measures the degree of polynomials needed to represent the shape exactly in its implicit or parametric form. Topological complexity measures the number of handles and components or the existence of non-manifold singularities, non-regularized components, holes or self-intersections. Morphological complexity measures smoothness and feature size. Combinatorial complexity measures the vertex count in polygonal meshes. Representational complexity measures the footprint and ease-of-use of a data structure, or the storage size of a compressed model. The latter three vary as a function of accuracy.  相似文献   
73.
Since the publication of the original Marching Cubes algorithm, numerous variations have been proposed for guaranteeing water-tight constructions of triangulated approximations of isosurfaces. Most approaches divide the 3D space into cubes that each occupy the space between eight neighboring samples of a regular lattice. The portion of the isosurface inside a cube may be computed independently of what happens in the other cubes, provided that the constructions for each pair of neighboring cubes agree along their common face. The portion of the isosurface associated with a cube may consist of one or more connected components, which we call sheets. The topology and combinatorial complexity of the isosurface is influenced by three types of decisions made during its construction: (1) how to connect the four intersection points on each ambiguous face, (2) how to form interpolating sheets for cubes with more than one loop, and (3) how to triangulate each sheet. To determine topological properties, it is only relevant whether the samples are inside or outside the object, and not their precise value, if there is one. Previously reported techniques make these decisions based on local—per cube—criteria, often using precomputed look-up tables or simple construction rules. Instead, we propose global strategies for optimizing several topological and combinatorial measures of the isosurfaces: triangle count, genus, and number of shells. We describe efficient implementations of these optimizations and the auxiliary data structures developed to support them.  相似文献   
74.
Baked carbon anodes with varying apparent densities and baking temperatures were tested in Na3AlF6–Al2O3(sat) melts at 1010° C. The double-layer capacitance (C dl) was used as an indicator of the wetted surface area. For unpolarized anodes,C dl increased with increasing time of immersion and reached a constant level after 1.5–2h. The values decreased with increasing polarization potential in the range 1–1.5 V positive to aluminium. TheC dl of polished samples increased markedly during electrolysis, particularly at low current densities. No clear correlation was found betweenC dl and apparent density. Semi-logarithmic plots of potential versus current could be divided into three segments. The lower two were linear, the ranges and slopes being 0.01–0.1 A cm–2, 0.20–0.44 V per decade and 0.1–0.5 A cm–2, 0.18–0.24 V per decade, respectively. At higher current densities the curves bent upwards. The current density corresponding to an overpotential of 0.5 V increased slightly with increasing apparent density, whereas the ohmic voltage drop. at constant current density decreased. The current densities were corrected for differences in wetted surface area on the basis of theC dl data. The change in baking temperature from 970 to 1100°C had no appreciable effect on the overpotential, whereas samples baked at 1250°C showed a somewhat lower overpotential.  相似文献   
75.
This study determined the vertical temperature gradient in two large industrial buildings with room height close to 10 m. One of the buildings was an assembly hall with air heating system and the other was a warehouse equipped with radiant heating (primary) and air heating (secondary) system. The objective of the study was to determine the differences of vertical temperature gradient in halls during the winter. The findings from this study were used in dynamic whole-year simulations for estimating the heating and ventilation energy differences. The results showed about 0.2 K/m vertical temperature gradients in both halls, which was smaller by a factor of 5 for air heating than guidebook. This difference was likely because of the building being ventilated and well insulated. Temperature gradients kept reasonably constant at all measured outdoor temperatures. Energy simulations with measured gradient values of 0.2 K/m and with outdoor airflow rate during occupied hours 1.0 L/(s·m2), which was enough for ventilation and for air heating with simulated good insulation level, resulted in 15% to 41% higher primary energy for air heating. If it is possible to lower outdoor airflow rates to 0.5 L/(s·m2) during occupied hours, which was enough to remove pollutants from occupancy, the analyzed cases showed 23% lower primary energy for radiant heating.  相似文献   
76.

The bulk of Internet interactions is highly redundant and also security sensitive. To reduce communication bandwidth and provide a desired level of security, a data stream is first compressed to squeeze out redundant bits and then encrypted using authenticated encryption. This generic solution is very flexible and works well for any pair of (compression, encryption) algorithms. Its downside, however, is the fact that the two algorithms are designed independently. One would expect that designing a single algorithm that compresses and encrypts (called compcrypt) should produce benefits in terms of efficiency and security. The work investigates how to design a compcrypt algorithm using the ANS entropy coding. First, we examine basic properties of ANS and show that a plain ANS with a hidden encoding table can be broken by statistical attacks. Next, we study ANS behavior when its states are chosen at random. Our compcrypt algorithm is built using ANS with randomized state jumps and a sponge MonkeyDuplex encryption. Its security and efficiency are discussed. The design provides 128-bit security for both confidentiality and integrity/authentication. Our implementation experiments show that our compcrypt algorithm processes symbols with a rate up to 269 MB/s (with a slight loss of compression rate) 178 MB/s.

  相似文献   
77.
Enhancing the effectiveness of colorectal cancer treatment is highly desirable. Radiation-based anticancer therapy—such as proton therapy (PT)—can be used to shrink tumors before subsequent surgical intervention; therefore, improving the effectiveness of this treatment is crucial. The addition of noble metal nanoparticles (NPs), acting as radiosensitizers, increases the PT therapeutic effect. Thus, in this paper, the effect of novel, gold–platinum nanocauliflowers (AuPt NCs) on PT efficiency is determined. For this purpose, crystalline, 66-nm fancy shaped, bimetallic AuPt NCs were synthesized using green chemistry method. Then, physicochemical characterization of the obtained AuPt NCs by transmission electron microscopy (TEM), selected area electron diffraction (SAED), energy dispersive X-ray spectroscopy (EDS), and UV-Vis spectra measurements was carried out. Fully characterized AuPt NCs were placed into a cell culture of colon cancer cell lines (HCT116, SW480, and SW620) and a normal colon cell line (FHC) and subsequently subjected to proton irradiation with a total dose of 15 Gy. The 3-(4,5-dimethylthiazol-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2H-tetrazolium (MTS) test, performed after 18-h incubation of the irradiated cell culture with AuPt NCs, showed a significant reduction in cancer cell viability compared to normal cells. Thus, the radio-enhancing features of AuPt NCs indicate their potential application for the improvement in effectiveness of anticancer proton therapy.  相似文献   
78.
79.
Event services based on publish–subscribe architectures are well‐established components of distributed computing applications. Recently, an event service has been proposed as part of the common component architecture (CCA) for high‐performance computing (HPC) applications. In this paper we describe our implementation, experimental evaluation, and initial experience with a high‐performance CCA event service that exploits efficient communications mechanisms commonly used on HPC platforms. We describe the CCA event service model and briefly discuss the possible implementation strategies of the model. We then present the design and implementation of the event service using the aggregate remote memory copy interface as an underlying communication layer for this mechanism. Two alternative implementations are presented and evaluated on a Cray XD‐1 platform. The performance results demonstrate that event delivery latencies are low and that the event service is able to achieve high‐throughput levels. Finally, we describe the use of the event service in an application for high‐speed processing of data from a mass spectrometer and conclude by discussing some possible extensions to the event service for other HPC applications. Published in 2009 by John Wiley & Sons, Ltd.  相似文献   
80.
We present a new, simple, yet efficient algorithm for triangulating multiply-connected polygons. The algorithm requires sorting only local concave minima (sags). The order in which triangles are created mimics a flooding process of the interior of the polygon. At each stage, the algorithm analyses the positions and neighborhoods of two vertices only, and possibly checks for active sags, so as to determine which of five possible actions to take. Actions are based on a local decomposition of the polygon into monotonic regions, or gorges (raise the water level in the current gorge, spill into an adjacent gorge, jump to the other bank of a filled gorge, divide a gorge into two, and fill a gorge to its top). The implementation is extremely simple and numerically robust for a large class of polygons. It has been tested on millions of cases as a preprocessing step of a walkthrough and inspection program for complex mechanical and architectural scenes. Extensive experimental results indicate that the observed complexity in terms of the number of vertices, remains under in all cases.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号