首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1460篇
  免费   54篇
  国内免费   4篇
电工技术   25篇
综合类   1篇
化学工业   414篇
金属工艺   20篇
机械仪表   25篇
建筑科学   35篇
矿业工程   7篇
能源动力   22篇
轻工业   70篇
水利工程   12篇
石油天然气   1篇
武器工业   1篇
无线电   112篇
一般工业技术   253篇
冶金工业   247篇
原子能技术   11篇
自动化技术   262篇
  2023年   11篇
  2022年   39篇
  2021年   29篇
  2020年   18篇
  2019年   19篇
  2018年   25篇
  2017年   26篇
  2016年   27篇
  2015年   19篇
  2014年   37篇
  2013年   103篇
  2012年   61篇
  2011年   71篇
  2010年   48篇
  2009年   50篇
  2008年   35篇
  2007年   30篇
  2006年   55篇
  2005年   48篇
  2004年   51篇
  2003年   50篇
  2002年   35篇
  2001年   17篇
  2000年   22篇
  1999年   19篇
  1998年   23篇
  1997年   22篇
  1996年   24篇
  1995年   16篇
  1994年   21篇
  1993年   18篇
  1992年   24篇
  1991年   11篇
  1990年   15篇
  1989年   20篇
  1988年   20篇
  1987年   14篇
  1986年   17篇
  1985年   15篇
  1982年   20篇
  1981年   14篇
  1980年   14篇
  1979年   16篇
  1978年   23篇
  1977年   17篇
  1976年   26篇
  1975年   16篇
  1974年   21篇
  1973年   14篇
  1970年   10篇
排序方式: 共有1518条查询结果,搜索用时 28 毫秒
41.
We consider Constraint Satisfaction Problems in which constraints can be initially incomplete, where it is unknown whether certain tuples satisfy the constraint or not. We assume that we can determine the satisfaction of such an unknown tuple, i.e., find out whether this tuple is in the constraint or not, but doing so incurs a known cost, which may vary between tuples. We also assume that we know the probability of an unknown tuple satisfying a constraint. We define algorithms for this problem, based on backtracking search. Specifically, we consider a simple iterative algorithm based on a cost limit on the unknowns that may be determined, and a more complex algorithm that delays determining an unknown in order to estimate better whether doing so is worthwhile. We show experimentally that the more sophisticated algorithms can greatly reduce the average cost.  相似文献   
42.
During the last decades, simulation software based on the Finite Element Method (FEM) has significantly contributed to the design of feasible forming processes. Coupling FEM to mathematical optimization algorithms offers a promising opportunity to design optimal metal forming processes rather than just feasible ones. In this paper Sequential Approximate Optimization (SAO) for optimizing forging processes is discussed. The algorithm incorporates time-consuming nonlinear FEM simulations. Three variants of the SAO algorithm—which differ by their sequential improvement strategies—have been investigated and compared to other optimization algorithms by application to two forging processes. The other algorithms taken into account are two iterative algorithms (BFGS and SCPIP) and a Metamodel Assisted Evolutionary Strategy (MAES). It is essential for sequential approximate optimization algorithms to implement an improvement strategy that uses as much information obtained during previous iterations as possible. If such a sequential improvement strategy is used, SAO provides a very efficient algorithm to optimize forging processes using time-consuming FEM simulations.  相似文献   
43.
Increasingly, new regulations are governing organizations and their information systems. Individuals responsible for ensuring legal compliance and accountability currently lack sufficient guidance and support to manage their legal obligations within relevant information systems. While software controls provide assurances that business processes adhere to specific requirements, such as those derived from government regulations, there is little support to manage these requirements and their relationships to various policies and regulations. We propose a requirements management framework that enables executives, business managers, software developers and auditors to distribute legal obligations across business units and/or personnel with different roles and technical capabilities. This framework improves accountability by integrating traceability throughout the policy and requirements lifecycle. We illustrate the framework within the context of a concrete healthcare scenario in which obligations incurred from the Health Insurance Portability and Accountability Act (HIPAA) are delegated and refined into software requirements. Additionally, we show how auditing mechanisms can be integrated into the framework and how auditors can certify that specific chains of delegation and refinement decisions comply with government regulations.  相似文献   
44.
Evenly Spaced Streamlines for Surfaces: An Image-Based Approach   总被引:1,自引:0,他引:1  
We introduce a novel, automatic streamline seeding algorithm for vector fields defined on surfaces in 3D space. The algorithm generates evenly spaced streamlines fast, simply and efficiently for any general surface-based vector field. It is general because it handles large, complex, unstructured, adaptive resolution grids with holes and discontinuities, does not require a parametrization, and can generate both sparse and dense representations of the flow. It is efficient because streamlines are only integrated for visible portions of the surface. It is simple because the image-based approach removes the need to perform streamline tracing on a triangular mesh, a process which is complicated at best. And it is fast because it makes effective, balanced use of both the CPU and the GPU. The key to the algorithm's speed, simplicity and efficiency is its image-based seeding strategy. We demonstrate our algorithm on complex, real-world simulation data sets from computational fluid dynamics and compare it with object-space streamline visualizations.  相似文献   
45.
We present a real-time relighting and shadowing method for dynamic scenes with varying lighting, view and BRDFs. Our approach is based on a compact representation of reflectance data that allows for changing the BRDF at run-time and a data-driven method for accurately synthesizing self-shadows on articulated and deformable geometries. Unlike previous self-shadowing approaches, we do not rely on local blocking heuristics. We do not fit a model to the BRDF-weighted visibility, but rather only to the visibility that changes during animation. In this manner, our model is more compact than previous techniques and requires less computation both during fitting and at run-time. Our reflectance product operators can re-integrate arbitrary low-frequency view-dependent BRDF effects on-the-fly and are compatible with all previous dynamic visibility generation techniques as well as our own data-driven visibility model. We apply our reflectance product operators to three different visibility generation models, and our data-driven model can achieve framerates well over 300Hz.  相似文献   
46.
The development of evolutionary algorithms for optimization has always been a stimulating and growing research area with an increasing demand in using them to solve complex industrial optimization problems. A novel immunity-based hybrid evolutionary algorithm known as Hybrid Artificial Immune Systems (HAIS) for solving both unconstrained and constrained multi-objective optimization problems is developed in this research. The algorithm adopts the clonal selection and immune suppression theories, with a sorting scheme featuring uniform crossover, multi-point mutation, non-dominance and crowding distance sorting to attain the Pareto optimal front in an efficient manner. The proposed algorithm was verified with nine benchmarking functions on its global optimal search ability as well as compared with four optimization algorithms to assess its diversity and spread. Sensitivity analysis was also carried out to investigate the selection of key parameters of the algorithm. It is found that the developed immunity-based hybrid evolutionary algorithm provides a useful means for solving optimization problems and has successfully applied to the problem of global repositioning of containers, which is one of a constrained multi-objective optimization problem. The developed HAIS will assist shipping liners on timely decision making and planning of container repositioning operations in global container transportation business in an optimized and cost effective manner.  相似文献   
47.
While hexahedral mesh elements are preferred by a variety of simulation techniques, constructing quality all‐hex meshes of general shapes remains a challenge. An attractive hex‐meshing approach, often referred to as sub‐mapping, uses a low distortion mapping between the input model and a PolyCube (a solid formed from a union of cubes), to transfer a regular hex grid from the PolyCube to the input model. Unfortunately, the construction of suitable PolyCubes and corresponding volumetric maps for arbitrary shapes remains an open problem. Our work introduces a new method for computing low‐distortion volumetric PolyCube deformations of general shapes and for subsequent all‐hex remeshing. For a given input model, our method simultaneously generates an appropriate PolyCube structure and mapping between the input model and the PolyCube. From these we automatically generate good quality all‐hex meshes of complex natural and man‐made shapes.  相似文献   
48.
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.  相似文献   
49.
Heart fatty acid binding protein (Fabp3) is a cytosolic protein expressed primarily in heart, and to a lesser extent in skeletal muscle, brain, and kidney. During myocardial injury, the Fabp3 level in serum is elevated rapidly, making it an ideal early marker for myocardial infarction. In this study, an MS‐based selected reaction monitoring method (LC‐SRM) was developed for quantifying Fabp3 in rat serum. Fabp3 was enriched first through an immobilized antibody, and the protein was digested on beads directly. A marker peptide of Fabp3 was quantified using LC‐SRM with a stable isotope‐labeled peptide standard. For six quality control samples with Fabp3 ranging from 0.256 to 25 ng, the average recovery following the procedure was about 73%, and the precision (%CV) between replicates was less than 7%. The Fabp3 concentrations in rat serum peaked 1 h after isoproterenol treatment, and returned to baseline levels 24 h after the dose. Elevated Fabp3 levels were also detected in rats administered with a PPAR α/δ agonist, which has shown to cause skeletal muscle necrosis. Fabp3 can be used as a biomarker for both cardiac and skeletal necroses. The cross‐validation of the LC‐SRM method with an existing ELISA method is described.  相似文献   
50.
This paper addresses the problem of aligning multiple sequences of noncoding RNA (ncRNA) genes. We approach this problem with the biologically motivated paradigm that scoring of ncRNA alignments should be based primarily on secondary structure rather than nucleotide conservation. We introduce a novel graph theoretic model (NLG) for analyzing algorithms based on this approach, prove that the RNA multiple alignment problem is NP-Complete in this model, and present a polynomial time algorithm that approximates the optimal structure of size S   within a factor of O(log2S)O(log2S).  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号