首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1421篇
  免费   51篇
  国内免费   4篇
电工技术   25篇
综合类   1篇
化学工业   409篇
金属工艺   16篇
机械仪表   25篇
建筑科学   33篇
矿业工程   7篇
能源动力   22篇
轻工业   67篇
水利工程   10篇
石油天然气   1篇
武器工业   1篇
无线电   105篇
一般工业技术   252篇
冶金工业   238篇
原子能技术   11篇
自动化技术   253篇
  2023年   11篇
  2022年   38篇
  2021年   29篇
  2020年   17篇
  2019年   18篇
  2018年   23篇
  2017年   24篇
  2016年   26篇
  2015年   19篇
  2014年   34篇
  2013年   100篇
  2012年   61篇
  2011年   70篇
  2010年   46篇
  2009年   50篇
  2008年   33篇
  2007年   30篇
  2006年   55篇
  2005年   47篇
  2004年   51篇
  2003年   49篇
  2002年   35篇
  2001年   17篇
  2000年   22篇
  1999年   16篇
  1998年   18篇
  1997年   20篇
  1996年   24篇
  1995年   16篇
  1994年   20篇
  1993年   16篇
  1992年   24篇
  1991年   11篇
  1990年   14篇
  1989年   19篇
  1988年   20篇
  1987年   12篇
  1986年   17篇
  1985年   15篇
  1982年   19篇
  1981年   13篇
  1980年   14篇
  1979年   16篇
  1978年   23篇
  1977年   16篇
  1976年   25篇
  1975年   16篇
  1974年   21篇
  1973年   14篇
  1970年   10篇
排序方式: 共有1476条查询结果,搜索用时 15 毫秒
41.
Increasingly, new regulations are governing organizations and their information systems. Individuals responsible for ensuring legal compliance and accountability currently lack sufficient guidance and support to manage their legal obligations within relevant information systems. While software controls provide assurances that business processes adhere to specific requirements, such as those derived from government regulations, there is little support to manage these requirements and their relationships to various policies and regulations. We propose a requirements management framework that enables executives, business managers, software developers and auditors to distribute legal obligations across business units and/or personnel with different roles and technical capabilities. This framework improves accountability by integrating traceability throughout the policy and requirements lifecycle. We illustrate the framework within the context of a concrete healthcare scenario in which obligations incurred from the Health Insurance Portability and Accountability Act (HIPAA) are delegated and refined into software requirements. Additionally, we show how auditing mechanisms can be integrated into the framework and how auditors can certify that specific chains of delegation and refinement decisions comply with government regulations.  相似文献   
42.
Evenly Spaced Streamlines for Surfaces: An Image-Based Approach   总被引:1,自引:0,他引:1  
We introduce a novel, automatic streamline seeding algorithm for vector fields defined on surfaces in 3D space. The algorithm generates evenly spaced streamlines fast, simply and efficiently for any general surface-based vector field. It is general because it handles large, complex, unstructured, adaptive resolution grids with holes and discontinuities, does not require a parametrization, and can generate both sparse and dense representations of the flow. It is efficient because streamlines are only integrated for visible portions of the surface. It is simple because the image-based approach removes the need to perform streamline tracing on a triangular mesh, a process which is complicated at best. And it is fast because it makes effective, balanced use of both the CPU and the GPU. The key to the algorithm's speed, simplicity and efficiency is its image-based seeding strategy. We demonstrate our algorithm on complex, real-world simulation data sets from computational fluid dynamics and compare it with object-space streamline visualizations.  相似文献   
43.
We present a real-time relighting and shadowing method for dynamic scenes with varying lighting, view and BRDFs. Our approach is based on a compact representation of reflectance data that allows for changing the BRDF at run-time and a data-driven method for accurately synthesizing self-shadows on articulated and deformable geometries. Unlike previous self-shadowing approaches, we do not rely on local blocking heuristics. We do not fit a model to the BRDF-weighted visibility, but rather only to the visibility that changes during animation. In this manner, our model is more compact than previous techniques and requires less computation both during fitting and at run-time. Our reflectance product operators can re-integrate arbitrary low-frequency view-dependent BRDF effects on-the-fly and are compatible with all previous dynamic visibility generation techniques as well as our own data-driven visibility model. We apply our reflectance product operators to three different visibility generation models, and our data-driven model can achieve framerates well over 300Hz.  相似文献   
44.
The development of evolutionary algorithms for optimization has always been a stimulating and growing research area with an increasing demand in using them to solve complex industrial optimization problems. A novel immunity-based hybrid evolutionary algorithm known as Hybrid Artificial Immune Systems (HAIS) for solving both unconstrained and constrained multi-objective optimization problems is developed in this research. The algorithm adopts the clonal selection and immune suppression theories, with a sorting scheme featuring uniform crossover, multi-point mutation, non-dominance and crowding distance sorting to attain the Pareto optimal front in an efficient manner. The proposed algorithm was verified with nine benchmarking functions on its global optimal search ability as well as compared with four optimization algorithms to assess its diversity and spread. Sensitivity analysis was also carried out to investigate the selection of key parameters of the algorithm. It is found that the developed immunity-based hybrid evolutionary algorithm provides a useful means for solving optimization problems and has successfully applied to the problem of global repositioning of containers, which is one of a constrained multi-objective optimization problem. The developed HAIS will assist shipping liners on timely decision making and planning of container repositioning operations in global container transportation business in an optimized and cost effective manner.  相似文献   
45.
While hexahedral mesh elements are preferred by a variety of simulation techniques, constructing quality all‐hex meshes of general shapes remains a challenge. An attractive hex‐meshing approach, often referred to as sub‐mapping, uses a low distortion mapping between the input model and a PolyCube (a solid formed from a union of cubes), to transfer a regular hex grid from the PolyCube to the input model. Unfortunately, the construction of suitable PolyCubes and corresponding volumetric maps for arbitrary shapes remains an open problem. Our work introduces a new method for computing low‐distortion volumetric PolyCube deformations of general shapes and for subsequent all‐hex remeshing. For a given input model, our method simultaneously generates an appropriate PolyCube structure and mapping between the input model and the PolyCube. From these we automatically generate good quality all‐hex meshes of complex natural and man‐made shapes.  相似文献   
46.
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.  相似文献   
47.
Heart fatty acid binding protein (Fabp3) is a cytosolic protein expressed primarily in heart, and to a lesser extent in skeletal muscle, brain, and kidney. During myocardial injury, the Fabp3 level in serum is elevated rapidly, making it an ideal early marker for myocardial infarction. In this study, an MS‐based selected reaction monitoring method (LC‐SRM) was developed for quantifying Fabp3 in rat serum. Fabp3 was enriched first through an immobilized antibody, and the protein was digested on beads directly. A marker peptide of Fabp3 was quantified using LC‐SRM with a stable isotope‐labeled peptide standard. For six quality control samples with Fabp3 ranging from 0.256 to 25 ng, the average recovery following the procedure was about 73%, and the precision (%CV) between replicates was less than 7%. The Fabp3 concentrations in rat serum peaked 1 h after isoproterenol treatment, and returned to baseline levels 24 h after the dose. Elevated Fabp3 levels were also detected in rats administered with a PPAR α/δ agonist, which has shown to cause skeletal muscle necrosis. Fabp3 can be used as a biomarker for both cardiac and skeletal necroses. The cross‐validation of the LC‐SRM method with an existing ELISA method is described.  相似文献   
48.
This paper addresses the problem of aligning multiple sequences of noncoding RNA (ncRNA) genes. We approach this problem with the biologically motivated paradigm that scoring of ncRNA alignments should be based primarily on secondary structure rather than nucleotide conservation. We introduce a novel graph theoretic model (NLG) for analyzing algorithms based on this approach, prove that the RNA multiple alignment problem is NP-Complete in this model, and present a polynomial time algorithm that approximates the optimal structure of size S   within a factor of O(log2S)O(log2S).  相似文献   
49.
Polychronization: computation with spikes   总被引:10,自引:0,他引:10  
We present a minimal spiking network that can polychronize, that is, exhibit reproducible time-locked but not synchronous firing patterns with millisecond precision, as in synfire braids. The network consists of cortical spiking neurons with axonal conduction delays and spike-timing-dependent plasticity (STDP); a ready-to-use MATLAB code is included. It exhibits sleeplike oscillations, gamma (40 Hz) rhythms, conversion of firing rates to spike timings, and other interesting regimes. Due to the interplay between the delays and STDP, the spiking neurons spontaneously self-organize into groups and generate patterns of stereotypical polychronous activity. To our surprise, the number of coexisting polychronous groups far exceeds the number of neurons in the network, resulting in an unprecedented memory capacity of the system. We speculate on the significance of polychrony to the theory of neuronal group selection (TNGS, neural Darwinism), cognitive neural computations, binding and gamma rhythm, mechanisms of attention, and consciousness as "attention to memories."  相似文献   
50.
Knowledge-base V&V primarily addresses the question: “Does my knowledge-base contain the right answer and can I arrive at it?” One of the main goals of our work is to properly encapsulate the knowledge representation and allow the expert to work with manageable-sized chunks of the knowledge-base. This work develops a new methodology for the verification and validation of Bayesian knowledge-bases that assists in constructing and testing such knowledge-bases. Assistance takes the form of ensuring that the knowledge is syntactically correct, correcting “imperfect” knowledge, and also identifying when the current knowledge-base is insufficient as well as suggesting ways to resolve this insufficiency. The basis of our approach is the use of probabilistic network models of knowledge. This provides a framework for formally defining and working on the problems of uncertainty in the knowledge-base.

In this paper, we examine the project which is concerned with assisting a human expert to build knowledge-based systems under uncertainty. We focus on how verification and validation are currently achieved in .  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号