全文获取类型
收费全文 | 5730篇 |
免费 | 345篇 |
国内免费 | 11篇 |
专业分类
电工技术 | 111篇 |
综合类 | 9篇 |
化学工业 | 1658篇 |
金属工艺 | 81篇 |
机械仪表 | 169篇 |
建筑科学 | 216篇 |
矿业工程 | 5篇 |
能源动力 | 187篇 |
轻工业 | 551篇 |
水利工程 | 41篇 |
石油天然气 | 23篇 |
无线电 | 454篇 |
一般工业技术 | 939篇 |
冶金工业 | 281篇 |
原子能技术 | 33篇 |
自动化技术 | 1328篇 |
出版年
2024年 | 6篇 |
2023年 | 87篇 |
2022年 | 249篇 |
2021年 | 326篇 |
2020年 | 175篇 |
2019年 | 202篇 |
2018年 | 209篇 |
2017年 | 208篇 |
2016年 | 252篇 |
2015年 | 218篇 |
2014年 | 272篇 |
2013年 | 450篇 |
2012年 | 394篇 |
2011年 | 456篇 |
2010年 | 342篇 |
2009年 | 335篇 |
2008年 | 300篇 |
2007年 | 255篇 |
2006年 | 220篇 |
2005年 | 153篇 |
2004年 | 123篇 |
2003年 | 98篇 |
2002年 | 90篇 |
2001年 | 65篇 |
2000年 | 57篇 |
1999年 | 58篇 |
1998年 | 84篇 |
1997年 | 71篇 |
1996年 | 56篇 |
1995年 | 30篇 |
1994年 | 39篇 |
1993年 | 30篇 |
1992年 | 13篇 |
1991年 | 11篇 |
1990年 | 12篇 |
1989年 | 10篇 |
1988年 | 7篇 |
1987年 | 12篇 |
1986年 | 4篇 |
1985年 | 13篇 |
1984年 | 9篇 |
1983年 | 17篇 |
1982年 | 10篇 |
1981年 | 6篇 |
1980年 | 8篇 |
1979年 | 9篇 |
1978年 | 6篇 |
1977年 | 10篇 |
1976年 | 5篇 |
1971年 | 3篇 |
排序方式: 共有6086条查询结果,搜索用时 0 毫秒
81.
A novel frequency–based definition of dynamic compliance is introduced within the framework of H ∞ –norm based structural dynamics in the presence of load uncertainties. The system itself is supposed to depend on a vector of design parameters with respect to which an optimal design is pursued. A three-step worst-case-scenario is then developed that finds the minimum-compliance structure capable of accounting for the entire norm–bounded load sets. Once the problem is initialized, the current worst load is found that is used as input to the minimization of the structural compliance and the procedure is repeated until convergence. Numerical examples are eventually proposed that deal with viscoelastic beams discretized via a truly–mixed finite–element scheme. 相似文献
82.
This paper proposes a generic approach for designing vulnerability testing tools for web services, which includes the definition of the testing procedure and the tool components. Based on the proposed approach, we present the design of three innovative testing tools that implement three complementary techniques (improved penetration testing, attack signatures and interface monitoring, and runtime anomaly detection) for detecting injection vulnerabilities, thus offering an extensive support for different scenarios. A case study has been designed to demonstrate the tools for the particular case of SQL Injection vulnerabilities. The experimental evaluation demonstrates that the tools can effectively be used in different scenarios and that they outperform well-known commercial tools by achieving higher detection coverage and lower false-positive rates. 相似文献
83.
Marco Tiloca Christian Gehrmann Ludwig Seitz 《International Journal of Information Security》2017,16(2):173-193
DTLS is a transport layer security protocol designed to provide secure communication over unreliable datagram protocols. Before starting to communicate, a DTLS client and server perform a specific handshake in order to establish a secure session and agree on a common security context. However, the DTLS handshake is affected by two relevant issues. First, the DTLS server is vulnerable to a specific Denial of Service (DoS) attack aimed at forcing the establishment of several half-open sessions. This may exhaust memory and network resources on the server, so making it less responsive or even unavailable to legitimate clients. Second, although it is one of the most efficient key provisioning approaches adopted in DTLS, the pre-shared key provisioning mode does not scale well with the number of clients, it may result in scalability issues on the server side, and it complicates key re-provisioning in dynamic scenarios. This paper presents a single and efficient security architecture which addresses both issues, by substantially limiting the impact of DoS, and reducing the number of keys stored on the server side to one unit only. Our approach does not break the existing standard and does not require any additional message exchange between DTLS client and server. Our experimental results show that our approach requires a shorter amount of time to complete a handshake execution and consistently reduces the time a DTLS server is exposed to a DoS instance. We also show that it considerably improves a DTLS server in terms of service availability and robustness against DoS attack. 相似文献
84.
Rodrigo Queiroz Leonardo Passos Marco Tulio Valente Claus Hunsen Sven Apel Krzysztof Czarnecki 《Software and Systems Modeling》2017,16(1):77-96
Feature annotations (e.g., code fragments guarded by #ifdef C-preprocessor directives) control code extensions related to features. Feature annotations have long been said to be undesirable. When maintaining features that control many annotations, there is a high risk of ripple effects. Also, excessive use of feature annotations leads to code clutter, hinder program comprehension and harden maintenance. To prevent such problems, developers should monitor the use of feature annotations, for example, by setting acceptable thresholds. Interestingly, little is known about how to extract thresholds in practice, and which values are representative for feature-related metrics. To address this issue, we analyze the statistical distribution of three feature-related metrics collected from a corpus of 20 well-known and long-lived C-preprocessor-based systems from different domains. We consider three metrics: scattering degree of feature constants, tangling degree of feature expressions, and nesting depth of preprocessor annotations. Our findings show that feature scattering is highly skewed; in 14 systems (70 %), the scattering distributions match a power law, making averages and standard deviations unreliable limits. Regarding tangling and nesting, the values tend to follow a uniform distribution; although outliers exist, they have little impact on the mean, suggesting that central statistics measures are reliable thresholds for tangling and nesting. Following our findings, we then propose thresholds from our benchmark data, as a basis for further investigations. 相似文献
85.
Agnès Front Dominique Rieu Marco Santorum Fatemeh Movahedian 《Software and Systems Modeling》2017,16(3):691-714
A business process can be characterized by multiple perspectives (intentional, organizational, operational, functional, interactional, informational, etc). Business process modeling must allow different stakeholders to analyze and represent process models according to these different perspectives. This representation is traditionally built using classical data acquisition methods together with a process representation language such as BPMN or UML. These techniques and specialized languages can easily become hard, complex and time consuming. In this paper, we propose ISEA, a participative end-user modeling approach that allows the stakeholders in a business process to collaborate together in a simple way to communicate and improve the business process elicitation in an accurate and understandable manner. Our approach covers the organizational perspective of business processes, exploits the information compiled during the elicitation of the organizational perspective and touches lightly an interactional perspective allowing users to create customized interface sketches to test the user interface navigability and the coherence within the processes. Thus, ISEA can be seen as a participative end-user modeling approach for business process elicitation and improvement. 相似文献
86.
Gabriele Kern-Isberner Marco Wilhelm Christoph Beierle 《Annals of Mathematics and Artificial Intelligence》2017,79(1-3):163-179
An often used methodology for reasoning with probabilistic conditional knowledge bases is provided by the principle of maximum entropy (so-called MaxEnt principle) that realises an idea of least amount of assumed information and thus of being as unbiased as possible. In this paper we exploit the fact that MaxEnt distributions can be computed by solving nonlinear equation systems that reflect the conditional logical structure of these distributions. We apply the theory of Gröbner bases that is well known from computational algebra to the polynomial system which is associated with a MaxEnt distribution, in order to obtain results for reasoning with maximum entropy. We develop a three-phase compilation scheme extracting from a knowledge base consisting of probabilistic conditionals the information which is crucial for MaxEnt reasoning and transforming it to a Gröbner basis. Based on this transformation, a necessary condition for knowledge bases to be consistent is derived. Furthermore, approaches to answering MaxEnt queries are presented by demonstrating how inferring the MaxEnt probability of a single conditional from a given knowledge base is possible. Finally, we discuss computational methods to establish general MaxEnt inference rules. 相似文献
87.
In this article we discuss artificial neural networks‐based fault detection and isolation (FDI) applications for robotic manipulators. The artificial neural networks (ANNs) are used for both residual generation and residual analysis. A multilayer perceptron (MLP) is employed to reproduce the dynamics of the robotic manipulator. Its outputs are compared with actual position and velocity measurements, generating the so‐called residual vector. The residuals, when properly analyzed, provides an indication of the status of the robot (normal or faulty operation). Three ANNs architectures are employed in the residual analysis. The first is a radial basis function network (RBFN) which uses the residuals of position and velocity to perform fault identification. The second is again an RBFN, except that it uses only the velocity residuals. The third is an MLP which also performs fault identification utilizing only the velocity residuals. The MLP is trained with the classical back‐propagation algorithm and the RBFN is trained with a Kohonen self‐organizing map (KSOM). We validate the concepts discussed in a thorough simulation study of a Puma 560 and with experimental results with a 3‐joint planar manipulator. © 2001 John Wiley & Sons, Inc. 相似文献
88.
This paper deals with four solvers for combinatorial problems: the commercial state-of-the-art solver ILOG oplstudio, and the research answer set programming (ASP) systems dlv, smodels and cmodels. The first goal of this research is to evaluate the relative performance of such systems when used in a purely declarative
way, using a reproducible and extensible experimental methodology. In particular, we consider a third-party problem library,
i.e., the CSPLib, and uniform rules for modelling and instance selection. The second goal is to analyze the marginal effects
of popular reformulation techniques on the various solving technologies. In particular, we consider structural symmetry breaking,
the adoption of global constraints, and the addition of auxiliary predicates. Finally, we evaluate, on a subset of the problems,
the impact of numbers and arithmetic constraints on the different solving technologies. Results show that there is not a single
solver winning on all problems, and that reformulation is almost always beneficial: symmetry-breaking may be a good choice,
but its complexity has to be carefully chosen, by taking into account also the particular solver used. Global constraints
often, but not always, help opl, and the addition of auxiliary predicates is usually worth, especially when dealing with ASP solvers. Moreover, interesting
synergies among the various modelling techniques exist. 相似文献
89.
Dash RK Somersalo E Cabrera ME Calvetti D 《Computer methods and programs in biomedicine》2007,85(3):247-256
The reconstruction of an unknown input function from noisy measurements in a biological system is an ill-posed inverse problem. Any computational algorithm for its solution must use some kind of regularization technique to neutralize the disastrous effects of amplified noise components on the computed solution. In this paper, following a hierarchical Bayesian statistical inversion approach, we seek estimates for the input function and regularization parameter (hyperparameter) that maximize the posterior probability density function. We solve the maximization problem simultaneously for all unknowns, hyperparameter included, by a suitably chosen quasi-Newton method. The optimization approach is compared to the sampling-based Bayesian approach. We demonstrate the efficiency and robustness of the deconvolution algorithm by applying it to reconstructing the time courses of mitochondrial oxygen consumption during muscle state transitions (e.g., from resting state to contraction and recovery), from the simulated noisy output of oxygen concentration dynamics on the muscle surface. The model of oxygen transport and metabolism in skeletal muscle assumes an in vitro cylindrical structure of the muscle in which the oxygen from the surrounding oxygenated solution diffuses into the muscle and is then consumed by the muscle mitochondria. The algorithm can be applied to other deconvolution problems by suitably replacing the forward model of the system. 相似文献
90.
The multimod application framework: a rapid application development tool for computer aided medicine 总被引:1,自引:0,他引:1
Viceconti M Zannoni C Testi D Petrone M Perticoni S Quadrani P Taddei F Imboden S Clapworthy G 《Computer methods and programs in biomedicine》2007,85(2):138-151
This paper describes a new application framework (OpenMAF) for rapid development of multimodal applications in computer-aided medicine. MAF applications are multimodal in data, in representation, and in interaction. The framework supports almost any type of biomedical data, including DICOM datasets, motion-capture recordings, or data from computer simulations (e.g. finite element modeling). The interactive visualization approach (multimodal display) helps the user interpret complex datasets, providing multiple representations of the same data. In addition, the framework allows multimodal interaction by supporting the simultaneous use of different input-output devices like 3D trackers, stereoscopic displays, haptics hardware and speech recognition/synthesis systems. The Framework has been designed to run smoothly even on limited power computers, but it can take advantage of all hardware capabilities. The Framework is based on a collection of portable libraries and it can be compiled on any platform that supports OpenGL, including Windows, MacOS X and any flavor of Unix/linux. 相似文献