全文获取类型
收费全文 | 2002篇 |
免费 | 130篇 |
国内免费 | 8篇 |
专业分类
电工技术 | 18篇 |
综合类 | 1篇 |
化学工业 | 455篇 |
金属工艺 | 42篇 |
机械仪表 | 41篇 |
建筑科学 | 147篇 |
矿业工程 | 3篇 |
能源动力 | 95篇 |
轻工业 | 134篇 |
水利工程 | 16篇 |
石油天然气 | 8篇 |
无线电 | 159篇 |
一般工业技术 | 377篇 |
冶金工业 | 202篇 |
原子能技术 | 12篇 |
自动化技术 | 430篇 |
出版年
2024年 | 3篇 |
2023年 | 24篇 |
2022年 | 64篇 |
2021年 | 85篇 |
2020年 | 53篇 |
2019年 | 61篇 |
2018年 | 64篇 |
2017年 | 56篇 |
2016年 | 74篇 |
2015年 | 71篇 |
2014年 | 75篇 |
2013年 | 145篇 |
2012年 | 103篇 |
2011年 | 150篇 |
2010年 | 92篇 |
2009年 | 88篇 |
2008年 | 93篇 |
2007年 | 97篇 |
2006年 | 96篇 |
2005年 | 82篇 |
2004年 | 48篇 |
2003年 | 65篇 |
2002年 | 52篇 |
2001年 | 23篇 |
2000年 | 23篇 |
1999年 | 29篇 |
1998年 | 43篇 |
1997年 | 21篇 |
1996年 | 23篇 |
1995年 | 22篇 |
1994年 | 31篇 |
1993年 | 16篇 |
1992年 | 17篇 |
1991年 | 13篇 |
1990年 | 9篇 |
1989年 | 15篇 |
1988年 | 7篇 |
1987年 | 9篇 |
1986年 | 9篇 |
1985年 | 19篇 |
1984年 | 8篇 |
1983年 | 10篇 |
1982年 | 6篇 |
1981年 | 7篇 |
1980年 | 11篇 |
1979年 | 4篇 |
1978年 | 4篇 |
1976年 | 3篇 |
1973年 | 3篇 |
1969年 | 5篇 |
排序方式: 共有2140条查询结果,搜索用时 15 毫秒
51.
Raimund Kirner Jens Knoop Adrian Prantl Markus Schordan Albrecht Kadlec 《Software and Systems Modeling》2011,10(3):411-437
Worst-case execution time (WCET) analysis is concerned with computing a precise-as-possible bound for the maximum time the execution of a program can
take. This information is indispensable for developing safety-critical real-time systems, e. g., in the avionics and automotive
fields. Starting with the initial works of Chen, Mok, Puschner, Shaw, and others in the mid and late 1980s, WCET analysis
turned into a well-established and vibrant field of research and development in academia and industry. The increasing number
and diversity of hardware and software platforms and the ongoing rapid technological advancement became drivers for the development
of a wide array of distinct methods and tools for WCET analysis. The precision, generality, and efficiency of these methods
and tools depend much on the expressiveness and usability of the annotation languages that are used to describe feasible and infeasible program paths. In this article we survey the annotation languages which
we consider formative for the field. By investigating and comparing their individual strengths and limitations with respect
to a set of pivotal criteria, we provide a coherent overview of the state of the art. Identifying open issues, we encourage
further research. This way, our approach is orthogonal and complementary to a recent approach of Wilhelm et al. who provide
a thorough survey of WCET analysis methods and tools that have been developed and used in academia and industry. 相似文献
52.
The standard continuous time state space model with stochastic disturbances contains the mathematical abstraction of continuous time white noise. To work with well defined, discrete time observations, it is necessary to sample the model with care. The basic issues are well known, and have been discussed in the literature. However, the consequences have not quite penetrated the practice of estimation and identification. One example is that the standard model of an observation, being a snapshot of the current state plus noise independent of the state, cannot be reconciled with this picture. Another is that estimation and identification of time continuous models require a more careful treatment of the sampling formulas. We discuss and illustrate these issues in the current contribution. An application of particular practical importance is the estimation of models based on irregularly sampled observations. 相似文献
53.
Peter Lincoln Greg Welch Andrew Nashel Andrei State Adrian Ilie Henry Fuchs 《Virtual Reality》2011,15(2-3):225-238
Applications such as telepresence and training involve the display of real or synthetic humans to multiple viewers. When attempting to render the humans with conventional displays, non-verbal cues such as head pose, gaze direction, body posture, and facial expression are difficult to convey correctly to all viewers. In addition, a framed image of a human conveys only a limited physical sense of presence—primarily through the display’s location. While progress continues on articulated robots that mimic humans, the focus has been on the motion and behavior of the robots rather than on their appearance. We introduce a new approach for robotic avatars of real people: the use of cameras and projectors to capture and map both the dynamic motion and the appearance of a real person onto a humanoid animatronic model. We call these devices animatronic Shader Lamps Avatars (SLA). We present a proof-of-concept prototype comprised of a camera, a tracking system, a digital projector, and a life-sized styrofoam head mounted on a pan-tilt unit. The system captures imagery of a moving, talking user and maps the appearance and motion onto the animatronic SLA, delivering a dynamic, real-time representation of the user to multiple viewers. 相似文献
54.
In their recogniser forms, the Earley and RIGLR algorithms for testing whether a string can be derived from a grammar are worst-case cubic on general context free grammars (CFG). Earley gave an outline of a method for turning his recognisers into parsers, but it turns out that this method is incorrect. Tomita’s GLR parser returns a shared packed parse forest (SPPF) representation of all derivations of a given string from a given CFG but is worst-case unbounded polynomial order. The parser version of the RIGLR algorithm constructs Tomita-style SPPFs and thus is also worst-case unbounded polynomial order. We have given a modified worst-case cubic GLR algorithm, that, for any string and any CFG, returns a binarised SPPF representation of all possible derivations of a given string. In this paper we apply similar techniques to develop worst-case cubic Earley and RIGLR parsing algorithms. 相似文献
55.
56.
57.
Adrian Fernandez Emilio Insfran Silvia Abrahão 《Information and Software Technology》2011,53(8):789-817
Context
In recent years, many usability evaluation methods (UEMs) have been employed to evaluate Web applications. However, many of these applications still do not meet most customers’ usability expectations and many companies have folded as a result of not considering Web usability issues. No studies currently exist with regard to either the use of usability evaluation methods for the Web or the benefits they bring.Objective
The objective of this paper is to summarize the current knowledge that is available as regards the usability evaluation methods (UEMs) that have been employed to evaluate Web applications over the last 14 years.Method
A systematic mapping study was performed to assess the UEMs that have been used by researchers to evaluate Web applications and their relation to the Web development process. Systematic mapping studies are useful for categorizing and summarizing the existing information concerning a research question in an unbiased manner.Results
The results show that around 39% of the papers reviewed reported the use of evaluation methods that had been specifically crafted for the Web. The results also show that the type of method most widely used was that of User Testing. The results identify several research gaps, such as the fact that around 90% of the studies applied evaluations during the implementation phase of the Web application development, which is the most costly phase in which to perform changes. A list of the UEMs that were found is also provided in order to guide novice usability practitioners.Conclusions
From an initial set of 2703 papers, a total of 206 research papers were selected for the mapping study. The results obtained allowed us to reach conclusions concerning the state-of-the-art of UEMs for evaluating Web applications. This allowed us to identify several research gaps, which subsequently provided us with a framework in which new research activities can be more appropriately positioned, and from which useful information for novice usability practitioners can be extracted. 相似文献58.
Chvátal-Gomory cuts are among the most well-known classes of cutting planes for general integer linear programs (ILPs). In
case the constraint multipliers are either 0 or
, such cuts are known as
-cuts. It has been proven by Caprara and Fischetti (Math. Program. 74:221–235, 1996) that separation of
-cuts is
-hard.
In this paper, we study ways to separate
-cuts effectively in practice. We propose a range of preprocessing rules to reduce the size of the separation problem. The
core of the preprocessing builds a Gaussian elimination-like procedure. To separate the most violated
-cut, we formulate the (reduced) problem as integer linear program. Some simple heuristic separation routines complete the
algorithmic framework.
Computational experiments on benchmark instances show that the combination of preprocessing with exact and/or heuristic separation
is a very vital idea to generate strong generic cutting planes for integer linear programs and to reduce the overall computation
times of state-of-the-art ILP-solvers. 相似文献
59.
Guy Redding Marlon Dumas Arthur H. M. ter Hofstede Adrian Iordachescu 《Service Oriented Computing and Applications》2010,4(3):191-201
Mainstream business process modelling techniques often promote a design paradigm wherein the activities that may be performed
within a case, together with their usual execution order, form the backbone on top of which other aspects are anchored. This
Fordist paradigm, while effective in standardised and production-oriented domains, breaks when confronted with processes in
which case-by-case variations and exceptions are the norm. We contend that the effective design of flexible processes calls
for a substantially different modelling paradigm. Motivated by requirements from the human services domain, we explore the
hypothesis that a framework consisting of a small set of coordination concepts, combined with established object-oriented
modelling principles, provides a suitable foundation for designing highly flexible processes. Several human service delivery
processes have been designed using this framework, and the resulting models have been used to realise a system to support
these processes in a pilot environment. 相似文献
60.
This paper constructs multirate linear multistep time discretizations based on Adams-Bashforth methods. These methods are
aimed at solving conservation laws and allow different timesteps to be used in different parts of the spatial domain. The
proposed family of discretizations is second order accurate in time and has conservation and linear and nonlinear stability
properties under local CFL conditions. Multirate timestepping avoids the necessity to take small global timesteps—restricted
by the largest value of the Courant number on the grid—and therefore results in more efficient computations. Numerical results
obtained for the advection and Burgers’ equations confirm the theoretical findings.
This work was supported by the National Science Foundation through award NSF CCF-0515170. 相似文献