全文获取类型
收费全文 | 284篇 |
免费 | 8篇 |
专业分类
化学工业 | 52篇 |
金属工艺 | 9篇 |
机械仪表 | 9篇 |
建筑科学 | 12篇 |
能源动力 | 4篇 |
轻工业 | 9篇 |
无线电 | 22篇 |
一般工业技术 | 94篇 |
冶金工业 | 26篇 |
原子能技术 | 1篇 |
自动化技术 | 54篇 |
出版年
2023年 | 2篇 |
2022年 | 3篇 |
2021年 | 5篇 |
2020年 | 3篇 |
2019年 | 6篇 |
2018年 | 4篇 |
2017年 | 11篇 |
2016年 | 11篇 |
2015年 | 5篇 |
2014年 | 7篇 |
2013年 | 12篇 |
2012年 | 10篇 |
2011年 | 17篇 |
2010年 | 10篇 |
2009年 | 12篇 |
2008年 | 16篇 |
2007年 | 12篇 |
2006年 | 11篇 |
2005年 | 7篇 |
2004年 | 7篇 |
2003年 | 8篇 |
2002年 | 6篇 |
2001年 | 11篇 |
2000年 | 5篇 |
1999年 | 5篇 |
1998年 | 10篇 |
1996年 | 5篇 |
1995年 | 2篇 |
1994年 | 6篇 |
1993年 | 3篇 |
1992年 | 2篇 |
1991年 | 2篇 |
1990年 | 2篇 |
1989年 | 3篇 |
1988年 | 2篇 |
1986年 | 3篇 |
1984年 | 2篇 |
1982年 | 3篇 |
1981年 | 2篇 |
1980年 | 6篇 |
1978年 | 2篇 |
1977年 | 2篇 |
1976年 | 3篇 |
1975年 | 2篇 |
1973年 | 2篇 |
1966年 | 2篇 |
1965年 | 7篇 |
1964年 | 2篇 |
1963年 | 2篇 |
1962年 | 2篇 |
排序方式: 共有292条查询结果,搜索用时 15 毫秒
1.
Locke Davenport Huyer Serena Mandla Yufeng Wang Scott B. Campbell Bess Yee Christian Euler Benjamin F. Lai A. Dawn Bannerman Dawn S. Y. Lin Miles Montgomery Kayla Nemr Timothy Bender Slava Epelman Radhakrishnan Mahadevan Milica Radisic 《Advanced functional materials》2021,31(6):2003341
Itaconate (ITA) is an emerging powerhouse of innate immunity with therapeutic potential that is limited in its ability to be administered in a soluble form. A library of polyester materials that incorporate ITA into polymer backbones resulting in materials with inherent immunoregulatory behavior is developed. Harnessing hydrolytic degradation release from polyester backbones, ITA polymers result in the mechanism specific immunoregulatory properties on macrophage polarization in vitro. In a functional assay, the polymer-released ITA inhibits bacterial growth on acetate. Translation to an in vivo model of biomaterial associated inflammation, intraperitoneal injection of ITA polymers demonstrate a rapid resolution of inflammation in comparison to a control polymer silicone, demonstrating the value of sustained biomimetic presentation of ITA. 相似文献
2.
3.
Single-assignment and functional languages have value semantics that do not permit side-effects. This lack of side-effects makes automatic detection of parallelism and optimization for data locality in programs much easier. However, the same property poses a challenge in implementing these languages efficiently. This paper describes an optimizing compiler system that solves the key problem of aggregate copy elimination. The methods developed rely exclusively on compile-time algorithms, including interprocedural analysis, that are applied to an intermediate data flow representation. By dividing the problem into update-in-place and build-in-place analysis, a small set of relatively simple techniques—edge substitution, graph pattern matching, substructure sharing and substructure targeting—was found to be very powerful. If combined properly and implemented carefully, the algorithms eliminate unnecessary copy operations to a very high degree. No run-time overhead is imposed on the compiled programs. 相似文献
4.
Hyun-Ho Choi Sang-Yoon Lee Il-Yoon Choi Hyo-Nam Cho Sankaran Mahadevan 《Reliability Engineering & System Safety》2006,91(6):674-688
Until now, in many forensic reports, the failure cause assessments are usually carried out by a deterministic approach so far. However, it may be possible for the forensic investigation to lead to unreasonable results far from the real collapse scenario, because the deterministic approach does not systematically take into account any information on the uncertainties involved in the failures of structures.Reliability-based failure cause assessment (reliability-based forensic engineering) methodology is developed which can incorporate the uncertainties involved in structural failures and structures, and to apply them to the collapsed bridge in order to identify the most critical failure scenario and find the cause that triggered the bridge collapse. Moreover, to save the time and cost of evaluation, an algorithm of automated event tree analysis (ETA) is proposed and possible to automatically calculate the failure probabilities of the failure events and the occurrence probabilities of failure scenarios. Also, for reliability analysis, uncertainties are estimated more reasonably by using the Bayesian approach based on the experimental laboratory testing data in the forensic report. For the applicability, the proposed approach is applied to the Hang-ju Grand Bridge, which collapsed during construction, and compared with deterministic approach. 相似文献
5.
6.
Extensive numerical experiments on the scattering from a thin perfectly conducting square plate have been carried out to assess the performance of the exact analytical expressions for the electromagnetic field of a rectangular patch with uniform and linear distributions of current, in connection with the method of moments. Two solution schemes, employing pulses and roof-top functions for approximating the surface current on the plate, have been used. Convergence rates and results for the two solution schemes are compared with each other, as well as with an efficient solution by A.W. Glisson and D.R. Wilton (1980). The overall performance indicated by the numerical experiments suggests that it would be useful to use the exact numerical expressions in the solution of problems where accurate computation of the field radiated by such current sources is required 相似文献
7.
Recent Advances in Hierarchical Reinforcement Learning 总被引:16,自引:0,他引:16
Reinforcement learning is bedeviled by the curse of dimensionality: the number of parameters to be learned grows exponentially with the size of any compact encoding of a state. Recent attempts to combat the curse of dimensionality have turned to principled ways of exploiting temporal abstraction, where decisions are not required at each step, but rather invoke the execution of temporally-extended activities which follow their own policies until termination. This leads naturally to hierarchical control architectures and associated learning algorithms. We review several approaches to temporal abstraction and hierarchical organization that machine learning researchers have recently developed. Common to these approaches is a reliance on the theory of semi-Markov decision processes, which we emphasize in our review. We then discuss extensions of these ideas to concurrent activities, multiagent coordination, and hierarchical memory for addressing partial observability. Concluding remarks address open challenges facing the further development of reinforcement learning in a hierarchical setting. 相似文献
8.
Spontaneous regression of AK-5, a histiocytic tumor, is mediated by CD3-, CD8+ NK cells through ADCC. The onset of AK-5 regression is associated with the induction of humoral immune response and the augmentation of effector function. The mechanism of tumor cell death involves both necrosis and apoptosis. Interleukin-12, a 75-kDa heterodimeric cytokine, has multiple effects on T and NK cells. We have investigated the role of IL-12 in the NK cell-mediated AK-5 tumor regression process. Subcutaneous transplantation of AK-5 tumor induced the expression of IL-12 (p35 and p40) message by Day 6-8 in the splenocytes of syngenic rats. Similarly, analysis of serum samples from tumor-bearing animals showed the presence of circulating IL-12 around the same time. Interaction of immune cells with antibody-tagged AK-5 cells in vitro also triggered the expression of IL-12 message and protein by 3 hr. The circulating IL-12 in the sera of tumor-rejecting animals, as well as rIL-12, stimulated NK cell proliferation, expression of CD16 and CD25, and the activation of NK cells function. These observations suggest that the ability of the AK-5 tumor to induce the endogenous production of IL-12 may be responsible for keeping the NK cells constantly in an activated state, thus demonstrating an efficient mechanism for the complete regression of the tumor. 相似文献
9.
Peida Xu Xiaoyan Su Sankaran Mahadevan Chenzhao Li Yong Deng 《Applied Intelligence》2014,41(3):681-693
As an important tool for knowledge representation and decision-making under uncertainty, Dempster-Shafer evidence theory (D-S theory) has been used in many fields. The application of D-S theory is critically dependent on the availability of the basic probability assignment (BPA). The determination of BPA is still an open issue. A non-parametric method to obtain BPA is proposed in this paper. This method can handle multi-attribute datasets in classification problems. Each attribute value of the dataset sample is treated as a stochastic quantity. Its non-parametric probability density function (PDF) is calculated using the training data, which can be regarded as the probability model for the corresponding attribute. The BPA function is then constructed based on the relationship between the test sample and the probability models. The missing attribute values in datasets are treated as ignorance in the framework of the evidence theory. This method does not have the assumption of any particular distribution. As a result, it can be flexibly used in many engineering applications. The obtained BPA can avoid high conflict between evidence, which is desired in data fusion. Several benchmark classification problems are used to demonstrate the proposed method and to compare against existing methods. The constructed classifier based on the proposed method compares well to the state-of-the-art algorithms. 相似文献
10.
Saraju P. Mohanty Mahadevan Gomathisankaran Elias Kougianos 《Computers & Electrical Engineering》2014
The design space for nanoscale CMOS circuits is vast, with multiple dimensions corresponding to process variability, leakage, power, thermal, reliability, security, and yield considerations. These design issues in the form of either objectives or constraints can be handled at various levels of digital design abstraction, such as architectural, logic and transistor. At the architectural level (a.k.a. Register-Transfer Level, RTL), there is a balanced degree of freedom for fast design exploration by exploring various values of design parameters. Correct design decisions at an early phase of the design cycle ensure that design errors are not propagated to lower levels of circuit abstraction, where it is costly to correct them. Moreover, design optimization at higher levels of abstraction provides a convenient way to deal with design complexity, facilitates design verification, and increases design reuse through intellectual property (IP) cores. 相似文献