首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1038篇
  免费   66篇
电工技术   17篇
综合类   2篇
化学工业   259篇
金属工艺   12篇
机械仪表   28篇
建筑科学   53篇
矿业工程   11篇
能源动力   67篇
轻工业   131篇
水利工程   6篇
石油天然气   3篇
无线电   89篇
一般工业技术   153篇
冶金工业   33篇
原子能技术   6篇
自动化技术   234篇
  2023年   9篇
  2022年   35篇
  2021年   48篇
  2020年   23篇
  2019年   22篇
  2018年   17篇
  2017年   28篇
  2016年   48篇
  2015年   48篇
  2014年   61篇
  2013年   78篇
  2012年   71篇
  2011年   87篇
  2010年   57篇
  2009年   63篇
  2008年   69篇
  2007年   45篇
  2006年   48篇
  2005年   44篇
  2004年   32篇
  2003年   22篇
  2002年   24篇
  2001年   13篇
  2000年   14篇
  1999年   11篇
  1998年   7篇
  1997年   7篇
  1996年   7篇
  1995年   3篇
  1994年   3篇
  1993年   6篇
  1992年   3篇
  1991年   6篇
  1990年   2篇
  1989年   2篇
  1988年   3篇
  1987年   2篇
  1986年   3篇
  1985年   5篇
  1984年   2篇
  1983年   2篇
  1982年   3篇
  1981年   5篇
  1979年   4篇
  1977年   3篇
  1976年   1篇
  1975年   2篇
  1971年   2篇
  1969年   1篇
  1967年   1篇
排序方式: 共有1104条查询结果,搜索用时 15 毫秒
11.
Proliferation and apoptosis of neoplastic cells are prognostic biomarkers in plasma cell neoplasms (PCNs). The prognostic capacity of proliferation to apoptosis ratio (Ratio-PA) in the era of immunomodulatory treatments is re-evaluated in 316 gammopathy of undetermined significance (MGUS), 57 smoldering multiple myeloma (SMM), and 266 multiple myeloma (MM) patients. Ratio-PA of 0.77 ± 0.12, 1.94 ± 0.52, and 11.2 ± 0.7 (p < 0.0001) were observed in MGUS, SMM, and MM patients. Ten-year overall survival (10y-OS) rates for patients with low/high Ratio-PA were 93.5%/77.3% p < 0.0001) for MGUS, 82.5%/64.7% (p < 0.05) for SMM, and 62.3%/47.0% (p < 0.05) for MM. For patients with low, intermediate, and high risk, 10y-OS for low/high Ratio-PA were 95.5%/72.9% (p < 0.0001), 74.2%/50.4% (p < 0.0001), and 35.3%/20.0% (p = 0.836), respectively. Ratio-PA was an independent prognostic factor for OS (HR = 2.119, p < 0.0001, Harrell-C-statistic = 0.7440 ± 0.0194) when co-analyzed with sex, age, and standard risk. In patients with Ratio-PAhigh, only first-line therapy with VRd/VTd, but not PAD/VCD, coupled with ASCT was associated with high 10y-OS (82.7%). Tumor cell Ratio-PA estimated at diagnosis offers a prognostic biomarker that complements standard risk stratification and helps to guide the clinical management of pre-malignant and symptomatic PCNs. Every effort should be made to provide first-line therapies including VTd or VRd associated with ASCT to patients with Ratio-PAhigh at higher risk of progression and death.  相似文献   
12.
Infectious diseases caused by intestinal protozoan, such as Entamoeba histolytica (E. histolytica) and Giardia lamblia (G. lamblia) are a worldwide public health issue. They affect more than 70 million people every year. They colonize intestines causing primarily diarrhea; nevertheless, these infections can lead to more serious complications. The treatment of choice, metronidazole, is in doubt due to adverse effects and resistance. Therefore, there is a need for new compounds against these parasites. In this work, a structure-based virtual screening of FDA-approved drugs was performed to identify compounds with antiprotozoal activity. The glycolytic enzyme triosephosphate isomerase, present in both E. histolytica and G. lamblia, was used as the drug target. The compounds with the best average docking score on both structures were selected for the in vitro evaluation. Three compounds, chlorhexidine, tolcapone, and imatinib, were capable of inhibit growth on G. lamblia trophozoites (0.05–4.935 μg/mL), while folic acid showed activity against E. histolytica (0.186 μg/mL) and G. lamblia (5.342 μg/mL).  相似文献   
13.
In this paper, an asymptotic expansion is constructed to solve second-order differential equation systems with highly oscillatory forcing terms involving multiple frequencies. An asymptotic expansion is derived in inverse of powers of the oscillatory parameter and its truncation results in a very effective method of dicretizing the differential equation system in question. Numerical experiments illustrate the effectiveness of the asymptotic method in contrast to the standard Runge–Kutta method.  相似文献   
14.
In this paper we propose and evaluate a set of new strategies for the solution of three dimensional separable elliptic problems on CPU–GPU platforms. The numerical solution of the system of linear equations arising when discretizing those operators often represents the most time consuming part of larger simulation codes tackling a variety of physical situations. Incompressible fluid flows, electromagnetic problems, heat transfer and solid mechanic simulations are just a few examples of application areas that require efficient solution strategies for this class of problems. GPU computing has emerged as an attractive alternative to conventional CPUs for many scientific applications. High speedups over CPU implementations have been reported and this trend is expected to continue in the future with improved programming support and tighter CPU–GPU integration. These speedups by no means imply that CPU performance is no longer critical. The conventional CPU-control–GPU-compute pattern used in many applications wastes much of CPU’s computational power. Our proposed parallel implementation of a classical cyclic reduction algorithm to tackle the large linear systems arising from the discretized form of the elliptic problem at hand, schedules computing on both the GPU and the CPUs in a cooperative way. The experimental result demonstrates the effectiveness of this approach.  相似文献   
15.
Automated formal verification of security protocols has been mostly focused on analyzing high-level abstract models which, however, are significantly different from real protocol implementations written in programming languages. Recently, some researchers have started investigating techniques that bring automated formal proofs closer to real implementations. This paper surveys these attempts, focusing on approaches that target the application code that implements protocol logic, rather than the libraries that implement cryptography. According to these approaches, libraries are assumed to correctly implement some models. The aim is to derive formal proofs that, under this assumption, give assurance about the application code that implements the protocol logic. The two main approaches of model extraction and code generation are presented, along with the main techniques adopted for each approach.  相似文献   
16.
Over the past decade process mining has emerged as a new analytical discipline able to answer a variety of questions based on event data. Event logs have a very particular structure; events have timestamps, refer to activities and resources, and need to be correlated to form process instances. Process mining results tend to be very different from classical data mining results, e.g., process discovery may yield end-to-end process models capturing different perspectives rather than decision trees or frequent patterns. A process-mining tool like ProM provides hundreds of different process mining techniques ranging from discovery and conformance checking to filtering and prediction. Typically, a combination of techniques is needed and, for every step, there are different techniques that may be very sensitive to parameter settings. Moreover, event logs may be huge and may need to be decomposed and distributed for analysis. These aspects make it very cumbersome to analyze event logs manually. Process mining should be repeatable and automated. Therefore, we propose a framework to support the analysis of process mining workflows. Existing scientific workflow systems and data mining tools are not tailored towards process mining and the artifacts used for analysis (process models and event logs). This paper structures the basic building blocks needed for process mining and describes various analysis scenarios. Based on these requirements we implemented RapidProM, a tool supporting scientific workflows for process mining. Examples illustrating the different scenarios are provided to show the feasibility of the approach.  相似文献   
17.
18.
Real-world applications of multivariate data analysis often stumble upon the barrier of interpretability. Simple data analysis methods are usually easy to interpret, but they risk providing poor data models. More involved methods may instead yield faithful data models, but limited interpretability. This is the case of linear and nonlinear methods for multivariate data visualization through dimensionality reduction. Even though the latter have provided some of the most exciting visualization developments, their practicality is hindered by the difficulty of explaining them in an intuitive manner. The interpretability, and therefore the practical applicability, of data visualization through nonlinear dimensionality reduction (NLDR) methods would improve if, first, we could accurately calculate the distortion introduced by these methods in the visual representation and, second, if we could faithfully reintroduce this distortion into such representation. In this paper, we describe a technique for the reintroduction of the distortion into the visualization space of NLDR models. It is based on the concept of density-equalizing maps, or cartograms, recently developed for the representation of geographic information. We illustrate it using Generative Topographic Mapping (GTM), a nonlinear manifold learning method that can provide both multivariate data visualization and a measure of the local distortion that the model generates. Although illustrated here with GTM, it could easily be extended to other NLDR visualization methods, provided a local distortion measure could be calculated. It could also serve as a guiding tool for interactive data visualization.  相似文献   
19.
In this paper the system ACOPlan for planning with non uniform action cost is introduced and analyzed. ACOPlan is a planner based on the ant colony optimization framework, in which a colony of planning ants searches for near optimal solution plans with respect to an overall plan cost metric. This approach is motivated by the strong similarity between the process used by artificial ants to build solutions and the methods used by state?Cbased planners to search solution plans. Planning ants perform a stochastic and heuristic based search by interacting through a pheromone model. The proposed heuristic and pheromone models are presented and compared through systematic experiments on benchmark planning domains. Experiments are also provided to compare the quality of ACOPlan solution plans with respect to state of the art satisficing planners. The analysis of the results confirm the good performance of the Action?CAction pheromone model and points out the promising performance of the novel Fuzzy?CLevel?CAction pheromone model. The analysis also suggests general principles for designing performant pheromone models for planning and further extensions of ACOPlan to other optimization models.  相似文献   
20.
The aim of this paper is to show how the hybridization of a multi-objective evolutionary algorithm (MOEA) and a local search method based on the use of rough set theory is a viable alternative to obtain a robust algorithm able to solve difficult constrained multi-objective optimization problems at a moderate computational cost. This paper extends a previously published MOEA [Hernández-Díaz AG, Santana-Quintero LV, Coello Coello C, Caballero R, Molina J. A new proposal for multi-objective optimization using differential evolution and rough set theory. In: 2006 genetic and evolutionary computation conference (GECCO’2006). Seattle, Washington, USA: ACM Press; July 2006], which was limited to unconstrained multi-objective optimization problems. Here, the main idea is to use this sort of hybrid approach to approximate the Pareto front of a constrained multi-objective optimization problem while performing a relatively low number of fitness function evaluations. Since in real-world problems the cost of evaluating the objective functions is the most significant, our underlying assumption is that, by aiming to minimize the number of such evaluations, our MOEA can be considered efficient. As in its previous version, our hybrid approach operates in two stages: in the first one, a multi-objective version of differential evolution is used to generate an initial approximation of the Pareto front. Then, in the second stage, rough set theory is used to improve the spread and quality of this initial approximation. To assess the performance of our proposed approach, we adopt, on the one hand, a set of standard bi-objective constrained test problems and, on the other hand, a large real-world problem with eight objective functions and 160 decision variables. The first set of problems are solved performing 10,000 fitness function evaluations, which is a competitive value compared to the number of evaluations previously reported in the specialized literature for such problems. The real-world problem is solved performing 250,000 fitness function evaluations, mainly because of its high dimensionality. Our results are compared with respect to those generated by NSGA-II, which is a MOEA representative of the state-of-the-art in the area.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号