首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1567篇
  免费   28篇
电工技术   15篇
化学工业   289篇
金属工艺   25篇
机械仪表   31篇
建筑科学   80篇
矿业工程   17篇
能源动力   36篇
轻工业   134篇
水利工程   7篇
石油天然气   9篇
武器工业   1篇
无线电   110篇
一般工业技术   192篇
冶金工业   392篇
原子能技术   11篇
自动化技术   246篇
  2022年   9篇
  2021年   9篇
  2020年   9篇
  2019年   11篇
  2018年   17篇
  2017年   15篇
  2016年   19篇
  2015年   26篇
  2014年   30篇
  2013年   90篇
  2012年   48篇
  2011年   73篇
  2010年   50篇
  2009年   74篇
  2008年   59篇
  2007年   63篇
  2006年   77篇
  2005年   66篇
  2004年   64篇
  2003年   49篇
  2002年   54篇
  2001年   45篇
  2000年   33篇
  1999年   33篇
  1998年   43篇
  1997年   43篇
  1996年   42篇
  1995年   26篇
  1994年   30篇
  1993年   28篇
  1992年   31篇
  1991年   12篇
  1990年   29篇
  1989年   14篇
  1988年   10篇
  1987年   15篇
  1986年   18篇
  1985年   24篇
  1984年   25篇
  1983年   22篇
  1982年   15篇
  1981年   13篇
  1980年   12篇
  1979年   14篇
  1978年   12篇
  1977年   13篇
  1976年   21篇
  1975年   10篇
  1974年   9篇
  1973年   10篇
排序方式: 共有1595条查询结果,搜索用时 15 毫秒
21.
As recognized precursor lesions to colorectal cancer, colorectal adenomatous polyps have been studied to enhance knowledge of colorectal cancer etiology. Although most of the known risk factors for colorectal cancer are also associated with the occurrence of colorectal adenomas, cigarette smoking has had a strong, consistent relationship with colorectal adenomas but is generally not associated with colorectal cancer. The explanation for this paradox is unknown. With data collected in 1986-1988 during a large case-control study based on colonoscopy results in New York City, New York, the authors investigated the possibility that the paradox may arise because subjects with colorectal adenomas were included in the control group of cancer case-control studies. The authors found a statistically significant increased risk between heavy cigarette smoking (smokers with > or = 40 pack-years of smoking) and risk of adenoma (odds ratio (OR) = 1.61, 95% confidence interval (CI) 1.06-2.44). They saw no increased colorectal cancer risk from heavy cigarette smoking (OR = 1.02, 95% CI 0.52-1.99) using a "manufactured" control group to simulate a typical unscreened, population-based control group. When the authors compared these colorectal cancer cases with an adenoma-free control group examined by colonoscopy in a polytomous model with several case groups (newly diagnosed adenomas, carcinoma in situ, intramucosal carcinoma, and colorectal cancer), they found that the risk for 20-39 pack-years of smoking was elevated, although not statistically significant, and was similar for all four case groups. The risk for the highest smoking category (> or = 40 pack-years) was more strongly elevated in all four case groups, although it was statistically significant for only the newly diagnosed adenoma and the carcinoma in situ cases (adenomas, OR = 1.59, 95% CI 1.05-2.42; carcinoma in situ, OR = 2.05, 95% CI 1.01-4.15; intramucosal carcinoma, OR = 1.30, 95% CI 0.61-2.77; and colorectal cancer, OR = 1.30, 95% CI 0.64-2.65). While the authors' study is weakened by the lack of statistical significance concerning risk for colorectal cancer, these data offer some support for the hypothesis that the association between cigarette smoking and risk of colorectal cancer may have been masked by inclusion in the control group of subjects with adenomas. They also suggest that the major effect of smoking on the colorectal adenoma-carcinoma sequence occurs in the earlier stages of the formation of adenoma and the development of carcinoma in situ.  相似文献   
22.
Abstract— The current status of AC powder electroluminescent (ACPEL) displays is reviewed with particular emphasis given to color and lifetime. The printing of the displays in forward and reverse architectures is also discussed, in addition to the fabrication of ACPEL displays with interdigitated electrodes, and different types of ACPEL phosphors and materials for back electrodes, transparent conducting electrodes, binders, and dielectrics are considered. Furthermore, shape conformable and highly flexible ACPEL displays are surveyed.  相似文献   
23.
This paper presents a parameter sensitivity study of the Nelder-Mead Simplex Method for unconstrained optimization. Nelder-Mead Simplex Method is very easy to implement in practice, because it does not require gradient computation; however, it is very sensitive to the choice of initial points selected. Fan-Zahara conducted a sensitivity study using a select set of test cases and suggested the best values for the parameters based on the highest percentage rate of successful minimization. Begambre-Laier used a strategy to control the Particle Swarm Optimization parameters based on the Nelder Mead Simplex Method in identifying structural damage. The main purpose of the paper is to extend their parameter sensitivity study to better understand the parameter’s behavior. The comprehensive parameter sensitivity study was conducted on seven test functions: B2, Beale, Booth, Wood, Rastrigin, Rosenbrock and Sphere Functions to search for common patterns and relationships each parameter has in producing the optimum solution. The results show important relations of the Nelder-Mead Simplex parameters: reflection, expansion, contraction, and Simplex size and how they impact the optimum solutions. This study is crucial, because better understanding of the parameters behavior can motivate current and future research using Nelder-Mead Simplex in creating an intelligent algorithm, which can be more effective, efficient, and save computational time.  相似文献   
24.
This paper presents results on a new hybrid optimization method which combines the best features of four traditional optimization methods together with an intelligent adjustment algorithm to speed convergence on unconstrained and constrained optimization problems. It is believed that this is the first time that such a broad array of methods has been employed to facilitate synergistic enhancement of convergence. Particle swarm optimization is based on swarm intelligence inspired by the social behavior and movement dynamics of bird flocking, fish schooling, and swarming theory. This method has been applied for structural damage identification, neural network training, and reactive power optimization. It is also believed that this is the first time an intelligent parameter adjustment algorithm has been applied to maximize the effectiveness of individual component algorithms within the hybrid method. A comprehensive sensitivity analysis of the traditional optimization methods within the hybrid group is used to demonstrate how the relationship among the design variables in a given problem can be used to adjust algorithm parameters. The new method is benchmarked using 11 classical test functions and the results show that the new method outperforms eight of the most recently published search methodologies.  相似文献   
25.
Given a language L and a non-deterministic finite automaton M, we consider whether we can determine efficiently (in the size of M) if M accepts at least one word in L, or infinitely many words. Given that M accepts at least one word in L, we consider how long a shortest word can be. The languages L that we examine include the palindromes, the non-palindromes, the k-powers, the non-k-powers, the powers, the non-powers (also called primitive words), the words matching a general pattern, the bordered words, and the unbordered words.  相似文献   
26.
Security countermeasures help ensure the confidentiality, availability, and integrity of information systems by preventing or mitigating asset losses from Cybersecurity attacks. Due to uncertainty, the financial impact of threats attacking assets is often difficult to measure quantitatively, and thus it is difficult to prescribe which countermeasures to employ. In this research, we describe a decision support system for calculating the uncertain risk faced by an organization under cyber attack as a function of uncertain threat rates, countermeasure costs, and impacts on its assets. The system uses a genetic algorithm to search for the best combination of countermeasures, allowing the user to determine the preferred tradeoff between the cost of the portfolio and resulting risk. Data collected from manufacturing firms provide an example of results under realistic input conditions.  相似文献   
27.
In many clinical scenarios, medical data visualization and interaction are important to physicians for exploring inner anatomical structures and extracting meaningful diagnostic information. Real-time high-quality volume rendering, artifact-free clipping, and rapid scalar value classification are important techniques employed in this process. Unfortunately, in practice, it is still difficult to achieve an optimal balance. In this paper, we present some strategies to address this issue, which are based on the calculation of segment-based post color attenuation and dynamic ray–plane intersection (RPI) respectively. When implemented within our visualization system, the new classification algorithm can deliver real-time performance while avoiding the “color over-accumulation” artifacts suffered by the commonly used acceleration algorithms that employ pre-integrated classification. Our new strategy can achieve an optimized balance between image quality and classification speed. Next, the RPI algorithm is used with opacity adjustment technique to effectively remove the “striping” artifacts on the clipping plane caused by the nonuniform integration length. Furthermore, we present techniques for multiple transfer function (TF) based anatomical feature enhancement and “keyhole” based endoscopic inner structure view. Finally, the algorithms are evaluated subjectively by radiologists and quantitatively compared using image power spectrum analysis.  相似文献   
28.
This paper recasts the Friesz et al. (1993) measure theoretic model of dynamic network user equibrium as a controlled variational inequality problem involving Riemann integrals. This restatement is done to make the model and its foundations accessible to a wider audience by removing the need to have a background in functional analysis. Our exposition is dependent on previously unavailable necessary conditions for optimal control problems with state-dependent time lags. These necessary conditions, derived in an Appendix, are employed to show that a particular variational inequality control problem has solutions that are dynamic network user equilibria. Our analysis also shows that use of proper flow propagation constraints obviates the need to explicitly employ the arc exit time functions that have complicated numerical implementations of the Friesz et al. (1993) model heretofore. We close by describing the computational implications of numerically determining dynamic user equilibria from formulations based on state-dependent time lags.  相似文献   
29.
The genomics, proteomics, clinical, and drug discovery laboratories have a growing need to maintain valuable samples at ultra-low (−80°C) temperatures in a validated, secure environment. Automated sample processing systems have until now required manual (off-line) storage of samples at −80°C, reducing system reliability and speed. Both of these important needs are addressed by the Sample Process Management System being introduced by BIOPHILE Inc. Conventional sample management processes, such as storage, retrieval, and cataloging, are increasingly strained by the growing sample populations. There are variable sample types, access requirements and storage requirements. Security and inventory procedures are implemented manually. The evolving technologies present in the laboratory cannot interface with conventional manual storage techniques. Addressing these limitations, the primary benefits of BIOPHILE's solutions are:
• Fully validated sample management process that coordinates the life-cycles of samples and their related data.
• Robotic technology to securely store and retrieve samples, improving their accessibility and stability. Thermal shock is reduced, improving sample longevity and quality. The robotic technology allows integration with larger automation systems.
• A process program to develop a Sample Management Strategy. This strategy is developed by analyzing long-term research goals, current baseline processes, and identification of current sample life cycles. A full validation documentation package can be generated, providing a high level of quality assurance.
• Improved sample visibility and quality assurance - automated sample population cataloging; controlled sample management access and security.
  相似文献   
30.
Linear models of synaptic plasticity provide a useful starting-point for examining the dynamics of neuronal development and learning, but their inherent problems are well known. Models of synaptic plasticity that embrace the demands of biological realism are therefore typically nonlinear. Viewed from a more abstract perspective, nonlinear models of synaptic plasticity are a subset of nonlinear dynamical systems. As such, they may therefore exhibit bifurcations under the variation of control parameters, including noise and errors in synaptic updates. One source of noise or error is the cross-talk that occurs during otherwise Hebbian plasticity. Under cross-talk, stimulation of a set of synapses can induce or modify plasticity in adjacent, unstimulated synapses. Here, we analyze two nonlinear models of developmental synaptic plasticity and a model of independent component analysis in the presence of a simple model of cross-talk. We show that cross-talk does indeed induce bifurcations in these models, entirely destroying their ability to acquire either developmentally or learning-related patterns of fixed points. Importantly, the critical level of cross-talk required to induce bifurcations in these models is very sensitive to the statistics of the afferents' activities and the number of afferents synapsing on a postsynaptic cell. In particular, the critical level can be made arbitrarily small. Because bifurcations are inevitable in nonlinear models, our results likely apply to many nonlinear models of synaptic plasticity, although the precise details vary by model. Hence, many nonlinear models of synaptic plasticity are potentially fatally compromised by the toxic influence of cross-talk and other sources of noise and errors more generally. We conclude by arguing that biologically realistic models of synaptic plasticity must be robust against noise-induced bifurcations and that biological systems may have evolved strategies to circumvent their possible dangers.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号