首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1813篇
  免费   88篇
电工技术   5篇
综合类   1篇
化学工业   427篇
金属工艺   32篇
机械仪表   25篇
建筑科学   73篇
矿业工程   2篇
能源动力   19篇
轻工业   268篇
水利工程   24篇
石油天然气   6篇
无线电   66篇
一般工业技术   271篇
冶金工业   436篇
原子能技术   3篇
自动化技术   243篇
  2024年   2篇
  2023年   19篇
  2022年   54篇
  2021年   63篇
  2020年   29篇
  2019年   47篇
  2018年   58篇
  2017年   45篇
  2016年   50篇
  2015年   51篇
  2014年   60篇
  2013年   104篇
  2012年   104篇
  2011年   156篇
  2010年   126篇
  2009年   99篇
  2008年   133篇
  2007年   113篇
  2006年   70篇
  2005年   69篇
  2004年   58篇
  2003年   42篇
  2002年   48篇
  2001年   46篇
  2000年   27篇
  1999年   29篇
  1998年   19篇
  1997年   19篇
  1996年   20篇
  1995年   12篇
  1994年   19篇
  1993年   16篇
  1992年   17篇
  1990年   14篇
  1989年   6篇
  1988年   4篇
  1987年   12篇
  1986年   6篇
  1985年   6篇
  1984年   4篇
  1983年   4篇
  1982年   3篇
  1981年   2篇
  1980年   3篇
  1977年   3篇
  1976年   3篇
  1975年   1篇
  1974年   1篇
  1970年   1篇
  1969年   1篇
排序方式: 共有1901条查询结果,搜索用时 555 毫秒
41.
42.
Acetaminophen (paracetamol) is available in a wide range of oral formulations designed to meet the needs of the population across the age-spectrum, but for people with impaired swallowing, i.e. dysphagia, both solid and liquid medications can be difficult to swallow without modification. The effect of a commercial polysaccharide thickener, designed to be added to fluids to promote safe swallowing by dysphagic patients, on rheology and acetaminophen dissolution was tested using crushed immediate-release tablets in water, effervescent tablets in water, elixir and suspension. The inclusion of the thickener, comprised of xanthan gum and maltodextrin, had a considerable impact on dissolution; acetaminophen release from modified medications reached 12–50% in 30?min, which did not reflect the pharmacopeia specification for immediate release preparations. Flow curves reflect the high zero-shear viscosity and the apparent yield stress of the thickened products. The weak gel nature, in combination with high G' values compared to G'' (viscoelasticity) and high apparent yield stress, impact drug release. The restriction on drug release from these formulations is not influenced by the theoretical state of the drug (dissolved or dispersed), and the approach typically used in clinical practice (mixing crushed tablets into pre-prepared thickened fluid) cannot be improved by altering the order of incorporation or mixing method.  相似文献   
43.

Context

Information system development (ISD) has been plagued with high failure rates. This is partially due to the activities being a combination of both a technical and social processes involving stakeholders with conflicting interests.

Objective

Existing software risk management theories and frameworks offer limited suggestions for actions that can be taken to reduce the chance of failure of ISD projects. Our objective is to examine the connections among some of the more important user related risks in order to shed light on how specific strategies enhance the chance of project success.

Method

We conducted a sample of information systems project managers to test a multivariate model to explain the impact of pursuing a partnership with users on the conflicts that arise between users and developers, role ambiguity, and subsequent impact on project performance.

Results

The proposed model was supported, suggesting that user-developer conflict and role ambiguity have a negative impact on performance estimation difficulty, which negatively affects project performance.

Conclusion

Pursuit of project partnering yields a number of significant relationships in the model indicating an organization can implement practices that reduce risks associated with role ambiguity and conflict in system development projects.  相似文献   
44.
45.
On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution. The approach presented here can apply not only to conventional processors but also to other technologies such as Field Programmable Gate Arrays (FPGA), Graphical Processing Units (GPU), and the STI Cell BE processor. Results on modern processor architectures and the STI Cell BE are presented.

Program summary

Program title: ITER-REFCatalogue identifier: AECO_v1_0Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECO_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 7211No. of bytes in distributed program, including test data, etc.: 41 862Distribution format: tar.gzProgramming language: FORTRAN 77Computer: desktop, serverOperating system: Unix/LinuxRAM: 512 MbytesClassification: 4.8External routines: BLAS (optional)Nature of problem: On modern architectures, the performance of 32-bit operations is often at least twice as fast as the performance of 64-bit operations. By using a combination of 32-bit and 64-bit floating point arithmetic, the performance of many dense and sparse linear algebra algorithms can be significantly enhanced while maintaining the 64-bit accuracy of the resulting solution.Solution method: Mixed precision algorithms stem from the observation that, in many cases, a single precision solution of a problem can be refined to the point where double precision accuracy is achieved. A common approach to the solution of linear systems, either dense or sparse, is to perform the LU factorization of the coefficient matrix using Gaussian elimination. First, the coefficient matrix A is factored into the product of a lower triangular matrix L and an upper triangular matrix U. Partial row pivoting is in general used to improve numerical stability resulting in a factorization PA=LU, where P is a permutation matrix. The solution for the system is achieved by first solving Ly=Pb (forward substitution) and then solving Ux=y (backward substitution). Due to round-off errors, the computed solution, x, carries a numerical error magnified by the condition number of the coefficient matrix A. In order to improve the computed solution, an iterative process can be applied, which produces a correction to the computed solution at each iteration, which then yields the method that is commonly known as the iterative refinement algorithm. Provided that the system is not too ill-conditioned, the algorithm produces a solution correct to the working precision.Running time: seconds/minutes  相似文献   
46.
Motorcycle protective clothing can be uncomfortably hot during summer, and this experiment was designed to evaluate the physiological significance of that burden. Twelve males participated in four, 90-min trials (cycling 30 W) across three environments (25, 30, 35 °C [all 40% relative humidity]). Clothing was modified between full and minimal injury protection. Both ensembles were tested at 25 °C, with only the more protective ensemble investigated at 30 and 35 °C. At 35 °C, auditory canal temperature rose at 0.02 °C min?1 (SD 0.005), deviating from all other trials (p < 0.05). The thresholds for moderate (>38.5 °C) and profound hyperthermia (>40.0 °C) were predicted to occur within 105 min (SD 20.6) and 180 min (SD 33.0), respectively. Profound hyperthermia might eventuate in ~10 h at 30 °C, but should not occur at 25 °C. These outcomes demonstrate a need to enhance the heat dissipation capabilities of motorcycle clothing designed for summer use in hot climates, but without compromising impact protection.

Practitioner’s Summary:

Motorcycle protective clothing can be uncomfortably hot during summer. This experiment was designed to evaluate the physiological significance of this burden across climatic states. In the heat, moderate (>38.5 °C) and profound hyperthermia (>40.0 °C) were predicted to occur within 105 and 180 min, respectively.  相似文献   

47.
In early or preparatory design stages, an architect or designer sketches out rough ideas, not only about the object or structure being considered, but its relation to its spatial context. This is an iterative process, where the sketches are not only the primary means for testing and refining ideas, but also for communicating among a design team and to clients. Hence, sketching is the preferred media for artists and designers during the early stages of design, albeit with a major drawback: sketches are 2D and effects such as view perturbations or object movement are not supported, thereby inhibiting the design process. We present an interactive system that allows for the creation of a 3D abstraction of a designed space, built primarily by sketching in 2D within the context of an anchoring design or photograph. The system is progressive in the sense that the interpretations are refined as the user continues sketching. As a key technical enabler, we reformulate the sketch interpretation process as a selection optimization from a set of context‐generated canvas planes in order to retrieve a regular arrangement of planes. We demonstrate our system (available at http:/geometry.cs.ucl.ac.uk/projects/2016/smartcanvas/ ) with a wide range of sketches and design studies.  相似文献   
48.
We define the notion of controlled hybrid language that allows information share and interaction between a controlled natural language (specified by a context-free grammar) and a controlled visual language (specified by a Symbol-Relation grammar). We present the controlled hybrid language INAUT, used to represent nautical charts of the French Naval and Hydrographic Service (SHOM) and their companion texts (Instructions nautiques).  相似文献   
49.
50.
Statistical tests are often performed to discover which experimental variables are reacting to specific treatments. Time-series statistical models usually require the researcher to make assumptions with respect to the distribution of measured responses which may not hold. Randomization tests can be applied to data in order to generate null distributions non-parametrically. However, large numbers of randomizations are required for the precise p-values needed to control false discovery rates. When testing tens of thousands of variables (genes, chemical compounds, or otherwise), significant q-value cutoffs can be extremely small (on the order of 10−5 to 10−8). This requires high-precision p-values, which in turn require large numbers of randomizations. The NVIDIA® Compute Unified Device Architecture® (CUDA®) platform for General Programming on the Graphics Processing Unit (GPGPU) was used to implement an application which performs high-precision randomization tests via Monte Carlo sampling for quickly screening custom test statistics for experiments with large numbers of variables, such as microarrays, Next-Generation sequencing read counts, chromatographical signals, or other abundance measurements. The software has been shown to achieve up to more than 12 fold speedup on a Graphics Processing Unit (GPU) when compared to a powerful Central Processing Unit (CPU). The main limitation is concurrent random access of shared memory on the GPU. The software is available from the authors.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号