首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   145篇
  免费   4篇
电工技术   2篇
化学工业   32篇
金属工艺   1篇
机械仪表   1篇
建筑科学   3篇
能源动力   5篇
轻工业   16篇
水利工程   1篇
无线电   9篇
一般工业技术   20篇
冶金工业   14篇
原子能技术   1篇
自动化技术   44篇
  2023年   1篇
  2022年   6篇
  2021年   1篇
  2020年   2篇
  2019年   1篇
  2018年   6篇
  2017年   3篇
  2016年   3篇
  2015年   2篇
  2014年   3篇
  2013年   11篇
  2012年   4篇
  2011年   8篇
  2010年   1篇
  2009年   4篇
  2008年   7篇
  2007年   7篇
  2006年   7篇
  2005年   5篇
  2004年   8篇
  2003年   2篇
  2002年   2篇
  2001年   1篇
  2000年   3篇
  1999年   2篇
  1998年   2篇
  1997年   3篇
  1996年   2篇
  1995年   2篇
  1994年   2篇
  1993年   1篇
  1992年   3篇
  1991年   1篇
  1990年   1篇
  1989年   1篇
  1988年   1篇
  1987年   3篇
  1985年   3篇
  1984年   2篇
  1983年   1篇
  1982年   1篇
  1981年   2篇
  1980年   4篇
  1979年   2篇
  1978年   2篇
  1977年   4篇
  1975年   3篇
  1974年   1篇
  1973年   1篇
  1966年   1篇
排序方式: 共有149条查询结果,搜索用时 15 毫秒
1.
Determination of the thermo-mechanical structure of the crust for seismically active regions using available geophysical and geological data is of great importance. The most important feature of the intraplate earthquakes in the Indian region are that the seismicity occurs within the entire crust. In Latur region of India an earthquake occurred in the upper crust. In such situations, quantifying the uncertainties in seismogenic depths becomes very important. The stochastic heat conduction equation has been solved for different sets of boundary conditions, an exponentially decreasing radiogenic heat generation and randomness in thermal conductivity. Closed form analytical expressions for mean and variance in the temperature depth distribution have been used and an automatic formulation has been developed in Matlab for computing and plotting the thermal structure. The Matlab toolbox presented allows us to display the controlling thermal parameters on the screen directly, and plot the subsurface thermal structure along with its error bounds. The software can be used to quantify the thermal structure for any given region and is applied here to the Latur earthquake region of India.  相似文献   
2.
In this paper, we introduce geometry-dependent lighting that allows lighting parameters to be defined independently and possibly discrepantly over an object or scene based on the local geometry. We present and discuss light collages, a lighting design system with geometry-dependent lights for effective feature-enhanced visualization. Our algorithm segments the objects into local surface patches and places lights that are locally consistent but globally discrepant to enhance the perception of shape. We use spherical harmonics for efficiently storing and computing light placement and assignment. We also outline a method to find the minimal number of light sources sufficient to illuminate an object well with our globally discrepant lighting approach.  相似文献   
3.
We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick “repairs,” which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions, without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been ? in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most ? log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degree would have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et?al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network.  相似文献   
4.
Solid waste management is increasingly becoming a challenging task for the municipal authorities due to increasing waste quantities, changing waste composition, decreasing land availability for waste disposal sites and increasing awareness about the environmental risk associated with the waste management facilities. The present study focuses on the optimum selection of the treatment and disposal facilities, their capacity planning and waste allocation under uncertainty associated with the long-term planning for solid waste management. The fuzzy parametric programming model is based on a multi-objective, multi-period system for integrated planning for solid waste management. The model dynamically locates the facilities and allocates the waste considering fuzzy waste quantity and capacity of waste management facility. The model addresses uncertainty in waste quantity as well as uncertainties in the operating capacities of waste management facilities simultaneously. It was observed that uncertainty in waste quantity is likely to affect the planning for waste treatment/disposal facilities more as compared with the uncertainty in the capacities of the waste management facilities. The relationship between increase in waste quantity and increase in the total cost/risk involved in waste management is found to be nonlinear. Therefore, it is possible that a marginal change in waste quantity could increase the total cost/risk substantially. The information obtained from the analysis of modeling results can be effectively used for understanding the effect of changing the priorities and objectives of planning decisions on facility selections and waste diversions.  相似文献   
5.
明天的航海电气系统将同今天的系统有极大的不同。电力电子给予船舶上包括推进、电力分配、备用电源、声纳和雷达等在内的各种系统的进展以重要的影响。刚刚出现的新材料、新器件和新的系统概念(诸如宽带半导体材料、碳化硅基的电力半导体器件、电力电子模组(PEBB),以及集成功率系统)正在使、并将持续地使未来的航海系统有别于今天的系统,如同内燃船舶有别于蒸汽船舶。但是,这些正在实现的技术和有关概念还未被大家所周知,而且还有难于理解的地方。本文就将介绍这些新概念和新技术,指出潜在的影响力,并揭示新的设计方法,以推动航海电气系统的发展。  相似文献   
6.
7.
The addition of a small amount of high molecular weight polymer to a solvent can substantially decrease friction losses by approximately 80%. This phenomenon known as drag reduction (DR) is used extensively in oil recovery during hydraulic fracturing and in many other applications to reduce the pumping costs. However, because of long chain length, these polymers get adsorbed on the surface of reservoir, diminishing the effectiveness of fracking. In the current study, a thermo‐responsive polymer, i.e., poly(N‐isopropylacrylamide) (PNIPAM) is investigated as a drag reducing agent (DRA), which collapses reversibly above 33 °C known as lower critical solution temperature (LCST), thereby preventing it from getting adsorbed beyond this temperature. Free radical polymerization was used to synthesize the PNIPAM and a Taylor–Couette (TC) setup with a rotating inner cylinder was utilized for measuring the DR. The effect of concentration, Reynolds number (Re), and temperature on DR were studied and a maximum of 50% DR was observed at 400 PPM concentration. PNIPAM demonstrated significant decrease in DR beyond LCST, validating its thermo‐responsive nature that could be beneficial for DR in oil recovery or in providing a control modality to DR technologies.DR versus temperature for PNIPAM solution (500 PPM) at Re = 100,000 demonstrating responsive behavior with temperature © 2016 Wiley Periodicals, Inc. J. Appl. Polym. Sci. 2016 , 133, 44191.  相似文献   
8.
The development of enabling mass spectrometry platforms for the quantification of diverse lipid species in human urine is of paramount importance for understanding metabolic homeostasis in normal and pathophysiological conditions. Urine represents a non‐invasive biofluid that can capture distinct differences in an individual's physiological status. However, currently there is a lack of quantitative workflows to engage in high throughput lipidomic analysis. This study describes the development of a MS/MSALL shotgun lipidomic workflow and a micro liquid chromatography–high resolution tandem mass spectrometry (LC–MS/MS) workflow for urine structural and mediator lipid analysis, respectively. This workflow was deployed to understand biofluid sample handling and collection, extraction efficiency, and natural human variation over time. Utilization of 0.5 mL of urine for structural lipidomic analysis resulted in reproducible quantification of more than 600 lipid molecular species from over 20 lipid classes. Analysis of 1 mL of urine routinely quantified in excess of 55 mediator lipid metabolites comprised of octadecanoids, eicosanoids, and docosanoids generated by lipoxygenase, cyclooxygenase, and cytochrome P450 activities. In summary, the high‐throughput functional lipidomics workflow described in this study demonstrates an impressive robustness and reproducibility that can be utilized for population health and precision medicine applications.  相似文献   
9.
A general theory of the Transmission Disequilibrium Test for two linked flanking marker loci used in interval mapping of a disease gene with an arbitrary mode of inheritance based on the genotypic relative risk model is presented from first principles. The expectations of all the cells in a contingency table possible with four marker haplotypes (transmitted vs. not transmitted) are derived. Although algebraic details of the six possible linkage tests are given, only the test involving doubly heterozygous parents has been considered in detail. Based on a test of symmetry of a square contingency table, chi-square tests are proposed for the null hypothesis of no linkage between the markers and the disease gene. The power of the tests is discussed in terms of the corresponding non-centrality parameters for each of the four modes of inheritance viz. additive, recessive, dominant and multiplicative. Sample sizes required for 80% power at the significance level of 0.05 have also been computed in each case. The results have been presented both for the case when the pair of markers is at the disease susceptibility locus as well as for the case when it is not so. In addition to the marker gene frequencies, recombination probabilities, and various association parameters, etc., it is found that the results depend on a composite parameter involving the genotypic relative risk of the homozygous disease genotype and the disease gene frequency instead of its constituents individually. The power increases with the decrease in the recombination probability in general but their magnitudes differ across the modes of inheritance. Additive and multiplicative modes of inheritance, in general, are found to give almost similar sample sizes. The sample sizes are found to be higher when the marker haplotype is not at the disease susceptibility locus than when the markers are there, indicating loss of power of the tests in the former case. But these are lower than the sample sizes required in the single marker case, thereby showing the superiority of the strategy in adopting the two marker loci for the transmission disequilibrium test. The use of linkage information between the markers seems to improve matters when this strategy is adapted for disease gene identification. The computations for sample sizes required for 80% power at the significance level of 5×10−8 used in TDT for fine mapping and genome-wide association studies indicate that the sample sizes needed could be several times larger than those for the traditional significance level of 0.05.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号