全文获取类型
收费全文 | 822篇 |
免费 | 60篇 |
国内免费 | 1篇 |
专业分类
电工技术 | 12篇 |
化学工业 | 284篇 |
金属工艺 | 29篇 |
机械仪表 | 13篇 |
建筑科学 | 21篇 |
矿业工程 | 3篇 |
能源动力 | 24篇 |
轻工业 | 68篇 |
水利工程 | 4篇 |
石油天然气 | 4篇 |
武器工业 | 1篇 |
无线电 | 27篇 |
一般工业技术 | 216篇 |
冶金工业 | 54篇 |
原子能技术 | 5篇 |
自动化技术 | 118篇 |
出版年
2023年 | 9篇 |
2022年 | 6篇 |
2021年 | 17篇 |
2020年 | 11篇 |
2019年 | 12篇 |
2018年 | 47篇 |
2017年 | 35篇 |
2016年 | 54篇 |
2015年 | 34篇 |
2014年 | 51篇 |
2013年 | 63篇 |
2012年 | 48篇 |
2011年 | 80篇 |
2010年 | 50篇 |
2009年 | 43篇 |
2008年 | 27篇 |
2007年 | 30篇 |
2006年 | 20篇 |
2005年 | 17篇 |
2004年 | 16篇 |
2003年 | 15篇 |
2002年 | 18篇 |
2001年 | 3篇 |
2000年 | 10篇 |
1999年 | 10篇 |
1997年 | 6篇 |
1996年 | 7篇 |
1995年 | 8篇 |
1994年 | 7篇 |
1993年 | 4篇 |
1992年 | 6篇 |
1991年 | 6篇 |
1990年 | 4篇 |
1988年 | 3篇 |
1987年 | 4篇 |
1986年 | 3篇 |
1985年 | 6篇 |
1984年 | 5篇 |
1983年 | 8篇 |
1982年 | 14篇 |
1981年 | 6篇 |
1980年 | 8篇 |
1979年 | 8篇 |
1978年 | 8篇 |
1977年 | 5篇 |
1976年 | 6篇 |
1974年 | 3篇 |
1973年 | 6篇 |
1972年 | 3篇 |
1971年 | 2篇 |
排序方式: 共有883条查询结果,搜索用时 0 毫秒
61.
James A. Kimber Sergei G. Kazarian František Štěpánek 《Chemical engineering science》2012,69(1):394-403
This work presents a novel use of the Discrete Element Method (DEM) combined with inter-particle mass transfer in order to simulate polymer swelling and dissolution. Each particle can absorb water and swell, pushing on its neighbours and causing an overall expansion. Once the disentanglement threshold is reached, the polymer dissolves and the particle reduces in size. This paper applies DEM to simulate the radial swelling and dissolution of cylindrical tablets. The method was validated against exact numerical solution of the same system to assess the accuracy of the DEM simulations for different DEM particle sizes. Parametric studies were done to assess the impact of physical parameters – namely the concentration-dependent diffusion coefficient of water through the polymer, the dissolution rate constant of the polymer and the disentanglement threshold of the polymer – on the radial expansion of the tablet. It was found that different settings of the concentration-dependent water diffusion coefficient function could produce similar radial expansion curves but with different internal concentration profiles. Increasing the dissolution rate constant or decreasing the disentanglement threshold of the polymer caused a reduction in the maximum radius of tablet. Lastly, ATR-FTIR spectroscopic imaging was used to obtain chemical images of a pure hydroxy-propyl methylcellulose (HPMC) tablet swelling and dissolving. The model was optimised to match both the HPMC tablet radius and the concentration profiles over time. 相似文献
62.
Compatibilization is the modification of the interface in immiscible polymer blends in order to refine and stabilize their phase structure. The presence of a compatibilizer at the interface markedly affects the deformation behaviour of the dispersed droplets during and after cessation of flow. In this work the morphology development in blends of polystyrene and linear low-density polyethylene compatibilized with styrene-butadiene-styrene triblock copolymer during and after uniaxial elongation was investigated by means of electron microscopy and small-angle X-ray scattering. The incorporation of only 1 wt.-% of the compatibilizer led to a pronounced increase of the stationary elongational viscosity of the blend. It was found, that at moderate capillary number (Ca ≈ CaCR) the compatibilizer stabilises the droplets against break-ups during the flow. When Ca >> CaCR no differences in the deformation of uncompatibilized and compatibilized droplets were observed. After cessation of the flow, the presence of the compatibilizer prevented the droplet break-up and supported and accelerated the shape recovery of the elongated particles. 相似文献
63.
Jianming Zhang Shugao Ma Mehrnoosh Sameki Stan Sclaroff Margrit Betke Zhe Lin Xiaohui Shen Brian Price Radomír Měch 《International Journal of Computer Vision》2017,124(2):169-186
We study the problem of salient object subitizing, i.e. predicting the existence and the number of salient objects in an image using holistic cues. This task is inspired by the ability of people to quickly and accurately identify the number of items within the subitizing range (1–4). To this end, we present a salient object subitizing image dataset of about 14 K everyday images which are annotated using an online crowdsourcing marketplace. We show that using an end-to-end trained convolutional neural network (CNN) model, we achieve prediction accuracy comparable to human performance in identifying images with zero or one salient object. For images with multiple salient objects, our model also provides significantly better than chance performance without requiring any localization process. Moreover, we propose a method to improve the training of the CNN subitizing model by leveraging synthetic images. In experiments, we demonstrate the accuracy and generalizability of our CNN subitizing model and its applications in salient object detection and image retrieval. 相似文献
64.
Monadic second order (MSO) logic has proved to be a useful tool in many areas of application, reaching from decidability and complexity to picture processing, correctness of programs and parallel processes. To characterize the structural borderline between decidability and undecidability is a classical research problem here. This problem is related to questions in computational complexity, especially to the model checking problem, for which many tools developed in the area of decidability have proved to be useful. For more than two decades it was conjectured in [D. Seese, The structure of the models of decidable monadic theories of graphs, Ann. Pure Appl. Logic 53 (1991) 169–195] that decidability of monadic theories of countable structures implies that the theory can be reduced via interpretability to a theory of trees. 相似文献
65.
Jochem Verrelst Michael E. Schaepman Zbyněk Malenovský Jan G.P.W. Clevers 《Remote sensing of environment》2010,114(3):647-100
An important bio-indicator of actual plant health status, the foliar content of chlorophyll a and b (Cab), can be estimated using imaging spectroscopy. For forest canopies, however, the relationship between the spectral response and leaf chemistry is confounded by factors such as background (e.g. understory), canopy structure, and the presence of non-photosynthetic vegetation (NPV, e.g. woody elements)—particularly the appreciable amounts of standing and fallen dead wood found in older forests. We present a sensitivity analysis for the estimation of chlorophyll content in woody coniferous canopies using radiative transfer modeling, and use the modeled top-of-canopy reflectance data to analyze the contribution of woody elements, leaf area index (LAI), and crown cover (CC) to the retrieval of foliar Cab content. The radiative transfer model used comprises two linked submodels: one at leaf level (PROSPECT) and one at canopy level (FLIGHT). This generated bidirectional reflectance data according to the band settings of the Compact High Resolution Imaging Spectrometer (CHRIS) from which chlorophyll indices were calculated. Most of the chlorophyll indices outperformed single wavelengths in predicting Cab content at canopy level, with best results obtained by the Maccioni index ([R780 − R710] / [R780 − R680]). We demonstrate the performance of this index with respect to structural information on three distinct coniferous forest types (young, early mature and old-growth stands). The modeling results suggest that the spectral variation due to variation in canopy chlorophyll content is best captured for stands with medium dense canopies. However, the strength of the up-scaled Cab signal weakens with increasing crown NPV scattering elements, especially when crown cover exceeds 30%. LAI exerts the least perturbations. We conclude that the spectral influence of woody elements is an important variable that should be considered in radiative transfer approaches when retrieving foliar pigment estimates in heterogeneous stands, particularly if the stands are partly defoliated or long-lived. 相似文献
66.
Solutions of numerically ill-posed least squares problems for A∈Rm×n by Tikhonov regularization are considered. For D∈Rp×n, the Tikhonov regularized least squares functional is given by where matrix W is a weighting matrix and is given. Given a priori estimates on the covariance structure of errors in the measurement data , the weighting matrix may be taken as which is the inverse covariance matrix of the mean 0 normally distributed measurement errors in . If in addition is an estimate of the mean value of , and σ is a suitable statistically-chosen value, J evaluated at its minimizer approximately follows a χ2 distribution with degrees of freedom. Using the generalized singular value decomposition of the matrix pair , σ can then be found such that the resulting J follows this χ2 distribution. But the use of an algorithm which explicitly relies on the direct solution of the problem obtained using the generalized singular value decomposition is not practical for large-scale problems. Instead an approach using the Golub-Kahan iterative bidiagonalization of the regularized problem is presented. The original algorithm is extended for cases in which is not available, but instead a set of measurement data provides an estimate of the mean value of . The sensitivity of the Newton algorithm to the number of steps used in the Golub-Kahan iterative bidiagonalization, and the relation between the size of the projected subproblem and σ are discussed. Experiments presented contrast the efficiency and robustness with other standard methods for finding the regularization parameter for a set of test problems and for the restoration of a relatively large real seismic signal. An application for image deblurring also validates the approach for large-scale problems. It is concluded that the presented approach is robust for both small and large-scale discretely ill-posed least squares problems. 相似文献
67.
This paper deals with an application of constraint programming in production scheduling with earliness and tardiness penalties that reflects the scheduling part of the Just-In-Time inventory strategy. Two scheduling problems are studied, an industrial case study problem of lacquer production scheduling, and also the job-shop scheduling problem with earliness/tardiness costs. The paper presents two algorithms that help the constraint programming solver to find solutions of these complex problems. The first algorithm, called the cost directed initialization, performs a greedy initialization of the search tree. The second one, called the time reversing transformation and designed for lacquer production scheduling, reformulates the problem to be more easily searchable when the default search or the cost directed initialization is used. The conducted experiments, using case study instances and randomly generated problem instances, show that our algorithms outperform generic approaches, and on average give better results than other nontrivial algorithms. 相似文献
68.
Jaroslav Keznikl Tomáš Bureš František Plášil Petr Hnětynka 《Software and Systems Modeling》2014,13(2):843-872
In current software systems, connectors play an important role by encapsulating the communication and coordination logic. Since they share common patterns (elements) depending on characteristics of the connections, the elements can be predefined and reused. A method of connector implementation based on a composition of predefined elements naturally comprises two steps: resolution of the connector architecture, and creation of the actual connector code based on the architecture. However, manual resolution of a connector architecture is very difficult due to the number of factors to be considered. Thus, the challenge is to come up with an automated method, able to address all the important factors. In this paper, we present a method for automated resolution of connector architectures based on constraint solving techniques. We exploit a propositional logic with relational calculus for defining a connector theory, a constraint specification reflecting both the predefined parts and the important resolution factors, and employ a constraint solver to find a suitable connector architecture as a model of the theory. As a proof of the concept, we show how the theory can be captured in the Alloy language and resolved via the Alloy Analyzer. 相似文献
69.
To allow efficient and user‐friendly development of a component‐based application, component systems have to provide a rather complex development infrastructure, including a tool for component composition, component repository, and a run‐time infrastructure. In this paper, we present and evaluate benefits of using meta‐modeling during the process of defining a component system and also during creation of the development and run‐time infrastructures. Most of the presented arguments are based on a broad practical experience with designing the component systems SOFA and SOFA 2; the former designed in a classical ad hoc ‘manual’ way, whereas the latter with the help of meta‐modeling. Copyright © 2010 John Wiley & Sons, Ltd. 相似文献
70.
Energy Optimization of Algebraic Multigrid Bases 总被引:13,自引:0,他引:13
We propose a fast iterative method to optimize coarse basis functions in algebraic multigrid by minimizing the sum of their
energies, subject to the condition that linear combinations of the basis functions equal to given zero energy modes, and subject
to restrictions on the supports of the coarse basis functions. For a particular selection of the supports, the first iteration
gives exactly the same basis functions as our earlier method using smoothed aggregation. The convergence rate of the minimization
algorithm is bounded independently of the mesh size under usual assumptions on finite elements. The construction is presented
for scalar problems as well as for linear elasticity. Computational results on difficult industrial problems demonstrate that
the use of energy minimal basis functions improves algebraic multigrid performance and yields a more robust multigrid algorithm
than smoothed aggregation.
Received: March 9, 1998; revised January 25, 1999 相似文献