首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   181篇
  免费   10篇
电工技术   9篇
化学工业   30篇
金属工艺   1篇
机械仪表   4篇
建筑科学   5篇
轻工业   26篇
水利工程   2篇
无线电   12篇
一般工业技术   12篇
冶金工业   53篇
原子能技术   1篇
自动化技术   36篇
  2024年   1篇
  2023年   1篇
  2022年   1篇
  2021年   5篇
  2020年   1篇
  2019年   8篇
  2018年   6篇
  2017年   5篇
  2016年   10篇
  2015年   2篇
  2014年   12篇
  2013年   8篇
  2012年   10篇
  2011年   6篇
  2010年   6篇
  2009年   12篇
  2008年   5篇
  2007年   4篇
  2006年   4篇
  2005年   5篇
  2004年   2篇
  2003年   1篇
  2002年   1篇
  2001年   6篇
  2000年   7篇
  1999年   1篇
  1998年   21篇
  1997年   14篇
  1996年   5篇
  1995年   3篇
  1994年   6篇
  1993年   4篇
  1992年   2篇
  1991年   1篇
  1976年   4篇
  1963年   1篇
排序方式: 共有191条查询结果,搜索用时 15 毫秒
61.
There are few fully automated methods for liver segmentation in magnetic resonance images (MRI) despite the benefits of this type of acquisition in comparison to other radiology techniques such as computed tomography (CT). Motivated by medical requirements, liver segmentation in MRI has been carried out. For this purpose, we present a new method for liver segmentation based on the watershed transform and stochastic partitions. The classical watershed over-segmentation is reduced using a marker-controlled algorithm. To improve accuracy of selected contours, the gradient of the original image is successfully enhanced by applying a new variant of stochastic watershed. Moreover, a final classifier is performed in order to obtain the final liver mask. Optimal parameters of the method are tuned using a training dataset and then they are applied to the rest of studies (17 datasets). The obtained results (a Jaccard coefficient of 0.91 ± 0.02) in comparison to other methods demonstrate that the new variant of stochastic watershed is a robust tool for automatic segmentation of the liver in MRI.  相似文献   
62.
Regularisation is a well-known technique for working with ill-posed and ill-conditioned problems that have been explored in a variety of different areas, including Bayesian inference, functional analysis, optimisation, numerical analysis and connectionist systems. In this paper we present the equivalence between the Bayesian approach to the regularisation theory and the Tikhonov regularisation into the function approximation theory framework, when radial basis functions networks are employed. This equivalence can be used to avoid expensive calculations when regularisation techniques are employed.  相似文献   
63.
The amount of information contained in databases available on the Web has grown explosively in the last years. This information, known as the Deep Web, is heterogeneous and dynamically generated by querying these back-end (relational) databases through Web Query Interfaces (WQIs) that are a special type of HTML forms. The problem of accessing to the information of Deep Web is a great challenge because the information existing usually is not indexed by general-purpose search engines. Therefore, it is necessary to create efficient mechanisms to access, extract and integrate information contained in the Deep Web. Since WQIs are the only means to access to the Deep Web, the automatic identification of WQIs plays an important role. It facilitates traditional search engines to increase the coverage and the access to interesting information not available on the indexable Web. The accurate identification of Deep Web data sources are key issues in the information retrieval process. In this paper we propose a new strategy for automatic discovery of WQIs. This novel proposal makes an adequate selection of HTML elements extracted from HTML forms, which are used in a set of heuristic rules that help to identify WQIs. The proposed strategy uses machine learning algorithms for classification of searchable (WQIs) and non-searchable (non-WQI) HTML forms using a prototypes selection algorithm that allows to remove irrelevant or redundant data in the training set. The internal content of Web Query Interfaces was analyzed with the objective of identifying only those HTML elements that are frequently appearing provide relevant information for the WQIs identification. For testing, we use three groups of datasets, two available at the UIUC repository and a new dataset that we created using a generic crawler supported by human experts that includes advanced and simple query interfaces. The experimental results show that the proposed strategy outperforms others previously reported works.  相似文献   
64.
In this work a methodology of meshless finite points method for the analysis of nonlinear material problems with proportional loading based on deformation theory is presented.In finite points method the approximation around each point is obtained by using weighted least square techniques. The discrete system of equation is constructed by means of a point collocation procedure. The non-dependence on a mesh or integration procedures is an important aspect which transforms the finite point method in a truly meshless technique.Hencky’s total deformation theory and an elastic approach is used on the determination of stress–strain fields. This approach introduces the concept of effective material properties which are considered as spatial field variables and to be functions of equilibrium stress state and material properties.The present results are in good agreement with those obtained by nonlinear finite element method and previous work in this meshless context. Nevertheless the present methodology is based on a strong formulation, keeping the meshless characteristics of FPM.  相似文献   
65.
The design of distribution networks that simultaneously consider location and inventory decisions seeks to balance costs and product availability. The most commonly observed measure of product availability in practical settings is the fill-rate service level. However, the optimal design of a distribution network that considers the fill rate to control shortages of fast-moving consumer goods (FMCG) is considered intractable and has only been addressed by heuristic methods. This paper addresses the optimal design of a distribution network for FMCG able to provide high fill-rate service level under a continuous review ( r , Q ) $(r,Q)$ policy. Considering the exact formulation for the provided fill rate, we formulated a joint location–inventory model with fill-rate service level constraints as a convex mixed integer nonlinear problem for which a novel decomposition-based outer approximation algorithm is proposed. Numerical experiments have shown that our solution approach provides good-quality solutions that are on average 0.15% and, at worst, 2.2% from the optimal solution.  相似文献   
66.
The aim is to develop active packaging films containing natural antioxidants and to evaluate their capacity to enhance the oxidative stability of beef during refrigeration. The antioxidant activity of a natural extract obtained from a brewery residual waste was evaluated and compared with that of a commercial rosemary extract and two synthetic antioxidants (BHT and propyl gallate). Different concentrations of each antioxidant were also added directly to beef samples, resulting in a reduction in lipid oxidation of up to 70–80% relative to the control. Active antioxidant films coated with PVPP-WS extract reduced lipid oxidation by up to 80%, relative to the control, during cold storage. The use of active packaging films containing natural extracts could improve the oxidative stability of meat products and should therefore be of great interest in the food industry.  相似文献   
67.
The aim of this paper was to characterize chitosan samples from the shrimp shells for the later development of antimicrobial active systems. These systems include 100 % chitosan-based films obtained by casting, polyamide films with 5 and 10 % of chitosan obtained by extrusion and polyethylene/polyethylene terephthalate films with a coating of 0.6 % of chitosan. For that purpose, several analytical techniques including IR, 1H NMR, GPC, and microscopic techniques (scanning electron microscopy and transmission electron microscopy) were used. Within the studied samples, C1 showed the lowest DA and MW and consequently presented the most suitable properties for the development of an active packaging. Additionally, mechanical properties were performed. The effectiveness of the developed systems was evaluated by means of microbiological assays. The tested films showed antimicrobial capacity against coliform enterobacteria, mesophilic aerobic microorganism, and yeast and moulds.  相似文献   
68.
Food manufacturers are required to obtain scientific and technical evidence that a control measure or combination of control measures is capable of reducing a significant hazard to an acceptable level that does not pose a public health risk under normal conditions of distribution and storage. A validation study provides evidence that a control measure is capable of controlling the identified hazard under a worst-case scenario for process and product parameters tested. It also defines the critical parameters that must be controlled, monitored, and verified during processing. This review document is intended as guidance for the food industry to support appropriate validation studies, and aims to limit methodological discrepancies in validation studies that can occur among food safety professionals, consultants, and third-party laboratories. The document describes product and process factors that are essential when designing a validation study, and gives selection criteria for identifying an appropriate target pathogen or surrogate organism for a food product and process validation. Guidance is provided for approaches to evaluate available microbiological data for the target pathogen or surrogate organism in the product type of interest that can serve as part of the weight of evidence to support a validation study. The document intends to help food manufacturers, processors, and food safety professionals to better understand, plan, and perform validation studies by offering an overview of the choices and key technical elements of a validation plan, the necessary preparations including assembling the validation team and establishing prerequisite programs, and the elements of a validation report.  相似文献   
69.
A case of hypertensive intracerebellar hematoma surgically treated and cured was reported. The 41-year-old male had two cerebrovascular attacks with headache and vomiting followed by left hemiparesis. Drowsiness and dysarthria appeared the next day. The patient was admitted to a hospital, where right facial palsy, loss of right gag reflex and paralytic hemiplegia on the left side were noted. On the 7th day, the patient's consciousness became clear byt the other neurological evidences did not change. On the 14th day, bradycardia and central hyperventilation appeared and he became drowsy again. The patient was transferred to the authors' clinic. When the patient was admitted, he showed typical cerebellar signs such as nystagmus, ataxia, and slurring speech with pyramidal sign on left side and cranial nerves paralysis on right side, and also showed the changes of vital signs as a medullary syndrome in the late stage of the course. The vertebral angiogram revealed a space taking process in the right cerebellar hemisphere. The old blood (30g) was removed by suboccipital craniectomy. The hematoma cavity had a communication with the IVth ventricle through a small perforation in the medial wall of the hematoma. Spontaneour intracerebellar hematoma including of hypertensive origin is not rare in the reports of autopsy but surgically treated case has only rarely been reported. The main reason of few survivals should be in its fulminate course.  相似文献   
70.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号