首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
李绍华  王建新  陈建二 《软件学报》2009,20(9):2307-2319
在参数计算与复杂性理论中,一个参数问题是固定参数可解的问题当且仅当该问题是可核心化的.核心化技术是参数化算法设计中应用最为广泛、有效的技术,是参数理论中的一个研究热点.通过实例分析对比了最主要的4种核心化技术的基本思想、应用特点和方法,总结了核心化技术在cover类、packing类和cut类等几个重要领域中的应用成果,展望核心化技术的进一步研究方向并加以分析讨论,针对核心化新技术研究和某些热点问题,提出了可能采取的核心优化方法和思路.  相似文献   

2.
Scenario techniques are a teeming field in energy and environmental research and decision making. This Thematic Issue (TI) highlights quantitative (computational) methods that improve the development and use of scenarios for dealing with the dual challenge of complexity and (deep) uncertainty. The TI gathers 13 articles that describe methodological innovations or extensions and refinements of existing methods, as well as applications that demonstrate the potential of these methodological developments. The TI proposes two methodological foci for dealing with the challenges of (deep) uncertainty and complexity: diversity and vulnerability approaches help tackle uncertainty; multiple-objective and multiple-scale approaches help address complexity; whereas some combinations of those foci can also be applied. This overview article to the TI presents the contributions gathered in the TI, and shows how they individually and collectively bring new capacity to scenarios techniques to deal with complexity and (deep) uncertainty.  相似文献   

3.
Reliable study results are necessary for the assessment of discoveries, including those from proteomics. Reliable study results are also crucial to increase the likelihood of making a successful choice of biomarker candidates for verification and subsequent validation studies, a current bottleneck for the transition to in vitro diagnostic (IVD). In this respect, a major need for improvement in proteomics appears to be accuracy of measurements, including both trueness and precision of measurement. Standardization and total quality management systems (TQMS) help to provide accurate measurements and reliable results. Reference materials are an essential part of standardization and TQMS in IVD and are crucial to provide metrological correct measurements and for the overall quality assurance process. In this article we give an overview on how reference materials are defined, prepared and what role they play in standardization and TQMS to support the generation of reliable results. We discuss how proteomics can support the establishment of reference materials and biomarker tests for IVD applications, how current reference materials used in IVD may be beneficially applied in proteomics, and we provide considerations on the establishment of reference materials specific for proteomics. For clarity, we solely focus on reference materials related to serum and plasma.  相似文献   

4.
5.
6.
This paper demonstrates how the data-association technique known as the probabilistic multi-hypothesis tracker (PMHT) can be applied to the feature-based simultaneous localization and map building (SLAM) problem. The main advantage of PMHT over other conventional data-association techniques is that it has low computational complexity, while still providing good performance. Low complexity is a particularly desirable feature for the SLAM problem where the estimators used may already be costly to implement. The paper also proposes an estimation approach based on generalized expectation-maximization iterations of the PMHT SLAM problem, which is able to achieve low computation complexity at the expense of local convergence. The performance of the PMHT SLAM algorithm is compared with other approaches, and its output is demonstrated on a benchmark data set recorded in Victoria Park, Sydney, Australia  相似文献   

7.
Diabetic nephropathy (DN) is the main cause of mortality for diabetic patients. The objective of this work was to develop a proteomic approach to detect proteins or peptides in urine for identifying individuals in the early stage of DN. We obtained urine samples from 106 diabetic patients and 50 healthy subjects. Early stage of DN was defined as urine albumin-to-creatinine ratio between 30 to 299?mg/g. Mass spectra were generated using surface-enhanced laser desorption/ionization time-of-flight mass spectrometry. Peaks were detected by Ciphergen SELDI software version 3.1. Over 1000 proteins or peptides were obtained using ProteinChip. About 200 of them, the m/z values were in the range from 1008.5 to 79 942.3?Da. These values were significantly differentiated between diabetic patients and contro1 subjects. A mathematical analysis revealed that a cluster of 8 up-regulated proteins and 16 down-regulated proteins was in the diabetic patients, with m/z values from 2197.3 to 79 613?Da. Four top-ranked proteins, with m/z values of 4139.0, 4453.5, 5281.1, and 5898.5?Da, were selected as the potential fingerprints for detection of early stage DN with a sensitivity of 75% and a specificity of 80%. ProteinChip technology may be a novel non-invasive method for detecting early stage DN.  相似文献   

8.
Diabetic nephropathy (DN) is a serious kidney complication of diabetes, and constitutes the leading cause of end-stage renal disease. The earliest clinical evidence of DN is microalbuminuria, a term which refers to the appearance of small but abnormal amounts of albumin in the urine. However, screening methods for DN, such as biomarker assays, are yet to be developed for type 2 DN. In the present study, in an attempt to identify the biomarkers for initial diagnoses of type 2 DN, the protein profiles of human sera collected from 30 microalbuminuric type 2 diabetic patients were compared with those collected from 30 normoalbuminuric type 2 diabetic patients, via 2-DE. As a result, a total of 18 spots were determined to have different protein levels in the microalbuminuric patients. Twelve spots had lower protein levels of approximately 50%, and the other six had higher levels of approximately 100-300% as compared to the spots of normoalbuminuric patients. These spots were identified with ESI-Q-TOF (ESI-quadrupole-TOF) MS. Among the identified proteins, vitamin D-binding protein (DBP) and pigment epithelium-derived factor (PEDF) were verified by Western blotting. The results of this study indicate that the DBP may be employed as diagnostic and monitoring biomarkers of type 2 DN, contingent on further study into the matter.  相似文献   

9.
An attempt to reduce the computational complexity of the advancing front triangulation is described. The method is first decomposed into subtasks, and the computational complexity is investigated separately for them. It is shown that a major subtask, namely the geometric ompatibility (mesh correctness) checks can be carried out with linear growth rate. The applied techniques include modified advancing front management, and a localization device in the form of a regular grid (stored as a hypermatrix). The other subtask (access to mesh control function) could not be made of linear computational complexity for all modes of mesh control (ad hoc and adaptive). While thead hoc gradation control yields an algorithm with ideal oveall computational complexity, the adaptive gradation control gives still a suboptimal complexity (of orderO(N logN)).  相似文献   

10.
From a practical point of view it is often desirable to limit the complexity of a topology optimization design such that casting/milling type manufacturing techniques can be applied. In the context of gradient driven topology optimization this work studies how castable designs can be obtained by use of a Heaviside design parameterization in a specified casting direction. This reduces the number of design variables considerably and the approach is simple to implement.  相似文献   

11.
In this paper we present techniques to significantly improve the space complexity of several ordered tree comparison algorithms without sacrificing the corresponding time complexity. We present new algorithms for computing the constrained ordered tree edit distance and the alignment of (ordered) trees. The techniques can also be applied to other related problems.  相似文献   

12.
Hyperglycemia is a major key factor in the pathogenesis of microvascular complications of diabetes, including diabetic nephropathy (DN). Most studies to date have focused on the glomerular abnormalities found in DN. However, nephromegaly in the early stages of diabetes and the correlation of tubulointerstitial pathology rather than glomerular pathology with declining renal function in DN suggests the involvement of the tubulointerstitium. The etiology of the tubulointerstitial pathology in DN, however, is not fully understood. In this study, to understand the DN pathways, we constructed an initial 2-DE reference map for primitively cultured human proximal tubule (HK-2) cell in the presence of 5?mM and 25?mM glucose, which correspond to blood glucose concentrations during the normal and hyperglycemia conditions, respectively. Differentially expressed HK-2 cell cellular proteins at the high glucose concentration were identified via ESI-Q-TOF MS/MS and confirmed by Western blotting; enolase 1 (up-regulated) and lactate dehydrogenase (down-regulated). The regulation of these proteins will help in understanding DN mechanism through the glycolysis metabolic pathways in high glucose stimulated HK-2 cells.  相似文献   

13.
Most graphics cards in standard personal computers are now equipped with several pixel pipelines running shader programs. Taking advantage of this technology by transferring parallel computations from the CPU side to the GPU side increases the overall computational power even in non-graphical applications by freeing the main processor from an heavy work. A generic library is presented to show how anyone can benefit from modern hardware by combining various techniques with little hardware specific programming skills. Its shader implementation is applied to retinal and cortical simulation. The purpose of this sample application is not to provide a correct approximation of real center surround ganglion or middle temporal cells, but to illustrate how easily intertwined spatiotemporal filters can be applied on raw input pictures in real-time. Requirements and interconnection complexity really depend on the vision framework adopted, therefore various hypothesis that may benefit from such a library are introduced.  相似文献   

14.
Chu Spaces and Channel Theory are well-established areas of investigation in the general context of category theory when applied to semantically-based information flow. In this Part I of a two-part work, we review a range of related concepts and examples showing how these methods can be applied to logic and computer science, including Formal Concept Analysis, distributed systems and ontology development. We also discuss spatial coarse-graining in relationship to information, and in this direction we establish some basic simplicial and categorical techniques which will supplement the other methods of this Part I when they are applied to characterise visual object identification and the inference of mereological (i.e. part-whole) complexity in Part II.  相似文献   

15.
The spatial properties of gaps have an important influence upon the regeneration dynamics and species composition of forests. However, such properties can be difficult to quantify over large spatial areas using field measurements. This research considers how we conceptualize and define forest canopy gaps from a remote sensing point of view and highlights the inadequacies of passive optical remotely sensed data for delineating gaps. The study employs the analytical functions of a geographical information system to extract gap spatial characteristics from imagery acquired by an active remote sensing device, an airborne light detection and ranging instrument (LiDAR). These techniques were applied to an area of semi-natural broadleaved deciduous forest, in order to map gap size, shape complexity, vegetation height diversity and gap connectivity. A vegetation cover map derived from imagery from an airborne multispectral scanner was used in combination with the LiDAR data to characterize the dominant vegetation types within gaps. Although the quantification of these gap characteristics alone is insufficient to provide conclusive evidence on specific processes, the paper demonstrates how such information can be indicative of the general status of a forest and can provide new perspectives and possibilities or further ecological research and forest monitoring activities.  相似文献   

16.
Although lean production (LP) is usually associated with complexity reduction, it has been increasingly applied in highly complex socio-technical systems (CSS) (e.g. healthcare), in which the complexity level cannot be reduced below a certain (high) threshold. This creates a paradoxical situation, which is not well understood in theory and can be underlying the frustrating results of many lean implementations. This article presents a systematic literature review of how LP has dealt with complexity, both in theory and in practice, from a complexity science perspective. The review was based on 94 papers, which were analyzed according to seven criteria: how the concept of complexity is being used in lean research; the complexity level of the studied systems; the compatibility between the methodological approach and the nature of complexity; how complexity is managed by LP; barriers to LP in CSS; side-effects of LP in CSS; and whether complexity is always detrimental to LP. A research agenda is also proposed.  相似文献   

17.
Visualization is one of the most effective methods for analyzing how high-dimensional data are distributed. Dimensionality reduction techniques, such as PCA, can be used to map high dimensional data to a two- or three-dimensional space. In this paper, we propose an algorithm called HyperMap that can be effectively applied to visualization. Our algorithm can be seen as a generalization of FastMap. It preserves its linear computation complexity, and overcomes several main shortcomings, especially in visualization. Since there are more than two pivot objects in each axis of a target space, more distance information needs to be preserved in each dimension. Then in visualization, the number of pivot objects can go beyond the limitation of six (2-pivot objects × 3-dimensions). Our HyperMap algorithm also gives more flexibility to the target space, such that the data distribution can be observed from various viewpoints. Its effectiveness is confirmed by empirical evaluations on both real and synthetic datasets.  相似文献   

18.
Owing to recent advances in proteomics analytical methods and bioinformatics capabilities there is a growing trend toward using these capabilities for the development of drugs to treat human disease, including target and drug evaluation, understanding mechanisms of drug action, and biomarker discovery. Currently, the genetic sequences of many major organisms are available, which have helped greatly in characterizing proteomes in model animal systems and humans. Through proteomics, global profiles of different disease states can be characterized (e.g. changes in types and relative levels as well as changes in PTMs such as glycosylation or phosphorylation). Although intracellular proteomics can provide a broad overview of physiology of cells and tissues, it has been difficult to quantify the low abundance proteins which can be important for understanding the diseased states and treatment progression. For this reason, there is increasing interest in coupling comparative proteomics methods with subcellular fractionation and enrichment techniques for membranes, nucleus, phosphoproteome, glycoproteome as well as low abundance serum proteins. In this review, we will provide examples of where the utilization of different proteomics-coupled enrichment techniques has aided target and biomarker discovery, understanding the drug targeting mechanism, and mAb discovery. Taken together, these improvements will help to provide a better understanding of the pathophysiology of various diseases including cancer, autoimmunity, inflammation, cardiovascular disease, and neurological conditions, and in the design and development of better medicines for treating these afflictions.  相似文献   

19.
Serum and plasma from which serum is derived represent a substantial challenge for proteomics due to their complexity. A landmark plasma proteome study was initiated a decade ago by the Human Proteome Organization (HUPO) that had as an objective to examine the capabilities of existing technologies. Given the advances in proteomics and the continued interest in the plasma proteome, it would timely reassess the depth and breadth of analysis of plasma that can be achieved with current methodology and instrumentation. A collaborative project to define the plasma proteome and its variation, with a plan to build a plasma proteome database would be timely.  相似文献   

20.
Constraint propagation is one of the techniques central to the success of constraint programming. To reduce search, fast algorithms associated with each constraint prune the domains of variables. With global (or non-binary) constraints, the cost of such propagation may be much greater than the quadratic cost for binary constraints. We therefore study the computational complexity of reasoning with global constraints. We first characterise a number of important questions related to constraint propagation. We show that such questions are intractable in general, and identify dependencies between the tractability and intractability of the different questions. We then demonstrate how the tools of computational complexity can be used in the design and analysis of specific global constraints. In particular, we illustrate how computational complexity can be used to determine when a lesser level of local consistency should be enforced, when constraints can be safely generalized, when decomposing constraints will reduce the amount of pruning, and when combining constraints is tractable.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号