首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 140 毫秒
1.
Editorial     
From Qiang Xuesen [1], to Guan Zhaozhi [2], and finally to Han Jingqing [3], active disturbance rejection control (ADRC) arose from their unwavering conviction that theory must be connected to practice, and that control theory of any practical significance must not simply be a branch of mathematics, predicated on accurate mathematical model of physical processes. It is their vision and wisdom that have guided us through difficult times. In particular, Han’s provoking question of “Is this a theory of control or a theory of model?” [3] awakened generations of scholars and compelled them to reexamine the very premise on which modern control theory has been built: what is the object of our study? Is it the control of a physical process, or is it the control of a mathematical model?...  相似文献   

2.
We propose a novel approach,namely local reduction of networks,to extract the global core(GC,for short)from a complex network.The algorithm is built based on the small community phenomenon of networks.The global cores found by our local reduction from some classical graphs and benchmarks convince us that the global core of a network is intuitively the supporting graph of the network,which is"similar to"the original graph,that the global core is small and essential to the global properties of the network,and that the global core,together with the small communities gives rise to a clear picture of the structure of the network,that is,the galaxy structure of networks.We implement the local reduction to extract the global cores for a series of real networks,and execute a number of experiments to analyze the roles of the global cores for various real networks.For each of the real networks,our experiments show that the found global core is small,that the global core is similar to the original network in the sense that it follows the power law degree distribution with power exponent close to that of the original network,that the global core is sensitive to errors for both cascading failure and physical attack models,in the sense that a small number of random errors in the global core may cause a major failure of the whole network,and that the global core is a good approximate solution to the r-radius center problem,leading to a galaxy structure of the network.  相似文献   

3.
As an important branch of Knowledge Discovery, the task of Data Classification is to determine the objects that belong to which pre-defined goals. As evolutionary computation does not require priori assumptions, it shows great vitality in dealing with imprecise, incomplete and uncertain information, which the traditional methods of statistical classifications are helpless in the classification issues. This paper presents a classification algorithm based on cloud model and genetic algorithm. Experiments show that the algorithm is efficient to continuous attribute data sets for the classification.  相似文献   

4.
This paper presents a case-study where a new programming technique is applied to an established branch of software development. The purpose of the study was to test whether or not aspect-oriented programming (AOP) could be used in operating systems development. Instead of any real world operating system an educational OS with the name Nachos was used. This was because Nachos is written in Java which makes it easy to introduce aspect-oriented techniques. In this paper a new file system for the Nachos OS is developed and then it is analyzed by profiling and metrics. The results show that it is possible to use AOP in OS development and that it is also beneficial to do so.  相似文献   

5.
Processing breakdown involved in garden path phenomenon is a special phenomenon in which the processor has to backtrack on the early understanding and create another path out to process data successfully and perfectly. From the semantic viewpoint of Natural Language Expert System, this paper puts forward a particular schema by which processing procedure of breakdown can be shown clearly and raises a special matching pattern to demonstrate the activity of noun-verb category. According to the analysis of interactive presentation of same word in which noun and verb definitions are involved, it is verified that Natural Language Expert System, only when its semantic database is activated, can clear the misunderstanding of syntactic sentences in which processing breakdown appears.  相似文献   

6.
As the web grows,the massive increase in information is placing severe burdens on information retrieval and sharing.Automated search engines and directories with small editorial staff are unable to keep up with the increasing submission of web sites.To address the problem,this paper presents Infomarker-an Internet information service system based on open Directory and Zero-Keyword Inquiry,The Open DIrectory sets up a net-community in which the increasing netcitizens can each organize a small portion of the web and present it to the others.By means of Zero-Keyword Inquiry,user can get the information he is interested in with out inputting any keyword that is often required by search engines,In Infomarker,user can record the web address he likes and can put forward an information request based on his wed records.The information matching engine checks the information in the Open Directory to find what fits user‘s needs and adds it to user‘s web address records.The key to the matching process is layered keyword mapping.Informarker provides people with a whole new approach to getting information and shows a wide prospect.  相似文献   

7.
The UNIX localization and Chinese Information Processing System   总被引:1,自引:1,他引:0       下载免费PDF全文
To facilitate the wider use of computers all over the world,it is necessary to provide National Language Support in the computer systems.This paper introduces some aspects of design and implementation of the UNIX-based Chinese Information Processing Systems (CIPS). Due to the special nature of the Oriental languages,and in order to be able to share resources and exchange in formation between different countries,it is necessary to create a standard of multilingual information exchange code.The unified Chinese/Japanese/Korean character code,Han Character Collection(HCC),was proposed to ISO/IEC JTC1/SC2/WG2 by China Computer and Information Processing Standardization Technical Committee.Based on this character set and the corresponding coding system,it is possible to create a true Internationalized UNIX System.  相似文献   

8.
The growth of small errors in weather prediction is exponential on average. As an error becomes larger, its growth slows down and then stops with the magnitude of the error saturating at about the average distance between two states chosen randomly.This paper studies the error growth in a low-dimensional atmospheric model before, during and after the initial exponential divergence occurs. We test cubic, quartic and logarithmic hypotheses by ensemble prediction method. Furthermore, the quadratic hypothesis suggested by Lorenz in 1969 is compared with the ensemble prediction method. The study shows that a small error growth is best modeled by the quadratic hypothesis. After the error exceeds about a half of the average value of variables, logarithmic approximation becomes superior. It is also shown that the time length of the exponential growth in the model data is a function of the size of small initial error and the largest Lyapunov exponent. We conclude that the size of the error at the least upper bound(supremum) of time length is equal to 1 and it is invariant to these variables. Predictability, as a time interval, where the model error is growing, is for small initial error, the sum of the least upper bound of time interval of exponential growth and predictability for the size of initial error equal to 1.  相似文献   

9.
Motivated by proxy signature and blind signature for the secure communications,the batch signature is proposed to create a novel quantum cryptosystem.It is based on three-dimensional two-particle-entangled quantum system which is used to distribute the quantum keys and create strings of quantum-trits(qutrits) for messages.All of the messages,which are expected to be signed,are encrypted by the private key of the message owner during communications.Different from the classical blind signature,an authenticity verification of signatures and an arbitrator’s efficient batch proxy signature are simultaneously applied in the present scheme.Analysis of security and efficiency shows that it enables us to achieve a large number of quantum blind signatures for quantities of messages with a high efficiency with the arbitrator’s secure batch proxy blind signature.  相似文献   

10.
This paper investigates the problem of global attitude regulation control for a rigid spacecraft under input saturation. Based on the technique of finite-time control and the switching control method, a novel global bounded finite-time attitude regulation controller is proposed. Under the proposed controller, it is shown that the spacecraft attitude can reach the desired attitude in a finite time. In addition, the bound of a proposed attitude controller can be adjusted to any small level to accommodate the actuation bound in practical implementation.  相似文献   

11.
It is a well-known fact that test power consumption may exceed that during functional operation. Leakage power dissipation caused by leakage current in Complementary Metal-Oxide-Semiconductor (CMOS) circuits during test has become a significant part of the total power dissipation. Hence, it is important to reduce leakage power to prolong battery life in portable systems which employ periodic self-test, to increase test reliability and to reduce test cost. This paper analyzes leakage current and presents a kind of leakage current simulator based on the transistor stacking effect. Using it, we propose techniques based on don't care bits (denoted by Xs) in test vectors to optimize leakage current in integrated circuit (IC) test by genetic algorithm. The techniques identify a set of don't care inputs in given test vectors and reassign specified logic values to the X inputs by the genetic algorithm to get minimum leakage vector (MLV). Experimental results indicate that the techniques can effectually optimize leakage current of combinational circuits and sequential circuits during test while maintaining high fault coverage,  相似文献   

12.
It is a well-known fact that test power consumption may exceed that during functional operation.Leakage power dissipation caused by leakage current in Complementary Metal-Oxide-Semiconductor(CMOS)circuits during test has become a significant part of the total power dissipation.Hence,it is important to reduce leakage power to prolong battery life in portable systems which employ periodic self-test,to increase test reliability and to reduce test cost.This paper analyzes leakage current and presents a kind of leakage current sinmlator based on the transistor stacking effect. Using it,we propose techniques based on don't care bits(denoted by Xs)in test vectors to optimize leakage current in integrated circuit(IC)test by genetic algorithm.The techniques identify a set of don't care inputs in given test vectors and reassign specified logic values to the X inputs by the genetic algorithm to get minimum leakage vector(MLV). Experimental results indicate that the techniques can effectually optimize leakage current of combinational circuits and sequential circuits during test while maintaining high fault coverage.  相似文献   

13.
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes (of some) of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This problem arose originally from the geneticists‘ need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even with focus on a single gene and in the presence of three alleles. Several other results on the computational complexity of problems from genetics that are related to consistency checking are also offered. In particular, it is shown that checking the consistency of pedigrees over two alleles, and of pedigrees without loops, can be done in polynomial time.  相似文献   

14.
In today's process-centered business organization, it is imperative that enterprise information system must be converted from task-centered to process-centered system. However, traditional software development methodology is function-oriented, in which each function manages its own data and it results in redundancy because data that belongs to one object are stored by several functions. Proposed in this paper is a process-driven software development methodology, in which business process is a major concern and workflow functionalities are identified and specified throughout the entire development life cycle. In the proposed methodology, the development process, modeling tools and deliverables are clarified explicitly. Proposed methodology can be a guideline to practitioners involved in enterprise software development, of which workflow is an essential part.  相似文献   

15.
This paper presents a tutorial-style review on the recent results about the disturbance observer (DOB) in view of robust stabilization and recovery of the nominal performance. The analysis is based on the case when the bandwidth of Q-filter is large, and it is explained in a pedagogical manner that, even in the presence of plant uncertainties and disturbances, the behavior of real uncertain plant can be made almost similar to that of disturbance-free nominal system both in the transient and in the steady-state. The conventional DOB is interpreted in a new perspective, and its restrictions and extensions are discussed.  相似文献   

16.
Personal computers and their relevant technologies have been widely used by artists and musicians to create and record their own music and electroacoustic compositions. "Laptoppers" are famous for using their laptops for their dance/electronic beats and music. A genre that has not relied on the use of PCs for the production of its music is rock/heavy metal, since bands of these genres usually book recording studio time where professionals take on the task of the production using expensive equipment. This study shows that in today's day and age, and with the software and hardware currently available, it is possible for rock/metal artists to use their PC to record and produce their own CD successfully and at an extremely competitive cost. The effort's of a rock band that does just this is followed from the beginning and the results of their CD production and song successes is presented. The article also serves as a "HowTo" guide that bands on a low budget can follow to make good quality demo CDs and enter the music business industry.  相似文献   

17.
Method of Direct Texture Synthesis on Arbitrary Surfaces   总被引:2,自引:0,他引:2       下载免费PDF全文
A direct texture synthesis method on arbitrary surfaces is proposed in this paper. The idea is to recursively map triangles on surface to texture space until the surface is completely mapped. First, the surface is simplified and a tangential vector field is created over the simplified mesh. Then, mapping process searches for the most optimal texture coordinates in texture sample for each triangle, and the textures of neighboring triangles are blended on the mesh. All synthesized texture triangles are compressed to an atlas. Finally, the simplified mesh is subdivided to approach the initial surface. The algorithm has several advantages over former methods: it synthesizes texture on surface without local parameterization; it does not need partitioning surface to patches; and it does not need a particular texture sample. The results demonstrate that the new algorithm is applicable to a wide variety of texture samples and any triangulated surfaces.  相似文献   

18.
NAREGI is a 5-year Japanese National Grid Project during 2003-2007, whose chief aim is to develop a set of grid middleware to serve as a basis for future e-Science. NAREGI also aims to lead the way in standardization of grid middleware, based on the OGSA architecture. Its super-scheduler is based on the proposed OGSA-EMS Architecture, in that it becomes the first working implementation that implements the documented component relationships within the OGSA-EMS architecture document v.1.0. Through the efforts and experience in the design and implementation, it has been confirmed that the documented OGSA-EMS architecture is quite feasible, but will require significant amount of refinement and speed improvements to finalize its detailed specifications. The super-scheduler also supports co-allocation across multiple sites to support automated execution of grid-based MPIs that execute across machines. Such a resource allocation requires sophisticated interactions between the OGSA-EMS components not covered in the current OGSA-EMS architecture, some of which are non-trivial. Overall, job scheduling with OGSA-EMS has proven to not only work, but also that its job allocation and execution time is within reasonable bounds.  相似文献   

19.
Life science research aims to continuously improve the quality and standard of human life. One of the major challenges in this area is to maintain food safety and security. A number of image processing techniques have been used to investigate the quality of food products. In this paper, we propose a new algorithm to effectively segment connected grains so that each of them can be inspected in a later processing stage. One family of the existing segmentation methods is based on the idea of watersheding, and it has shown promising results in practice. However, due to the over-segmentation issue, this technique has experienced poor performance in various applications, such as inhomogeneous background and connected targets. To solve this problem, we present a combination of two classical techniques to handle this issue. In the first step, a mean shift filter is used to eliminate the inhomogeneous background,where entropy is used to be a converging criterion. Secondly, a color gradient algorithm is used in order to detect the most significant edges, and a marked watershed transform is applied to segment cluttered objects out of the previous processing stages. The proposed framework is capable of compromising among execution time, usability, efficiency and segmentation outcome in analyzing ring die pellets. The experimental results demonstrate that the proposed approach is effectiveness and robust.  相似文献   

20.
In this paper, we propose a new hard problem, called bilateral inhomogeneous small integer solution (Bi-ISIS), which can be seen as an extension of the small integer solution problem on lattices. The main idea is that, instead of choosing a rectangle matrix, we choose a square matrix with small rank to generate Bi-ISIS problem without affecting the hardness of the underlying SIS problem. Based on this new problem, we present two new hardness problems: computational Bi-ISIS and decisional problems. As a direct application of these problems, we construct a new lattice-based key exchange (KE) protocol, which is analogous to the classic Diffie- Hellman KE protocol. We prove the security of this protocol and show that it provides better security in case of worst-case hardness of lattice problems, relatively efficient implementations, and great simplicity.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号