首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper describes the basic concepts for the Image Interchange Format (IIF) for the first International Image Processing and Interchange Standard (IPI), which is under elaboration by ISO/IEC JTC1/SC24 (International Standards Organization/International Electronics Commission, Joint Technical Committee, Computer Graphics)—i.e., “information processing”/“computer graphics”—committee work. Starting with a discussion of existing image formats and current image interchange practices, this study outlines the need for a new approach to a general image interchange format. A requirements list and corresponding design goals for the IIF are presented. Finally, the relation to the other parts of the IPI standard are described. The authors are coworkers and contributors to the relevant committees within the ISO/IEC and the DIN German Institute for Standardization (DIN Deutsches Institut für Normung).  相似文献   

2.
In the late 1980s, traditional standards development organisations (SDOs) were moving toward creating anticipatory standards as a way of coping with the fast growth in new technology in the computing industry. The development of anticipatory standards (standards developed ahead of the technology) was seen as a possible way for the formal standards bodies to keep abreast of these rapid changes. By creating standards ahead of the technology, the standards would act as “change agents” and guide the market. Anticipatory standards were seen as one way of addressing the problem of arriving at suboptimal de facto standards. If the industry can be guided before the technology develops, this will encourage the use of optimal products. This paper considers the diffusion pattern of the ISO/IEC Information Resource Dictionary System (IRDS) Framework standard that fits into the category of an anticipatory standard. Comparisons are made between the diffusion patterns of the ISO/IEC IRDS standard and the ISO/IEC Open Standards Interconnection (OSI) Reference Model as they were both anticipatory in nature, both framework/reference standards, both originated at approximately the same time and were both developed in traditional standards development organisations.  相似文献   

3.
The first half is a tutorial on orderings, lattices, Boolean algebras, operators on Boolean algebras, Tarski's fixed point theorem, and relation algebras.

In the second half, elements of a complete relation algebra are used as “meanings” for program statements. The use of relation algebras for this purpose was pioneered by de Bakker and de Roever in [10–12]. For a class of programming languages with program schemes, single μ-recursion, while-statements, if-then-else, sequential composition, and nondeterministic choice, a definition of “correct interpretation” is given which properly reflects the intuitive (or operational) meanings of the program constructs. A correct interpretation includes for each program statement an element serving as “input/output relation” and a domain element specifying that statement's “domain of nontermination”. The derivative of Hitchcock and Park [17] is defined and a relation-algebraic version of the extension by de Bakker [8, 9] of the Hitchcock-Park theorem is proved. The predicate transformers wps(-) and wlps(-) are defined and shown to obey all the standard laws in [15]. The “law of the excluded miracle” is shown to hold for an entire language if it holds for that language's basic statements (assignment statements and so on). Determinism is defined and characterized for all the program constructs. A relation-algebraic version of the invariance theorem for while-statements is given. An alternative definition of intepretation, called “demonic”, is obtained by using “demonic union” in place of ordinary union, and “demonic composition” in place of ordinary relational composition. Such interpretations are shown to arise naturally from a special class of correct interpretations, and to obey the laws of wps(-).  相似文献   


4.
We discuss the usage of the (non-OO) Z specification language to represent some fundamental concepts of the ISO Reference Model of Open Distributed Processing (RM-ODP) and the ISO General Relationship Model (GRM). After discussing some of the difficulties involved, we offer suggestions on how Z can be used successfully for specifying and modeling open object-based distributed systems. Although fundamental specification concepts of RM-ODP and GRM are less well-known than “traditional” mathematics, we propose an RM-ODP toolkit somewhat analogous in its usage to the well-known mathematical toolkit in Z.  相似文献   

5.
The project was started in December 84, to define the necessary data records, to handle standard parts on CAD systems and work stations.

Basis of the data records were the article characteristics, documented in the German Standard DIN 4000. The aim is to add information on the logical structure of the standard parts from the tables in the “Product Standards”.

During the first year of work it was noted that the user of the data will have a more efficient implementation, if corresponding standardized software is available to produce the selected graphic- and model-representation in the CAD systems. This is to be realized in close cooperation with the German NAM 96.4 (Standardization Committee for Manufacturing of Machines) and the corresponding ISO/TC 184 SC 4 (Industrial Automation, External Representation of Product Definition Data) as well as vendors and users of CAD systems.

The software will be realized on a FORTRAN basis. The relevant FORTRAN-Extension has been documented in the Prestandard DIN V 66304.

The results of the project offered by DIN shall be:

1. - standardized data records and software for the mostly used standard parts
2. - tested and actualized files and subroutine-libraries
3. - DIN certificates in a neutral form, e.g. independent from features of specific CAD systems.
  相似文献   

6.
7.
We stabilize the unstable “shock-like” equilibrium profiles of the viscous Burgers equation using control at the boundaries. These equilibria are not stabilizable (even locally) using the standard “radiation feedback boundary conditions.” Using a nonlinear spatially-scaled transformation (that employs three ingredients, of which one is the Hopf–Cole nonlinear integral transformation) and linear backstepping, we design an explicit nonlinear full-state control law that achieves exponential stability, with a region of attraction for which we give an estimate. The region of attraction is not the entire state space since the Burgers PDE is known not to be globally controllable.   相似文献   

8.
The definition of sustainability which is generally adopted is: “meeting the needs of the present generation without compromising the ability of future generations to meet their own needs” (World Commission on the Environment and Development, 1987. Brundtland report). The EU MDG7 report, describes environmental sustainability as: “…meeting current human needs without undermining the capacity of the environment to provide for those needs over the long term…” (UN, 2005a). Over the past decade public concern about sustainable development has profoundly transformed attitudes and to a lesser extent practices in manufacturing industries. A sustainable approach to design and engineering involves evaluating where a product or system has the greatest environmental impact and then prioritising strategies which reduce that impact. There is hardly any industry sector in which the management of environmental sustainability is not of significant relevance. The degradation of pristine ecosystems, global warming, and unprecedented energy usage, has become key issues for all of earths ‘tenants’. It is essential that all facets of design and manufacturing take action on environmental sustainability concerns through appropriate strategies, and endeavour to implement standards such as the ISO 14001, and accommodating related legislation as a foundation for sustainable manufacturing. This paper discusses the sustainability challenges of the industrial world, the sustainable management issues they face, and the strategies they might employ, while maintaining corporate responsibility and gaining competitive advantage.  相似文献   

9.
10.
This comment letter points out that the essence of the “extreme learning machine (ELM)” recently appeared has been proposed earlier by Broomhead and Lowe and Pao , and discussed by other authors. Hence, it is not necessary to introduce a new name “ELM.”   相似文献   

11.
A method of Die-life prediction is suggested for cup shaped forgings.Authors theorize that forge specialists can make die-life prediction by comparing a “target” forging process” with other standard processor whose actual life are known. The authors make comparison by calculation of risk (that which shortens die-life span). Risk is estimated by using a risk tree network, based on information compiled from a survey of forge experts. The risk rate of an “end node” is estimated by a computer-aided forging process planning system. Comparing dimensions of the target and standard forging processes has effect on the risk rate. Once risk is determined, it is used to predict dielife span using Fuzzy inference. The Fuzzy inference rule is estimated based on data received from the interview of experts.  相似文献   

12.
Constrained multibody system dynamics an automated approach   总被引:1,自引:0,他引:1  
The governing equations for constrained multibody systems are formulated in a manner suitable for their automated, numerical development and solution. Specifically, the “closed loop” problem of multibody chain systems is addressed.

The governing equations are developed by modifying dynamical equations obtained from Lagrange's form of d'Alembert's principle. This modification, which is based upon a solution of the constraint equations obtained through a “zero eigenvalues theorem,” is, in effect, a contraction of the dynamical equations.

It is observed that, for a system with n generalized coordinates and m constraint equations, the coefficients in the constraint equations may be viewed as “constraint vectors” in n-dimensional space. Then, in this setting the system itself is free to move in the nm directions which are “orthogonal” to the constraint vectors.  相似文献   


13.
Dongmei  Ramiro  Luigi   《Computer Communications》2006,29(18):3766-3779
This paper discusses issues of personalization of presence services in the context of Internet Telephony. Such services take into consideration the willingness and ability of a user to communicate in a network, as well as possibly other factors such as time, address, etc. Via a three-layer service architecture for communications in the session initiation protocol (SIP) standard, presence system basic services and personalized services (personal policies) are clearly separated and discussed. To enrich presence related services, presence information is illustratively extended from the well known “online” and “offline” indicators to a much broader meaning that includes “location”, “lineStatus”, “role”, “availability”, etc. Based on this, the call processing language (CPL) is extended in order to describe presence related personalized services for both call processing systems and presence systems using information such as a person’s presence status, time, address, language, or any of their combinations. A web-based system is designed and implemented to simulate these advanced services. In the implementation, personal policies are programmed by end users via a graphic user interface (GUI) and are automatically translated into extended CPL. The simulation system clearly displays when, where and what CPL policies should be used for the provision of personalized presence services and call processing services. Policy conflicts are also addressed by setting policy priorities in the system.  相似文献   

14.
This paper proposes a novel control approach that incorporates a hybrid game strategy in Markov-game-based fuzzy control. Specifically, we aim at designing a “safe and universally consistent” controller that exhibits an ability to maintain performance against large disturbance and environment variations. The proposed hybrid control is a convex combination (based on experiential information) of “a variation of cautious fictitious play” approach and the “minimax” control approach implemented on a fuzzy Markov game platform. We show analytical convergence of Markov-game-based control in the presence of bounded external disturbances, and extend the analysis to show convergence of the proposed Markov-game-based hybrid control approach. Controller simulation and comparison against baseline Markov game fuzzy control and fuzzy $Q$ -learning control on a highly nonlinear two-link robot brings out the superiority of the approach in handling severe environment and disturbance variations over different desired trajectories. This paper illustrates the possibility of obtaining “universal consistency,” i.e., reasonable performance against severe environment and disturbance variations, by hybridizing “cautious fictitious play” with “minimax” approaches in Markov-game-based control.   相似文献   

15.
An interactive program has been developed which simulates several representative industrial processes. Specifically, the program generates product quality characteristic values which are concurrently monitored by standard control charting methods. The program requires the user to specify initial process parameter values and subsequent process adjustments; the latter is necessary in the event the process is deemed to be “out-of-control”. The effectiveness of these decisions are measured by economic criteria. The use of the software promotes a “hands-on” approach, which will better prepare the students to achieve quality improvements in an industrial environment through systematic and scientific evaluation.  相似文献   

16.
A simple, moderately accurate, atmospheric correction algorithm for SeaWiFS   总被引:10,自引:0,他引:10  
We present a simple modification to the standard coastal zone color scanner (CZCS) atmospheric correction algorithm for application to Sea-viewing-wide field-ofview-sensor (SeaWiFS). The modification reduces the error in the water-leaving reflectance using the standard algorithm by a factor of 2–6 when the aerosol behaves as predicted by the LOWTRAN-6 models. For many aerosol models likely to approximate aerosol properties over the oceans the error in the retrieved water-leaving reflectance is predicted to be < ± 0.002 at 443 nm for an aerosol load approximately 2 to 3 times that normally occurring in a maritime atmosphere. These errors in atmospheric correction lead to an error in the pigment concentration (C), retrieved using the blue-green ratio algorithm, of < 50% for more than 75% of the aerosol models tested, whenever the algorithm retrieves a “reasonable” pigment concentration, and when 0.1 ≤ C ≤ 1 mg / m3. This accuracy may be sufficient for some applications, for example, at-sea processing to guide ships to desirable sampling locations. An important feature of this algorithm is that, unlike more sophisticated and computational intensive algorithms, aerosol models are not required to effect the actual atmospheric correction. Investigators already having the CZCS algorithm implemented on an image processing system should be able to process SeaWiFS imagery by making very simple modifications to the code.  相似文献   

17.
This work utilizes the concept of “Composite set” (C-set) and of the related C-calculus to study some standard problems of pattern analysis and general processing of signals. After some basic definitions and notations about composite sets are briefly stipulated, it is shown how a family of C-sets can be associated with a digitized picture. Each element in the family conveys partial information about the picture itself, yet it is possible to combine the various contributions from each C-set in such a way as to completely retrieve the image. Conditions that guarantee such “convergence” are theoretically investigated: the cases of nonconvergence are also proved to be of some interest.

C-calculus is concretely applied to the extraction of significant regions in a digitized picture, of contours, etc. An application to texture discrimination and analysis is also outlined.  相似文献   


18.
Donnel type stability equations for buckling of stringer stiffened cylindrical panels under combined axial compression and hydrostatic pressure are solved by the displacement approach of [6], The solution is employed for a parametric study over a wide range of panel and stringer geometries to evaluate the combined influence of panel configurations and boundary conditions along the straight edges on the buckling behavior of the panel relative to a complete “counter” cylinder (i.e. a cylinder with identical skin and stiffener parameters).

The parametric studies reveal a “sensitivity” to the “weak in shear”, Nx = Nxφ = 0, along the straight edges, SS1 boundary conditions type where the panel buckling loads are always smaller than those predicted for a complete “counter” cylinder. In the case of “classical”, SS3 B.Cs., there always exist values of panel width, 2φ0, for which ρ = 1, i.e. the panel buckling load equals that of the complete “counter” cylinder. For SS2 and SS4 B.Cs. types, the nature by which the panel critical load approaches that of the complete cylinder appears to be panel configuration dependent.

Utilization of panels for the experimental determination of a complete cylinder buckling load is found to be satisfactory for relatively very lightly and heavily stiffened panels, as well as for short panels, (L/R) = 0.2 and 0.5. Panels of moderate length and stiffening have to be debarred, since they lead to nonconservative buckling load predictions.  相似文献   


19.
A new topomer-based method for 3D searching of conventional structural databases is described, according to which 3D molecular structures are compared as sets of fragments or topomers, in single rule-generated conformations oriented by superposition of their fragmentation bonds. A topomer is characterized by its CoMFA-like steric shape and now also by its pharmacophoric features, in some novel ways that are detailed and discussed.

To illustrate the behavior of topomer similarity searching, a new dbtop program was used to generate a topomer distance matrix for a diverse set of 26 PDE4 inhibitors and 15 serotonin receptor modulators. With the best of three parameter settings tried, within the 210 shortest topomer distances (of 1460), 94.7% involved pairs of compounds having the same biological activity, and the nearest neighbor to every compound also shared its activity. The standard similarity metric, Tanimoto coefficients of “2D fingerprints”, could achieve a similar selectivity performance only for the 108 shortest distances, and three Tanimoto nearest neighbors had a different biological activity. Topomer similarity also allowed “lead-hopping” among 22 of the 26 PDE4 inhibitors, notably between rolipram and cipamfylline, while “2D fingerprints” Tanimotos recognized similarity only within generally recognized structural classes.

In 370 searches of authentic high-throughput screening (HTS) data sets, the typical topomer similarity search rate was about 200 structures per s.  相似文献   


20.
In many applications, the use of Bayesian probability theory is problematical. Information needed to feasibility calculate is unavailable. There are different methodologies for dealing with this problem, e.g., maximal entropy and Dempster-Shafer Theory. If one can make independence assumptions, many of the problems disappear, and in fact, this is often the method of choice even when it is obviously incorrect. The notion of independence is a 0–1 concept, which implies that human guesses about its validity will not lead to robust systems. In this paper, we propose a fuzzy formulation of this concept. It should lend itself to probabilistic updating formulas by allowing heuristic estimation of the “degree of independence.” We show how this can be applied to compute a new notion of conditional probability (we call this “extended conditional probability”). Given information, one typically has the choice of full conditioning (standard dependence) or ignoring the information (standard independence). We list some desiderata for the extension of this to allowing degree of conditioning. We then show how our formulation of degree of independence leads to a formula fulfilling these desiderata. After describing this formula, we show how this compares with other possible formulations of parameterized independence. In particular, we compare it to a linear interpolant, a higher power of a linear interpolant, and to a notion originally presented by Hummel and Manevitz [Tenth Int. Joint Conf. on Artificial Intelligence, 1987]. Interestingly, it turns out that a transformation of the Hummel-Manevitz method and our “fuzzy” method are close approximations of each other. Two examples illustrate how fuzzy independence and extended conditional probability might be applied. The first shows how linguistic probabilities result from treating fuzzy independence as a linguistic variable. The second is an industrial example of troubleshooting on the shop floor.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号