The development of novel thalidomide derivatives as immunomodulatory and anti‐angiogenic agents has revived over the last two decades. Herein we report the design and synthesis of three chemotypes of barbituric acids derived from the thalidomide structure: phthalimido‐, tetrafluorophthalimido‐, and tetrafluorobenzamidobarbituric acids. The latter were obtained by a new tandem reaction, including a ring opening and a decarboxylation of the fluorine‐activated phthalamic acid intermediates. Thirty compounds of the three chemotypes were evaluated for their anti‐angiogenic properties in an ex vivo assay by measuring the decrease in microvessel outgrowth in rat aortic ring explants. Tetrafluorination of the phthalimide moiety in tetrafluorophthalimidobarbituric acids was essential, as all of the nonfluorinated counterparts lost anti‐angiogenic activity. An opening of the five‐membered ring and the accompanying increased conformational freedom, in case of the corresponding tetrafluorobenzamidobarbituric acids, was well tolerated. Their activity was retained, although their molecular structures differ in torsional flexibility and possible hydrogen‐bond networking, as revealed by comparative X‐ray crystallographic analyses. 相似文献
Investigating the effect of concentration forcing of the CO2 methanation is not only relevant for power‐to‐gas plants but also for the study of dynamic phenomena of this reaction. In this study a Ni/Al2O3 catalyst is investigated under concentration forcing at industrially relevant conditions. The dynamic experiments allow an evaluation in terms of the reaction rate and enable the study of the reaction mechanism. The experiments show that no methane is formed in the CO2‐rich part of the cycle, whereas a fast hydrogenation of carbonaceous species to methane takes place upon switching to H2. 相似文献
Many types of cells release phospholipid membrane vesicles thought to play key roles in cell-cell communication, antigen presentation, and the spread of infectious agents. Extracellular vesicles (EVs) carry various proteins, messenger RNAs (mRNAs), and microRNAs (miRNAs), like a “message in a bottle” to cells in remote locations. The encapsulated molecules are protected from multiple types of degradative enzymes in body fluids, making EVs ideal for delivering drugs. This review presents an overview of the potential roles of EVs as natural drugs and novel drug-delivery systems. 相似文献
Establishing adequate technical and physical boundary conditions for a sustained nuclear fusion reaction is a challenging task. Phased feedback control and monitoring for heating, fuelling and magnetic shaping is mandatory, especially for fusion devices aiming at high performance plasmas. Technical and physical interrelations require close collaboration of many components in sequential as well as in parallel processing flows. Moreover, handling of asynchronous, off-normal events has become a key element of modern plasma performance optimisation and machine protection recipes.The manifoldness of plasma states and events, the variety of plant system operation states and the diversity in diagnostic data sampling rates can hardly be mastered with a rigid control scheme. Rather, an adaptive system topology in combination with sophisticated synchronisation and process scheduling mechanisms is suited for such an environment. Moreover, the system is subject to real-time control constraints: response times must be deterministic and adequately short.Therefore, the experimental tokamak device ASDEX Upgrade employs a discharge control system DCS, whose core has been designed to meet these requirements. In the paper we will compare the scheduling schemes for the parallelised realisation of a control workflow and show the advantage of a data-driven workflow over a managed workflow. The data-driven workflow as used in DCS is based on signals connecting process outputs and inputs. These are implemented as real-time streams of data samples. Consequently, real-time signal management forms the foundation of DCS. The paper explains the principal features such as tagged samples, signal groups, algorithmic blocks and processes as well as scheduling schemes which allow DCS control applications to be defined as self-contained modular building blocks glued together by a software framework.By virtue of this sound foundation, DCS is a mature but still evolving system for reliable, distributed control of an entire tokamak device coordinating and monitoring 20 diagnostic systems, 14 magnetic power supplies, 5 heating systems with a total power of more than 25 MW, 8 gas fuelling channels, a pellet injector and a killer gas gun. 相似文献
Promising to cope with increasing demand variety and uncertainty, flexibility in general and process flexibility in particular are becoming ever more desired corporate capabilities. During the last years, the business process management and the production/operations management communities have proposed numerous approaches that investigate how to valuate and determine an appropriate level of process flexibility. Most of these approaches are very restrictive regarding their application domain, neglect characteristics of the involved processes and outputs other than demand and capacity, and do not conduct a thorough economic analysis of process flexibility. Against this backdrop, the authors propose an optimization model that determines an appropriate level of process flexibility in line with the principles of value-based business process management. The model includes demand uncertainty, variability, criticality, and similarity as process characteristics. The paper also reports on the insights gained from applying the optimization model to the coverage switching processes of an insurance broker pool company. 相似文献
Acceptance testing is a time-consuming task for complex software systems that have to fulfill a large number of requirements.
To reduce this effort, we have developed a widely automated method for deriving test plans from requirements that are expressed
in natural language. It consists of three stages: annotation, clustering, and test plan specification. The general idea is
to exploit redundancies and implicit relationships in requirements specifications. Multi-viewpoint techniques based on RM-ODP
(Reference Model for Open Distributed Processing) are employed for specifying the requirements. We then use linguistic analysis
techniques, requirements clustering algorithms, and pattern-based requirements collection to reduce the total effort of testing
against the requirements specification. In particular, we use linguistic analysis for extracting and annotating the actor,
process and object of a requirements statement. During clustering, a similarity function is computed as a measure for the
overlap of requirements. In the test plan specification stage, our approach provides capabilities for semi-automatically deriving
test plans and acceptance criteria from the clustered informal textual requirements. Two patterns are applied to compute a
suitable order of test activities. The generated test plans consist of a sequence of test steps and asserts that are executed
or checked in the given order. We also present the supporting prototype tool TORC, which is available open source. For the
evaluation of the approach, we have conducted a case study in the field of acceptance testing of a national electronic identification
system. In summary, we report on lessons learned how linguistic analysis and clustering techniques can help testers in understanding
the relations between requirements and for improving test planning. 相似文献
Although “User-Centred”, “Participatory”, and other similar design approaches have proved to be very valuable for mainstream
design, their principles are more difficult to apply successfully when the user group contains, or is composed of, older and/or
disabled users. In the field of design for older and disabled people, the “Universal Design”, “Inclusive Design” and “Design
for All” movements have encouraged designers to extend their design briefs to include older and disabled people. The downside
of these approaches is that they can tend to encourage designers to follow a traditional design path to produce a prototype
design, and only then investigate how to modify their interfaces and systems to cope with older and/or disabled users. This
can lead to an inefficient design process and sometimes an inappropriate design, which may be “accessible” to people with
disabilities, but in practice unusable. This paper reviews the concept that the authors have called “User-Sensitive Inclusive
Design”, which suggests a different approach to designing for marginalised groups of people. Rather than suggesting that designers
rely on standards and guidelines, it is suggested that designers need to develop a real empathy with their user groups. A
number of ways to achieve this are recommended, including the use of ethnography and techniques derived from professional
theatre both for requirements gathering and for improving designers’ empathy for marginalised groups of users, such as older
and disabled people. 相似文献
The rigorous application of design science in information and communications technology (ICT) research is growing rapidly and producing exciting results. The five papers published in this special issue reflect some of the most recent ideas and research projects in ICT design science research (DSR). This introduction begins with concise summaries of the published papers. We then reflect on three key design science issues, using the published papers to illustrate our views. The three issues are: (1) the nature of the artifacts/problems studied in DSR in ICT disciplines; (2) the research approaches that are used; and (3) the nature of the research contributions that are made. We explain why we believe that these issues are interdependent and why thinking about these three issues as a whole can support an improved understanding of the goals and processes of design science research. 相似文献
The standardization of processes and the identification of shared business services in a service-oriented architecture (SOA)
are currently widely discussed. Above all in practice, however, there still is a lack of appropriate instruments to support
these tasks. In this paper an approach for a process map is introduced which allows for a systematic presentation—as complete
as possible—of the processes in an enterprise (division). After a consistent refinement of the process has taken place by
means of aggregation/disaggregation respectively, generalization/specialization relations, it is possible to identify primarily
functional similarities of the detailed sub-processes. The application of the process map at a financial service provider
(FSP) highlights how these similarities can be taken as a basis to standardize processes and to identify shared services. 相似文献
Numerous visual notations are present in technical and business domains. Notations have to be cognitively effective to ease the planning, documentation, and communication of the domains’ concepts. Semantic transparency (ST) is one of the elementary principles that influence notations’ cognitive effectiveness. However, the principle is criticized for not being well defined and challenges arise in the evaluations and applications of ST. Accordingly, this research’s objectives were to answer how the ST principle is defined, operationalized, and evaluated in present notations as well as applied in the design of new notations in ICT and related areas. To meet these objectives, a systematic literature review was conducted with 94 studies passing the selection process criteria. The results reject one of the three aspects, which define semantic transparency, namely “ST is achieved with the use of icons.” Besides, taxonomies of related concepts and research methods, evaluation metrics, and other findings from this study can help to conduct verifiable ST-related experiments and applications, consequently improving the visual vocabularies of notations and effectiveness of the resulting diagrams.