Ten years of maintenance, nine published revisions of the standards for the Testing and Test Control Notation version 3 (TTCN-3), more than 500 change requests since 2006, and 10 years of activity on the official TTCN-3 mailing list add up to a rich history, not unlike that of many successful Open Source Software (OSS) projects. In this article, we contemplate TTCN-3 in the context of software evolution and examine its history quantitatively. We mined the changes in the textual content of the standards, the data in change requests from the past 5 years, and the mailing list archives from the past 10 years. In addition, to characterize the use of the TTCN-3 we investigated the meta-data of the contributions at the TTCN-3 User Conference, and the use of language constructs in a large-scale TTCN-3 test suite. Based on these data sets, we first analyze the amount, density, and location of changes within the different parts of the standard. Then, we analyze the activity and focus of the user community and the maintenance team in both the change request management system and the official TTCN-3 mailing list. Finally, we analyze the distribution of contributions at the TTCN-3 User Conference across different topics over the past 8 years and construct use anomalies during the development of a large-scale test suite. Our findings indicate that the TTCN-3 is becoming increasingly stable as the overall change density and intensity, as well as the number of change requests are decreasing, despite the monotonous increase in the size of the standards. 相似文献
Information Systems and e-Business Management - The digital transformation, with its ongoing trend towards electronic business, confronts companies with increasingly growing amounts of data which... 相似文献
Image post-processing corrects for cardiac and respiratory motion (MoCo) during cardiovascular magnetic resonance (CMR) stress perfusion. The study analyzed its influence on visual image evaluation.
Materials and methods
Sixty-two patients with (suspected) coronary artery disease underwent a standard CMR stress perfusion exam during free-breathing. Image post-processing was performed without (non-MoCo) and with MoCo (image intensity normalization; motion extraction with iterative non-rigid registration; motion warping with the combined displacement field). Images were evaluated regarding the perfusion pattern (perfusion deficit, dark rim artifact, uncertain signal loss, and normal perfusion), the general image quality (non-diagnostic, imperfect, good, and excellent), and the reader’s subjective confidence to assess the images (not confident, confident, very confident).
Results
Fifty-three (non-MoCo) and 52 (MoCo) myocardial segments were rated as ‘perfusion deficit’, 113 vs. 109 as ‘dark rim artifacts’, 9 vs. 7 as ‘uncertain signal loss’, and 817 vs. 824 as ‘normal’. Agreement between non-MoCo and MoCo was high with no diagnostic difference per-patient. The image quality of MoCo was rated more often as ‘good’ or ‘excellent’ (92 vs. 63%), and the diagnostic confidence more often as “very confident” (71 vs. 45%) compared to non-MoCo.
Conclusions
The comparison of perfusion images acquired during free-breathing and post-processed with and without motion correction demonstrated that both methods led to a consistent evaluation of the perfusion pattern, while the image quality and the reader’s subjective confidence to assess the images were rated more favorably for MoCo.
In photorealistic image synthesis the radiative transfer equation is often not solved by simulating every wavelength of light, but instead by computing tristimulus transport, for instance using sRGB primaries as a basis. This choice is convenient, because input texture data is usually stored in RGB colour spaces. However, there are problems with this approach which are often overlooked or ignored. By comparing to spectral reference renderings, we show how rendering in tristimulus colour spaces introduces colour shifts in indirect light, violation of energy conservation, and unexpected behaviour in participating media. Furthermore, we introduce a fast method to compute spectra from almost any given XYZ input colour. It creates spectra that match the input colour precisely. Additionally, like in natural reflectance spectra, their energy is smoothly distributed over wide wavelength bands. This method is both useful to upsample RGB input data when spectral transport is used and as an intermediate step for corrected tristimulus‐based transport. Finally, we show how energy conservation can be enforced in RGB by mapping colours to valid reflectances. 相似文献
Facilitating compliance management, that is, assisting a company’s management in conforming to laws, regulations, standards, contracts, and policies, is a hot but non-trivial task. The service-oriented architecture (SOA) has evolved traditional, manual business practices into modern, service-based IT practices that ease part of the problem: the systematic definition and execution of business processes. This, in turn, facilitates the online monitoring of system behaviors and the enforcement of allowed behaviors—all ingredients that can be used to assist compliance management on the fly during process execution. In this paper, instead of focusing on monitoring and runtime enforcement of rules or constraints, we strive for an alternative approach to compliance management in SOAs that aims at assessing and improving compliance. We propose two ingredients: (i) a model and tool to design compliant service-based processes and to instrument them in order to generate evidence of how they are executed and (ii) a reporting and analysis suite to create awareness of a company’s compliance state and to enable understanding why and where compliance violations have occurred. Together, these ingredients result in an approach that is close to how the real stakeholders—compliance experts and auditors—actually assess the state of compliance in practice and that is less intrusive than enforcing compliance. 相似文献
Currently process modeling is mostly done manually. Therefore, the initial design of process models as well as changes to process models which are frequently necessary to react to new market developments or new regulations are time-consuming tasks. In this paper we introduce SEMPA, an approach for the partly automatic planning of process models. Using ontologies to semantically describe actions – as envisioned in Semantic Business Process Management –, a process model for a specified problem setting can be created automatically. In comparison to existing planning algorithms our approach creates process models including control structures and is able to cope with complex and numerical input and output parameters of actions. The prototypical implementation as well as an example taken from the financial services domain illustrate the practical benefit of our approach. 相似文献
Although IP Multimedia Subsystem (IMS) based Next Generation Networks (NGNs) are already emerging as the common session control platform for converging fixed, mobile and cable networks, harmonized solutions for the management of these converged platforms have still got to be developed. This document describes a hands-on approach to NGN Management. Started with IMS specific management systems, succeeding research had to take into account the importance of the management of NGN SDPs as well. This work shows that the hybrid nature of an NGN, where services can be delivered at the IMS layer, by SIP signaling mechanisms, as well as at the SDP, via Web Services, requires a harmonized management approach. Taking into account Service Oriented Architecture (SOA) principles and policy based model driven architectures, this work shows that a unification of service composition and service management already at the workflow creation level, bares significant benefits in terms of automation and harmonization. Following the SOA paradigm, the approach presented here does not differentiate between business process management (BPM) and management process management. Focusing on Telemanagement Forum’s enhanced Telecom Operations Map service fulfillment and service assurance operations, this document describes an New Generation Software and Services (NGOSS) based implementation of a unified Operation Support System (OSS) for NGNs that encompasses many problems of former stovepipe management solutions in terms of automation, flexibility and manageability. 相似文献