共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
In this paper, we define a number of tools that we think belong to the core of any toolkit for requirements engineers. The tools are conceptual and hence, they need precise definitions that lay down as exactly as possible what their meaning and possible use is. We argue that this definition can best be achieved by a formal specification of the tool. This means that for each semi-formal requirements engineering tool we should provide a formal specification that precisely specifies its meaning. We argue that this mutually enhances the formal and semi-formal technique: it makes formal techniques more usable and, as we will argue, at the same time simplifies the diagram-based notations.At the same time, we believe that the tools of the requirements engineer should, where possible, resemble the familiar semi-formal specification techniques used in practice today. In order to achieve this, we should search existing requirements specification techniques to look for a common kernel of familiar semi-formal techniques and try to provide a formalisation for these.In this paper we illustrate this approach by a formal analysis of the Shlaer-Mellor method for object-oriented requirements specification. The formal specification language used in this analysis is LCM, a language based on dynamic logic, but similar results would have been achieved by means of another language. We analyse the techniques used in the information model, state model, process model and communication model of the Shlaer-Mellor method, identify ambiguities and redundancies, indicate how these can be eliminated and propose a formalisation of the result. We conclude with a listing of the tools extracted from the Shlaer-Mellor method that we can add to a toolkit that in addition contains LCM as formal specification technique. 相似文献
3.
Craig Stroupe Author Vitae 《Computers and Composition》2007,24(4):421-442
Beginning with a problematic assignment in a “New Media Writing” class, this article demonstrates, first, the significant, perhaps irreconcilable differences in the writing/reading environments of print as opposed to New Media: the interiority, on one hand, of the individual text implied in the “shape” of narratives and other elaborated verbal performances and, on the other hand, the mythic exteriority of networked, information space and the market logic of its “attention economy.” These differences pose a challenge not only to the traditional practices of academic, literary, and professional discourse communities, but to what this article terms “writing culture”—that is, popular cultural practices and assumptions conditioned by the procedures and experience of textual elaboration. Examining student hypertexts, key critical works on New Media, web sites, and literary theory and history, this article suggests a solution to this challenge, arguing that the future development of online writing genres ultimately cannot depend on imposing written shapes on network space. Instead, a close analysis of a hoax from the auction site, eBay, suggests how parody can constitute a lens through which the Web's own generic conventions filter the critical/creative consciousness that has long epitomized writing culture. 相似文献
4.
5.
Towards an understanding of the causes and effects of software requirements change: two case studies
Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ??market??, ??organisation??, ??vision??, ??specification??, and ??solution?? to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ??market??, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ??organisation?? and ??vision?? sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ??specification?? or ??solution?? sources. Results from the second case study indicate that only ??requirements dependency?? is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies. 相似文献
6.
7.
Landmark Graphics supplies software and services to the upstream oil and gas industry. Our software portfolio, which ranges from exploration and drilling to data management and decision analysis, includes more than 60 products consisting of over 50 million lines of source code. For many years, Landmark has been collecting project metrics we wished to harvest to gain insight into key business questions in three areas: optimal release cycle duration (scope/time trade-off), optimal project staffing levels, effects of uncertainty. We set out to develop a relatively simple project dynamics model to use in conjunction with market sensitivity and economic analysis to help optimize profitability. Some of our ideas and results are similar to those of Preston Smith and Donald Reinertsen, who examined the impact of time-to-market sensitivity. However, our approach is a more detailed model tuned to software development issues. 相似文献
8.
How to integrate legal requirements into a requirements engineering methodology for the development of security and privacy patterns 总被引:1,自引:0,他引:1
Luca Compagna Paul El Khoury Alžběta Krausová Fabio Massacci Nicola Zannone 《Artificial Intelligence and Law》2009,17(1):1-30
Laws set requirements that force organizations to assess the security and privacy of their IT systems and impose them to implement
minimal precautionary security measures. Several IT solutions (e.g., Privacy Enhancing Technologies, Access Control Infrastructure,
etc.) have been proposed to address security and privacy issues. However, understanding why, and when such solutions have
to be adopted is often unanswered because the answer comes only from a broader perspective, accounting for legal and organizational
issues. Security engineers and legal experts should analyze the business goals of a company and its organizational structure
and derive from there the points where security and privacy problems may arise and which solutions best fit such (legal) problems.
The paper investigates the methodological support for capturing security and privacy requirements of a concrete health care
provider.
相似文献
Nicola Zannone (Corresponding author)Email: |
9.
10.
Programming by example is a powerful way of bestowing on nonprogrammers the ability to communicate tasks to a computer. When creating procedures from examples it is necessary to be able to infer the existence of variables, conditional branches, and loops. This article explores the role of empirical or “similarity-based” learning in this process. For a concrete example of a procedure induction system, we use an existing scheme called METAMOUSE which allows graphical procedures to be specified from examples of their execution. A procedure is induced from the first example, and can be generalized in accordance with examples encountered later on. We describe how the system can be enhanced with Mitchell's candidate elimination algorithm, one of the simplest empirical learning techniques, to improve its ability to recognize constraints in a comprehensive and flexible manner. Procedure induction is, no doubt, a very complex task. This work revealed usefulness and effectiveness of empirical learning in procedure induction, although it cannot be a complete substitute for specific preprogrammed, domain knowledge in situations where this is readily available. However, in domains such as graphical editing, where knowledge is incomplete and/or incorrect, the best way to pursue may prove to be a combination of similarity- and explanation-based learning. © 1994 John Wiley & Sons, Inc. 相似文献
11.
There seems to be no clear consensus in the existing literature about the role of deontic logic in legal knowledge representation — in large part, we argue, because of an apparent misunderstanding of what deontic logic is, and a misplaced preoccupation with the surface formulation of legislative texts. Our aim in this paper is to indicate, first, which aspects of legal reasoning are addressed by deontic logic, and then to sketch out the beginnings of a methodology for its use in the analysis and representation of law.The essential point for which we argue is that deontic logic — in some form or other —needs to be taken seriously whenever it is necessary to make explicit, and then reason about, the distinction between what ought to be the case and what is the case, or as we also say, between the ideal and the actual. We take the library regulations at Imperial College as the main illustration, and small examples from genuinely legal domains to introduce specific points. In conclusion, we touch on the role of deontic logic in the development of the theory of normative positions.Deontic logic and the theory of normative positions are of relevance to legal knowledge representation, but also to the analysis and. representation of normative systems generally. The emphasis of the paper is on legal knowledge representation, but we seek to place the discussion within the context of a broader range of issues concerning the role of deontic logic in Computer Science. 相似文献
12.
13.
Context
In the long run, features of a software product line (SPL) evolve with respect to changes in stakeholder requirements and system contexts. Neither domain engineering nor requirements engineering handles such co-evolution of requirements and contexts explicitly, making it especially hard to reason about the impact of co-changes in complex scenarios.Objective
In this paper, we propose a problem-oriented and value-based analysis method for variability evolution analysis. The method takes into account both kinds of changes (requirements and contexts) during the life of an evolving software product line.Method
The proposed method extends the core requirements engineering ontology with the notions to represent variability-intensive problem decomposition and evolution. On the basis of problemorientation, the analysis method identifies candidate changes, detects influenced features, and evaluates their contributions to the value of the SPL.Results and Conclusion
The process of applying the analysis method is illustrated using a concrete case study of an evolving enterprise software system, which has confirmed that tracing back to requirements and contextual changes is an effective way to understand the evolution of variability in the software product line. 相似文献14.
Using data from two surveys of people knowledgeable about requirements for, and the success of the development of, large commercial applications (CAs) in hundreds of large organizations from around the world, this paper reports a high positive correlation between an organization’s requirements definition and management (RDM) maturity and that organization’s successful performance on CA development projects. Among the organizations that responded with a filled survey, an organization that is assessed at a high RDM maturity is significantly more successful in its CA development projects than is an organization that is assessed at a low RDM maturity, when success in CA development projects is measured as (1) delivering CAs on-time, on-budget, and on-function, (2) meeting the business objectives of these projects, and (3) the perceived success of these projects. This paper presents a comprehensive framework for RDM, describes a quality RDM process, and describes RDM maturity and how to measure it. It describes the two surveys, the first of which ended up being a pilot for the second, which was designed taking into account what was learned from the first survey. The paper concludes with advice to practitioners on the application of the RDM maturity framework in any organization that wishes to improve its RDM and its performance in the development of large CAs. 相似文献
15.
Jahna Otterbacher 《Knowledge and Information Systems》2013,35(3):645-664
Online review forums provide consumers with essential information about goods and services by facilitating word-of-mouth communication. Despite that preferences are correlated to demographic characteristics, reviewer gender is not often provided on user profiles. We consider the case of the internet movie database (IMDb), where users exchange views on movies. Like many forums, IMDb employs collaborative filtering such that by default, reviews are ranked by perceived utility. IMDb also provides a unique gender filter that displays an equal number of reviews authored by men and women. Using logistic classification, we compare reviews with respect to writing style, content and metadata features. We find salient differences in stylistic features and content between reviews written by men and women, as predicted by sociolinguistic theory. However, utility is the best predictor of gender, with women’s reviews perceived as being much less useful than those written by men. While we cannot observe who votes at IMDb, we do find that highly rated female-authored reviews exhibit “male” characteristics. Our results have implications for which contributions are likely to be seen, and to what extent participants get a balanced view as to “what others think” about an item. 相似文献
16.
Pena F. B. Crabi D. Izidoro S. C. Rodrigues . O. Bernardes G. 《Pattern Analysis & Applications》2022,25(1):241-251
Pattern Analysis and Applications - The grading of gemstones is currently a manual procedure performed by gemologists. A popular approach uses reference stones, where those are visually... 相似文献
17.
This paper considers the problem of a qualitative searches in abstract and biographic databases. It analyzes two classes of search instruments, viz., with the application of retrieval requests based on the Boolean combinations of terms and instruments that use free sentences in a natural language. It is noted that the first class of systems gives a clearer insight into the result but demands high qualifications from a user; it is very difficult to achieve high indicators of search completeness with them. The second class of systems is simpler in its workings, permits the processing of more verbose queries, and is oriented to an unqualified user. However, the output in such systems requires a longer browsing routine to search for relevant records. The experience in using an untraditional search engine with the automatic creation of terminological combinations from a query text is described. Many terminological combinations from a query text, which are contained in the documents found during a search, are issued to a user as an intermediate result. There is a convenient mechanism for browsing through the terminological combinations that are of interest to a user and through the found records themselves, as well as the mechanism for searching by subject heading indices with the output to them through a query text. Illustrative examples of using the system during searching in the abstract Medicine VINITI DB are given. 相似文献
18.
19.
Software systems are becoming more and more critical in every domain of human society. These systems are used not only by corporates and governments, but also by individuals and across networks of organizations. The wide use of software systems has resulted in the need to contain a large amount of critical information and processes, which certainly need to remain secure. As a consequence, it is important to ensure that the systems are secure by considering security requirements at the early phases of software development life cycle. In this paper, we propose to consider security requirements as functional requirements and apply model-oriented security requirements engineering framework as a systematic solution to elicit security requirements for e-governance software systems. As the result, high level of security can be achieved by more coverage of assets and threats, and identifying more traces of vulnerabilities in the early stages of requirements engineering. This in turn will help to elicit effective security requirements as countermeasures with business requirements. 相似文献
20.
Modern Software Engineering (SE) is characterized by the use of several models that establish and show the different states a software product goes through, from its initial conception to its end, passing across its development, setup and maintenance among others. Each phase produces a set of deliverables following different documentation standards, but in many cases, natural language text is a key aspect in the elaboration of such documents. This work surveys the state of the art in the application of text mining techniques to architectural software design, starting from the role of text documents during development phases, specifically the kind of text documents that can be subsequently exploited to assist architects in the complex task of designing software. Intelligent text analysis techniques utilized in software engineering tasks across the software life-cycle are detailed in order to analyze works focused on automatically bridging the gap between requirements and software architectures. 相似文献