首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
《Computers in Industry》1987,8(4):370-376
The Technology Assessment and Management Conference of the Gottlieb Duttweiler Institute was held on November 7 and 8, 1985, in Rüschlikon / Zürich, Switzerland. Patricia A. MacConaill (Commission of the European Communities, Information Technologies and Telecommunications Task Force) chaired this conference.  相似文献   

2.
3.
4.
5.
The authors review and categorize the research in applications of artificial intelligence (AI) and expert systems (ES) in new product development (NPD) activities. A brief overview of NPD process and AI is presented. This is followed by a literature survey in regard to AI and ES applications in NPD, which revealed twenty four articles (twenty two applications) in the 1990–1997 period. The applications are categorized into five areas: expert decision support systems for NPD project evaluation, knowledge-based systems (KBS) for product and process design, KBS for QFD, AI support for conceptual design and AI support for group decision making in concurrent engineering. Brief review of each application is provided. The articles are also grouped by NPD stages and seven NPD core elements (competencies and abilities). Further research areas are pointed out.  相似文献   

6.
Nowadays, time is critical in most system engineering projects. The ability to deliver systems in short time determines the success of the system supplier. For customers, the quicker the system delivery time, the better are their chances to get some business advantages in their ever-changing business environments. As a consequence, an increasing number of projects are subjected to tight deadlines in all project phases, including requirements elicitation. A project with plenty of time for developing a requirements specification is hard to find. In this paper, experiences from one such project are reflected. Based on these experiences, time aspects in requirements engineering are discussed; i.e., what could be done better in requirements engineering if there were more time and what can easily be missed in requirements engineering under a tight deadline.  相似文献   

7.
8.
9.
10.
11.
12.
The paper analyses restructuring processes occuring with the introduction of information technologies into firms in Austria and assesses how far the evidence lends support to the thesis of a fundamental change in rationalization patterns as postulated by continental industrial sociologists claiming the emergence of a novel type of systemic rationalization. Based on a research perspective putting emphasis on several levels of social mediation of technological change the broad conclusion is the following: there are clear indications of a novel systemic approach to rationalization but the associated forms of work organization show substantial variation. The analysis of the influence of national-level institutions, industry- and firm-specific conditions, and their role in micro-political processes of system and work design, points towards an underutilization of work humanization potentials and suggests an increase in skill supply as one of the possible intervention strategies.  相似文献   

13.
Jon David 《Network Security》1999,1999(8):11-14
In the first part of this article we defined a vulnerability and discussed how to determine the importance of vulnerabilities. We looked at many types of vulnerabilities. Most people think of a vulnerability as something that gets exploited by ‘hacking in’ to organizations and systems by strangers on the outside. We looked at several important vulnerabilities not usually given much consideration because of that preconception: legal vulnerabilities, personnel/human resources vulnerabilities, premises vulnerabilities, phone switch vulnerabilities, disaster recovery/contingency planning vulnerabilities, insurance vulnerabilities, policy vulnerabilities, password vulnerabilities, default value vulnerabilities and improper access control vulnerabilities. It was shown that it was neither possible nor effective to remove all vulnerabilities, but that proper analyses were necessary to determine which to treat.  相似文献   

14.
As is common across the public sector, the UK police service is under pressure to do more with less, to target resources more efficiently and take steps to identify threats proactively; for example under risk-assessment schemes such as ‘Clare’s Law’ and ‘Sarah’s Law’. Algorithmic tools promise to improve a police force’s decision-making and prediction abilities by making better use of data (including intelligence), both from inside and outside the force. This article uses Durham Constabulary’s Harm Assessment Risk Tool (HART) as a case-study. HART is one of the first algorithmic models to be deployed by a UK police force in an operational capacity. Our article comments upon the potential benefits of such tools, explains the concept and method of HART and considers the results of the first validation of the model’s use and accuracy. The article then critiques the use of algorithmic tools within policing from a societal and legal perspective, focusing in particular upon substantive common law grounds for judicial review. It considers a concept of ‘experimental’ proportionality to permit the use of unproven algorithms in the public sector in a controlled and time-limited way, and as part of a combination of approaches to combat algorithmic opacity, proposes ‘ALGO-CARE’, a guidance framework of some of the key legal and practical concerns that should be considered in relation to the use of algorithmic risk assessment tools by the police. The article concludes that for the use of algorithmic tools in a policing context to result in a ‘better’ outcome, that is to say, a more efficient use of police resources in a landscape of more consistent, evidence-based decision-making, then an ‘experimental’ proportionality approach should be developed to ensure that new solutions from ‘big data’ can be found for criminal justice problems traditionally arising from clouded, non-augmented decision-making. Finally, this article notes that there is a sub-set of decisions around which there is too great an impact upon society and upon the welfare of individuals for them to be influenced by an emerging technology; to an extent, in fact, that they should be removed from the influence of algorithmic decision-making altogether.  相似文献   

15.
In virtually every application of optimum linear-quadratic regulator (LQR) theory there exists a hidden region of ‘unreachable poles’ (in the left half-plane) which cannot be realized as optimum closed-loop poles. These regions of unreachable closed-loop poles are not visible using the solution procedures ordinarily employed in LQR applications and their lurking presence has (apparently) been overlooked by many professors, textbook writers and industrial users of LQR control theory for the past 25 years. The existence of these regions of unreachable poles represents a serious defect in the LQR method because those regions may (and often do!) contain closed-loop pole patterns which are considered highly desirable by classical control engineering standards, i.e. by ITAE and other classical standards of ‘ideal’ transient response. We first show how one can identify the regions of unreachable poles in an LQR problem. Then, it is shown how one can modify conventional LQR theory to overcome this defect and make all unreachable poles (in the left half-plane) become reachable. By this means, an explicit formula is derived for the LQR state-weighting matrix Q which will automatically produce ITAE or any other arbitrarily prescribed closed-loop pole patterns in the left half-plane.  相似文献   

16.
17.
Abstract:

Most of school science concentrates on helping students gain a knowledge and understanding of explicit science, which may subsequently be tested in examinations. It presents a picture of science as a secure body of knowledge, gained by scientists working according to the standard procedures of science. In this paper I stress a different model of science: the looseness of the knowledge held, the idiosyncratic methods by which it is obtained, and the personal way in which it is used to solve problems. I also stress the importance of tacit knowledge and the affective driving force, which describe the personal knowledge that scientists both hold and utilize; I analyse the nature of authentic science in terms of the type of knowledge that scientists know and the way in which scientists work. The arguments for and against such authentic science in schools are considered, together with the factors limiting its practicality. Influenced by the writing of Polanyi, Hodgkin, and Claxton and by experience of, and researches into, students doing problem‐solving projects in schools, I argue that it is desirable, and possible, to incorporate some such authentic science into the school science curriculum. In spite of many unsympathetic pressures acting on current schooling, I believe that there is a vital need to reaffirm the importance of the tacit and the affective in school science.  相似文献   

18.
This paper aims to provide a basis for renewed talk about use in computing. Four current discourse arenas are described. Different intentions manifest in each arena are linked to failures in translation, different terminologies crossing disciplinary and national boundaries non-reflexively. Analysis of transnational use discourse dynamics shows much miscommunication. Conflicts like that between the Scandinavian System Development School and the usability approach have less current salience. Renewing our talk about use is essential to a participatory politics of information technology and will lead to clearer perception of the implications of letting new systems becoming primary media of social interaction.  相似文献   

19.
《Ergonomics》2012,55(2):203-209
The most marked ‘ morning ’ and ‘evening ’ types in a psychology class were identified by means of a questionnaire, and asked to record their oral temperatures and food intakes throughout the day during a 4-week and a 4-day period respectively. The morning group had its mean circadian temperature maximum 5 h earlier than the evening group, and had its cumulative food intake distribution curve 1¾ h ahead of the evening group. After adjusting the food distributions by 1¾ h in the time base to get a least-square fit, significant differences between the distributions remained. It is suggested that morning types have a more autonomous 24-hour-periodicity than evening types. It is concluded that the questionnaires have the power to discriminate extreme morning and evening types of individuals in terms of oral temperature and food intake. Food intake seems to be a sensitive enough measure to be included in studies of inter-individual differences of circadian rhythms.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号