首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
The teaching of literature through CAI raises problems of both a linguistic and instructional nature; student involvement and creativity in studying literature, and especially poetry, is difficult to build into a computer-based lesson. We have confronted these difficulties in the lessonPoetry I, which introduces undergraduates to basic concepts of poetic verse in a design using screen display, speech synthesis, and verse processing to maximize interactivity and student involvement.The lesson contains instructional modules which include the student's composition of a limerick. Computational processing of the limerick's text enables the program to offer guidance as the student composes and revises the verse and gains first-hand experience with metrical language. The significant problems inherent in processing verse are addressed by adapting Digital Equipment Corporation's DECTalk speech synthesizer as an engine converting natural language verse text into accessible strings of phonemic symbols. Although the verse processor cannot scan verse, it can interpret DECTalk's symbols reliably enough to successfully stimulate students' thinking about their verse compositions.Preliminary responses to the program have been favorable and demonstrate its effectiveness in involving students more deeply in learning about poetic verse. This suggests that the techniques of verse processing prototyped inPoetry I might be usefully extended to other types of verse and levels of study.W. Webster Newbold is an Assistant Professor of English at Ball State University, Muncie, Indiana, USA, who has recently served as English Department Computer Coordinator. His research interests include computer-assisted instruction and computer-based composition.Herbert F. W Stahlke is a Professor of English, specializing in linguistics, and currently serves as Associate Director for Academic Computing, Ball State University Computing Services. His research interests include computer-assisted instruction, natural language processing, and lexical databases.  相似文献   

2.
The Century of Prose Corpus is a historical corpus of British English of the period 1680–1780. It has been designed to provide a resource for students of the language of that era. The COPC is diachronic and may be considered a unit in what will eventually become a series of corpora providing access to the whole of the English language from the oldest specimens to the present. This article describes and explains the various features of the COPC.Louis T. Milic is Professor of English Emeritus at Cleveland State University. He retired from teaching in 1991 and has been Secretary-Treasurer of the Dictionary Society of North America since 1990. He has also occupied himself with research and publication: Zeugmas in Gibbon's History,Prose Studies (1991), Fielding's Linguistic Sub-Stratum,Orbis Litterarum (1990).  相似文献   

3.
When students use computers as learning tools, the whole process of learning, and, indeed, the learners themselves, are transformed. This article illustrates some techniques that foster transformative learning in computer-assisted first-year literature classes: first, a lesson plan on A Valediction: Forbidding Mourning that uses Microsoft Word functions, including format painter, tables, and annotation to explore meaning in context; second, a plan for learners to use subconference options in the Daedalus Interactive Writing Environment to analyze Oedipus Rex; finally, a demonstration of how students engage in a meta-reflection process as they explore Barn Burning with Freelance Graphics.Marguerite Jamieson is an English instructor at Anne Arundel Community College in Arnold, Maryland, and a doctoral student at George Mason University. Her research interests include forming bridges between adult learning theory and contemporary literary theory — especially drawing on transformational learning theory and the work of Mikhail Bakhtin and Lev Vygotsky.Rebecca Kajs holds a doctorate in English from Texas Woman's University with a concentration in rhetoric. For ten years, she taught the use of heuristic tools for reading analysis at the University of Texas at Arlington. She is currently an associate professor of English and Philosophy at Anne Arundel Community College.Anne Agee holds a doctorate in rhetoric from The Catholic University of America. A professor of English and formerly director of the Humanities Computer Center at Anne Arundel Community College, she is currently the college's Coordinator of Instructional Technology. Dr. Agee and Professor Jamieson have collaborated in a study of the learning environment in a computer classroom, the results of which were published in the Fall 1995 issue of Teaching/Learning Conversations. Dr. Agee has also published Using [Daedalus] InterChange as a Teachers' Journal in the Fall 1995 issue of Wings.  相似文献   

4.
Literature instructors are using hypertext to enhance their teaching in a broad variety of ways that includes putting course materials on the WWW; creating online tutorials; using annotated hypertexts in addition to or in lieu of print texts; having students write hypertexts; examining the medium of hypertext as a literary and cultural theme; and studying hypertext fiction in the context of traditional literature classes. The article describes examples of each of these uses of hypertext in teaching literature and provides sources of further examples of and information on using hypertext as a teaching tool in literature classes.Seth R. Katz is Assistant Professor of English at Bradley University in Peoria, IL. His research interests include computer applications in teaching literature and writing, and the grammatical analysis of poetic language. His recent publications include Graduate Programs and Job Training in Profession 95.I presented a version of this article as part of a session on Hypertexts for Teaching Imaginative Literature at the MLA Convention in Chicago, December 29, 1995.  相似文献   

5.
This paper suggests ways in which the pattern-matching capability of the computer can be used to further our understanding of stylized ballad language. The study is based upon a computer-aided analysis of the entire 595,000- word corpus of Francis James Child'sThe English and Scottish Popular Ballads (1882–1892), a collection of 305 textual traditions, most of which are represented by a variety of texts. The paper focuses on the Mary Hamilton tradition as a means of discussing the function of phatic language in the ballad genre and the significance of textual variation.Cathy Lynn Preston is a Research Associate, Computer Research in the Humanities, at the University of Colorado, Boulder. She is interested in folklore, particularly oral narrative; popular literature of the 18th- and 19th-century, particularly broadside and chapbook; the works of John Gay, Jonathan Swift, Thomas Hardy; Middle English romance and lyric. Her major publications areA KWIC Concordance to Jonathan Swift's A Tale of a Tub, The Battle of the Books, and A Discourse Concerning the Mechanical Operation of the Spirit, A Fragment, (New York: Garland Publishing, 1984) (co-authored with Harold D. Kelling), andA KWIC Concordance to Thomas Hardy's Tess of the d'Urbervilles, (New York: Garland Publishing, 1989).  相似文献   

6.
    
In And Then There Were None, Ward Elliot and Robert Valenza report on the work of the Shakespeare Clinic (Claremont McKenna Colleges, 1987–1995). Working from popular theories that William Shakespeare is not the true author of the plays and poems ascribed to him, Elliot and Valenza cast a broad net to find another writer whose distinctive linguistic features match those of the Shakespeare canon. A regime of 51 tests was designed whereby to compare Shakespeare's drama with 79 non-Shakespearean (or at least noncanonical) plays. Success rates at or near 100% are reported for the Elliot-Valenza tests in distinguishing Shakespeare from non-Shakespeare. A smaller battery of tests was designed for distinguishing Shakespeare poems from nondramatic texts by other poets, with similar success rates being reported. But many of the Elliot-Valenza tests are deeply flawed, both in their design and execution.Donald Foster is the Jean Webster Professor of Dramatic Literature in the Dept. of English at Vassar College.  相似文献   

7.
Exploiting user feedback to compensate for the unreliability of user models   总被引:1,自引:1,他引:0  
Natural Language is a powerful medium for interacting with users, and sophisticated computer systems using natural language are becoming more prevalent. Just as human speakers show an essential, inbuilt responsiveness to their hearers, computer systems must tailor their utterances to users. Recognizing this, researchers devised user models and strategies for exploiting them in order to enable systems to produce the best answer for a particular user.Because these efforts were largely devoted to investigating how a user model could be exploited to produce better responses, systems employing them typically assumed that a detailed and correct model of the user was available a priori, and that the information needed to generate appropriate responses was included in that model. However, in practice, the completeness and accuracy of a user model cannot be guaranteed. Thus, unless systems can compensate for incorrect or incomplete user models, the impracticality of building user models will prevent much of the work on tailoring from being successfully applied in real systems. In this paper, we argue that one way for a system to compensate for an unreliable user model is to be able to react to feedback from users about the suitability of the texts it produces. We also discuss how such a capability can actually alleviate some of the burden now placed on user modeling. Finally, we present a text generation system that employs whatever information is available in its user model in an attempt to produce satisfactory texts, but is also capable of responding to the user's follow-up questions about the texts it produces.Dr. Johanna D. Moore holds interdisciplinary appointments as an Assistant Professor of Computer Science and as a Research Scientist at the Learning Research and Development Center at the University of Pittsburgh. Her research interests include natural language generation, discourse, expert system explanation, human-computer interaction, user modeling, intelligent tutoring systems, and knowledge representation. She received her MS and PhD in Computer Science from the University of California at Los Angeles, and her BS in Mathematics and Computer Science from the University of California at Los Angeles. She is a member of the Cognitive Science Society, ACL, AAAI, ACM, IEEE, and Phi Beta Kappa. Readers can reach Dr. Moore at the Department of Computer Science, University of Pittsburgh, Pittsburgh, PA 15260.Dr. Cecile Paris is the project leader of the Explainable Expert System project at USC's information Sciences Institute. She received her PhD and MS in Computer Science from Columbia University (New York) and her bachelor's degree from the University of California in Berkeley. Her research interests include natural language generation and user modeling, discourse, expert system explanation, human-computer interaction, intelligent tutoring systems, machine learning, and knowledge acquisition. At Columbia University, she developed a natural language generation system capable of producing multi-sentential texts tailored to the users level of expertise about the domain. At ISI, she has been involved in designing a flexible explanation facility that supports dialogue for an expert system shell. Dr. Paris is a member of the Association for Computational Linguistics (ACL), the American Association for Artificial Intelligence (AAAI), the Cognitive Science Society, ACM, IEEE, and Phi Kappa Phi. Readers can reach Dr. Paris at USC/ISI, 4676 Admiralty Way, Marina Del Rey, California, 90292  相似文献   

8.
Harnad's proposed robotic upgrade of Turing's Test (TT), from a test of linguistic capacity alone to a Total Turing Test (TTT) of linguisticand sensorimotor capacity, conflicts with his claim that no behavioral test provides even probable warrant for attributions of thought because there is no evidence of consciousness besides private experience. Intuitive, scientific, and philosophical considerations Harnad offers in favor of his proposed upgrade are unconvincing. I agree with Harnad that distinguishing real from as if thought on the basis of (presence or lack of) consciousness (thus rejecting Turing (behavioral) testing as sufficient warrant for mental attribution)has the skeptical consequence Harnad accepts — there is in factno evidence for me that anyone else but me has a mind. I disagree with hisacceptance of it! It would be better to give up the neo-Cartesian faith in private conscious experience underlying Harnad's allegiance to Searle's controversial Chinese Room Experiment than give up all claim to know others think. It would be better to allow that (passing) Turing's Test evidences — evenstrongly evidences — thought.  相似文献   

9.
Abe  Naoki  Mamitsuka  Hiroshi 《Machine Learning》1997,29(2-3):275-301
We propose a new method for predicting protein secondary structure of a given amino acid sequence, based on a training algorithm for the probability parameters of a stochastic tree grammar. In particular, we concentrate on the problem of predicting -sheet regions, which has previously been considered difficult because of the unbounded dependencies exhibited by sequences corresponding to -sheets. To cope with this difficulty, we use a new family of stochastic tree grammars, which we call Stochastic Ranked Node Rewriting Grammars, which are powerful enough to capture the type of dependencies exhibited by the sequences of -sheet regions, such as the parallel and anti-parallel dependencies and their combinations. The training algorithm we use is an extension of the inside-outside algorithm for stochastic context-free grammars, but with a number of significant modifications. We applied our method on real data obtained from the HSSP database (Homology-derived Secondary Structure of Proteins Ver 1.0) and the results were encouraging: Our method was able to predict roughly 75 percent of the -strands correctly in a systematic evaluation experiment, in which the test sequences not only have less than 25 percent identity to the training sequences, but are totally unrelated to them. This figure compares favorably to the predictive accuracy of the state-of-the-art prediction methods in the field, even though our experiment was on a restricted type of -sheet structures and the test was done on a relatively small data size. We also stress that our method can predict the structure as well as the location of -sheet regions, which was not possible by conventional methods for secondary structure prediction. Extended abstracts of parts of the work presented in this paper have appeared in (Abe & Mamitsuka, 1994) and (Mamitsuka & Abe, 1994).  相似文献   

10.
Radical changes in class discussion using networked computers   总被引:2,自引:0,他引:2  
This study examines the effects of conducting class discussion on a local area network. A real time networking program (INTERCHANGE) was used for class discussion in freshman and senior literature courses and in a graduate humanities computing class. Pseudonyms, collaborative exams and essays, and computer-assisted reading were tested, along with organization of the students by sex and personality type. At the beginning and end of each semester in each class students were asked 50 to 70 multiple choice questions. Their answers revealed that the many advantages of computer assisted class discussion (CACD) clearly outweigh the disadvantages. Jerome Bump, Professor of English at the University of Texas, is the author of Gerard Manley Hopkins (1982) and CAI in Writing at the University: Some Recommendations, Computers and Education 11, 2 (1987), 121–33. He explores the interface of CAI, psychology, and the humanities.  相似文献   

11.
Summary A formal functional specification of a serializable interface for an interactive database is given and refined into two different versions with distinct strategies for solving read/write conflicts. The formalization is based on techniques of algebraic specification for defining the basic data structures and functional system specification by streams and stream processing functions for defining the properties concerning interaction. It is especially demonstrated how different specification techniques can be used side by side. Manfred Broy finished his studies with the Diplom in Mathematics and Computer Science at the Technical University of Munich. Till 1983 he was research and teaching assistant at the Institut für Informatik and the Sonderforschungsbereich 49 Programmiertechnik. At the Technical University of Munich he also did his Ph.D. (in February 1980 with the subject: Transformation parallel ablaufender Programme) and qualified as an university lecturer (in 1982 with the subject: A Theory for Nondeterminism, Parallelism, Communication and Concurrency). In April 1983 he became a Full Professor for Computer Science at the Faculty of Mathematics and Computer Science at the University of Passau. Since October 1989 he has been Full Professor for Computer Science at the Technical University of Munich. His fields of interests are: Programming languages, program development, programming methodology and distributed systems.This work was supported by the DFG Project Transformation paralleler Programme and by the Sonderforschungsbereich 342 Werkzeuge und Methoden für die Nutzung paralleler Architekturen  相似文献   

12.
The adaptiveness of agents is one of the basic conditions for the autonomy. This paper describes an approach of adaptiveness forMonitoring Cognitive Agents based on the notion of generic spaces. This notion allows the definition of virtual generic processes so that any particular actual process is then a simple configuration of the generic process, that is to say a set of values of parameters. Consequently, generic domain ontology containing the generic knowledge for solving problems concerning the generic process can be developed. This lead to the design of Generic Monitoring Cognitive Agent, a class of agent in which the whole knowledge corpus is generic. In other words, modeling a process within a generic space becomes configuring a generic process and adaptiveness becomes genericity, that is to say independence regarding technology. In this paper, we present an application of this approach on Sachem, a Generic Monitoring Cognitive Agent designed in order to help the operators in operating a blast furnace. Specifically, the NeuroGaz module of Sachem will be used to present the notion of a generic blast furnace. The adaptiveness of Sachem can then be noted through the low cost of the deployment of a Sachem instance on different blast furnaces and the ability of NeuroGaz in solving problem and learning from various top gas instrumentation.  相似文献   

13.
    
This article uses recent work on the computer-aided analysis of texts by the French writer Céline as a framework to discuss Olsen's paper on the current state of computer-aided literary analysis. Drawing on analysis of syntactic structures, lexical creativity and use of proper names, it makes two points: (1) given a rich theoretical framework and sufficiently precise models, even simple computer tools such as text editors and concordances can make a valuable contribution to literary scholarship; (2) it is important to view the computer not as a device for finding what we as readers have failed to notice, but rather as a means of focussing more closely on what we have already felt as readers, and of verifying hypotheses we have produced as researchers.Johanne Bénard is an Assistant Professor of French. She finished her Ph.D. thesis at the Université de Montréal in 1989 and is working on a book which can be described as an autobiographical reading of Céline's work. She has published various articles on Céline's correspondence (the latest being La lettre du/au père,Colloque international de Toulouse L.-F. Céline, 1990) and on the theory of autobiography (Le contexte de l'autobiographie,RSSI 11 [1991]). Her present interest is the linguistic aspects of Céline's text and the theory of orality.Greg Lessard is an Associate Professor in the French Studies and Computing and Information Science departments. His research areas include natural language generation, computer-aided text analysis, and the linguistic analysis of second-language performance errors. Recent publications include articles inResearch in Humanities Computing: 1989 on orality in Canadian French novels, and inLiterary and Linguistic Computing, 6, 4 (1991) on repeated structures in literary texts.  相似文献   

14.
This study compares pencil-and-paper and computer-assisted versions of a college process/model program in critical thinking and academic writing to a traditional composition program. Students in the experimental sections used more linguistic markers of argument and comparison/contrast, attempted more arguments and made stronger arguments. CAI students also did better than students in the pencil-and-paper sections on some measures. Applications of computer assisted techniques in studying meaning in poetry, teaching technical writing and in teaching practice, suggest new approaches to collaborative thinking and writing. Thomas Bacig is a Professor of Humanities at the University of Minnesota, Duluth with research i interests in: computer assisted instruction in writing, reading and thinking; science fiction; and forest history. His most recent publication is: How Computer-Assisted Instruction Informs the Design of Conventional Classroom Activities in English Composition (with D. Larmouth), Proceedings of the 19th Annual Small College Computing Symposium, 1987. Robert Evans is Associate Professor of Philosophy at the University of Minnesota, Duluth with research interests in logic, American philosophy and philosophy of law. His most recent publication is a review of Volume 11, John Dewey, The Later Works containing Dewey's major political work, Liberalism and Social Action, in Transactions of the Charles S. Peirce Society, 15, winter 1989.Donald Larmouth is Professor of Linguistics and Dean of Arts, Sciences and Graduate programs at the University of Wisconsin, Green Bay. He has research interests in dialect geography, computers in composition and language policy. His most recent publication is Does Linguistic Heterogenity Erode National Unity? in Thomas Tonneson's Ethnicity and Language, Institute on Race and Ethnicity of the University of Wisconsin System, 1988.Kenneth Risdon is Assistant Professor of Composition at the University of Minnesota, Duluth with research interests in writing with computers, use of computer networks, and computer analysis of text.  相似文献   

15.
The number of virtual connections in the nodal space of an ATM network of arbitrary structure and topology is computed by a method based on a new concept—a covering domain having a concrete physical meaning. The method is based on a network information sources—boundary switches model developed for an ATM transfer network by the entropy approach. Computations involve the solution of systems of linear equations. The optimization model used to compute the number of virtual connections in a many-category traffic in an ATM network component is useful in estimating the resource of nodal equipment and communication channels. The variable parameters of the model are the transmission bands for different traffic categories.  相似文献   

16.
    
Critics have condemned English Romantic tragedies as a series of poor imitations of Renaissance tragedy. This paper tests such literary-critical questions through statistical comparisons of ten plays from each group. The measures chosen give evidence of a strong and consistent difference between the groups, going beyond historical changes in the language. The Romantic tragedies are more expository; the Renaissance ones include more commonplace interactions between characters. The later plays do not show the marked variations in function-word frequencies of their predecessors. Of the Renaissance plays, Shakespeare's show the closest affinity to the Romantic tragedies, and the most telling contrasts.After retiring from his Chair of English in 1989, John Burrows became Honorary Director of the Centre for Literary and Linguistic Computing at the University of Newcastle, N.S.W. His publications includeComputation into Criticism: A Study of Jane Austen and an Experiment in Method (Oxford: Clarendon, 1987). He is now working on another book.D.H. Craig is an Associate Professor in the English Department at the University of Newcastle, N.S.W. He has editedBen Jonson: The Critical Heritage (London: Routledge, 1990) and is writing a book on Jonson's style, based on frequency counts of very common words.  相似文献   

17.
In this paper is indicated the possible utility of isotonic spaces as a background language for discussing systems. In isotonic spaces the basic duality between neighborhood and convergent first achieves a proper background permitting applications beyond the scope of topological spaces. A generalization of continuity of mappings based on ancestral relations is presented and this definition is applied to establish a necessary and sufficient condition that mappings preserve connectedness. Fortunately for systems theory, it is not necessary to have infinite sets or infinitary operators to apply definitions of neighborhood, convergents, continuity and connectedness.This work was supported in part by a grant from the National Science Foundation.  相似文献   

18.
This paper discusses some of the key drivers that will enable businesses to operate effectively on-line, and looks at how the notion of website will become one of an on-line presence which will support the main activities of an organisation. This is placed in the context of the development of the information society which will allow individuals-as consumers or employees-quick, inexpensive and on-demand access to vast quantities of entertainment, services and information. The paper draws on an example of these developments in Australasia.  相似文献   

19.
In this paper, we propose a two-layer sensor fusion scheme for multiple hypotheses multisensor systems. To reflect reality in decision making, uncertain decision regions are introduced in the hypotheses testing process. The entire decision space is partitioned into distinct regions of correct, uncertain and incorrect regions. The first layer of decision is made by each sensor indepedently based on a set of optimal decision rules. The fusion process is performed by treating the fusion center as an additional virtual sensor to the system. This virtual sensor makes decision based on the decisions reached by the set of sensors in the system. The optimal decision rules are derived by minimizing the Bayes risk function. As a consequence, the performance of the system as well as individual sensors can be quantified by the probabilities of correct, incorrect and uncertain decisions. Numerical examples of three hypotheses, two and four sensor systems are presented to illustrate the proposed scheme.  相似文献   

20.
We consider two formalisations of the notion of a compositionalsemantics for a language, and find some equivalent statements in termsof substitutions. We prove a theorem stating necessary and sufficientconditions for the existence of a canonical compositional semanticsextending a given partial semantics, after discussing what features onewould want such an extension to have. The theorem involves someassumptions about semantical categories in the spirit of Husserl andTarski.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号