首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
In this paper we develop and analyze a new superconvergent local discontinuous Galerkin (LDG) method for approximating solutions to the fourth-order Euler–Bernoulli beam equation in one space dimension. We prove the $L^2$ stability of the scheme and several optimal $L^2$ error estimates for the solution and for the three auxiliary variables that approximate derivatives of different orders. Our numerical experiments demonstrate optimal rates of convergence. We also prove superconvergence results towards particular projections of the exact solutions. More precisely, we prove that the LDG solution and its spatial derivatives (up to third order) are $\mathcal O (h^{k+3/2})$ super close to particular projections of the exact solutions for $k$ th-degree polynomial spaces while computational results show higher $\mathcal O (h^{k+2})$ convergence rate. Our proofs are valid for arbitrary regular meshes and for $P^k$ polynomials with $k\ge 1$ , and for periodic, Dirichlet, and mixed boundary conditions. These superconvergence results will be used to construct asymptotically exact a posteriori error estimates by solving a local steady problem on each element. This will be reported in Part II of this work, where we will prove that the a posteriori LDG error estimates for the solution and its derivatives converge to the true errors in the $L^2$ -norm under mesh refinement.  相似文献   

2.
Methods for computer aided design and analysis on color-woven fabrics are presented. The relation of color pattern, colored warp, colored weft and the model for regular weave texture are given based on their formative principle. Some efficient algorithms are developed to determine the color order of warp and weft threads which yields a given color pattern and regular weave texture. Finally, the design system structure and data process flowing are introduced.  相似文献   

3.
Citizen science broadly describes citizen involvement in science. Citizen science has gained significant momentum in recent years, brought about by widespread availability of smartphones and other Internet and communications technologies (ICT) used for collecting and sharing data. Not only are more projects being launched and more members of the public participating, but more human–computer interaction (HCI) researchers are focusing on the design, development, and use of these tools. Together, citizen science and HCI researchers can leverage each other’s skills to speed up science, accelerate learning, and amplify society’s well-being globally as well as locally. The focus of this article is on HCI and biodiversity citizen science as seen primarily through the lens of research in the author’s laboratory. The article is framed around five topics: community, data, technology, design, and a call to save all species, including ourselves. The article ends with a research agenda that focuses on these areas and identifies productive ways for HCI specialists, science researchers, and citizens to collaborate. In a nutshell, while species are disappearing at an alarming rate, citizen scientists who document species’ distributions help to support conservation and educate the public. HCI researchers can empower citizen scientists to dramatically increase what they do and how they do it.  相似文献   

4.
In this paper new a posteriori error estimates for the local discontinuous Galerkin (LDG) method for one-dimensional fourth-order Euler–Bernoulli partial differential equation are presented and analyzed. These error estimates are computationally simple and are obtained by solving a local steady problem with no boundary condition on each element. We use the optimal error estimates and the superconvergence results proved in Part I to show that the significant parts of the discretization errors for the LDG solution and its spatial derivatives (up to third order) are proportional to \((k+1)\) -degree Radau polynomials, when polynomials of total degree not exceeding \(k\) are used. These results allow us to prove that the \(k\) -degree LDG solution and its derivatives are \(\mathcal O (h^{k+3/2})\) superconvergent at the roots of \((k+1)\) -degree Radau polynomials. We use these results to construct asymptotically exact a posteriori error estimates. We further apply the results proved in Part I to prove that, for smooth solutions, these a posteriori LDG error estimates for the solution and its spatial derivatives at a fixed time \(t\) converge to the true errors at \(\mathcal O (h^{k+5/4})\) rate. We also prove that the global effectivity indices, for the solution and its derivatives up to third order, in the \(L^2\) -norm converge to unity at \(\mathcal O (h^{1/2})\) rate. Our proofs are valid for arbitrary regular meshes and for \(P^k\) polynomials with \(k\ge 1\) , and for periodic and other classical mixed boundary conditions. Our computational results indicate that the observed numerical convergence rates are higher than the theoretical rates. Finally, we present a local adaptive procedure that makes use of our local a posteriori error estimate.  相似文献   

5.
In the very beginning, the Computer Laboratory of the University of Cambridge was founded to provide computing service for different disciplines across the university. As computer science developed as a discipline in its own right, boundaries necessarily arose between it and other disciplines, in a way that is now often detrimental to progress. Therefore, it is necessary to reinvigorate the relationship between computer science and other academic disciplines and celebrate exploration and creativ...  相似文献   

6.
Integrated modelling and assessment can facilitate exploration of complex social–ecological interactions and quantify trade-offs in regional policy, planning, and management options. However, there have been challenges in its acceptance and adoption for supporting decisions. Here we overcome this implementation gap through the development of an interactive online tool called the Landscape Futures Analysis Tool (LFAT) (http://www.lfat.org.au/). Identifying four high priority regional management issues; agricultural production, carbon sequestration, biodiversity conservation and weed management, we developed a series of simple models to explore them through a range of environmental and economic scenarios including climate change, carbon price, agricultural commodity price, and production costs. These models were implemented within the LFAT to allow users to select, query and explore combinations of key variables and examine their impact on each of the management issues through a range of interactive maps and summary statistics.  相似文献   

7.
Available statistical data shows that the cost of repairing software faults rises dramatically in later development stages. In particular, the new technology of generating implementation code from architectural specifications requires highly reliable designs. Much research has been done at this stage using verification and validation techniques to prove correctness in terms of certain properties. A prominent approach of this category is model checking (Atlee, J.M., and Gannon, J. 1993. State-based model checking of event-driven systems requirements. IEEE Trans. Software Eng., 19(1): 24–40.) Such approaches and the approach of software testing are complementary. Testing reveals some errors that cannot be easily identified through verification, and vice versa. This paper presents the technology and the accompanying tool suite to the testing of software architecture specifications. We apply our state-of-the-art technology in software coverage testing, program diagnosis and understanding to software architectural designs. Our technology is based on both the control flow and the data flow of the executable architectural specifications. It first generates a program flow diagram from the specification and then automatically analyses the coverage features of the diagram. It collects the corresponding flow data during the design simulation to be mapped to the flow diagram. The coverage information for the original specification is then obtained from the coverage information of the flow diagram. This technology has been used for C, C++, and Java, and has proven effective (Agrawal, H., Alberti, J., Li, J.J., et al. 1998. Mining system tests to aid software maintenance, IEEE Computer July, pp. 64–73.)  相似文献   

8.
Given a graph G=(V,E), a vertex v of G is a median vertex if it minimizes the sum of the distances to all other vertices of G. The median problem consists of finding the set of all median vertices of G. In this note, we present self-stabilizing algorithms for the median problem in partial rectangular grids and relatives. Our algorithms are based on the fact that partial rectangular grids can be isometrically embedded into the Cartesian product of two trees, to which we apply the algorithm proposed by Antonoiu and Srimani (J. Comput. Syst. Sci. 58:215–221, 1999) and Bruell et al. (SIAM J. Comput. 29:600–614, 1999) for computing the medians in trees. Then we extend our approach from partial rectangular grids to a more general class of plane quadrangulations. We also show that the characterization of medians of trees given by Gerstel and Zaks (Networks 24:23–29, 1994) extends to cube-free median graphs, a class of graphs which includes these quadrangulations.  相似文献   

9.
This paper derives from an interdisciplinary research project which is studying the engagement of young people with different aspects of techno-popular culture. The focus is on the young person and the significance of digital technologies in their lives as a whole. Drawing on cultural studies research we are investigating the ways in which the contexts for computer use are structured by the different discourses present within the family, and the ways in which these discourses may provide a framing context for children’s interactions with digital technology. Drawing on socio-cultural research we take the view that learning is learning to do something with a cultural or cognitive tool. Our analysis of data from case studies of 16 families shows that the context of home computer use amongst young people is far from a simple and uniform phenomenon and is structured by the different discourses present within the family. What young people learn through interaction with computers is thus as much framed by the context of use as by the affordance of the technology.  相似文献   

10.
A real-time synchronization algorithm for interprncessor communication ispresented, which is based on the techniques of Reference 2, except that bufferedcommunication is used. The upper and lower bounds of the mean response time of thisalgorithm is derived.  相似文献   

11.
Introduction A number of universities have recentlystarted to add baccalaureate programs inInformation Technology(IT)to their existingprograms in Computer Science(CS)and(Management)Information Systems(IS).Whilesome have welcomed this development,othersare less accommodating.The argument that ITbaccalaureate programs are not sufficiently distinctis most often heard from faculty in programs inComputer Science(CS)and(Management)Information Systems(IS).The argument isoften two-fold.First…  相似文献   

12.
The shorter term beat-to-beat heart rate data collected from the general population are often interrupted by artifacts, and an arbitrary exclusion of such individuals from analysis may significantly reduce the sample size and/or introduce selection bias. A computer algorithm was developed to label as artifacts any data points outside the upper and lower limits generated by a 5-beat moving average ±25% (or set manually by an operator using a mouse) and to impute beat-to-beat heart rate throughout an artifact period to preserve the timing relationships of the adjacent, uncorrupted heart rate data. The algorithm applies Fast Fourier Transformation to the smoothed data to estimate low-frequency (LF; 0.025–0.15 Hz) and high-frequency (HF; 0.16–0.35 Hz) spectral powers and the HF/LF ratio as conventional indices of sympathetic, vagal, and vagal–sympathetic balance components, respectively. We applied this algorithm to resting, supine, 2-min beat-to-beat heart rate data collected in the population-based Atherosclerosis Risk in Communities study to assess the performance (success rate) of the algorithm (N= 526) and the inter- and intra-data-operator repeatability of using this computer algorithm (N= 108). Eighty-eight percent (88%) of the records could be smoothed by the computer-generated limits, an additional 4.8% by manually set limits, and 7.4% of the data could not be processed due to a large number of artifacts in the beginning or the end of the records. For the repeatability study, 108 records were selected at random, and two trained data operators applied this algorithm to the same records twice within a 6-month interval of each process (blinded to each other's results and their own prior results). The inter-data-operator reliability coefficients were 0.86, 0.92, and 0.90 for the HF, LF, and HF/LF components, respectively. The average intra-data-operator reliability coefficients were 0.99, 0.99, and 0.98 for the HF, LF, and HF/LF components, respectively. These results indicate that this computer algorithm is efficient and highly repeatable in processing short-term beat-to-beat heart rate data collected from the general population, given that the data operators are trained according to standardized protocol.  相似文献   

13.
14.
15.
We report two experiments which tested whether cognitive capacities are limited to those functions that are computationally tractable (PTIME-Cognition Hypothesis). In particular, we investigated the semantic processing of reciprocal sentences with generalized quantifiers, i.e., sentences of the form Q dots are directly connected to each other, where Q stands for a generalized quantifier, e.g. all or most. Sentences of this type are notoriously ambiguous and it has been claimed in the semantic literature that the logically strongest reading is preferred (Strongest Meaning Hypothesis). Depending on the quantifier, the verification of their strongest interpretations is computationally intractable whereas the verification of the weaker readings is tractable. We conducted a picture completion experiment and a picture verification experiment to investigate whether comprehenders shift from an intractable reading to a tractable reading which should be dispreferred according to the Strongest Meaning Hypothesis. The results from the picture completion experiment suggest that intractable readings occur in language comprehension. Their verification, however, rapidly exceeds cognitive capacities in case the verification problem cannot be solved using simple heuristics. In particular, we argue that during verification, guessing strategies are used to reduce computational complexity.  相似文献   

16.
Future Trends in Computer Graphics: How Much is Enough?   总被引:1,自引:0,他引:1       下载免费PDF全文
Over the forty-year history of interactive computer graphics, there have been con-tinuous advances, but at some stage this progression must terminate with images being suifficently realistic for all practical purposes. How much detail do we really need? Polygon counts over a few million imply that on average each polygon paints less than a single pixel, making use of polygon shading hardware wasteful. We consider the problem of determining how much realism is required for a variety of applications. We discuss how current trends in computer graphics hardware, and in particular of graphics cards targeted at the computer games industry, will help or hinder achieve-ment of these requirements. With images now being so convincingly realistic in many cases, critical faculties are often suspended and the images are accepted as correct and truthful although they may well be incorrect and sometimes misleading or untruthful. Display resolution has remained largely constant in spatial terms for the last twenty yeaxs and in terms of the number of pixels has increased by less than an order of magnitude. If the long-promised breakthroughs in display technology are finally realised, how should we use the increased resolution?  相似文献   

17.
18.
Abstract

The rapid diversification of communities in Ontario has necessitated the provincial government to reevaluate public school curriculums and policies to make schools more inclusive and reflective of its diverse population. This article critically analyzes the content of the latest revised science curricula for Grades 1 to 10 and assesses the degree to which multiculturalism, including antiracism, principles found in provincial equity and inclusive policies are implemented. Though small progress has been made to support multicultural science education in the current compulsory science curricula, very little changes were observed in curriculum expectations, knowledge that students are required to acquire.  相似文献   

19.
We discuss the opportunity that the singularity inside a Schwarzschild black hole could be replaced by a regular bounce, described as a regular minimum of the spherical radius (instead of zero) and a regular maximum of the longitudinal scale (instead of infinity) in the corresponding Kantowski-Sachs metric. Such a metric in a vicinity of the bounce is shown to be a solution to the Einstein equations with the stress-energy tensor representing vacuum polarization of quantum matter fields, described by a combination of curvature-quadratic terms in the effective action. The indefinite parameters of the model can be chosen in such a way that it remains a few orders of magnitude apart from the Planck scale (say, on the GUT scale), that is, in a semiclassical regime.  相似文献   

20.
“Usability” is a construct conceived by the human–computer interaction (HCI) community to denote a desired quality of interactive systems and products. Despite its prominence and intensive use in HCI research, the usefulness of the usability construct to HCI theories and to our understanding of HCI has been meager. In this article I propose and discuss two reasons for this state of affairs. The first is that usability is an umbrella construct. Umbrella constructs are prevalent in scientific fields that are broad, diverse, and lack a unifying research paradigm. Accordingly, umbrella constructs, such as usability, tend to be vague and loose, characteristics that challenge our ability to accumulate and communicate knowledge and to capture real-world phenomena. The second reason involves the nature of the relations between the usability construct and its measures, a topic rarely discussed in HCI research. There appears to be a mismatch between how the HCI community has (implicitly) conceptualized these relations and how it has empirically examined them. The relations have been conceptualized according to a formative measurement model but have mostly been tested according to a reflective measurement model. The trouble is that representing the usability construct by the reflective model appears inappropriate, and representing it by the formative model involves considerable difficulties. Possible ways of addressing these issues are discussed, each with its advantages and drawbacks. I conclude that for scientific research on this subject to progress, the usability construct ought to be unbundled and replaced by well-defined constructs. The issues discussed in this article are relevant to other HCI umbrella concepts and constructs such as user experience.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号