共查询到20条相似文献,搜索用时 0 毫秒
1.
Particulate matter (PM) with an aerodynamic diameter of <2.5 μm (PM 2.5) has become the primary air pollutant in most major cities in China. Some studies have indicated that there is a positive correlation between the aerosol optical thickness (AOT) and surface-level PM 2.5 concentration. In order to estimate PM 2.5 concentration over large areas, a model relating the concentration of PM 2.5 and AOT has been established. The scale height of aerosol and relative humidity as well as the effect of surface temperature and wind velocity were introduced to enhance the model. 2013 full year Moderate Resolution Imaging Spectroradiometer (MODIS) AOT data and ground measurements of the PM 2.5 concentration in the Beijing–Tianjin–Hebei region were used to fit a seasonal multivariate linear equation relating PM 2.5 concentration and AOT, and the accuracy of the model has been determined. When comparing MODIS-estimated PM 2.5 with the measurements from ground monitoring stations during spring, summer, autumn and winter, we found the R2 values were 0.45, 0.45, 0.37, and 0.31, respectively. Based on this model, the spatial distribution of PM 2.5 concentration during four typical haze events sampled by seasons was derived, and displayed with the backward air trajectories calculated using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model. We undertook a preliminary analysis about the source of surface-level PMs and the process of its accumulation and dispersion during the haze episodes by analysing the effect of terrain and topography in the specific location of the Beijing–Tianjin–Hebei region. The spatial distribution of the PM 2.5 concentration showed that the high value region was generally in the southeast of the study area, which approximately overlapped an area of lower vegetation coverage, and the temporal variation of PM 2.5 concentration indicated that the air pollution was more severe during winter and spring than summer and autumn. The results of the analysis of backward air trajectories suggested that the hazy weather in the Beijing–Tianjin–Hebei region was mainly caused by unfavourable terrain and weather conditions. 相似文献
2.
In a keiretsu, i.e., a set of companies with interlocking business relationships, it is important for corporate management to identify those companies over which they reciprocally exercise greater influence and power. In this article we employ a DEcision MAking Trial and Evaluation Laboratory (DEMATEL) to illustrate the reciprocal influence of each company in Mazda’s Yokokai keiretsu, as measured by the number of transactions and cross-shareholdings. Furthermore, we calculate the centrality index of each company, and then analyze the relationship between centrality index and influence in order to identify the determinants of the influence. Based on the findings, we identify some characteristics of effective relationships which have important managerial implications. 相似文献
3.
Neural Computing and Applications - Seismic catalogs are vital to understanding and analyzing the progress of active fault systems. The background seismicity rate in a seismic catalog, strongly... 相似文献
4.
Many students find it difficult to engage with mathematical concepts. As a relatively new class of learning tools, visualization tools may be able to promote higher levels of engagement with mathematical concepts. Often, development of new tools may outpace empirical evaluations of the effectiveness of these tools, especially in educational contexts. This seems to be the case with educational visualization tools. Much evidence about the effectiveness of these tools appears to be more suggestive than based on empirical evaluations. In this paper, we attempt to fill this gap and provide empirical evidence for the use of visualization tools in supporting exploratory and other learning-related activities. In particular, we aim to investigate whether visualization tools can be used to engage pre-university students in exploring non-trivial mathematical concepts. We focus particularly on this age group and content domain because of the difficulty these students may encounter when trying to investigate more challenging mathematical concepts. Also, it is during their formative years before university that students’ predisposition and likeness towards mathematical ideas are formed. We report in this paper a study assessing whether a visualization tool, whose design was informed explicitly by research from information visualization and human–computer interaction, could engage pre-university students in their exploration and learning of more advanced mathematical concepts. Students who participated in this study came from multiple grade levels and have diverse cognitive and language skills as well as preferences towards mathematics. The results of this study indicate that visualization tools can effectively engage these students and support their exploration of non-trivial mathematical concepts, only if the tool is designed such that it can cater the diverse needs of these students. 相似文献
5.
Field geological observations have both spatial and non-spatial aspects and recording them directly on a personal computer using a digital mapping tool has become a practical and effective alternative to traditional methods of field data collection and mapping. This paper presents the design of a cost-effective, stand-alone digital field-mapping tool named GRDM that caters to special requirements of field-based studies concerned with spatial disposition of the statistics of field measurements. Such studies require recording multiple observations for individual attributes at each field location to capture the inter-site variability and automatic computation of their statistics. Field observations include directional data that are circular in nature. Therefore, computation of their exclusive statistics within the field system is also necessary. To meet these requirements, GRDM was designed for field personnel lacking expertise in customizing a GIS. Its design automatically accommodates a list of values for each non-spatial attribute attached to individual location points and generates statistics from the lists. The system treats the orientation values as a distinct numeric data type and computes circular statistics for them. It makes both the original data as well as their statistics simultaneously available for extraction of thematic information. 相似文献
6.
Cognition, Technology & Work - Examining different team dynamics and understanding collective activities in home environments are two important challenges for ergonomics and its related fields.... 相似文献
7.
We designed a vibrotactile vest and The Humming Wall, a vibroacoustic interactive furniture set in an urban environment to interact with each other. We developed the vibrotactile patterns in the vest as a form of vibrotactile language to convey information to the wearer. In addition, we designed a set of interactive movements on The Humming Wall that would trigger patterns on the vest and elicit sensations and encourage body movements onto the wearer’s body. We invited people to interact in pairs at The Humming Wall, with one at the wall and one wearing the vest (they later swapped roles). Actions by the one at the wall, such as swiping or knocking on the wall were repeated on the vest wearer’s body. In addition, participants could ‘feel’ (vibroacoustically) and hear their own heartbeats and breath rates at the wall. We conducted a field trial with 39 participants over a 5-week period. Participants wearing the vest (and their pair) completed a set of tasks. We logged use and responses, recorded all activities on video and conducted post-experiment interviews and questionnaires. The results depicted the participants’ experience, communication and connection while wearing the vibrotactile vest and interacting with the wall. The findings show convincing, strong and positive responses to novel interactions between the responsive vibroacoustic environment and the vibrotactile vest. This work constitutes the first field trial with people ‘working’ in pairs with a vibrotactile wearable responding to and driving vibroacoustic displays with an interactive vibroacoustic environment. 相似文献
10.
Abstract Human performance comparisons on interactive systems were drawn between output displays (CRT and LCD) across settings of control-display gain. Empirical evidence was sought in light of the common feeling in the user community that motor-sensory tasks are more difficult on a system equipped with an LCD display vs. a CRT display. In a routine target acquisition task using a mouse, movement times were 34% longer and motor-sensory bandwidth was 25% less when the output display was an LCD vs. a CRT. No significant difference in error rates was found. Control-display (C-D) gain was tested as a possible confounding factor; however, no interaction effect was found. There was a significant, opposing main effect for C-D gain on movement lime and error rates, illustrating the difficulty in optimizing C-D gain on the basis of movement time alone. 相似文献
11.
Abstract Philosophers and architects have beeninvestigating how human presence is integrated into ‘constructed realities’for many years. Architects now need tolook to cinema for ways of connecting disparate spaces and relating them through movement, and in time. Framing interactive experience with the help of Stanislavskian categories and building story-blocks into games, brings narrative, dramatic and emotional‘added value’ to the process of immersion in cinematic fiction. By prototyping an easily-adaptable modelof architecture for audiovisualspace, the experimental project House of Affects aims to create a matrix of construction for spaces for cinematic interactive experience. 相似文献
12.
The digital transformation sets new requirements to all classes of enterprise systems in companies. ERP systems in particular, which represent the dominant class of enterprise systems, are struggling to meet the new requirements at all levels of the architecture. Therefore, there is an urgent need to reconsider the overall architecture of the systems and address the root of the related issues. Given that many restrictions ERP pose on their adaptability are related to the standardization of data, the database layer of ERP systems is addressed. Since database serve as the foundation for data storage and retrieval, they limit the flexibility of enterprise systems and the chance to adapt to new requirements accordingly. So far, relational databases are widely used. Using a systematic literature approach, recent requirements for ERP systems were identified. Prominent database approaches were assessed against the 23 requirements identified. The results reveal the strengths and weaknesses of recent database approaches. To this end, the results highlight the demand to combine multiple database approaches to fulfill recent business requirements. From a conceptual point of view, this paper supports the idea of federated databases which are interoperable to fulfill future requirements and support business operation. This research forms the basis for renewal of the current generation of ERP systems and proposes to ERP vendors to use different database concepts in the future. 相似文献
13.
One of the most important characteristics of chance discovery is that it focuses on the specific events or patterns in which
the essential nature of an applied domain is implicitly included. The understanding and forecasting of such patterns and events
will have a significant impact on decision making in the applied domain. This paper discusses the meaning of chance discovery
from the viewpoint of medicine. Since chance discovery in medicine can be viewed as the way to find a suitable occasion for
some critical actions or to check the dangerous possibilities, called rare risky events, detection and interpretation of rare
but important events are ones of the components that supports chance discovery. According to this observation, several approaches
for detecting rare events were introduced and evaluated by a small dataset on neurological diseases. Experimental results
show that a set of events which include rare risky events can be detected by the introduced detection method, though interpretation
by domain experts is required for selection of such events.
Shusaku Tsumoto, Ph.D.: He graduated from Osaka University, School of Medicine in 1989. After residents of neurology in Chiba University Hospital,
he was involved in developing hospital information system in Chiba University Hospital. He moved to Tokyo Medical University
in 1993 and started his research on rough sets and data mining in medicine. He received his Ph.D (Computer Science) from Tokyo
Institute of Technology in 1997, and is now a Professor at Department of Medical Informatics, Shimane Medical University.
His interests include approximate reasoning, data mining, fuzzy sets, knowledge acquisition, mathematical theory of data mining,
and rough sets (alphabetical order). 相似文献
14.
Thermal engineering deals with the estimation of the temperature at different spatial points and different instants for a given set of boundary and initial conditions. For this purpose, the reference model is a numerical simulation model but it is time-consuming. Consequently we build a surrogate model in order to replace it. This surrogate model is a recursive multilayer perceptron, independent of the boundary conditions and parametrized by the statistical learning of multidimensional temporal trajectories computed with the reference model. It emulates the outputs of the reference model over time from the only knowledge of initial conditions and exogenous variables. Moreover this model is able to predict these outputs in steady state, even if its formulation is time-dependent.A new methodology is proposed so as to overcome the learning problem associated to the very weak number of trajectories available for the surrogate model construction. The first step attempts to build a more robust surrogate model by considering it as the average of local models resulting from the V-folds cross-validation technique. This new kind of multilayer perceptron is much more robust and accurate, in particular when the learning dataset is very small. The second step consists in the creation of a new learning dataset which is made up of each time observation coming from each trajectory. In this way, we artificially obtain a sizeable sample allowing all the classic neural networks constructions.Furthermore, many approaches exist in order to select the best hidden neurons number but most of them are costly or require a lot of observations. We consider here a non-asymptotic approach based on the minimization of a penalized criterion providing accurate results in an economical computational way. In order to calibrate precisely the penalty term, we use the slope heuristic or the dimension jump, recently introduced in a regression framework. The validation of the method is performed on a toy function.The prediction ability of the surrogate model built with the new methodology is successfully compared to usual constructions on a simplified problem and then applied to thermal engineering. 相似文献
15.
Artificial Intelligence applications in large-scale industry, such as fossil power plants, require the ability to manage uncertainty and time. In this paper, we present an intelligent system to assist an operator of a power plant. This system, called SEDRET, is based on a novel knowledge representation of uncertainty and time, called Temporal Nodes Bayesian Networks (TNBN), a type of Probabilistic Temporal Network. A set of temporal nodes and a set of edge define a TNBN, each temporal node is defined by a value of a variable and a time interval associate to the change of variable value. A TNBN generates a formal and systematic structure for modeling the temporal evolution of a process under uncertainty. The inference mechanism is based on probabilistic reasoning. A TNBN can be used to recognize events and state variables with respect to current plant conditions and predict the future propagation of disturbances. SEDRET was validated with the diagnosis and prediction of events in a steam generator with a power plant training simulator. The results performed in this work indicate that SEDRET can potentially improve plant availability through early diagnosis and prediction of disturbances that could lead to plant shutdown. 相似文献
16.
Abstract This study explores the range of experiences students have when making two kinds of decisions in relation to high school mathematics courses: what course to take, and how to and how much to apply themselves. Looking at the choices of students about to enter Grade 10, the first decision is their choice of courses. In mathematics, students leaving Grade 9 selected one course (usually), from five possibilities, for their Grade 10 year: an advanced‐placement Honours stream, an academic course with a traditional symbol‐manipulation approach, an academic course with a technology‐based applications approach, a non‐academic mathematics‐for‐citizenship course, and (as an imposed choice) repeating the Grade 9 mathematics course. The second point of decision‐making occurred within their mathematics and science courses—students constantly made choices about how, and how much, to apply themselves to the challenges of succeeding in the courses they had chosen. These students’ course choices, to a considerable extent, conformed to expectations based on the influence of socioeconomic status and prior achievement. Overwhelmingly, students were concerned more in the credentialing value of courses than their educational value or structural nature. Within their courses, most students focused their attention on doing the work rather than the content or the learning process. Students reported being encouraged to do their work by teachers, but could not provide any indication of tactical support with becoming effective learners. The final outcomes of the students’ marks suggest that, in the context of the study, Grade 10 mathematics courses are much more effective as gate‐keeping mechanisms than as opportunities for students to improve and succeed. 相似文献
17.
In the present study, a comprehensive assessment of the spatio-temporal variation of day-time and night-time land surface temperature (LST) and normalized difference vegetation index (NDVI) of Vadodara district of Gujarat in India from 2001 to 2012 has been carried out using satellite data. A significant cooling trend was observed in the day-time LST, whereas the night-time LST showed a distinct warming trend. The entire geographical extent of Vadodara was classified into different night-time LST classes to quantify the extent of the hot pockets, and it showed a clear-cut warming pattern for all the months of the year with an increase in the geographical areas under higher temperature range. Further analysis of Diurnal Temperature Range (DTR) also revealed a strong impact of the urbanization process, with annual DTR showing a decreasing trend at the rate of 0.29°C year ?1. An analysis of the vegetation cover of the district showed that on an average, the NDVI of the district increased during the study period. However, a micro-level examination of NDVI values depicted that the type of vegetation cover had drastically changed. The maximum NDVI values for months from May to December for 2012 were much lower than those of 2001 and 2006, indicating a change in vegetation pattern of the district. An assessment of the area under different NDVI values exhibited that for all the months of the year (except September), the total area with NDVI values of higher range (i.e. +0.5 and above) had substantially decreased from 2001 to 2012. The analysis revealed that for some of the months like February, while in 2001, 45% of district exhibited NDVI values above +0.5, but by 2012, it had decreased to only 18%, showing a drastic change in vegetation type and deterioration of the extent of thick dense vegetation. 相似文献
18.
A new method of ICA, TVICA, is proposed. Compared to the conventional ICA, the TVICA method allows the mixing matrix to be time dependent. Estimation is conducted under local homogeneity that assumes at any particular time point, there exists an interval over which the mixing matrix can be well approximated as constant. A sequential log likelihood-ratio testing procedure is used to automatically identify such local intervals. Numerical analysis demonstrates that TVICA provides good performance in homogeneous situations and does improve accuracy in nonstationary settings with possible structural change. In real data analysis with application to risk management, the TVICA confirms a superior performance when compared to several alternatives, including ICA, PCA and DCC-based models. 相似文献
20.
As a new form of sustainable development, the concept “Smart Cities” knows a large expansion during the recent years. It represents an urban model, refers to all alternative approaches to metropolitan ICTs case to enhance quality and performance of urban service for better interaction between citizens and government. However, the smart cities based on distributed and autonomous information infrastructure contains millions of information sources that will be expected more than 50 billion devices connected by using IoT or other similar technologies in 2020. In Information Technology, we often need to process and reason with information coming from various sources (sensors, experts, models). Information is almost always tainted with various kinds of imperfection: imprecision, uncertainty, ambiguity, we need a theoretical framework general enough to allow for the representation, propagation and combination of all kinds of imperfect information. The theory of belief functions is one such Framework. Real-time data generated from autonomous and distributed sources can contain all sorts of imperfections regarding on the quality of data e.g. imprecision, uncertainty, ignorance and/or incompleteness. Any imperfection in data within smart city can have an adverse effect over the performance of urban services and decision making. In this context, we address in this article the problem of imperfection in smart city data. We will focus on handling imperfection during the process of information retrieval and data integration and we will create an evidential database by using the evidence theory in order to improve the efficiency of smart city. The expected outcomes from this paper are (1) to focus on handling imperfection during the process of information retrieval and data integration (2) to create an evidential database by using the evidence theory in order to improve the efficiency of smart city. As experimentation we present a special case of modeling imperfect data in the field of Healthcare. An evidential database will be built which will contain all the perfect and imperfect data. These data come from several Heterogeneous sources in a context of Smart Cities. Imperfect aspects in the evidential database expressed by the theory of beliefs that will present in this paper. 相似文献
|