In an empirical user study, we assessed two approaches to ranking the results from a keyword search using semantic contextual match based on Latent Semantic Analysis. These techniques involved searches initiated from words found in a seed document within a corpus. The first approach used the sentence around the search query in the document as context while the second used the entire document. With a corpus of 20,000 documents and a small proportion of relevant documents (<0.1%), both techniques outperformed a conventional keyword search on a recall-based information retrieval (IR) task. These context-based techniques were associated with a reduction in the number of searches conducted, an increase in users’ precision and, to a lesser extent, an increase in recall. This improvement was strongest when the ranking was based on document, rather than sentence, context. Individuals were more effective on the IR task when the lists returned by the techniques were ranked better. User performance on the task also correlated with achievement on a generalized IQ test but not on a linguistic ability test. 相似文献
In order to alleviate traffic congestion for vehicles in urban networks, most of current researches mainly focused on signal optimization models and traffic assignment models, or tried to recognize the interaction between signal control and traffic assignment. However, these methods may not be able to provide fast and accurate route guidance due to the lack of individual traffic demands, real-time traffic data and dynamic cooperation between vehicles. To solve these problems, this paper proposes a dynamic and real-time route selection model in urban traffic networks (DR2SM), which can supply a more accurate and personalized strategy for vehicles in urban traffic networks. Combining the preference for alternative routes with real-time traffic conditions, each vehicle in urban traffic networks updates its route selection before going through each intersection. Based on its historical experiences and estimation about route choices of the other vehicles, each vehicle uses a self-adaptive learning algorithm to play congestion game with each other to reach Nash equilibrium. In the route selection process, each vehicle selects the user-optimal route, which can maximize the utility of each driving vehicle. The results of the experiments on both synthetic and real-world road networks show that compared with non-cooperative route selection algorithms and three state-of-the-art equilibrium algorithms, DR2SM can effectively reduce the average traveling time in the dynamic and uncertain urban traffic networks. 相似文献
Research on assistive technology, rehabilitation, and prosthetics requires the understanding of human machine interaction, in which human muscular properties play a pivotal role. This paper studies a nonlinear agonistic‐antagonistic muscle system based on the Hill muscle model. To investigate the characteristics of the muscle model, the problem of estimating the state variables and activation signals of the dual muscle system is considered. In this work, parameter uncertainty and unknown inputs are taken into account for the estimation problem. Three observers are presented: a high gain observer, a sliding mode observer, and an adaptive sliding mode observer. Theoretical analysis shows the convergence of the three observers. Numerical simulations reveal that the three observers are comparable and provide reliable estimates. 相似文献
Structural and Multidisciplinary Optimization - The maximum size constraint restricts the amount of material within a test region in each point of the design domain, leading to a highly constrained... 相似文献
There is an ocean current in the actual underwater working environment. An improved self-organizing neural network task allocation model of multiple autonomous underwater vehicles (AUVs) is proposed for a three-dimensional underwater workspace in the ocean current. Each AUV in the model will be competed, and the shortest path under an ocean current and different azimuths will be selected for task assignment and path planning while guaranteeing the least total consumption. First, the initial position and orientation of each AUV are determined. The velocity and azimuths of the constant ocean current are determined. Then the AUV task assignment problem in the constant ocean current environment is considered. The AUV that has the shortest path is selected for task assignment and path planning. Finally, to prove the effectiveness of the proposed method, simulation results are given.
The evolution of the entrance channel of the Snowy River estuary in response to river regulation and climate change is predicted. The predictions are made in terms of the physical attractors that define possible long‐term states of the estuary entrance condition. The classification of these attractors shows the dependence of the entrance stability on the catchment inflows and the present entrance depth. The Snowy River estuary in south‐eastern Australia is a barrier estuary with an unstable entrance that tends to closure. The classification from the attractor map shows that the estuary entrance has changed from predominantly stable to a predominantly unstable state attributable to diversion of water from the upper catchment. The introduction of a series of environmental flow regimes, commencing in 2002, has returned 8% rising to 21% of the mean annual natural flow, but this study shows that the releases provide limited improvement in entrance stability. Additionally, the predicted effects of climate change for this region include increased mean sea level (MSL), decreased annual rainfall, and increased incidence of storms. These changes will decrease stability, primarily through the rise in MSL. The rise in sea level will increase the plan area of the tidal basin, increasing the tidal prism, and hence drawing in more marine sand. The application to the Snowy River estuary provides a proof of concept of the attractor classification to support estuary management. 相似文献
This study investigated the growth, mortality and recruitment of Lates niloticus in Lake Victoria basis on length–frequency data collected during the period 2014‐2015. The asymptotic length (L ∞ ) had a value of 124 cm TL , growth curvature (K ) of 0.22 year?1, total mortality (Z ) of 0.96 year?1, a natural mortality (M ) of 0.42 year?1, a fishing mortality (F ) of 0.54 year?1, an exploitation rate (E ) of 0.57 and a growth performance index () of 3.53. Logistic selection model showed that 50% of fish of 46.09 cm TL encountering the gear are retained. There were two peak recruitment periods, a minor one in March and a major one in July, accounting for 12.04% and 22.04%, respectively, of the total fish catch. The Beverton and Holt's relative yield‐per‐recruit model indicated the indices for sustainable yields are 0.32 for optimum sustainable yield (E 0.5), 0.60 for maximum sustainable yield (E max) and 0.51 for economic yield (E 0.1). Compared to previous findings, there is a great decline in the sizes of Nile perch stocks in Lake Victoria. Thus, managing the fishery requires strict adherence to the slot size of 50–85 cm TL , and restrictions on illegal gear and methods, by the devolved governments through monitoring, control and surveillance in liaison with the Beach Management Units (BMU s). 相似文献
Verification of business processes typically relies on Petri net–based process models. While they allow for natural modeling and analysis of aspects such as parallelism and message exchange, such a process model is seldom complete and precise. This is mainly because the available techniques for deriving a Petri net model from the original model neglect process data in favor of feasible verification. In this paper, we present an approach for deriving more precise process models by leveraging a process‐to‐Petri‐net compiler, which takes as input a business process and generates as output a Petri net model for the process. This can be subsequently used for verification. However, in contrast to a conventional compiler, our compiler's objective is not to create the most efficient code but rather to produce a most precise but still effectively verifiable Petri net–based process model. 相似文献
Burnt area is a critical parameter for estimating emissions of greenhouse gases associated with biomass burning. Several burnt area products (BAPs) derived from Earth Observation satellites/sensors have been released; these are based on different spatial resolutions and derived using different methodologies so that accuracies can vary amongst them. This study validates a global (MODIS) and a national (AVHRR) BAP across Australian southern forests using two reference datasets: state fire histories (SFHs) from 2000 to 2013 and a forest cover map derived through high resolution air photo interpretation (API). The spatial and temporal agreement between fires in the BAPs and reference SFH were analysed based on 2610 sample points representative of Australian southern forest types (successful detection was evaluated according to fire type: planned burn vs. wildfire, size of fire, and land tenure). Results show that both BAPs were most successful when identifying large wildfires (>5000 ha). Overall accuracy for AVHRR and MODIS was 73.9% and 62.5%, respectively. When compared to the API derived forest cover map as reference dataset, both products achieved higher overall accuracies (94.1% for AVHRR and 87.1% for MODIS); an expected result given that the fires detected in this dataset are known to be observable using Earth observation data. But regardless of reference dataset, the AVHRR BAP which is tailored to Australian conditions achieved better results than the MODIS global BAP. Also, the AVHRR archive in Australia goes back to 1988, which is an important consideration for calculating wildfire history for greenhouse gas accounting. 相似文献
The zebrafish embryo is a vertebrate well suited for visualizing nanoparticles at high resolution in live animals. Its optical transparency and genetic versatility allow noninvasive, real‐time observations of vascular flow of nanoparticles and their interactions with cells throughout the body. As a consequence, this system enables the acquisition of quantitative data that are difficult to obtain in rodents. Until now, a few studies using the zebrafish model have only described semiquantitative results on key nanoparticle parameters. Here, a MACRO dedicated to automated quantitative methods is described for analyzing important parameters of nanoparticle behavior, such as circulation time and interactions with key target cells, macrophages, and endothelial cells. Direct comparison of four nanoparticle (NP) formulations in zebrafish embryos and mice reveals that data obtained in zebrafish can be used to predict NPs' behavior in the mouse model. NPs having long or short blood circulation in rodents behave similarly in the zebrafish embryo, with low circulation times being a consequence of NP uptake into macrophages or endothelial cells. It is proposed that the zebrafish embryo has the potential to become an important intermediate screening system for nanoparticle research to bridge the gap between cell culture studies and preclinical rodent models such as the mouse. 相似文献