共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
The same scene can be depicted by multiple visual media. For example, the same event can be captured by a comic image or a movie frame; the same object can be represented by a photograph or by a 3D computer graphics model. In order to extract the visual analogies that are at the heart of cross-media analysis, spatial matching is required. This matching is commonly achieved by extracting key points and scoring multiple, randomly generated mapping hypotheses. The more consensus a hypothesis can draw, the higher its score. In this paper, we go beyond the conventional set-size measure for the quality of a match and present a more general hypothesis score that attempts to reflect how likely is each hypothesized transformation to be the correct one for the matching task at hand. This is achieved by considering additional, contextual cues for the relevance of a hypothesized transformation. This context changes from one matching task to another and reflects different properties of the match, beyond the size of a consensus set. We demonstrate that by learning how to correctly score each hypothesis based on these features we are able to deal much more robustly with the challenges required to allow cross-media analysis, leading to correct matches where conventional methods fail. 相似文献
3.
4.
《Computers in human behavior》2000,16(3):271-286
Establishing a Fifth Dimension activity system in a community institution and an associated university course may be enough to get activity going, but it may not be sufficient to keep it going. We provide two examples of systems that existed for some time and then failed, despite evidence of their effectiveness. One was in California, the other in Michigan. 相似文献
5.
6.
7.
This study reports the impact of high sensitivity to early exchange in 11th-grade, CSCL triads solving well- and ill-structured problems in Newtonian Kinematics. A mixed-method analysis of the evolution of participation inequity (PI) in group discussions suggested that participation levels tended to get locked-in relatively early on in the discussion. Similarly, high (low) quality member contributions made earlier in a discussion did more good (harm) than those made later on. Both PI and differential impact of member contributions suggest a high sensitivity to early exchange; both significantly predicting the eventual group performance, as measured by solution quality. Consequently, eventual group performance could be predicted based on what happened in the first 30–40% of a discussion. In addition to drawing theoretical and methodological implications, implications for scaffolding CSCL groups are also discussed. 相似文献
8.
This is an introduction to a special issue on computer-supported collaborative learning. 相似文献
9.
Reward only is not enough: Evaluating and improving the fairness policy of the P2P file sharing network eMule/eDonkey 总被引:1,自引:0,他引:1
Yunzhao Li Don Gruenbacher Caterina Scoglio 《Peer-to-Peer Networking and Applications》2012,5(1):40-57
Limiting the threat of free-riding behavior is an important design issue for peer-to-peer (P2P) file sharing networks. However,
the fairness policy that rewards contributors with credit in one of the most popular P2P file sharing networks, eMule/eDonkey,
hasn’t been thoroughly studied. In this paper, motivated by our experiments with the eMule/eDonkey network, we firstly theoretically
analyze the content exchange process with credit in eMule/eDonkey and then verify the mathematical model by an agent-based
simulation. Both the numerical and simulation-based results confirm our discovery in the experiments that eMule/eDonkey’s
local credit strategy can not provide enough fairness as it doesn’t explicitly punish free-riders. To overcome this drawback,
we propose a new free-riding control scheme, which can simply maintain the current credit local structure and take advantage
of the credit policy. Extensive numerical evaluation and simulation indicate that this scheme significantly improves system
fairness. 相似文献
10.
The generic term ‘non-traditional machining’ (NTM) refers to a variety of thermal, chemical, electrical and mechanical material removal processes which have been developed due to lack of efficiency of the traditional machining processes to generate complex and intricate shapes in materials with ‘high strength-to-weight’ ratio. For effective utilization of the capabilities of different NTM processes, utmost care is needed for the selection of the most suitable process for a given machining application. Due to the lack of experienced experts in the domain of NTM processes, there is a need for a simple scientific/mathematical tool for selecting the most suitable NTM process when a particular shape feature is to be generated on a given work material. This paper focuses on the development of a two-phase decision model in this aspect. In the first phase, the most efficient NTM processes are selected for a given shape feature and work material combination having the best combination of performance parameters with the help of input-minimized-based Charnes, Cooper and Rhodes (CCR) model of data envelopment analysis (DEA). In the second phase, those efficient NTM processes are ranked in descending order of priority using the weighted-overall efficiency ranking method of multi-attribute decision-making (MADM) theory. Two real time machining applications are cited which prove the applicability, versatility and adaptability of this two-phase NTM process selection decision-making model as the results are quite consistent with those as derived by the past researches. 相似文献
11.
This paper reports an exploratory analysis of the relation between Internet addiction and patterns of use among Portuguese adolescents (n = 2617) from the WHO 2010 Health Behavior in School-aged children study, with a short version of Young's Internet Addiction Test (the brief Internet Addiction Questionnaire – bIAQ) and self-reports on online behaviors and access. Two-Step Cluster analysis identified two clusters of users based on their usage pattern: a minority of high-frequency users, with higher bIAQ scores, and a majority of low-frequency users, with lower bIAQ scores. Low and high-frequency users are particularly distinct in specific activities, which converges with previous research showing addiction to specific Internet activities rather than to the Internet as a whole. 相似文献
12.
Cormode G. Datar M. Indyk P. Muthukrishnan S. 《Knowledge and Data Engineering, IEEE Transactions on》2003,15(3):529-540
Massive data streams are now fundamental to many data processing applications. For example, Internet routers produce large scale diagnostic data streams. Such streams are rarely stored in traditional databases and instead must be processed "on the fly" as they are produced. Similarly, sensor networks produce multiple data streams of observations from their sensors. There is growing focus on manipulating data streams and, hence, there is a need to identify basic operations of interest in managing data streams, and to support them efficiently. We propose computation of the Hamming norm as a basic operation of interest. The Hamming norm formalizes ideas that are used throughout data processing. When applied to a single stream, the Hamming norm gives the number of distinct items that are present in that data stream, which is a statistic of great interest in databases. When applied to a pair of streams, the Hamming norm gives an important measure of (dis)similarity: the number of unequal item counts in the two streams. Hamming norms have many uses in comparing data streams. We present a novel approximation technique for estimating the Hamming norm for massive data streams; this relies on what we call the "l/sub 0/ sketch" and we prove its accuracy. We test our approximation method on a large quantity of synthetic and real stream data, and show that the estimation is accurate to within a few percentage points. 相似文献
13.
14.
15.
Ibrahim F Taib MN Abas WA Guan CC Sulaiman S 《Computer methods and programs in biomedicine》2005,79(3):273-281
Dengue fever (DF) is an acute febrile viral disease frequently presented with headache, bone or joint and muscular pains, and rash. A significant percentage of DF patients develop a more severe form of disease, known as dengue haemorrhagic fever (DHF). DHF is the complication of DF. The main pathophysiology of DHF is the development of plasma leakage from the capillary, resulting in haemoconcentration, ascites, and pleural effusion that may lead to shock following defervescence of fever. Therefore, accurate prediction of the day of defervescence of fever is critical for clinician to decide on patient management strategy. To date, no known literature describes of any attempt to predict the day of defervescence of fever in DF patients. This paper describes a non-invasive prediction system for predicting the day of defervescence of fever in dengue patients using artificial neural network. The developed system bases its prediction solely on the clinical symptoms and signs and uses the multilayer feed-forward neural networks (MFNN). The results show that the proposed system is able to predict the day of defervescence in dengue patients with 90% prediction accuracy. 相似文献
16.
Collaboration analysis methods should be adapted to the needs of their users to be more effective. This paper presents DESPRO, a method that guides learning designers and teachers through the steps needed to provide support adapted to the participants in CSCL situations, based on a semi-automatic process of role detection. DESPRO is supported by a structured characterization of participatory roles defined by means of social network analysis metrics, and on a framework that guides the definition of the roles to be identified and supported. This paper describes the method and its application in a case study to identify and provide support to the roles played by the teacher and the students in a university course. The case study shows how the method defines a flexible approach to meet the desired goal of providing support adapted to the needs of different users (i.e., roles) in CSCL settings. 相似文献
17.
18.
Liren Zhang Abderrahmane Lakas Hesham El-Sayed Ezedin Barka 《Journal of Network and Computer Applications》2013,36(3):1050-1056
This paper focuses on vehicle mobility analysis in VANET. The performance of vehicle mobility in terms of average inter-vehicle link available time and the average number of inter-vehicle link changes for maintaining an active link in VANET is analyzed using both handover model and random moving model, respectively. The theoretical analysis is verified by simulation experiments. The numerical results indicate that the analytical random moving model is able to appropriately present the behavior of vehicle moving under different conditions, especially when mobile vehicle is moving relatively fast. On the other hand, the effect of traffic conditions on the accuracy of theoretical analysis is also investigated. 相似文献
19.
Linux malware can pose a significant threat—its (Linux) penetration is exponentially increasing—because little is known or
understood about Linux OS vulnerabilities. We believe that now is the right time to devise non-signature based zero-day (previously
unknown) malware detection strategies before Linux intruders take us by surprise. Therefore, in this paper, we first do a
forensic analysis of Linux executable and linkable format (ELF) files. Our forensic analysis provides insight into different
features that have the potential to discriminate malicious executables from benign ones. As a result, we can select a features’
set of 383 features that are extracted from an ELF headers. We quantify the classification potential of features using information
gain and then remove redundant features by employing preprocessing filters. Finally, we do an extensive evaluation among classical
rule-based machine learning classifiers—RIPPER, PART, C4.5 Rules, and decision tree J48—and bio-inspired classifiers—cAnt
Miner, UCS, XCS, and GAssist—to select the best classifier for our system. We have evaluated our approach on an available
collection of 709 Linux malware samples from vx heavens and offensive computing. Our experiments show that ELF-Miner provides more than 99% detection accuracy with less than 0.1% false alarm rate. 相似文献
20.
The classic Data Envelopment Analysis (DEA) models developed with the assumption that all inputs and outputs are non-negative, whereas, we may face a case with negative data in the actual business world. So, the need to adapt the DEA models so that they are applicable to cases includes inputs and outputs which can take both negative and non-negative values has been an issue. It can be readily demonstrated that the assumption of constant returns to scale (CRS) is not possible in technologies under negative data. So, one of the interesting and challenge questions is how to determine the state of RTS in the presence of negative data under variable returns to scale (VRS) technology. Accordingly, in this contribution, we first address the efficiency measure and then suggest a method to discover the state of returns to scale (RTS) in the presence of negative input and output values which has not been discussed much enough so far in DEA literature. Finally, the main results are elaborated by some illustrative examples. 相似文献