首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
基于PageRank算法的搜索引擎优化策略   总被引:5,自引:0,他引:5  
张巍  李志蜀 《计算机应用》2005,25(7):1711-1712,1718
在介绍Google等搜索引擎最常用的PageRank搜索结果排名算法的基础上,详细阐述了各种网页链接结构对基于PageRank算法的网站搜索引擎排名结果可能产生的影响,并分析了实际应用中网站针对PageRank算法的各种优化策略,讨论了各自的优点。  相似文献   

3.
Users of the internet often wish to follow certain news events, and the interests of these users often overlap. General search engines (GSEs) cannot be used to achieve this task due to incomplete coverage and lack of freshness. Instead, a broker is used to regularly query the built-in search engines (BSEs) of news and social media sites. Each user defines an event profile consisting of a set of query rules called event rules (ERs). To ensure that queries match the semantics of BSEs, ERs are transformed into a disjunctive normal form, and separated into conjunctive clauses (atomic event rules, AERs). It is slow to process all AERs on BSEs, and can violate query submission rate limits. Accordingly, the set of AERs is reduced to eliminate AERs that are duplicates, or logically contained by other AERs. Five types of event are selected for experimental comparison and analysis, including natural disasters, accident disasters, public health events, social security events, and negative events of public servants. Using 12 BSEs, 85 ERs for five types of events are defined by five users. Experimental comparison is conducted on three aspects: event rule reduction ratio, number of collected events, and that of related events. Experimental results in this paper show that event rule reduction effectively enhances the efficiency of crawling.  相似文献   

4.
In this paper, we present an auxiliary algorithm, in terms of the speed of obtaining the optimal solution, that is effective in helping the simplex method for commencing a better initial basic feasible solution. The idea of choosing a direction towards an optimal point presented in this paper is new and easily implemented. From our experiments, the algorithm will release a corner point of the feasible region within few iterative steps, independent of the starting point. The computational results show that after the auxiliary algorithm is adopted as phase I process, the simplex method consistently reduce the number of required iterations by about 40%.Scope and purposeRecent progress in the implementations of simplex and interior point methods as well as advances in computer hardware has extended the capability of linear programming with today's computing technology. It is well known that the solution times for the interior point method improve with problem size. But, experimental evidence suggests that interior point methods dominate simplex-based methods only in the solution of very large scale linear programs. If the problem size is medium, how to combine the best features of these two methods to produce an effective algorithm for solving linear programming problems is still an interesting problem. In this research we present a new effective ε-optimality search direction based on the interior point method to start an initial basic feasible solution near the optimal point for the simplex method.  相似文献   

5.
搜索引擎在多成员搜索引擎搜索结果的整合过程中,搜索结果的排序在很大程度上决定着元搜索引擎的服务质量。为了实现搜索结果的有效整合,目前技术主要结合查询请求、文档内容、初始排序或(和)赋予搜索成员搜索引擎权重等因素。其中采用赋予搜索引擎权重时,往往根据用户和技术人员经验,主观地进行赋值,不能体现真实的用户搜索偏好。为此,提出了通过挖掘用户搜索及遍历情况,动态地赋予各成员搜索引擎权重的方法。通过用户遍历及点击下载情况,得到了用户搜索遍历与返回结果的匹配度,论证了该方法的可行性和有效性。  相似文献   

6.
7.
A necessary condition for feedback stabilizability of a linear system with a controller of fixed order is proposed. The condition is useful for designing low-order regulators and can be easily calculated in the case of single-input or single-output systems by solving a standard linear-programming problem  相似文献   

8.
Jansen  B.J. 《Computer》2006,39(7):88-90
With paid search, the content provider, search engine, and user have mutually supporting goals. With paid or sponsored search, content providers pay Web search engines to display sponsored links in response to user queries alongside the algorithmic links, also known as organic or nonsponsored links.  相似文献   

9.
This paper discusses economic applications of a recently developed artificial intelligence technique-Koza's genetic programming (GP). GP is an evolutionary search method related to genetic algorithms. In GP, populations of potential solutions consist of executable computer algorithms, rather than coded strings. The paper provides an overview of how GP works, and illustrates with two applications: solving for the policy function in a simple optimal growth model, and estimating an unusual regression function. Results suggest that the GP search method can be an interesting and effective tool for economists.  相似文献   

10.
Controllability analysis is concerned with determining the limitations on achievable dynamic performance. This paper proposes the use of linear programming to determine the best linear controller and corresponding dynamic performance for problems of the form: . That is, a controller, K, and a reference operating point, u0, are selected to minimise a specified objective, J, while ensuring feasibility for all disturbances, w, within a specified set, W. When K is a linear time invariant (LTI) controller and the objective function J and the constraints c can be expressed as linear functions then the above problem can be solved by linear programming. This formulation encompasses a wide range of problems ranging from minimising the maximum deviation in the regulated outputs subject to disturbances of magnitude less than one (the l1 optimal control problem) to optimising the expected value of a linear economic objective (the Optimal Linear Dynamic Economics (OLDE) problem). The relationship of this work to other approaches to controllability analysis is discussed. A highly flexible framework for addressing typical process performance requirements through appropriate selection of J, c and W is presented. The relative merits of alternative approaches for defining the achievable closed loop transfer functions based on the -parameterisation are carefully discussed. The feasibility of the proposed approach is demonstrated on an industrial reactor example. The needs for further work are discussed.  相似文献   

11.
Kingoff  A. 《Computer》1997,30(4):117-118
Search engines are sophisticated utilities designed expressly to find information on the global Internet. An expensive combination of high-speed computer networks and specialized software, they are usually created by large corporations and occasionally by universities. They are freely available to anyone with Internet access, and there are no search restrictions. With more than 150 search engines available, choosing the right one (or ones) is important. As with most products, no single engine is best for all searches and all users all the time. After comparing 50 of the most popular and powerful engines, I narrowed the field down to the four I found most useful: Alta Vista, Deja News, Excite and Yahoo  相似文献   

12.
Theory of search engines   总被引:4,自引:0,他引:4  
Four different stochastic matrices, useful for ranking the pages of the web are defined. The theory is illustrated with examples.  相似文献   

13.
A new model for evolving Evolutionary Algorithms is proposed in this paper. The model is based on the Linear Genetic Programming (LGP) technique. Every LGP chromosome encodes an EA which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization, the Traveling Salesman Problem and the Quadratic Assignment Problem are evolved by using the considered model. Numerical experiments show that the evolved Evolutionary Algorithms perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.  相似文献   

14.
This work presents an interactive fuzzy linear programming (FLP) approach for solving project management (PM) decision problems in a fuzzy environment. The proposed approach attempts to minimize total costs with reference to direct, indirect and penalty costs, durations of activities, specified project completion time and total allocated budget. A numerical example demonstrates the feasibility of applying the proposed FLP approach to actual PM decision problems. Accordingly, the proposed approach yields an efficient solution and determines the overall degree of decision maker (DM) satisfaction. Moreover, the proposed approach offers a systematic framework that facilitates the decision-making process, enabling a DM to interactively modify the range of the results when the environment data are vague until a satisfactory solution is obtained. In particular, several significant characteristics of the proposed FLP approach are elucidated in contrast to those of the main PM decision methods.  相似文献   

15.
《Computers & Structures》1987,25(5):661-664
A new variant of the simplex algorithm is used to distinguish between contact and non-contact points for elastic contact problems. The variant develops a unique solution of the problem in a finite number of analysis cycles.Applications of the method are shown to both the contact and uncontact problems. Use of the modified algorithm in other classes of sceleronomic analysis is described.It is concluded that the reformulation not only leads to proof of the solution uniqueness but also provides a solution method using the Phase I stage of the simplex algorithm.  相似文献   

16.
In this work, we investigate consumer reaction to web search engine logos. Our research is motivated by a small number of search engines dominating a market in which there are little switching costs. The major research goal is to investigate the effect that brand logos have on search engine brand knowledge, which includes brand image and brand awareness. To investigate this goal, we employ a survey of 207 participants and use a mixed method approach of sentiment analysis and mutual information statistic to investigate our research questions. Our findings reveal that some search engines have logos that do not communicate a clear meaning, resulting in a confused brand message. Brand image varies among the top search engines, with consumers possessing generally extremely positive or negative brand opinions. Google elicited a string of positive comments from the participants, to the point of several uses of the term ‘love.’ This is in line with the ultimate brand equity that Google has achieved (i.e., the generic term for web search). Most of the other search engines, including Microsoft, had primarily negative terms associated with them, although AOL, Ask, and Yahoo! had a mix of both positive and negative comments. Implications are that the brand logo may be an important interplay component with the technology for both established search engines and those entering the market.  相似文献   

17.
In this article, we show the existence of a formal convergence between the matrix models of biological memories and the vector space models designed to extract information from large collections of documents. We first show that, formally, the term-by-document matrix (a mathematical representation of a set of codified documents) can be interpreted as an associative memory. In this framework, the dimensionality reduction of the term-by-document matrices produced by the latent semantic analysis (LSA) has a common factor with the matrix biological memories. This factor consists in the generation of a statistical ‘conceptualisation’ of data using little dispersed weighted averages. Then, we present a class of matrix memory that built up thematic blocks using multiplicative contexts. The thematic memories define modular networks that can be acceded using contexts as passwords. This mathematical structure emphasises the contacts between LSA and matrix memory models and invites to interpret LSA, and similar procedures, as a reverse engineering applied on context-deprived cognitive products, or on biological objects (e.g. genomes) selected during large evolutionary processes.  相似文献   

18.
Stereo by intra- and inter-scanline search using dynamic programming   总被引:14,自引:0,他引:14  
This paper presents a stereo matching algorithm using the dynamic programming technique. The stereo matching problem, that is, obtaining a correspondence between right and left images, can be cast as a search problem. When a pair of stereo images is rectified, pairs of corresponding points can be searched for within the same scanlines. We call this search intra-scanline search. This intra-scanline search can be treated as the problem of finding a matching path on a two-dimensional (2D) search plane whose axes are the right and left scanlines. Vertically connected edges in the images provide consistency constraints across the 2D search planes. Inter-scanline search in a three-dimensional (3D) search space, which is a stack of the 2D search planes, is needed to utilize this constraint. Our stereo matching algorithm uses edge-delimited intervals as elements to be matched, and employs the above mentioned two searches: one is inter-scanline search for possible correspondences of connected edges in right and left images and the other is intra-scanline search for correspondences of edge-delimited intervals on each scanline pair. Dynamic programming is used for both searches which proceed simultaneously: the former supplies the consistency constraint to the latter while the latter supplies the matching score to the former. An interval-based similarity metric is used to compute the score. The algorithm has been tested with different types of images including urban aerial images, synthesized images, and block scenes, and its computational requirement has been discussed.  相似文献   

19.
20.
Multimedia Tools and Applications - Traditional multimedia search engines retrieve results based mostly on the query submitted by the user, or using a log of previous searches to provide...  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号