Heuristic algorithms (HAs) are widely used in multi-objective reservoir optimal operation (MOROO) due to the rapidity of the calculation and simplicity of their design. The literature usually focuses on one or two categories of HAs and simply reviews the state of the art. To provide an overall understanding and a specific comparison of HAs in MOROO, differential evolution (DE), particle swarm optimisation (PSO), and artificial physics optimisation (APO), which serve as typical examples of the three categories of HAs, are compared in terms of the development and applications using a designed experiment. Besides, the general model with constraints and fitness function, and the solution process using a hybrid feasible domain restoration method and penalty function method are also presented. Taking a designed experiment with multiple scenarios, the mean average of the optimal objective function values, the standard deviation of optimal objective function values, the mean average of the computational time, and population diversity are used for comparisons. Results of the comparisons show that (a) the problem of optimal multipurpose reservoir long-term operation is a mathematic programming problem with narrow feasible region and monotonic objective function; (b) it is easy to obtain the same optimal objective function value, but different optimal solutions using HAs; and (c) comparisons do not result in a clear winner, but DE can be more appropriate for MOROO.
Entity linking is a fundamental task in natural language processing. The task of entity linking with knowledge graphs aims at linking mentions in text to their correct entities in a knowledge graph like DBpedia or YAGO2. Most of existing methods rely on hand‐designed features to model the contexts of mentions and entities, which are sparse and hard to calibrate. In this paper, we present a neural model that first combines co‐attention mechanism with graph convolutional network for entity linking with knowledge graphs, which extracts features of mentions and entities from their contexts automatically. Specifically, given the context of a mention and one of its candidate entities' context, we introduce the co‐attention mechanism to learn the relatedness between the mention context and the candidate entity context, and build the mention representation in consideration of such relatedness. Moreover, we propose a context‐aware graph convolutional network for entity representation, which takes both the graph structure of the candidate entity and its relatedness with the mention context into consideration. Experimental results show that our model consistently outperforms the baseline methods on five widely used datasets. 相似文献
The Journal of Supercomputing - For sophisticated applications, engineers should always consider multi-objective, multi-task or multi-modal problems, especially in the Internet of Things, such as... 相似文献
Microsystem Technologies - Artificial intelligence (AI), together with its applications, has received world-wide attentions and is expected to exert force on the development of global economy and... 相似文献