共查询到20条相似文献,搜索用时 0 毫秒
1.
L. F. Pau 《Journal of Intelligent and Robotic Systems》1988,1(2):103-116
This paper reviews some knowledge representation approaches devoted to the sensor fusion problem, as encountered whenever images, signals, text must be combined to provide the input to a controller or to an inference procedure. The basic steps involved in the derivation of the knowledge representation scheme, are:
- locate a representation, based on exogeneous context information
- compare two representations to find out if they refer to the same object/entity
- merge sensor-based features from the various representations of the same object into a new set of features or attributes,
- aggregate the representations into a joint fused representation, usually more abstract than each of the sensor-related representations.
2.
A human tracking system based on the integration of the measurements from an inertial motion capture system and a UWB (Ultra-Wide Band) location system has been developed. On the one hand, the rotational measurements from the inertial system are used to track precisely all limbs of the body of the human. On the other hand, the translational measurements from both systems are combined by three different fusion algorithms (a Kalman filter, a particle filter and a combination of both) in order to obtain a precise global localization of the human in the environment. Several experiments have been performed to compare their accuracy and computational efficiency. 相似文献
3.
J. Z. Sasiadek 《Annual Reviews in Control》2002,26(2):213
Sensor fusion is a method of integrating signals from multiple sources. It allows extracting information from several different sources to integrate them into single signal or information. In many cases sources of information are sensors or other devices that allow for perception or measurement of changing environment. Information received from multiple-sensors is processed using “sensor fusion” or “data fusion” algorithms. These algorithms can be classified into three different groups. First, fusion based on probabilistic models, second, fusion based on least-squares techniques and third, intelligent fusion. The probabilistic model methods are Bayesian reasoning, evidence theory, robust statistics, recursive operators. The least-squares techniques are Kalman filtering, optimal theory, regularization and uncertainty ellipsoids. The intelligent fusion methods are fuzzy logic, neural networks and genetic algorithms. This paper will present three different methods of intelligent information fusion for different engineering applications. Chapter 2 is based on Sasiadek and Wang (2001) paper and presents an application of adaptive Kalman filtering to the problem of information fusion for guidance, navigation, and control. Chapter 3 is based on Sasiadek and Hartana (2000) and Chapter 4 on Sasiadek and Khe (2001) paper. 相似文献
4.
5.
XML is rapidly becoming a standard for data representation and exchange. It provides a common format for expressing both data structures and contents. As such, it can help in integrating structured, semistructured, and unstructured data over the Web. Still, it is well recognized that XML alone cannot provide a comprehensive solution to the articulated problem of data integration. There are still several challenges to face, including: developing a formal foundation for Web metadata standards; developing techniques and tools for the creation, extraction, and storage of metadata; investigating the area of semantic interoperability frameworks; and developing semantic-based tools for knowledge discovery 相似文献
6.
We report on a hybrid 12-dimensional full body state estimator for a hexapod robot executing a jogging gait in steady state on level terrain with regularly alternating ground contact and aerial phases of motion. We use a repeating sequence of continuous time dynamical models that are switched in and out of an extended Kalman filter to fuse measurements from a novel leg pose sensor and inertial sensors. Our inertial measurement unit supplements the traditionally paired three-axis rate gyro and three-axis accelerometer with a set of three additional three-axis accelerometer suites, thereby providing additional angular acceleration measurement, avoiding the need for localization of the accelerometer at the center of mass on the robot's body, and simplifying installation and calibration. We implement this estimation procedure offline, using data extracted from numerous repeated runs of the hexapod robot RHex (bearing the appropriate sensor suite) and evaluate its performance with reference to a visual ground-truth measurement system, comparing as well the relative performance of different fusion approaches implemented via different model sequences. 相似文献
7.
Situational data integration with data services and nested table 总被引:2,自引:0,他引:2
Yanbo Han Guiling Wang Guang Ji Peng Zhang 《Service Oriented Computing and Applications》2013,7(2):129-150
Situational data integration is often ad hoc, involves active participation of business users, and requires just-in-time treatment. Agility and end-user programming are of importance. The paper presents a spreadsheet-like programming environment called Mashroom, which offers required agility and expressive power to support situational data integration by non-professional users. In Mashroom, various data sources are encapsulated as data services with nested tables as their unified data model both for internal processing and for external uses. Users can operate on the nested tables interactively. Mashroom also supports the basic control flow patterns. The expressive power of Mashroom is analyzed and proved to be richer than N1NF relational algebra. All the XQuery expressions can be mapped to Mashroom operations and formulas. Experiments have revealed the potentials of Mashroom in situational data integration. 相似文献
8.
《Data Processing》1984,26(10):23-24
Speech-plus-data systems offer financial advantages to companies with medium data traffic flow. Systems with data rates of 1200–2400 bit/s are cheaper than high speed data channels and offer the possibility of transmitting speech as well as data. Lower data transmission systems, while not allowing speed of data transfer, are more flexible than the higher data rate systems. Examples of organizations using each type of system are discussed. 相似文献
9.
H. Varma K. Fadaie M. Habbane J. Stockhausen 《International journal of remote sensing》2013,34(4):627-636
Data fusion is a rapidly emerging technology. Numerous diverse definitions are being promoted and being adopted for various application techniques. The term 'data fusion' is being loosely used to signify combinations of often large amounts of diverse data into a consistent, accurate and intelligible whole. There are several distinct types of data fusion, for example, the data correspond to different attributes associated with the same geometry, within one architecture. In others, the data consist effectively of repeated measurements of different types of attributes that are assembled together using overlay techniques, which were formerly known as data compilation or data assimilation. In the former case, the data have to be fused in an intelligent manner, taking into account the different natures of the attributes, to gain as complete a picture as possible of the object from its component attributes. For the latter, the data are merely the overlaying of different types of attribution to produce a mosaic at the application level. The term data fusion can be broken into two components: true fusion, where one geometry is shared by multiple attributes within a single architecture or file; and data assimilation, where multiple redundant geometries with attributes are brought within the same context using overlay techniques. 相似文献
10.
一种改进的智能传感器数据融合方法 总被引:2,自引:0,他引:2
本文提出了一种利用模糊集理论和证据理论的智能传感器数据融合方法,其主要思路为:结合智能传感器的特点首先将每个传感器获取的隶属度函数转化为基本概率指派,再利用改进的组合规则来组合证据,从而得出融合结果.本方法给出了检测数据到基本概率指派的转化方法,还解决了证据组合过程中经常遇到的证据冲突问题.最后借用一个例子阐述了本方法与一般方法的优势,并证明了其应用于实际的有效性. 相似文献
11.
《Advanced Robotics》2013,27(6):537-549
This paper describes how to design a data fusion module in a skill transfer system. The data fusion paradigm is addressed. It consists of two independent modules for optimal fusion and filtering. A new interpretation of the Kalman filter equations is done, to achieve a 'model-free' equation capable of following arbitrary variables. An engineering approach is used to tune the parameters of interest for a certain task. The fusion algorithm presented here is global and can easily be extended to any arbitrary system. It was successfully tested in a human-robot skill transfer of the peg-in-hole task at the DLR. 相似文献
12.
Optimal mean square linear estimators are determined for general uncorrelated noise. We allow the noise variance matrix in the observation process to be singular. This requires properties of generalized inverses which are developed in Section II. The proofs appear to be new. When there are two observation sequences the optimal method of recursively fusing the two is determined. We derive a new formula for the covariance of the two estimates which then provides exact dynamics for a fused estimate. 相似文献
13.
Li Menggang Wang Fang Jia Xiaojun Li Wenrui Li Ting Rui Guangwei 《Neural computing & applications》2021,33(10):4729-4739
Neural Computing and Applications - Economic data include data of various types and characteristics such as macro-data, meso-data, and micro-data. The source of economic data can be the data... 相似文献
14.
Web数据挖掘中数据集成问题的研究 总被引:3,自引:0,他引:3
在分析Web环境下数据源特点的基础上,对Web数据挖掘中的数据集成问题进行了深入的研究,给出了一个基于XML技术的集成方案.该方案采用Web数据存取方式将不同数据源集成起来,为Web数据挖掘提供了统一有效的数据集,解决了Web异构数据源集成的难题.通过一个具体实例介绍了Web数据集成的过程. 相似文献
15.
Today, the Web is the largest source of information worldwide. There is currently a strong trend for decision-making applications such as Data Warehousing (DW) and Business Intelligence (BI) to move onto the Web, especially in the cloud. Integrating data into DW/BI applications is a critical and time-consuming task. To make better decisions in DW/BI applications, next generation data integration poses new requirements to data integration systems, over those posed by traditional data integration. In this paper, we propose a generic, metadata-based, service-oriented, and event-driven approach for integrating Web data timely and autonomously. Beside handling data heterogeneity, distribution and interoperability, our approach satisfies near real-time requirements and realize active data integration. For this sake, we design and develop a framework that utilizes Web standards (e.g., XML and Web services) for tackling data heterogeneity, distribution and interoperability issues. Moreover, our framework utilizes Active XML (AXML) to warehouse passive data as well as services to integrate active and dynamic data on-the-fly. AXML embedded services and changes detection services ensure near real-time data integration. Furthermore, the idea of integrating Web data actively and autonomously revolves around mining events logged by the data integration environment. Therefore, we propose an incremental XML-based algorithm for mining association rules from logged events. Then, we define active rules dynamically upon mined data to automate and reactivate integration tasks. Finally, as a proof of concept, we implement a framework prototype as a Web application using open-source tools. 相似文献
16.
Model integration is becoming increasingly important as our impacts on the environment become more severe and the systems we analyze become more complex. There are numerous attempts to make different models work in concert. However model integration usually treats models as software components only, ignoring the evolving nature of models and their constant modification and re-calibration to better represent reality. As a result, the changes that used to impact only contained models of subsystems, now propagate throughout the integrated system, across multiple model components. This makes it harder to keep the overall complexity under control and, in a way, defeats the purpose of modularity, where efficiency is supposed to be gained from independent development of modules. We argue that data that are available for module calibration can serve as an intermediate linkage tool, sitting between modules and providing a module-independent baseline, which is then adjusted when scenarios are to be run. In this case, it is not the model output that is directed into the next model. Rather, model output is presented as a variation around the baseline trajectory, and it is this variation that is then fed into the next module down the chain. The Chesapeake Bay Program suite of models is used to illustrate these problems and the possible remedy. 相似文献
17.
The extension of dataflow testing to interprocedural testing is described. This was done by developing both an analysis technique that computes the required interprocedural definition-use information, for both direct and indirect dependencies and a testing technique that uses this information in selecting and executing the subpaths across procedure boundaries. A testing tool that implements this technique is presented. For the interprocedural dataflow analysis, the technique summarizes the individual procedures' definition and use information at call sites and then propagates this information throughout the interacting procedures. By efficiently computing the interprocedural data dependencies before testing, the approach lets the testing tool use existing path-selection techniques based on dataflow for interprocedural testing. To track the execution path, the technique recognizes the calls to and returns from procedures and handles the association of various names with a definition as the execution path is being inspected. The technique handles recursive procedures and supports separate compilation of procedures 相似文献
18.
19.
高新技术作战成败与否在很大程度上取决于战场信息获取的快速性,信息分析的准确性以及信息传输的安全性,以多路信息汇集、信息感知、信息融合、数据挖掘、信息安全、智能推送等技术为支撑,构建一个信息汇集、融合和决策分发智能平台,该平台获取不同空间与地面的信息,经集成和融合后,对其进行分析和挖掘,并根据各终端用户对信息的特殊需求,主动、快速、准确、安全地对信息进行封装和分发.以提高战场信息的有效性、共享性和信息传输的快速与安全性. 相似文献
20.
一种改进的一致性数据融合算法 总被引:3,自引:1,他引:3
针对目前数字滤波算法中存在对先验信息要求苛刻及定义数据间支持度中门限的预先设定问题,在基于测量方差加权算法基础上,引入相对距离和置信距离的思想对其次优融合估计结果进行改进。仿真结果直观地说明了该估计算法的有效性。 相似文献