Recent advances in both anthropomorphic robots and bimanual industrial manipulators had led to an increased interest in the specific problems pertaining to dual arm manipulation. For the future, we foresee robots performing human-like tasks in both domestic and industrial settings. It is therefore natural to study specifics of dual arm manipulation in humans and methods for using the resulting knowledge in robot control. The related scientific problems range from low-level control to high level task planning and execution. This review aims to summarize the current state of the art from the heterogenous range of fields that study the different aspects of these problems specifically in dual arm manipulation. 相似文献
An interatomic potential for the vanadium-hydrogen binary system has been developed based on the second nearest-neighbor modified embedded-atom method (2NN MEAM) potential formalism, in combination with the previously developed potentials for V and H. Also, first-principles calculation has been carried out to provide data on the physical properties of this system, which are necessary for the optimization of the potential parameters. The developed potential reasonably reproduces the fundamental physical properties (thermodynamic, diffusion, elastic and volumetric properties) of V-rich bcc solid solution and some of the vanadium hydride phases. The applicability of this potential to the development of V-based alloys for hydrogen applications is discussed. 相似文献
The purpose of this paper is to present a new approach for measurand uncertainty characterization. The Markov chain Monte Carlo (MCMC) is applied to measurand probability density function (pdf) estimation, which is considered as an inverse problem. The measurement characterization is driven by the pdf estimation in a nonlinear Gaussian framework with unknown variance and with limited observed data. These techniques are applied to a realistic measurand problem of groove dimensioning using remote field eddy current (RFEC) inspection. The application of resampling methods such as bootstrap and the perfect sampling for convergence diagnostics purposes gives large improvements in the accuracy of the MCMC estimates. 相似文献
Scheduling constitutes an integral feature of Grid computing infrastructures, being also a key to realizing several of the
Grid promises. In particular, scheduling can maximize the resources available to end users, accelerate the execution of jobs,
while also supporting scalable and autonomic management of the resources comprising a Grid. Grid scheduling functionality
hinges on middleware components called meta-schedulers, which undertake to automatically distribute jobs across the dispersed
heterogeneous resources of a Grid. In this paper we present the design and implementation of a Grid meta-scheduler, which
we call EMPEROR. EMPEROR provides a framework for implementing scheduling algorithms based on performance criteria. In implementing
a particular instantiation of this framework, we have devised models for predicting host load and memory resources, and accordingly
for estimating the running time of a task. These models hinge on time series analysis techniques and take into account results
of the cluster computing literature. Apart from incorporating these models, EMPEROR provides fully fledged Grid scheduling
functionality, which complies with OGSA standards as the later are reflected in the Globus toolkit. Specifically, EMPEROR
interfaces to Globus middleware services (i.e., GSI, MDS, GRAM) towards discovering resources, implementing the scheduling
algorithm and ultimately submitting jobs to local scheduling systems. By and large, EMPEROR is one of the few standards based
meta-schedulers making use of dynamic scheduling information. 相似文献
The directed self‐assembly of diblock copolymer chains (poly(1,1‐dimethyl silacyclobutane)‐block‐polystyrene, PDMSB‐b‐PS) into a thin film double gyroid structure is described. A decrease of the kinetics of a typical double‐wave pattern formation is reported within the 3D‐nanostructure when the film thickness on mesas is lower than the gyroid unit cell. However, optimization of the solvent‐vapor annealing process results in very large grains (over 10 µm²) with specific orientation (i.e., parallel to the air substrate) and direction (i.e., along the groove direction) of the characteristic (211) plane, demonstrated by templating sub‐100‐nm‐thick PDMSB‐b‐PS films. 相似文献
We intend to show the potential of the numerical simulation of atmospheric turbulence to help find optimal sites for astronomical observation. We present results obtained with an atmospheric model, in which a representation of turbulence has been included. The model simulates the atmospheric flow over any given area, including the gross characteristics of the turbulence, from which maps of the astronomical seeing can be retrieved. A validation of the approach is obtained with actual measurements of the seeing, taken during field campaigns on two different sites. We find a good correlation in time between the observed and simulated values of the seeing, and we argue that this result can be extrapolated to space correlations. 相似文献
We developed a formal framework for conflict-driven clause learning (CDCL) using the Isabelle/HOL proof assistant. Through a chain of refinements, an abstract CDCL calculus is connected first to a more concrete calculus, then to a SAT solver expressed in a functional programming language, and finally to a SAT solver in an imperative language, with total correctness guarantees. The framework offers a convenient way to prove metatheorems and experiment with variants, including the Davis–Putnam–Logemann–Loveland (DPLL) calculus. The imperative program relies on the two-watched-literal data structure and other optimizations found in modern solvers. We used Isabelle’s Refinement Framework to automate the most tedious refinement steps. The most noteworthy aspects of our work are the inclusion of rules for forget, restart, and incremental solving and the application of stepwise refinement. 相似文献
Mining hidden knowledge from available datasets is an extremely time-consuming and demanding process, especially in our era with the vast volume of high-complexity data. Additionally, validation of results requires the adoption of appropriate multifactor criteria, exhaustive testing and advanced error measurement techniques. This paper proposes a novel Hybrid Fuzzy Semi-Supervised Forecasting Framework. It combines fuzzy logic, semi-supervised clustering and semi-supervised classification in order to model Big Data sets in a faster, simpler and more essential manner. Its advantages are clearly shown and discussed in the paper. It uses as few pre-classified data as possible while providing a simple method of safe process validation. This innovative approach is applied herein to effectively model the air quality of Athens city. More specifically, it manages to forecast extreme air pollutants’ values and to explore the parameters that affect their concentration. Also it builds a correlation between pollution and general climatic conditions. Overall, it correlates the built model with the malfunctions caused to the city life by this serious environmental problem.
A mixed approximation method called DQA-GMMA is presented in this paper. This approximation uses the combination of Diagonal Quadratic Approximation (DQA) and the Generalized Method of the Moving Asymptotes (GMMA). It has the flexibility to deal with both monotonic and non-monotonic design functions. The convexity and the separable form of this approximation ensures the efficient solution of constructed optimization problem by dual approach. Truss geometry and configuration problems are solved by this method. 相似文献