The accuracy of optical flow estimation algorithms has been improving steadily as evidenced by results on the Middlebury optical flow benchmark. The typical formulation, however, has changed little since the work of Horn and Schunck. We attempt to uncover what has made recent advances possible through a thorough analysis of how the objective function, the optimization method, and modern implementation practices influence accuracy. We discover that “classical” flow formulations perform surprisingly well when combined with modern optimization and implementation techniques. One key implementation detail is the median filtering of intermediate flow fields during optimization. While this improves the robustness of classical methods it actually leads to higher energy solutions, meaning that these methods are not optimizing the original objective function. To understand the principles behind this phenomenon, we derive a new objective function that formalizes the median filtering heuristic. This objective function includes a non-local smoothness term that robustly integrates flow estimates over large spatial neighborhoods. By modifying this new term to include information about flow and image boundaries we develop a method that can better preserve motion details. To take advantage of the trend towards video in wide-screen format, we further introduce an asymmetric pyramid downsampling scheme that enables the estimation of longer range horizontal motions. The methods are evaluated on the Middlebury, MPI Sintel, and KITTI datasets using the same parameter settings. 相似文献
Implementing information and communications technology (ICT) at scale requires evaluation processes to capture the impact on users as well as the infrastructure into which it is being introduced. For older adults living with cognitive impairment, this requires evaluation that can accommodate different levels of cognitive impairment, alongside input from family and formal caregivers, plus stakeholder organisations. The European Horizon 2020 project INdependent LIving support Functions for the Elderly (IN LIFE) set out to integrate 17 technologies into a single digital platform for older people living with cognitive impairment plus their families, care providers and stakeholders. The IN LIFE evaluation took place across six national pilot sites to examine a number of variables including impact on the users, user acceptance of the individual services and the overall platform, plus the economic case for the IN LIFE platform. The results confirmed the interest and need among older adults, family caregivers, formal caregivers and stakeholders, for information and communications technology (ICT). Relative to the baseline, quality of life improved and cognition stabilised; however, there was an overall reluctance to pay for the platform. The findings provide insights into existing barriers and challenges for adoption of ICT for older people living with cognitive impairment.
Three-dimensional integration is an emerging fabrication technology that vertically stacks multiple integrated chips. The benefits include an increase in device density; much greater flexibility in routing signals, power, and clock; the ability to integrate disparate technologies; and the potential for new 3D circuit and microarchitecture organizations. This article provides a technical introduction to the technology and its impact on processor design. Although our discussions here primarily focus on high-performance processor design, most of the observations and conclusions apply to other microprocessor market segments. 相似文献
The need for thermophysical properties of components and their mixtures has grown as computer simulation of processes has developed and expanded. Although equations of state require fewer input data, they are not yet generally applicable to all types of systems. Accordingly, in many cases, the liquid activity models are still very much required. A long-time disadvantage of the liquid activity method, for systems containing supercritical components, is overcome if the Henry constant is utilized. A van Laar-type interpolative equation provides the Henry constant in liquid mixtures from the values in the pure liquid components. The addition of a ternary interaction in addition to the usual binary ones provides improved MVL prediction of phase equilibria, espcially VLLE involving three phases. Examination of the consistency of thermal properties is made feasible with the aid of a generalized reduced Frost-Kalkwarf vapor pressure equation. It is useful also for extending and supplementing sparse data and for predicting properties from the structure and boiling point. Possible trends in properties needed and their availability to simulators are discussed in view of available computer facilities.Invited paper presented at the Ninth Symposium on Thermophysical Properties, June 24–27, 1985, Boulder, Colorado, U.S.A. 相似文献
A high-resolution hard x-ray microscope is described. This system is capable of detecting line features as small as 0.6 µm in width, and resolving line pairs 1.2-µm wide and 1.2-µm apart. Three types of two-dimensional image detectors are discussed and compared for use with hard x rays in high resolution. Principles of x-ray image magnification are discussed based on x-ray optics and diffraction physics. Examples of applications are shown in microradiography with fiber reinforced composite materials (SiC in Ti3Al Nb) and in diffraction imaging (topography) with device patterns on a silicon single crystal. High-resolution tomography has now become a reality. 相似文献
A ragged array is an irregularly shaped data structure that is an extremely convenient and natural means of implementing storage schemes that exploit the symmetry and sparsity of the different stiffness matrices involved in the finite-element method. Ragged arrays have the potential for improving the programmer’s productivity as well as enhancing code maintainability. Additionally, no performance degradation was detected when ragged arrays were used; the performance of the Gauss elimination procedure, implemented in C++ using ragged arrays, was comparable to the performance of the same procedure implemented in FORTRAN using traditional data structures. 相似文献
The genes of the trithorax group (trxG) in Drosophila melanogaster are required to maintain the pattern of homeotic gene expression that is established early in embryogenesis by the transient expression of the segmentation genes. The precise role of each of the diverse trxG members and the functional relationships among them are not well understood. Here, we report on the isolation of the trxG gene moira (mor) and its molecular characterization. mor encodes a fruit fly homolog of the human and yeast chromatin-remodeling factors BAF170, BAF155, and SWI3. mor is widely expressed throughout development, and its 170-kDa protein product is present in many embryonic tissues. In vitro, MOR can bind to itself and it interacts with Brahma (BRM), an SWI2-SNF2 homolog, with which it is associated in embryonic nuclear extracts. The leucine zipper motif of MOR is likely to participate in self-oligomerization; the equally conserved SANT domain, for which no function is known, may be required for optimal binding to BRM. MOR thus joins BRM and Snf5-related 1 (SNR1), two known Drosophila SWI-SNF subunits that act as positive regulators of the homeotic genes. These observations provide a molecular explanation for the phenotypic and genetic relationships among several of the trxG genes by suggesting that they encode evolutionarily conserved components of a chromatin-remodeling complex. 相似文献
Validation studies are a crucial requirement before implementation of new genetic typing systems for clinical diagnostics or forensic identity. Two different fluorescence-based multiplex DNA profiling systems composed of amelogenin, HumD21S11 and HumFGA (referred to as multiplex 1A), and HumD3S1358, HumD21S11 and HumFGA (multiplex 1B) have been evaluated for use in forensic identification using the Applied Biosystems Model 373A and Prism 377 DNA Sequencers, respectively. Experiments were aimed at defining the limit of target DNA required for reliable profiling, the level of degradation that would still permit amplification of the short tandem repeat (STR) loci examined, and the robustness of each locus in the multiplexes after samples were exposed to environmental insults. In addition, the specificity of the multiplexes was demonstrated using nonhuman DNAs. Forensically relevant samples such as cigarette butts, chewing gum, fingernails and envelope flaps were processed using both an organic extraction procedure and a QIAamp protocol. DNAs and resultant multiplex STR profiles were compared. The validation of the triplex STR systems was extended to include over 140 nonprobative casework specimens and was followed with a close monitoring of initial casework (over 300 exhibits). Our results document the robustness of these multiplex STR profiling systems which, when combined with other multiplex systems, could provide a power of discrimination of approximately 0.9999. 相似文献
The objective of this study was to validate retrospective caregiver interviews for diagnosing major causes of severe neonatal illness and death. A convenience sample of 149 infants aged < 28 days with one or more suspected diagnoses of interest (low birthweight/severe malnutrition, preterm birth, birth asphyxia, birth trauma, neonatal tetanus, pneumonia, meningitis, septicaemia, diarrhoea, congenital malformation or injury) was taken from patients admitted to two hospitals in Dhaka, Bangladesh. Study paediatricians performed a standardised history and physical examination and ordered laboratory and radiographic tests according to study criteria. With a median interval of 64.5 days after death or hospital discharge, caregivers of 118 (79%) infants were interviewed about their child's illness. Using reference diagnoses based on predefined clinical and laboratory criteria, the sensitivity and specificity of particular combinations of signs (algorithms) reported by the caregivers were ascertained. Sufficient numbers of children with five reference standard diagnoses were studied to validate caregiver reports. Algorithms with sensitivity and specificity > 80% were identified for neonatal tetanus, low birthweight/severe malnutrition and preterm delivery. Algorithms with specificities > 80% for birth asphyxia and pneumonia had sensitivities < 70%, or alternatively had high sensitivity with lower specificity. In settings with limited access to medical care, retrospective caregiver interviews provide a valid means of diagnosing several of the most common causes of severe neonatal illness and death. 相似文献