Background removal of an identity (ID) picture consists in separating the foreground (face, body, hair and clothes) from the background of the image. It is a necessary groundwork for all modern identity documents that also has many benefits for improving ID security. State of the art image processing techniques encountered several segmentation issues and offer only partial solutions. It is due to the presence of erratic components like hairs, poor contrast, luminosity variation, shadow, color overlap between clothes and background. In this paper, a knowledge infused approach is proposed that hybridizes smart image processing tasks and prior knowledge. The research is based on a divide and conquer strategy aiming at simulating the sequential attention of human when performing a manual segmentation. Knowledge is infused by considering the spatial relation between anatomic elements of the ID image (face feature, forehead, body and hair) as well as their “signal properties”. The process consists in first determining a convex hull around the person’s body including all the foreground while keeping very close to the contour between the background and the foreground. Then, a body map generated from biometric analysis associated to an automatic grab cut process is applied to reach a finer segmentation. Finally, a heuristic-based post-processing step consisting in correcting potential hair and fine boundary issues leads to the final segmentation. Experimental results show that the newly proposed architecture achieves better performances than tested current state-of-the-art methodologies including active contours, generalist popular deep learning techniques, and also two other ones considered as the smartest for portrait segmentation. This new technology has been adopted by an international company as its industrial ID foreground solution.
Ultralow expansion (ULE) glasses are of special interest for temperature stabilized systems for example in precision metrology. Nowadays, ULE materials are mainly used in macroscopic and less in micromechanical systems. Reasons for this are a lack of technologies for parallel fabricating high-quality released microstructures with a high accuracy. As a result, there is a high demand in transferring these materials into miniaturized application examples, realistic system modeling, and the investigation of microscopic material properties. Herein, a technological base for fabricating released micromechanical structures and systems with a structure height above 100 μm in ULE 7972 glass is established. Herein, the main fabrication parameters that are important for the system design and contribute thus to the introduction of titanium silicate as material for glass-based micromechanical systems are discussed. To study the mechanical properties in combination with respective simulation models, microcantilevers are used as basic mechanical elements to evaluate technological parameters and other impact factors. The implemented models allow to predict the micromechanical system properties with a deviation of only ±5% and can thus effectively support the micromechanical system design in an early stage of development. 相似文献
In a CuZnAl alloy containing 2.2 wt% of nickel it was found that due to alloy quenching and low temperature ageing, precipitation of NiAl disperse particles takes place resulting in raising of martensitic transformation characteristic temperatures. The ease with which this phase precipitates indicates the low thermal stability of this alloy. At the same time the process of precipitation of NiAl particles offers an opportunity for control of shape recovery temperature by varying quenching temperature, cooling rate and especially ageing time, since variations of characteristic temperatures as a function of ageing time can be expressed by the relation T=At0.5. It was also ascertained that the two way shape memory effect, as opposed to the one-way effect, is sensitive to processes associated with alloy ageing and also matrix phase transition in the bainite. 相似文献
We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, M?nnedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpF?STR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. 相似文献
We present in this article a way to produce test suites applied to the POSIX mini-challenge based on a behavioral model of
a file system manager written in UML/OCL. We illustrate the limitations of a fully automated test generation approach, which
justifies the use of test scenarios as a complement to a functional testing approach. Scenarios are expressed through regular
expressions describing sequences of operations, possibly punctuated by intermediate states that have to be reached by the
execution of the model. Scenarios are unfolded into extended sequences of operations that are played on the model using symbolic
animation techniques. We experimented our approach by testing the conformance of two different file systems w.r.t. the POSIX
standard: a recent Linux distribution and a customized Java implementation of POSIX used to evaluate the relevance of our
approach and its complementarity with a structural test generation approach. 相似文献
We introduce a new geometric method to generate sphere packings with restricted overlap values. Sample generation is an important, but time-consuming, step that precedes a calculation performed with the discrete element method (DEM). At present, there does not exist any software dedicated to DEM which would be similar to the mesh software that exists for finite element methods (FEM). A practical objective of the method is to build very large sphere packings (several hundreds of thousands) in a few minutes instead of several days as the current dynamic methods do. The developed algorithm uses a new geometric procedure to position very efficiently the polydisperse spheres in a tetrahedral mesh. The algorithm, implemented into YADE-OPEN DEM (open-source software), consists in filling tetrahedral meshes with spheres. In addition to the features of the tetrahedral mesh, the input parameters are the minimum and maximum radii (or their size ratio), and the magnitude of authorized overlaps. The filling procedure is stopped when a target solid fraction or number of spheres is reached. Based on this method, an efficient tool can be designed for DEMs used by researchers and engineers. The generated packings can be isotropic and the number of contacts per sphere is very high due to its geometric procedure. In this paper, different properties of the generated packings are characterized and examples from real industrial problems are presented to show how this method can be used. The current C++ version of this packing algorithm is part of YADE-OPEN DEM [20] available on the web (https://yade-dem.org). 相似文献
This article proposes a tabu search approach to solve a mathematical programming formulation of the linear classification problem, which consists of determining an hyperplane that separates two groups of points as well as possible in ?m. The tabu search approach proposed is based on a non-standard formulation using linear system infeasibility. The search space is the set of bases defined on the matrix that describes the linear system. The moves are performed by pivoting on a specified row and column. On real machine learning databases, our approach compares favorably with implementations based on parametric programming and irreducible infeasible constraint sets. Additional computational results for randomly generated instances confirm that our method provides a suitable alternative to the mixed integer programming formulation that is solved by a commercial code when the number of attributes m increases. 相似文献
In this paper we introduce novel regularization techniques for level set segmentation that target specifically the problem
of multiphase segmentation. When the multiphase model is used to obtain a partitioning of the image in more than two regions,
a new set of issues arise with respect to the single phase case in terms of regularization strategies. For example, if smoothing
or shrinking each contour individually could be a good model in the single phase case, this is not necessarily true in the
multiphase scenario. 相似文献