Some mycotoxins are produced by several Fusarium species during cultivation and are found in wheat and maize grain. Since 2000, Syngenta has organised a large field survey. Agronomic and climatic data and grain samples have been collected for mycotoxin analysis in France and Belgium. The importance of the agroclimatic factors and their interactions on the mycotoxin levels in grain has been estimated. The climate around flowering stage is the major factor for deoxynivalenol (DON) in wheat. The main agronomic criteria are residue management and the variety sensitivity to this mycotoxin. For DON, zearalenone and fumonisins in maize, the climate from flowering stage until harvest is the major factor. Then, according to each mycotoxin, the main agronomic criteria are the harvest condition (date and grain moisture), the corn borer infestation and the variety sensitivity to these mycotoxins. Over the years, the database has been used to define models to predict the mycotoxin risk before harvesting. Grain purchasers enter the required agronomic data via the Syngenta Internet site and define their grain purchasing areas. They also define the flowering period for wheat and corn borer infestation for maize. After calculation which integrates climatic data, the purchasers receive reports with forecasts of mycotoxin levels. Prediction is based on different agro-climatic statistical models specifically configured according to the different regions of production in France and Belgium. This approach is called Qualimètre? and was the first service in France and Belgium to forecast the grain mycotoxin level for wheat in 2004 and maize in 2006. 相似文献
Images obtained with catadioptric sensors contain significant deformations which prevent the direct use of classical image treatments. Thus, Markov random fields (MRF) whose usefulness is now obvious for projective image processing, cannot be used directly on catadioptric images because of the inadequacy of the neighborhood. In this paper, we propose to define a new neighborhood for MRF by using the equivalence theorem developed for central catadioptric sensors. We show the importance of this adaptation for segmentation, image restoration and motion detection. 相似文献
We propose a new encryption algorithm relying on reversible cellular automata (CA). The behavior complexity of CA and their
parallel nature makes them interesting candidates for cryptography. The proposed algorithm belongs to the class of symmetric
key systems.
Marcin Seredynski: He is a Ph.D. student at University of Luxembourg and Polish Academy of Sciences. He received his M.S. in 2004 from Faculty
of Electronics and Information Technology in Warsaw University of Technology. His research interests include cryptography,
cellular automata, nature inspired algorithms and network security. Currently he is working on intrusion detection algorithms
for ad-hoc networks.
Pascal Bouvry, Ph.D.: He earned his undergraduate degree in Economical & Social Sciences and his Master degree in Computer Science with distinction
(’91) from the University of Namur, Belgium. He went on to obtain his Ph.D. degree (’94) in Computer Science with great distinction
at the University of Grenoble (INPG), France. His research at the IMAG laboratory focussed on Mapping and scheduling task
graphs onto Distributed Memory Parallel Computers. Next, he performed post-doctoral researches on coordination languages and
multi-agent evolutionary computing at CWI in Amsterdam. He gained industrial experience as manager of the technology consultant
team for FICS in the banking sector (Brussels, Belgium). Next, he worked as CEO and CTO of SDC (Ho Chi Minh city, Vietnam)
in the telecom, semi-conductor and space industry. After that, He moved to Montreal Canada as VP Production of Lat45 and Development
Director for MetaSolv Software in the telecom industry. He is currently serving as Professor in the group of Computer Science
and Communications (CSC) of the Faculty of Sciences, Technology and Communications of Luxembourg University and he is heading
the Intelligent & Adaptive Systems lab. His current research interests include: ad-hoc networks & grid-computing, evolutionary
algorithms and multi-agent systems. 相似文献
Over the last decade, lattice Boltzmann methods have proven to be reliable and efficient tools for the numerical simulation of complex flows. The specifics of such methods as turbulence solvers, however, are not yet completely documented. This paper provides results of direct numerical simulations (DNS), by a lattice Boltzmann scheme, of fully developed, incompressible, pressure-driven turbulence between two parallel plates. These are validated against results from simulations using a standard Chebyshev pseudo-spectral method. Detailed comparisons, in terms of classical one-point turbulence statistics at moderate Reynolds number, with both numerical and experimental data show remarkable agreement.
Consequently, the choice of numerical method has, in sufficiently resolved DNS computations, no dominant effect at least on simple statistical quantities such as mean flow and Reynolds stresses. Since only the method-independent statistics can be credible, the choice of numerical method for DNS should be determined mainly through considerations of computational efficiency. The expected practical advantages of the lattice Boltzmann method, for instance against pseudo-spectral methods, are found to be significant even for the simple geometry and the moderate Reynolds number considered here. This permits the conclusion that the lattice Boltzmann approach is a promising DNS tool for incompressible turbulence. 相似文献
In this work, we introduce a new framework able to deal with a reasoning that is at the same time non monotonic and uncertain. In order to take into account a certainty level associated to each piece of knowledge, we use possibility theory to extend the non monotonic semantics of stable models for logic programs with default negation. By means of a possibility distribution we define a clear semantics of such programs by introducing what is a possibilistic stable model. We also propose a syntactic process based on a fix-point operator to compute these particular models representing the deductions of the program and their certainty. Then, we show how this introduction of a certainty level on each rule of a program can be used in order to restore its consistency in case of the program has no model at all. Furthermore, we explain how we can compute possibilistic stable models by using available softwares for Answer Set Programming and we describe the main lines of the system that we have developed to achieve this goal. 相似文献
Although particle detachment is a common phenomenon associated with most tribological processes, it seldom occurs that each piece of elemental debris can be considered as the result of a single event. Such an association has been revealed by the systematic study of a specific system, where a pin of graphite is made to rub against thoroughly polished steel. While the discontinuous nature of the transfer film allows a quantitative assessment of the volume of transfer he to be made by 3D optical-profilometry, the linear dependence of the rate of particle detachment dhe/dn (n=number of rubbing cycles) with the logarithm of sliding speed v strongly suggests the existence of a particular type of stick–slip, where each stick may lead to the detachment of a debris particle. The variations in size of these debris with environment as revealed by AFM, further suggest that the global rate of particle detachment is of the form: dhe/dn=Nxi, where N is the number of stick–slip events per rubbing cycle, x the proportion of stick events leading to a cohesive rupture, and i the mean volume of an elemental particle. While this relation is apparently supported by most experimental results, its actual validation can only be made by experiments at the level of single (nanoscale) asperities, carried out under well-controlled experimental conditions. 相似文献