The dietary selenium (Se) intake in Belgium has been re-evaluated. Duplicate meal collection, wet acid destruction and flow injection hydride generation atomic absorption spectrometry were used as techniques. The daily intake ranged from 28.4 g (Liège, Walloon part of the country) to 61.1 g (Vilvoorde, central part of the country). Compared with intakes recently published for other countries, the actual Belgian value corresponds to intermediate ranges of Se intake.
Die aktuelle tägliche Selen-Aufnahme durch die Nahrung in Belgien, bestimmt durch doppelte Probenahme
Zusammenfassung Die Selen-Aufnahme durch die Nahrung in Belgien wurde neu bestimmt. Als Technik diente die doppelte Probenahme von Mahlzeiten, deren nasser Aufschluß und die Fließ-Injektions AAS mit Hydridgeneration. Die tägliche Aufnahme schwankt zwischen 28,4 und 61, 1 g. Im Vergleich zu Daten, die neuerdings aus anderen Ländern publiziert wurden, liegen die derzeitigen Werte aus Belgien im Mittelbereich.
We consider deterministic distributed broadcasting on multiple access channels in the framework of adversarial queuing. Packets
are injected dynamically by an adversary that is constrained by the injection rate and the number of packets that may be injected
simultaneously; the latter we call burstiness. A protocol is stable when the number of packets in queues at the stations stays
bounded. The maximum injection rate that a protocol can handle in a stable manner is called the throughput of the protocol.
We consider adversaries of injection rate 1, that is, of one packet per round, to address the question if the maximum throughput
1 can be achieved, and if so then with what quality of service. We develop a protocol that achieves throughput 1 for any number
of stations against leaky-bucket adversaries. The protocol has
O(n2+\textburstiness){\mathcal{O}(n^2+\text{burstiness})} packets queued simultaneously at any time, where n is the number of stations; this upper bound is proved to be best possible. A protocol is called fair when each packet is
eventually broadcast. We show that no protocol can be both stable and fair for a system of at least two stations against leaky-bucket adversaries. We study in detail small systems of exactly
two and three stations against window adversaries to exhibit differences in quality of broadcast among classes of protocols.
A protocol is said to have fair latency if the waiting time of packets is
O(\textburstiness){\mathcal{O}(\text{burstiness})}. For two stations, we show that fair latency can be achieved by a full sensing protocol, while there is no stable acknowledgment
based protocol. For three stations, we show that fair latency can be achieved by a general protocol, while no full sensing
protocol can be stable. Finally, we show that protocols that either are fair or do not have the queue sizes affect the order
of transmissions cannot be stable in systems of at least four stations against window adversaries. 相似文献
This paper addresses the constrained motion planning problem for nonholonomic systems represented by driftless control systems with output. The problem consists in defining a control function driving the system output to a desirable point at a given time instant, whereas state and control variables remain over the control horizon within prescribed bounds. The state and control constraints are handled by extending the control system with a pair of state equations driven by the violation of constraints, and adding regularizing perturbations. For the regularized system a Jacobian motion planning algorithm is designed, called imbalanced. Solutions of example constrained motion planning problems for the rolling ball illustrate the theoretical concepts. 相似文献
One of the first steps in drug discovery involves identification of novel compounds that interfere with therapeutically relevant biological processes.
Identification of ‘lead’ compounds in all therapeutic areas included in a drug discovery program requires labor-intensive evaluation of numerous samples in a battery of therapy targeted biological assays. To accelerate the identification of ‘lead’ compounds, Janssen Research Foundation (JRF) has developed in the past an automated high throughput screening (HTS) based on the unattended operation of a custom Zymark tracked robot system. Automation of enzymatic and cellular assays was realized with this system adapted to the handling of microtiter plates. The microtiter plate technology is the basis of our screening. All compounds within our chemical library are stored and distributed in micronic tube racks or microtiter plates for screening. An efficient in-house developed mainframe based laboratory information management system supported all screening activities. Our experience at JRF has shown that the preparation of test compounds and serial dilutions has been a rate-limiting step in the overall screening process. In order to increase compound throughput, it was necessary both to optimize the robotized assays and to automate the compound supply processes. In HTS applications, one of the primary requirements is highly accurate and precise pipetting of microliter volumes of samples into microplates. The SciClone™ is an automated liquid handling workstation capable of both 96- and 384-channel high precision pipetting. For high throughput applications, the SciClone™ instrumentation is able to pipette a variety of liquid solutions with a high degree of accuracy and precision between microplates (inter-plate variability) and tip-to-tip (intra-plate variability) within a single plate. The focus of this presentation is to review the liquid handling performance of the SciCloneTM system as a multipurpose instrument for pipetting aqueous or organic solutions, and virus suspensions into 96- and 384-well microplates. The capabilities of the system and the resulting benefits for our screening activities will be described. 相似文献
Outsourcing continues to capture the attention of researchers as more companies move to outsourcing models as part of their
business practice. Two areas frequently researched and reported in the literature are the reasons why a company decides to
outsource, and outsourcing success factors. This paper describes an in-depth, longitudinal case study that explores both the
reasons why the company decided to outsource and factors that impact on success. The paper describes how Alpha, a very large
Australian communications company, approached outsourcing and how its approach matured over a period of 9 years. The paper
concludes that although a number of reasons are proposed for a company's decision to outsource, lowering costs was the predominant
driver in this case. We also describe other factors identified as important for outsourcing success such as how contracts
are implemented, the type of outsourcing partner arrangement, and outsourcing vendor capabilities.
Video visualization is a computation process that extracts meaningful information from original video data sets and conveys the extracted information to users in appropriate visual representations. This paper presents a broad treatment of the subject, following a typical research pipeline involving concept formulation, system development, a path-finding user study, and a field trial with real application data. In particular, we have conducted a fundamental study on the visualization of motion events in videos. We have, for the first time, deployed flow visualization techniques in video visualization. We have compared the effectiveness of different abstract visual representations of videos. We have conducted a user study to examine whether users are able to learn to recognize visual signatures of motions, and to assist in the evaluation of different visualization techniques. We have applied our understanding and the developed techniques to a set of application video clips. Our study has demonstrated that video visualization is both technically feasible and cost-effective. It has provided the first set of evidence confirming that ordinary users can be accustomed to the visual features depicted in video visualizations, and can learn to recognize visual signatures of a variety of motion events. 相似文献
The modelling of plasma formation during microwave breakdown is a difficult task because of the strong non-linear coupling between Maxwell?s equations and plasma equations, and of the large plasma density gradients that form during breakdown. An original Finite Volume Time Domain (FVTD) method has been developed to solve Maxwell?s equations coupled with a simplified fluid plasma model and is described in this paper. This method is illustrated with the study of the shielding of a metallic aperture by the plasma generated by an incident high power electromagnetic wave. Typical results obtained with the FVTD method for this shielding problem are shown. 相似文献
Feature selection is a problem of finding relevant features. When the number of features of a dataset is large and its number of patterns is huge, an effective method of feature selection can help in dimensionality reduction. An incremental probabilistic algorithm is designed and implemented as an alternative to the exhaustive and heuristic approaches. Theoretical analysis is given to support the idea of the probabilistic algorithm in finding an optimal or near-optimal subset of features. Experimental results suggest that (1) the probabilistic algorithm is effective in obtaining optimal/suboptimal feature subsets; (2) its incremental version expedites feature selection further when the number of patterns is large and can scale up without sacrificing the quality of selected features. 相似文献