首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Fluorescence resonance energy transfer (FRET) is widely used in spectral codification of information at the molecular level, and can be used to generate several layers of information on a DNA chip. We used two oligonucleotides (probes) labeled with different donor (harvesting) molecules in hybridization experiments with complementary oligonucleotides labeled with four different acceptors (targets). By looking at the fluorescence response of the sample after “specific” excitation of each donor molecule (by “specific” we mean a wavelength where one of the donors is predominantly excited), we inspected the possibility to identify the complementary oligonucleotide hybridized to the probe, in mixtures containing two donor probe/acceptor target pairs. In most samples (13 out of the 16 possible), it is trivial to identify the complementary target that is hybridized to the excited donor probe in the mixtures. The major limitations of the chosen system arise when very different concentrations of donor probe/acceptor target pairs are present in the same sample.  相似文献   

2.
Recent hardware advances reduce to one common denominator: The “Miracle” of the Chip. Large-scale integration continues its astounding progress and commercially available densities of only two years ago are already obsolete; the 100 K chip is said to be on several drawing boards. Future system architectures will exhibit increasing parallelism and modularity. An avalanche of new “hard-wired” components will include hierarchical and associative memories, pipeline and array processors; data security will be provided through cryptographic hardware. Finally, computer networks will eclipse the meteoric rise of their time-sharing ancestors as microprocessors take over the burdens of protocols and network operating systems.  相似文献   

3.
DNA-based fluorescent microarrays (sometimes called “biochips”) are fast becoming the preferred tool for studying a variety of complex biochemical phenomena ranging from multiplex mutation detection, to gene mapping and expression monitoring, to high throughput screening for new drug candidates. Fluorescence is a low energy phenomenon. The need for rapid, high resolution, wide field imaging of fluorescent microarrays calls for a specialized microscope architecture. We now describe the design of a “Flying Objective” epi-fluorescence microscope that is ideally suited to this application, and compare the performance of this novel instrument with two other commercial epi-fluorescence microscopes designed to read DNA microarrays.  相似文献   

4.
5.
A.  M.  A.  M.  A.  M.  R. Mayrhofer 《Pervasive and Mobile Computing》2008,4(3):448-479
An integrated, autonomous stick-on computing platform is proposed, consisting of (i) the Peer-it stick-on, multi-sensor, multi-actuator computer hardware, (ii) the Peer-it component-based software framework, and (iii) the Peer-it profile markup language PeerML, supporting spontaneous interaction among such platforms. The platform implements Peer-to-Peer computing principles in a self-contained, miniaturized, universal and scalable way, giving raise for application scenarios where the real-world artefacts like e.g. machines, tools or appliances–literally every thing–equipped with Peer-it technology can operate in spontaneously interacting, goal-oriented ensembles.Technically, preferences (like capabilities and goals) and context (like time, geo-position, owner, environmental conditions, etc.) of peers are kept as a profile encoded in PeerML in the local memory of Peer-its, and carried along wherever they move in space. Once peers come into spatial proximity of each other, profiles are exchanged via wireless communication, and the “similarity” of preferences is analyzed. In the case of “matching” preferences, an associated application is notified on both peers.Besides a fully functional autonomous hardware platform integrating multiple sensors, actuator arrays and wireless communication technologies, the Peer-it stick-on computer, a low-memory footprint, OSGi compliant Peer-it software framework has been implemented. We demonstrate in a flexible manufacturing systems (FMS) scenario, how the Peer-it technology can improve over centralized FMSs with respect to fault tolerance, scalability, flexibility in reconfiguration, productivity and efficiency.  相似文献   

6.
Induction based fluidics (IBF), a new, simple patented approach for transporting liquids in the micro and the macro world, is discussed. Electric fields are shown to energize liquid/s in a container/s to execute an array of purposes. IBF is shown uniquely to energize N liquids in simple off the shelf devices, inductively. We discuss calibration and other issues, as we demonstrate how simple devices can dispense nanoliters and microliters with high precision and accuracy. Furthermore, we show preliminary single and eight channel data for the Zip Tip™ made by Millipore where the device transports liquids “electrically.” We briefly consider how such new devices, “electric” Zip Tips™, might automate desalting and the placement of digests for MALDI TOF analysis.  相似文献   

7.
Visualization and Analysis of Gene Expression Data   总被引:1,自引:0,他引:1  
Producing microarray data starts with scanning in the glass, gel or plastic slides with a specialized scanner to obtain digital images of the results of an experiment after hybridization. With the help of image analysis software the DNA expression levels are then quantified. After the image processing and analysis step is completed we end up with a large number of quantified gene expression values. The data typically represents hundreds or thousands, in certain cases tens of thousands, of gene expressions across multiple experiments. To make sense of this much information it is unavoidable to use various visualization and statistical analysis techniques. One of the most typical microarray data analysis goals is to find statistically significant up or down regulated genes, in other words outliers or ‘interestingly’ behaving genes in the data. Other possible goals could be to find functional groupings of genes by discovering similarity or dissimilarity among gene expression profiles, or predicting the biochemical and physiological pathways of previously uncharacterized genes.  相似文献   

8.
Technology benefits last years longer than the standard ROI valuation analysis but are rarely enumerated. In this paper, we utilize a nonconstant dividend growth model to “capture” lasting marginal productivity gained through the “reinvestment” of labor capital rather than the standard the one-time gain of reducing the labor force to realize labor productivity gains. This innovative methodology for capturing the productivity value of maintained employees enables the valuation of continuing marginal productivity gains and the management of workload for the affected employees at Intel. This methodology is applied to the valuation of a standard operating system and hardware upgrade.  相似文献   

9.
The results of a study of elementary and secondary school usage of microcomputers for one southern state are reported. In addition to summarizing the availability of hardware, software, and personnel resources, evidence is presented to suggest that various sub-groups of students, particularly the “regular” classroom student and the socioeconomically disadvantaged, are severely limited in their access to computer technology, and that access, when afforded, is restricted to drill and practice applications. School-, district-, and state-level recommendations are provided to address these inequities.  相似文献   

10.
The development of rapid and ultra-sensitive detection technologies is a long-standing goal for researchers in the bio-detection fields. Nanowire field-effect transistor (nano-FET) devices have shown great promise in label-free and ultra sensitive detection of biological agents. However, critical application problems in using this technology have not been addressed, particularly the difficulties of FET sensing surface modification for various targets and lower detection specificity in real biological samples. A novel molecular signal transduction system reported herein overcomes such problems. With this system, various complicated bio-molecular interactions are “translated” into simple signal molecules with universal sequences. These signal molecules are captured on the sensing surface of nano-FET devices through sequence-specific recognition and generate detectable electronic signals. Using this system, nano-FET devices become universal for detecting almost any bio-agents. Staphylococcus aureus was used as the target to demonstrate the detection of DNA, RNA and protein on the same nano-FET device. Detection sensitivity was achieved at 25 fM levels in pure sample.  相似文献   

11.
Automatic detection of the level of human interest is of high relevance for many technical applications, such as automatic customer care or tutoring systems. However, the recognition of spontaneous interest in natural conversations independently of the subject remains a challenge. Identification of human affective states relying on single modalities only is often impossible, even for humans, since different modalities contain partially disjunctive cues. Multimodal approaches to human affect recognition generally are shown to boost recognition performance, yet are evaluated in restrictive laboratory settings only. Herein we introduce a fully automatic processing combination of Active–Appearance–Model-based facial expression, vision-based eye-activity estimation, acoustic features, linguistic analysis, non-linguistic vocalisations, and temporal context information in an early feature fusion process. We provide detailed subject-independent results for classification and regression of the Level of Interest using Support-Vector Machines on an audiovisual interest corpus (AVIC) consisting of spontaneous, conversational speech demonstrating “theoretical” effectiveness of the approach. Further, to evaluate the approach with regards to real-life usability a user-study is conducted for proof of “practical” effectiveness.  相似文献   

12.
How may we discriminate between the multitude of point-to-point communication facilities currently available? To take just one aspect of communication, how can we assess the fluency of coordination that results from using some communication technology? This paper describes two groups of measures with this general purpose. The measures described have been devised to be used in a particular approach to evaluation for the design of communication systems that borrows from experimental and ethnographic methods. This approach is promoted as a practical and rigorous way of assessing design alternatives.The first group of measures are subjective ratings that assess someone's awareness of the attentional status of their conversational partner, such awareness is necessary for the successful coordination of conversation. The rating scales are shown to be sensitive in that they distinguish between video and audio mediated conversation in a short experiment.The second group are measures derived from video records of communicative behaviour using “activity set” analysis. This can be used to assess coordination in communication directly. An activity set is a mutually exclusive and exhaustive set of behavioural states. A publicly available tool, Action Recorder, makes it possible to score the tapes in near real time. “Simple statistics” are extracted from a single activity set, examples are: the proportion of time spent looking towards the video monitor and the average duration of these glances. “Contingent statistics” are extracted from two or more activity sets, for example, the proportion of time both members of a pair are looking towards their video monitors. A way of assessing the synchronization evident in two people's behaviour is presented that makes use of these contingent statistics. Inter-observer reliabilities are given for all the measures generated.  相似文献   

13.
Microelectronic chip-based systems are available for a wide variety of applications. Many of these systems rely on NON-INTEGRATED optical detection schemes to collect data from the chips. A magnetoresistive detection format, however, can be completely integrated. This paper presents some basic concepts for optimizing micron-sized magnetoresistive sensors for single nucleotide polymorphism (SNP) analysis and DNA diagnostics. Magnetoresistive sensors are nano-fabricated thin film resistors whose resistance changes as a function of magnetic field. The magnetic DNA assay replaces the EXTERNAL optical reader apparatus with an INTEGRATED magnetoresistive sensor at each “pixel” of the array. The EXTERNAL light source can be replaced by an INTEGRATED magnetic field generation strap, or by a simple external coil. Magnetoresistive pixel sizes could presently be ˜ 3 microns on a side, and decrease to ˜ 100 nm with technological improvements. It is shown that, taking reasonable values for critical parameters, a signal to noise ratio of 10,000 : 1 is achievable using 10 nm paramagnetic beads as the assay label. As early demonstrations of the feasibility of this system, data have been collected using NVE's magnetoresistive sensors (non-optimized) to easily detect single micron-sized magnetic beads. Presently NVE is working on 1 million bit arrays of magnetoresistive sensors which are being fabricated into magnetoresistive random access memory (MRAM) chips. These arrays have many similarities to what is required for the magnetoresistive DNA assay including sub-micron bit size and single bit addressability.  相似文献   

14.
The paper presents an approach to characterizing a “stop–flow” mode of sensor array operation. The considered operation mode involves three successive phases of sensors exposure: flow (in a stream of measured gas), stop (in zero flow conditions) and recovery (in a stream of pure air). The mode was characterized by describing the distribution of information, which is relevant for classification of measured gases in the response of sensor array. The input data for classifier were the sets of sensors output values, acquired in discrete time moments of the measurement. Discriminant Function Analysis was used for data analysis. Organic vapours of ethanol, acetic acid and ethyl acetate in air were measured and classified. Our attention was focused on data sets which allowed for 100% efficient recognition of analytes. The number, size and composition of those data sets were examined versus time of sensor array response. This methodology allowed to observe the distribution of classification-relevant information in the response of sensor array obtained in “stop–flow” mode. Hence, it provided for the characterization of this mode.  相似文献   

15.
Many of the problems addressed through engineering analysis include a set of regulatory (or other) probabilistic requirements that must be demonstrated with some degree of confidence through the analysis. Problems cast in this environment can pose new challenges for computational analyses in both model validation and model-based prediction. The “regulatory problems” given for the “Sandia challenge problems exercise”, while relatively simple, provide an opportunity to demonstrate methods that address these challenges. This paper describes and illustrates methods that can be useful in analysis of the regulatory problem. Specifically, we discuss:
(1) an approach for quantifying variability and uncertainty separately to assess the regulatory requirements and provide a statement of confidence; and
(2) a general validation metric to focus the validation process on a specific range of the predictive distribution (the predictions near the regulatory threshold).
These methods are illustrated using the challenge problems. Solutions are provided for both the static frame and structural dynamics problems.
Keywords: Regulatory problem; Calibration; Model validation; Model-based prediction  相似文献   

16.
Push technology automates the information delivery process by not requiring users to request for the information they need. Wireless has experienced explosive growth in recent years, and “push” will be the predominant wireless service delivery paradigm of the future. A wide variety of services, alerts and messages, such as promotional content, will be delivered to consumers’ phones or PDA. However, to push information to a wireless device can be a challenge because of the problem of intermittent communication links and resource constraints on wireless devices as well as limited bandwidth. This paper explores an efficient multicasting mechanism that “pushes” pre-specified information to groups of wireless devices. The mechanism operates with limited bandwidth and also overcomes the connectivity problem. Based on the above concept, we have designed and implemented a system to multicast sales information via wireless technology. The system is message-oriented and JMS compliant.  相似文献   

17.
The parallel resources time and hardware and the complexity classes defined by them are studied using the aggregate model. The equivalence of complexity classes defined by sequential space and uniform aggregate hardware is established. Aggregate time is related to (bounded fanin) circuit depth and, similarly, aggregate hardware is related to circuit width. Interelationships between aggregate time and hardware follow as corollaries. Aggregate time is related to the sequential resource reversal. Simultaneous relationships from aggregate hardware and time to sequential space and reversal are shown (and conversely), and these are used as evidence for an “extended parallel computation thesis.” These simultaneous relationships provide new characterizations for the simultaneous parallel complexity class NC and for the complementary class SC. The evaluation of monotone planar circuits is shown to be in NC, in fact in LOGCFL.  相似文献   

18.
The emergence of parallel array processing, both in software methodology and hardware technology, opens new avenues for the implementation and optimization of systems for interactive computer graphics. The Q-spline interpolation method is presented, designed for incremental curve definition, local curve modification, “on-the-curve” control points and computational efficiency in array processing environment. The implementation and performance of the algorithms in the environment of a general purpose interactive computer graphics system is described.  相似文献   

19.
“Walkthrough” and “Jogthrough” techniques are well known expert based methodologies for the evaluation of user interface design. In this paper we describe the use of “Graphical” Jogthrough method for evaluating the interface design of the Network Simulator, an educational simulation program that enables users to virtually build a computer network, install hardware and software components, make the necessary settings and test the functionality of the network. Graphical Jogthrough is a further modification of a typical Jogthrough method, where evaluators' ratings produce evidence in the form of a graph, presenting estimated proportion of users who effectively use the interface versus the time they had to work with it in order to succeed effectiveness. We comment on the question: “What are the possible benefits and limitations of the Graphical Jogthrough method when applied in the case of educational software interface design?” We present the results of the evaluation session, and concluding from our experience we argue that the method could offer designers quantitative and qualitative data for formulating a useful (though rough in some aspects) estimation about the novice–becoming–expert pace that end users might follow when working with the evaluated interface.  相似文献   

20.
As digital interfaces increasingly mediate our access to information, the design of these interfaces becomes increasingly important. Designing digital interfaces requires writers to make rhetorical choices that are sometimes technical in nature and often correspond with principles taught in the computer science subfield of human-computer interaction. We propose that an HCI-informed writing pedagogy can complicate for both writing and computer science students the important role audience should play when designing traditional and digital interfaces. Although it is a subtle shift in many ways, this pedagogy seemed to complicate student understanding of the relationship between audience and the texts/interfaces they created: it was not just the “human” (beliefs, attitudes, values, demographics) or the “computer” (the software or hardware or other types of mediation) that mattered but rather the “interaction” between the two. First we explore some of the ways in which writing code and writing prose have merged and paved the way for an HCI-informed writing pedagogy. Next we examine some parallels between human-computer interaction principles and composition principles. Finally, we refer to assignments, student responses, and anecdotal evidence from our classes where an HCI-informed writing pedagogy drew—or could have drawn—student attention more acutely to various audience-related technical and rhetorical interface design choices.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号