Abstract: | A series of projects by the MIT Media Lab's Responsive Environments Group explore ways of bridging the rapidly expanding reach of networked electronic sensors with the limited realm of human perception. These include various implementations of cross-reality which render and manifest phenomena between the real world and shared online virtual environments via densely embedded sensor and actuator networks. We visualize information from ubiquitously deployed real-world smart power strips and sensor-rich media portals at different levels of abstraction through analogous Second Life constructs. Conversely, we manifest virtual world events into physical space using on-platform actuators and displays. We also show a set of simpler 2D visualizations that enable mobile devices to efficiently browse and interact with sensor network data. We touch on a recently developed system that uses a small badge to passively manage dynamic privacy in environments such as these that stream potentially revealing information across the real/virtual divide. These technologies' application areas involve fluid browsing of and interaction with the geographically dispersed real world in an unconstrained virtual environment and ubiquitous multiscale telepresence. |