Metadata-driven interactive web video assembly |
| |
Authors: | Rene Kaiser Michael Hausenblas Martin Umgeher |
| |
Affiliation: | (1) Institute of Information Systems & Information Management, Joanneum Research Forschungsgesellschaft mbH, Graz, Austria;(2) Institute for Software Technology, Graz University of Technology, Graz, Austria |
| |
Abstract: | The recent expansion of broadband Internet access led to an exponential increase of potential consumers of video on the Web.
The huge success of video upload websites shows that the online world, with its virtually unlimited possibilities of active
user participation, is an ideal complement to traditional consumption-only media like TV and DVD. It is evident that users
are willing to interact with content-providing systems in order to get the content they desire. In parallel to these developments,
innovative tools for producing interactive, non-linear audio-visual content are being created. They support the authoring
process alongside management of media and metadata, enabling on-demand assembly of videos based on the consumer’s wishes.
The quality of such a dynamic video remixing system mainly depends on the expressiveness of associated metadata. Eliminating
the need for manual input as far as possible, we aim at designing a system which is able to automatically enrich its own media
and metadata repositories continuously. Currently, video content remixing is available on the Web mostly in very basic forms.
Most platforms offer upload and simple modification of content. Although several implementations exist, to the best of our
knowledge no solution uses metadata to its full extent to dynamically render a video stream based on consumers’ wishes. With
the research presented in this paper, we propose a novel concept to interactive video assembly on the Web. In this approach,
consumers may describe the desired content using a set of domain-specific parameters. Based on the metadata the video clips
are annotated with, the system chooses clips fitting the user criteria. They are aligned in an aesthetically pleasing manner
while the user furthermore is able to interactively influence content selection during playback at any time. We use a practical
example to clarify the concept and further outline what it takes to implement a suchlike system.
Rene Kaiser
graduated in Software Engineering at the FH Hagenberg in 2005. Since 2006, he is working at JOANNEUM RESEARCH, focussing on
various research aspects of multimedia semantics. Rene is especially interested in metadata representation, Semantic Web technologies,
and non-linear interactive video production.
Dr. Michael Hausenblas
is a senior researcher at JOANNEUM RESEARCH working in the area of multimedia semantics. He has been utilising Web of Data
technologies in a couple of national and international projects. Additionally, he has been active in several W3C activities,
Semantic Web Deployment Working Group and in Video in the Web activity. Michael holds a PhD in Computer Science (Telematics)
from Graz University of Technology.
Martin Umgeher
is a PhD student at the Technical University of Graz. He is researching in the area of mobile multimedia applications, applying
agile development methodologies and focussing on usability aspects. Martin has been active in both national and international
multimedia-based projects.
![MediaObjects/11042_2008_242_Figc_HTML.gif](/content/gl1g2904228u62g1/MediaObjects/11042_2008_242_Figc_HTML.gif) |
| |
Keywords: | Non-linear video Interactive video Dynamic video assembly Metadata-based video assembly |
本文献已被 SpringerLink 等数据库收录! |
|