Tag Archives: iPhone

Multi-user environment for semantic search of multimedia contents

This research project exploits new technologies (multi-touch table and iPhone) in order to  develop a multi-user, multi-role and multi-modal system for multimedia content search, annotation and organization. As use case we considered the field of  broadcast journalism where editors and archivists work together in creating a film report using archive footage.

Multi user environment for semantic search of multimedia contents

Multi user environment for semantic search of multimedia contents

The idea behind this work-in-progress project is to create a multi-touch system that allows one or more users to search multimedia content, especially video, exploiting an ontology based structure for the knowledge management. Such system exploits a collaborative multi-role, multi-user and multi-modal interaction of two users performing different tasks within the application.

The first user plays the role of an archivist: by inserting a keyword through the iPhone, he is able to search and select data through an ontological structured interface designed ad-hoc for multi-touch table. At this stage the user can organize their results in  folders and subfolders: the iPhone is therefore used as a device for text input and for folders storage.

The other user performs the role of an editor: he receives the results of  the search carried out by the archivist through the system or the iPhone. This user examines the contents of the video search and select those that are most suitable for the final result, estimating how much the video is appropriate for his purposes (assessment for the current work session) and giving his opinion on the objective quality of the video (subjective assessment that can also influence future research). In addition, the user also plays the role of  an annotator: he can add more tags to the video if he considers them necessary to retrieve that content in future research.