Show simple item record

ICEIS

dc.contributor.authorEhnes, Jochen
dc.date.accessioned2010-10-13T14:06:23Z
dc.date.available2010-10-13T14:06:23Z
dc.date.issued2009en
dc.identifier.urihttp://hdl.handle.net/1842/3953
dc.description.abstractWe describe our approach to support ongoing meetings with an automated meeting assistant. The system based on the AMIDA Content Linking Device aims at providing relevant documents used in previous meetings for the ongoing meeting based on automatic speech recognition. Once the content linking device finds documents linked to a discussion about a similar subject in a previous meeting, it assumes they may be relevant for the current discussion as well. We believe that the way these documents are offered to the meeting participants is equally important as the way they are found. We developed a mixed reality, projection based user interface that lets the documents appear on the table tops in front of the meeting participants. They can hand them over to others or bring them onto the shared projection screen easily if they consider them relevant. Yet, irrelevant documents don't draw too much attention from the discussion. In this paper we describe the concept and implementation of this user interface and provide some preliminary results.en
dc.titleAn Automated Meeting Assistant: A Tangible Mixed Reality Interface for the AMIDA Automatic Content Linking Deviceen
dc.typeConference Paperen
dc.identifier.doihttp://dx.doi.org/10.1007/978-3-642-01347-8_79en
rps.titleICEISen
dc.date.updated2010-10-13T14:06:23Z
dc.date.openingDate2009


Files in this item

This item appears in the following Collection(s)

Show simple item record