LREC 2000 2nd International Conference on Language Resources & Evaluation | ||||||
Title | Dialogue and Prompting Strategies Evaluation in the DEMON System |
Authors | Lavelle Carine-Alexia (Institut de Recherche en Informatique de Toulouse, Université Paul Sabatier, 118, route de Narbonne, 31062 Toulouse, France, lavelle@irit.fr) De Calmès Martine (Institut de Recherche en Informatique, Université Paul Sabatier, 118,route de Narbonne, 31062 Toulouse Cedex France, decalmes@irit.fr) Pérennou Guy (Institut de Recherche en Informatique, Université Paul Sabatier, 118,route de Narbonne, 31062 Toulouse Cedex France, perennou@irit.fr) |
Keywords | Prompting Strategy, Spoken Dialogue Dystems, Usability |
Session | Session SO2 - Dialogue Evaluation Methods |
Full Paper | 38.ps, 38.pdf |
Abstract | In order to improve usability and efficiency of dialogue systems a major issue is of better adapting dialogue systems to intended users. This requires a good knowledge of users’ behaviour when interacting with a dialogue system. With this regard we based evaluations of dialogue and prompting strategies performed on our system on how they influence users answers. In this paper we will describe the measure we used to evaluate the effect of the size of the welcome prompt and a set measures we defined to evaluate three different confirmation strategies. We will then describe five criteria we used to evaluate system’s question complexity and their effect on users’ answers. The overall aim is to design a set of metrics that could be used to automatically decide which of the possible prompts at a given state in a dialogue should be uttered. |