검색 상세

다중 턴 검색기반 대화 시스템을 위한 단일 턴 사후학습 방법

Single-turn Post-training for Multi-turn Retrieval-based Dialogue Systems

초록/요약

Retrieval-based dialogue systems have shown great performance using the pre-trained language models such as BERT. In multi-turn response selection, BERT focuses on training the relation between a context with multiple utterances and a response. However, such a way of training is insufficient when considering the relations between each utterance in the context. This leads to a problem of not fully understanding the context flow required to select a response. To address this, I propose a new single-turn post-training method which learns information about adjacent pairs of conversations. Unlike sentences from a general document, adjacent utterance pairs of a conversation play an important role in analyzing the utterance's intention and generating a response. Through the proposed method, a model understands internal utterance interactions in a context by training every adjacent utterance pair in single-turn units. At the same time, it maintains the advantage of post-training, which can easily adapt to unseen domain-specific words or phrases. Experimental results showed that the single-turn post-training method can improve the performance of multi-turn response selection. As a result, my model outperforms the previous state-of-the-art model on three benchmark datasets.

more