Đang chuẩn bị nút TẢI XUỐNG, xin hãy chờ
Tải xuống
This approach provides a unified framework for leveraging the ability of the Transformer’s self-attention mechanism in modeling session sequences while taking into account the user’s main interest in the session. We empirically evaluate the proposed method on two benchmark datasets. The results show that DTER outperforms state-of-the-art session-based recommendation methods on common evaluation metrics. |