Personalizing search results using hierarchical RNN with query-aware attention

S Ge, Z Dou, Z Jiang, JY Nie, JR Wen - Proceedings of the 27th ACM …, 2018 - dl.acm.org
Proceedings of the 27th ACM international conference on information and …, 2018dl.acm.org
Search results personalization has become an effective way to improve the quality of search
engines. Previous studies extracted information such as past clicks, user topical interests,
query click entropy and so on to tailor the original ranking. However, few studies have taken
into account the sequential information underlying previous queries and sessions. Intuitively,
the order of issued queries is important in inferring the real user interests. And more recent
sessions should provide more reliable personal signals than older sessions. In addition, the …
Search results personalization has become an effective way to improve the quality of search engines. Previous studies extracted information such as past clicks, user topical interests, query click entropy and so on to tailor the original ranking. However, few studies have taken into account the sequential information underlying previous queries and sessions. Intuitively, the order of issued queries is important in inferring the real user interests. And more recent sessions should provide more reliable personal signals than older sessions. In addition, the previous search history and user behaviors should influence the personalization of the current query depending on their relatedness. To implement these intuitions, in this paper we employ a hierarchical recurrent neural network to exploit such sequential information and automatically generate user profile from historical data. We propose a query-aware attention model to generate a dynamic user profile based on the input query. Significant improvement is observed in the experiment with data from a commercial search engine when compared with several traditional personalization models. Our analysis reveals that the attention model is able to attribute higher weights to more related past sessions after fine training.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果