Sequence-aware Reinforcement Learning over Knowledge Graphs

Published in REVEAL: A RecSys 2019 Workshop, 2019

Recommended citation: @article{guptasequence, title={Sequence-aware Reinforcement Learning over Knowledge Graphs}, author={Gupta, Ashish and Mehrotra, Rishabh} }

Abstract: We consider the task of generating explainable recommendations with knowledge graphs in a large scale industrial e-commerce platform. We propose a Reinforcement Learning (RL) based approach for recommendation, which casts item recommendation problem as a deterministic Markov Decision Process (MDP) over the knowledge graph, wherein an agent starts from a user, and learns to navigate to the potential items of interest. We hypothesize that the path history can serve as a genuine explanation for why the item is recommended to the user. Different from past work on RL on knowledge graphs, we leverage sequential neural modeling of user’s historic item history, and hierarchical softmax approach for sampling paths in the knowledge graphs and propose Sequence Aware Reinforced Learning over Knowledge Graphs (SeqReLG). Experiments on large scale real world dataset highlights the benefits offered by sequential modeling of user’s history and action sampling techniques. We observe a significant gain in performance when compared to state-of-the-art RL based approach. We additionally discuss and address implementation details for large scale deployment of the proposed RL based solution.

Download paper here