dictionary compilation and downstream NLP. However, it is a challenging task due to the
varying degrees of frozenness lexical collocations exhibit. In this paper, we put forward a
sequence tagging BERT-based model enhanced with a graph-aware transformer
architecture, which we evaluate on the task of collocation recognition in context. Our results
suggest that explicitly encoding syntactic dependencies in the model architecture is helpful …