Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Deep Learning for Code

NS3: Neuro-Symbolic Semantic Code Search

Shushan Arakelyan · Anna Hakhverdyan · Miltiadis Allamanis · Christophe Hauser · Luis Garcia · Xiang Ren


Abstract: Semantic code search is the task of retrieving a code snippet given a textual description of its functionality. Recent work has been focused on using similarity metrics between neural embeddings of text and code. However, current language models are known to struggle with longer, compositional sentences, and multi-step reasoning. To overcome this limitation, we propose supplementing the query sentence with a layout of its semantic structure. The semantic layout is used to break down the final reasoning decision into a series of lower-level decisions. We use a Neural Module Network architecture to implement this idea. We compare our model - $NS^3$ (Neuro-Symbolic Semantic Search) - to a number of baselines, including state-of-the-art semantic code retrieval methods, such as CodeBERT, CuBERT and GraphCodeBERT, and evaluate on two datasets - Code Search Net (CSN) and Code Search and Question Answering (CoSQA). On these datasets, we demonstrate that our approach results in higher performance. We also perform additional studies to show the effectiveness of our modular design when handling compositional queries. [Note: This contribution also has a spotlight talk. Please find the paper here: https://openreview.net/forum?id=rubeJ2ObyWc]

Chat is not available.