Skip to yearly menu bar Skip to main content


Poster

Generalized Neural Sorting Networks with Error-Free Differentiable Swap Functions

Jungtaek Kim · Jeongbeen Yoon · Minsu Cho

Halle B #218
[ ]
Tue 7 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

Sorting is a fundamental operation of all computer systems, having been a long-standing significant research topic. Beyond the problem formulation of traditional sorting algorithms, we consider sorting problems for more abstract yet expressive inputs, e.g., multi-digit images and image fragments, through a neural sorting network. To learn a mapping from a high-dimensional input to an ordinal variable, the differentiability of sorting networks needs to be guaranteed. In this paper we define a softening error by a differentiable swap function, and develop an error-free swap function that holds a non-decreasing condition and differentiability. Furthermore, a permutation-equivariant Transformer network with multi-head attention is adopted to capture dependency between given inputs and also leverage its model capacity with self-attention. Experiments on diverse sorting benchmarks show that our methods perform better than or comparable to baseline methods.

Live content is unavailable. Log in and register to view live content