Affinity Posters
Blog Track Session 3
David Dobre · Leo Schwinn · Claire Vernade · Charlie Gauthier · Fabian Pedregosa · Gauthier Gidel
Halle B
Live content is unavailable. Log in and register to view live content
Schedule
Wed 1:45 a.m. - 3:45 a.m.
|
What exactly has TabPFN learned to do?
(
Poster
#2
)
>
link
Poster Location: Halle B #2 TabPFN [Hollmann et al., 2023], a Transformer model pretrained to perform in-context learning on fresh tabular classification problems, was presented at the last ICLR conference. To better understand its behavior, we treat it as a black-box function approximator generator and observe its generated function approximations on a varied selection of training datasets. Exploring its learned inductive biases in this manner, we observe behavior that is at turns either brilliant or baffling. We conclude this post with thoughts on how these results might inform the development, evaluation, and application of prior-data fitted networks (PFNs) in the future. |
Calvin McCarter 🔗 |
Wed 1:45 a.m. - 3:45 a.m.
|
Elaborating on the Value of Flow Matching for Density Estimation
(
Poster
#1
)
>
link
Poster Location: Halle B #1 Flow matching provides a simulation free method for training continuous normalizing flows. Key ingredients are an implicit definition of the target flow via direct definition of the conditional flows with respect to a single target sample and a loss function that directly regresses the time dependent vector field against the conditional vector fields with respect to single samples.In this post, the origin of the flow matching formulation for continuous normalizing flows, their generalization as well as their value for density estimation is discussed. Especially, light is shed on their ability to scale well to higher dimensions and therefore enable new applications in the growing research field of Simulation-based Inference. |
Maternus Herold 🔗 |