Skip to yearly menu bar Skip to main content


In-Person Poster presentation / poster accept

HyperDeepONet: learning operator with complex target function space using the limited resources via hypernetwork

Jae Yong Lee · Sung Woong Cho · Hyung Ju Hwang

MH1-2-3-4 #73

Keywords: [ Machine Learning for Sciences ] [ DeepONet ] [ hypernetwork ] [ Operator learning ] [ Deep operator network ]


Abstract:

Fast and accurate predictions for complex physical dynamics are a big challenge across various applications. Real-time prediction on resource-constrained hardware is even more crucial in the real-world problems. The deep operator network (DeepONet) has recently been proposed as a framework for learning nonlinear mappings between function spaces. However, the DeepONet requires many parameters and has a high computational cost when learning operators, particularly those with complex (discontinuous or non-smooth) target functions. In this study, we propose HyperDeepONet, which uses the expressive power of the hypernetwork to enable learning of a complex operator with smaller set of parameters. The DeepONet and its variant models can be thought of as a method of injecting the input function information into the target function. From this perspective, these models can be viewed as a special case of HyperDeepONet. We analyze the complexity of DeepONet and conclude that HyperDeepONet needs relatively lower complexity to obtain the desired accuracy for operator learning. HyperDeepONet was successfully applied to various operator learning problems using low computational resources compared to other benchmarks.

Chat is not available.