Aligning Gradient and Hessian for Neural Signed Distance Function

Dec 10, 2023

Speakers

About

The Signed Distance Function (SDF), as an implicit surface representation, provides a crucial method for reconstructing a manifold surface from unorganized point clouds. The SDF has a fundamental relationship with the principles of surface vector calculus. Given a smooth surface, there exists a thin-shell space in which the SDF is differentiable such that the gradient of the SDF is an eigenvector of its Hessian operator, with a corresponding eigenvalue of zero. In this paper, we propose to learn the SDF directly from point clouds without normals motivated by the key observation that the alignment between the gradient and the Hessian of the SDF allows for more effective control over the direction of the gradients, making up for the weakness of only regularizing the gradient norm. Extensive experimental results demonstrate its ability to accurately recover the underlying shape while effectively suppressing the presence of ghost geometry.

Organizer

Like the format? Trust SlidesLive to capture your next event!

Professional recording and live streaming, delivered globally.

Sharing

Recommended Videos

Presentations on similar topic, category or speaker

Interested in talks like this? Follow NeurIPS 2023