目录

目录

Implicit Geometric Regularization for Learning Shapes


<IGR> Implicit geometric regularization for learning shapes

Motivation

  • 从raw 点云中直接学习DeepSDF,在with or without 法向量数据的情况下
  • 用隐式的shape先验,就可以获得plausible solutions
    其实就是简单的loss函数,鼓励输入点云处的函数值为0,鼓励空间散布的点的梯度是单位模梯度
    https://longtimenohack.com/posts/paper_reading/2020icml_gropp_implicit/image-20201228164827136.png

overview

  • given raw input pointcloud $\mathcal{X}=\lbrace x_i\rbrace_{i\in I} \subset \mathbb{R}^3$, with or without normal data $\mathcal{N}=\lbrace n_i\rbrace_{i\in I} \subset \mathbb{R}^3$,从中学出一个 plausible 的surface $\mathcal{M}$
  • 学SDF时的常规loss:
    有数据处函数值为0,法向量为真值;
    (无数据处)空间分布的点法向量2-norm为1
    https://longtimenohack.com/posts/paper_reading/2020icml_gropp_implicit/image-20201228172709924.png
  • 然而只有上述loss存在问题
    • 首先,不能保证学到的是SDF
    • 其次,即使能学到SDF,也不能保证学到的是一个 plausible one
  • 本篇通过理论证明,如果对上述loss使用梯度下降算法,,就可以避免bad critical solutions
    • 是从平面的线性问题考虑的,把这种属性叫做 plane reduction
      https://longtimenohack.com/posts/paper_reading/2020icml_gropp_implicit/image-20201228173842422.png