当前位置:网站首页>CVPR 2022 | neural radiation field geometry editing method nerf editing

CVPR 2022 | neural radiation field geometry editing method nerf editing

2022-06-11 13:52:00 Alibaba Amoy technology team official website blog

fe9ab4e408d10664006df64f86b5de03.gif

Researchers of Alibaba Taobao technology and the Institute of computing of the Chinese Academy of sciences have proposed a method that allows users to freely edit the geometric content of the neural radiation field , Based on papers 《NeRF-Editing: Geometry Editing of Neural Radiance Fields》 Presented at the top conference in the field IEEE CVPR 2022.

Image based 3D Scene modeling and rendering is a widely studied subject in computer vision and computer graphics . Traditional methods rely on grid based scene representation , The mesh representation is optimized according to the real image by means of differentiable rasterization program or path tracking method . However , Mesh optimization based on two-dimensional image supervision is often prone to fall into local optimization , And the topology of the grid cannot be changed in the optimization process .

Neural radiation fields (NeRF) [1] It is a powerful tool to solve this problem at present , It uses a fully connected network (MLP) Implicitly model the geometry and appearance of the scene , Union rendering (volume rendering) Method to get high realistic rendering . However , As an implicit modeling method , It is difficult for users to edit or modify scene objects interactively in neural radiation field network . therefore , New visual angle synthesis ability based on implicit representation such as neural radiation field , Further research on how to edit implicit representations has become a new exploration direction . Yes NeRF Geometric editing is an effective geometric modeling method .

To address these issues , Researchers from the Institute of computing of the Chinese Academy of Sciences and the Technology Department of Alibaba Amoy have proposed a A method of allowing users to freely edit the geometric content of neural radiation field , Will be in the form of a thesis NeRF-Editing: Geometry Editing of Neural Radiance Fields Presented at the top conference in the field IEEE CVPR 2022.

1befb6d129953b27f762495e32a2552a.png

chart 1: Schematic diagram of geometric editing results of neural radiation field

technology roadmap

The purpose of this study is to propose a method that enables users to customize deformation constraints , And using the deformation constraint to edit the neural radiation field to generate the deformed scene image . The algorithm framework is shown in the figure 2 Shown . Firstly, the explicit triangular mesh representation is extracted from the trained neural radiation field representation . You specify control points on an explicit triangular mesh representation , And drag the control points for intuitive deformation editing . According to the mesh before and after deformation , This study proposes a method to propagate the vertex deformations of discrete meshes to the whole continuous space . Rays traveling during rendering through curved bodies , The neural radiation field can render the results that meet the user's editing expectations .

421b82d5f509a0d82edf5975be86b8a5.png

chart 2: Schematic diagram of deformation method of neural radiation field

The core of this research is to propagate the vertex deformations of discrete meshes to the whole continuous space . To achieve this process , Researchers propose to use the spatial tetrahedron as a proxy , By minimizing ARAP(as-rigid-as-possible)[2] The deformation energy term is used to solve the deformation of the space tetrahedron , Then the deformation displacement values of any point in space are interpolated , Bend the light , Pictured 3 Shown . To be specific , First, the user needs to take a multi view picture of the object , The neural radiation field network is used to reconstruct the object . After getting the explicit representation of the scene , Users can use existing mesh editing methods to edit the extracted explicit representation . In order to propagate the offset of mesh vertices to the whole space , The researchers constructed a tetrahedral mesh wrapped in a triangular mesh , The displacement of tetrahedral mesh vertices before and after deformation is solved by optimizing the coordinates of triangle mesh barycenter before and after deformation and the local rigidity of tetrahedral mesh . Remember that the vertices of the triangle mesh before and after deformation are , The vertices of tetrahedral mesh before and after deformation are df5bc672743b61ec8490f66c4477236b.png, Deformed tetrahedron 5de53798614b850a36d7ed8a0624be53.png It can be solved by optimizing the following energy formula

6893bbac2f1f90f14e3f6a60b981d0cc.png

among ,fe7c9451478224f390fa952a1f5de5d5.png That constrains the local rigidity of a tetrahedron ARAP energy . Please refer to the paper for more details .

be38e5d20ba2decf6df13e7d529e6e5e.png

chart 3: Propagation of discrete mesh vertex deformation into continuous space

Result display

chart 4 Shows the editing results on the composite scene . The first set of results ( First two lines ) It's a Lego bulldozer model . The user can edit the bulldozer to lower the shovel , It realizes the editing of complex synthetic data . The second set of results ( The last two lines ) It's a model chair . Users can stretch the back and legs of the chair , Realize the editing of local components of objects .

60dd77308d61f00d3b7a9f6171e5adbb.png

chart 4: Edit results on synthetic data

chart 5 It shows the editing results on the real scene . Users can edit giraffes to make them have different postures , Or zoom the local area .

9bfece57e93c801551b820b5f50c2705.png

chart 5: Edit results on real data

chart 6 Shows more results of editing in real scenes . Users can flap the wings of small dinosaurs , Or close the lid of the notebook .

aa551c9f7314bfcf46e135d3e4088659.png

762c755b39f75e17d0cf6fbaabf6f827.png

chart 6: Edit results on more real data

chart 7 Shows how to edit the model , Animation rendering results obtained by interpolating deformation sequences .

7059e7a721a80dbfbf9c3fcdab1f6fb7.png

chart 7: Edit the model interpolation sequence results before and after

Simple summary , In this study, the first method of geometric deformation of neural radiation field network supporting user control is proposed . By establishing the correspondence between explicit mesh representation and implicit volume representation , This method can use the mesh deformation method to guide the deformation of neural radiation field , So as to synthesize a new perspective rendering that meets the editing needs .

Paper information

Yu-Jie Yuan#, Yang-Tian Sun#, Yu-Kun Lai, Yuewen Ma, Rongfei Jia, Lin Gao*. NeRF-Editing: Geometry Editing of Neural Radiance Fields,IEEE CVPR 2022.

Technical details

http://geometrylearning.com/NeRFEditing/

reference

  1. Ben Mildenhall, Pratul P Srinivasan, Matthew Tancik, Jonathan T Barron, Ravi Ramamoorthi, and Ren Ng, NeRF: Representing scenes as neural radiance fields for view synthesis, ECCV, 2020, 405-421.

  2. Olga Sorkine-Hornung and Marc Alexa, As-rigid-as-possible surface modeling, Symposium on Geometry Processing, 2007.

Project information

Object Drawer( Official website :https://objectdrawer.alibaba.com) Low cost, high quality modeling products . The first 3D modeling product based on neural rendering in the industry , stay NeRF Based on neural rendering technology , Creatively put forward explicit + A new paradigm of implicit 3D model , Breakthrough in modeling robustness 、 Detail resolution 、 Speed of reasoning 、 Light migration, etc NeRF The limitations of neural rendering technology , It realizes the low-cost and high-quality automatic three-dimensional modeling of commodities , The degree of modeling reduction is much higher than that of similar products , Welcome to experience 、 communication .

team introduction

Commodity 3D modeling &AI The design team is based on the advanced AI Technology and XR New hardware 、 New engine , Create a new immersive shopping experience for consumers , Let users fully experience the enjoyment of goods and services before purchase , At the same time, it provides merchants with more competitive commodity scene expression . The team already has 3D modeling — Tracing Workshop (Object Drawer)、3D Dynamic and static scene generation —3D Magic pen and other products . While modeling and designing product development , The team has made positive contributions in the academic field , stay ICCV、NeurIPS、KDD、CVPR And other top academic conferences , Open to researchers 3D-FRONT Data sets , a ChinaGraph The first data award . In order to build the whole stack R & D capability of the team , We continue to attract 3D/XR engine 、 Vision / Excellent professionals in graphic algorithms and other fields join , Run together to XR The new age .

*   Expanding reading  

2669de6f031a29537f94cc15ba2cd4af.png

bcbf2df9a42781296c206b0de094387b.png

author | Sunyangtian

edit | Orange King

15e0e53b0b41ac2509a67e4f8d489fd8.png

原网站

版权声明
本文为[Alibaba Amoy technology team official website blog]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/162/202206111348083646.html