High-Quality Geometry and Texture Editing of Neural Radiance Field
dc.contributor.author | Kim, Soongjin | en_US |
dc.contributor.author | Son, Jooeun | en_US |
dc.contributor.author | Ju, Gwangjin | en_US |
dc.contributor.author | Lee, Joo Ho | en_US |
dc.contributor.author | Lee, Seungyong | en_US |
dc.contributor.editor | Chen, Renjie | en_US |
dc.contributor.editor | Ritschel, Tobias | en_US |
dc.contributor.editor | Whiting, Emily | en_US |
dc.date.accessioned | 2024-10-13T18:05:36Z | |
dc.date.available | 2024-10-13T18:05:36Z | |
dc.date.issued | 2024 | |
dc.description.abstract | Recent advances in Neural Radiance Field (NeRF) have demonstrated impressive rendering quality reconstructed from input images. However, the density-based radiance field representation introduces entanglement of geometry and texture, limiting the editability. To address this issue, NeuMesh proposed a mesh-based NeRF editing method supporting deformation and texture editing. Still, it fails reconstructing and rendering fine details of input images, and the dependency between rendering scheme and geometry limits editability for target scenes. In this paper, we propose an intermediate scene representation where a near-surface volume is associated with the guide mesh. Our key idea is separating a given scene into geometry, parameterized texture space, and radiance field. We define a mapping between GHI-coordinate space and DE-coordinate system defined by combination of mesh parameterization and the height from mesh surface to efficiently encode the near-surface volume. With the surface-aligned radiance field defined in the near-surface volume, our method can generate high quality rendering results with high frequency details. Our method also supports various geometry and appearance editing operations while preserving high rendering quality. We demonstrate the performance of our method by comparing it with the state-of-the-art methods both qualitatively and quantitatively and show its applications including shape deformation, texture filling, and texture painting. | en_US |
dc.description.sectionheaders | Neural Radiance Fields and Gaussian Splatting | |
dc.description.seriesinformation | Pacific Graphics Conference Papers and Posters | |
dc.identifier.doi | 10.2312/pg.20241318 | |
dc.identifier.isbn | 978-3-03868-250-9 | |
dc.identifier.pages | 10 pages | |
dc.identifier.uri | https://doi.org/10.2312/pg.20241318 | |
dc.identifier.uri | https://diglib.eg.org/handle/10.2312/pg20241318 | |
dc.publisher | The Eurographics Association | en_US |
dc.rights | Attribution 4.0 International License | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | CCS Concepts: Computing methodologies->Image-based rendering | |
dc.subject | Computing methodologies | |
dc.subject | Image | |
dc.subject | based rendering | |
dc.title | High-Quality Geometry and Texture Editing of Neural Radiance Field | en_US |