Abstract: We investigate the use of neural fields for modeling diverse mesoscale structures, such as fur, fabric, and grass. Instead of using
classical graphics primitives to model the structure, we propose to employ a versatile volumetric primitive represented by a
neural reflectance field (NeRF-Tex), which jointly models the geometry of the material and its response to lighting. The NeRF-Tex
primitive can be instantiated over a base mesh to “texture” it with the desired meso and microscale appearance. We condition
the reflectance field on user-defined parameters that control the appearance. A single NeRF texture thus captures an entire space
of reflectance fields rather than one specific structure. This increases the gamut of appearances that can be modeled and provides
a solution for combating repetitive texturing artifacts. We also demonstrate that NeRF textures naturally facilitate continuous
level-of-detail rendering. Our approach unites the versatility and modeling power of neural networks with the artistic control
needed for precise modeling of virtual scenes. While all our training data is currently synthetic, our work provides a recipe that
can be further extended to extract complex, hard-to-model appearances from real images.
Loading