Scraping Textures from Natural Images for Synthesis and Editing

1UC Merced 2NVIDIA 3UC San Diego 4UC Berkeley

Abstract

Existing texture synthesis methods focus on generating large texture images given a small texture sample. But such samples are typically assumed to be highly curated: rectangular, clean, and stationary. This paper aims to scrape textures directly from natural images of everyday objects and scenes, build texture models, and employ them for texture synthesis, texture editing, etc.

The key idea is to jointly learn image grouping and texture modeling. The image grouping module discovers clean texture segments, each of which is represented as a texton code and a parametric sine wave by the texture modeling module. By enforcing the model to reconstruct the input image from the texton codes and sine waves, our model can be learned self-supervisedly on a set of cluttered natural images, without requiring any form of annotation or well-cropped texture images.

We showcase that the learned texture features faithfully depicts textures in natural images and can be readily applied to a variety of tasks, including texture synthesis, editing and swapping, outperforming existing state-of-the-art methods.

Overview Video

Texture Synthesis Hugging Face Demo

Texture Editing Hugging Face Demo

BibTeX

@article{li2022texture,
  author    = {Li, Xueting and Wang, Xiaolong and Yang, Ming-Hsuan and Efros, Alexei and Liu, Sifei},
  title     = {Scraping Textures from Natural Images for Synthesis and Editing},
  journal   = {ECCV},
  year      = {2022},
}