Skip to content

TArdelean/FeaturePainting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Example-Based Feature Painting on Textures (SIGGRAPH Asia 2025)

This is the official implementation of Example-Based Feature Painting on Textures.

Teaser


🚀 Installation

We provide the pyprohect.toml file for an easy environment setup using uv. To get started, simply clone the repository and use uv sync to create the virtual environment.

cd FeaturePainting
uv sync

To load both parts of this project to the PYTHONPATH, please set the uv environment directory:

export UV_ENV_FILE=.uvenv

📂 Data

We provide one of our captures as template under datasets/blueb.

In order to run our method on MVTecAD, please download it from here and link the mvtec_anomaly_detection directory as a subfolder to datasets.

To try the method on your own textures, just place the image folder in datasets.

📊 How to run

The repository is structured in two parts. The first one (anomaly_segmentation) contains the logic for anomaly localization, contrastive learning, and semantic segmentation. The second one (synthesis) contains the scripts for training the diffusion model, arbitrarily-large texture generation, and feature transfer.

🔍 Anomaly/Feature Segmentation

The first step is to train a VAE on the images and extract the residuals.

# For an image folder dataset run
uv run anomaly_segmentation/train_vae.py simple <folder_name>
# For MVTec use
uv run anomaly_segmentation/train_vae.py mvtec <object_name>

After extracting residual maps, we use FCA to get anomaly scores for all images.

# For an image folder dataset
uv run anomaly_segmentation/anomaly_segmentation.py dataset=simple dataset.name=blueb features=cached features.fe_tag=VAE save_alpha_tiff=True track=fca_va
# For MVTec
uv run anomaly_segmentation/dataset_inference.py dataset=mvtec dataset.object_name=<object_name> features=cached features.fe_tag=VAE save_alpha_tiff=True track=vae_save

We then create the dataset of positive and negative pairs of connected regions, to be used in contrastive learning.

uv run anomaly_segmentation/create_cl_dataset.py dataset=simple dataset.name=blueb anomaly_out=<Name of output dir from previous step> n_clusters=3
# Similarly adapt for MVTec
uv run anomaly_segmentation/create_cl_dataset.py dataset=mvtec dataset.object_name=<object_name> anomaly_out=<Name of output dir from previous step> n_clusters=3

Finally, we train a feature embedding network using contrastive learning and cluster the pixels using KMeans on the new features.

uv run anomaly_segmentation/train_cl_dataset.py dataset=simple dataset.name=blueb n_clusters=3 compute_metrics=False

The additional scripts, visualize_groups.py and multi_segmentation.py have a similar interface and can be used for debugging purposes.

🖼️ Synthesis

To improve the convergence of the network training for each material individually, we recommend pretraining the model on DTD. This only has to be done once, and the saved weights can be then reused as starting point for training each model.

uv run synthesis/train.py --outdir=synthesis-runs --data-class=training.dtd_dataset.DTDataset --data=datasets/DTD/images \ 
--cond=1 --precond=spe --arch=spe --batch=32 --duration=2.2 --snap=5 --dump=10 --augment=0 --ema=0.02 --workers=4

To run on a particular texture from an image folder, adapt the command to use the right data class, path, and optionally the checkpoint of the DTD pretraining.

uv run synthesis/train.py --outdir=synthesis-runs --data-class=training.gen_folder_dataset.GenFolderDataset \
--object-name=single --data=datasets/<folder_name> --cond=1 --precond=spe --arch=spe --batch=32 --duration=1.1 \
--snap=5 --dump=10 --augment=0 --ema=0.02 --spc-labels=outputs/<anomaly-segmentation-folder>/tiff_labels \
--transfer=synthesis-runs/<DTD-run-folder>/<snapshot>.pkl --workers=8

For MVTecAD, modify the command above, setting --data-class=training.gen_mvtec_dataset.GenMvtecDataset --object-name=<object_name> --data=datasets/mvtec_anomaly_detection.

⚡ Gradio Demo

We include in the repository a gradio demo that can be launched using

 uv run synthesis/gradio_ui/main.py

📚 Citation

Should you find our work useful in your research, please cite:

@article{ardelean2025examplebased,
  author = {Ardelean, Andrei-Timotei and Weyrich, Tim},
  title = {Example-Based Feature Painting on Textures},
  journal = {ACM Transactions on Graphics (Proc. SIGGRAPH Asia)},
  year = 2025,
  month = dec,
  location = {Hong Kong},
  publisher = {ACM},
  address = {New York, NY, USA},
}

Acknowledgements

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 956585 (PRIME ITN).

📄 License

Please see our LICENSE.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published