Inpaint4Drag: Repurposing Inpainting Models
for Drag-Based Image Editing via Bidirectional Warping

ICCV 2025

Jingyi Lu, Kai Han
Visual AI Lab, The University of Hong Kong

Inpaint4Drag introduces a novel framework that decomposes drag-based editing into pixel-space bidirectional warping and image inpainting. Our method achieves real-time warping previews (0.01s) and efficient inpainting (0.3s) at 512×512 resolution, significantly improving interaction experience while serving as an adapter for any inpainting model.

Real-time Interactive Editing

Our bidirectional warping algorithm provides immediate visual feedback with real-time previews (0.01s), followed by efficient inpainting (0.3s). This is 14× faster than FastDrag and nearly 600× faster than DragDiffusion.

Try our demo

Physics-Inspired Deformation Framework

Our approach treats image regions as elastic materials, enabling natural transformations through user-specified control points and region masks.

Bidirectional Warping Algorithm

Our core innovation lies in decomposing drag-based editing into pixel-space bidirectional warping and standard image inpainting. We establish dense pixel correspondences through forward warping for initial contours and backward mapping for complete coverage, ensuring smooth deformation without artifacts.

Read the paper

BibTeX

@inproceedings{lu2025inpaint4drag,
    author    = {Jingyi Lu and Kai Han},
    title     = {Inpaint4Drag: Repurposing Inpainting Models for Drag-Based Image Editing via Bidirectional Warping},
    booktitle = {International Conference on Computer Vision (ICCV)},
    year      = {2025},
}