Real-Time Per-Garment Virtual Try-On with Temporal Consistency for Loose-Fitting Garments

dc.contributor.authorWu, Zaiqiangen_US
dc.contributor.authorShen, I-Chaoen_US
dc.contributor.authorIgarashi, Takeoen_US
dc.contributor.editorChristie, Marcen_US
dc.contributor.editorPietroni, Nicoen_US
dc.contributor.editorWang, Yu-Shuenen_US
dc.date.accessioned2025-10-07T05:03:36Z
dc.date.available2025-10-07T05:03:36Z
dc.date.issued2025
dc.description.abstractPer-garment virtual try-on methods collect garment-specific datasets and train networks tailored to each garment to achieve superior results. However, these approaches often struggle with loose-fitting garments due to two key limitations: (1) They rely on human body semantic maps to align garments with the body, but these maps become unreliable when body contours are obscured by loose-fitting garments, resulting in degraded outcomes; (2) They train garment synthesis networks on a per-frame basis without utilizing temporal information, leading to noticeable jittering artifacts. To address the first limitation, we propose a two-stage approach for robust semantic map estimation. First, we extract a garment-invariant representation from the raw input image. This representation is then passed through an auxiliary network to estimate the semantic map. This enhances the robustness of semantic map estimation under loose-fitting garments during garment-specific dataset generation. To address the second limitation, we introduce a recurrent garment synthesis framework that incorporates temporal dependencies to improve frame-to-frame coherence while maintaining real-time performance. We conducted qualitative and quantitative evaluations to demonstrate that our method outperforms existing approaches in both image quality and temporal coherence. Ablation studies further validate the effectiveness of the garment-invariant representation and the recurrent synthesis framework.en_US
dc.description.number7
dc.description.sectionheadersDigital Clothing
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume44
dc.identifier.doi10.1111/cgf.70272
dc.identifier.issn1467-8659
dc.identifier.pages12 pages
dc.identifier.urihttps://doi.org/10.1111/cgf.70272
dc.identifier.urihttps://diglib.eg.org/handle/10.1111/cgf70272
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.rightsCC BY-NC Attribution-NonCommercial 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by-nc/4.0/
dc.subjectCCS Concepts: Computing methodologies → Image processing; Image-based rendering; Human-centered computing → Mixed / augmented reality
dc.subjectComputing methodologies → Image processing
dc.subjectImage
dc.subjectbased rendering
dc.subjectHuman centered computing → Mixed / augmented reality
dc.titleReal-Time Per-Garment Virtual Try-On with Temporal Consistency for Loose-Fitting Garmentsen_US
Files
Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
cgf70272.pdf
Size:
7.5 MB
Format:
Adobe Portable Document Format
No Thumbnail Available
Name:
paper1005_mm1.mp4
Size:
22.26 MB
Format:
Video MP4
Collections