August 4, 2021
Knitting Virtual Yarn
Visual computing scientists at IST Austria publish new yarn cloth animation program at the prestigious SIGGRAPH conference
Animation artists, who want to level up the texture of their character’s yarn clothing can now draw on a powerful algorithm from the visual computing research done at the Institute of Science and Technology (IST) Austria. Georg Sperl from the Wojtan group presents their newest results about the efficient animation of intricate yarn complexity at the SIGGRAPH conference.
Who does not remember the clumsy cowboy Woody or the strong ice princess Elsa? As the animation film industry gradually lets go of single spectacular explosions, it dedicates more and more time to the essence of a story: characters. Especially their hair, skin, and clothing have become objects of intense scrutiny. “When Toy Story started, protagonists were almost like rigid objects,” remembers Chris Wojtan, computer graphics professor at the Institute of Science and Technology Austria (IST Austria). “Now, it´s possible to simulate even individual threads in a sweater, but it´s expensive to do so. That´s where our algorithm strikes.” The new method accounts for the local mechanics of the yarn, its tightening and reassembling, while maintaining the efficiency required for real-time simulation.
Watch the corresponding video on YouTube
Compared to a naïve animation the new method captures tightening knit loops and the change in transparency of the cloth. © IST Austria.
Stretching the Power of Yarn Cloth Animation
Calculating the behavior of each individual thread is computationally very costly. Last year, Georg Sperl, lead-author of the new paper, introduced an efficient mesh-based simulation that captures the behavior of knitted fabric. It splits the cloth surface into many little deforming and moving triangles. This grid then folds and curls according to the laws of physics.
But naïvely projecting an image of a flat yarn pattern onto this grid does not weave the magic. “Then the local yarn physics is missing,” Sperl summarizes about methods used in praised animated movies. Not until including the local behavior of yarn knots can one reproduce the tightening and reassembling of threads in knitted fabrics realistically. “Our method is a better approximation of what’s truly happening.”
In the first step, the new method computes how a tiny patch of the repeating yarn pattern reacts to different deformations. This is like collecting a library of possible local movements and positions of the threads for a certain knitting pattern. Next, the algorithm tiles these pre-computed patches onto the grid of the mesh-based simulation. When a twisted sleeve is animated, each deformed triangle serves as indicator telling the computer which yarn deformation from the precomputed library it should put there.
Watch the corresponding video on YouTube
With only milliseconds of animation time per frame even full moving garments with millions of yarn loops can be displayed in real-time. © IST Austria
Solving the Buckling Challenge Efficiently
One considerable challenge the scientists overcame was buckling. Since the algorithm averages two yarn deformations to determine intermediate states, it is important that the two possible arrangements merge smoothly. “In certain configurations though, yarn may pop this way or the other,” explains Sperl. “Both solutions are physically viable but averaging between them gives unrealistic results.” By constraining strong deformations, buckling was reduced in a user-controllable way.
Another challenge actually appeared because the animation generated so much detail. So-called aliasing shows as flickering on the texture when there is too much detail per pixel. Future work will circumvent this by tailoring the level of animated detail to the available resolution. Another topic for future work are irregular patches like seams and edges, which have been disregarded in this work.
The algorithm’s power stems also from being parallelizable, which means that the movement of many yarn segments can be looked up simultaneously on the graphics card (GPU). This allows for real-time animation, even for millions of yarn stitches. “Georg’s method is a way to do super high-fidelity simulation down to the individual thread level,” says his supervisor Wojtan, “and it simulates yarn-level detail hundreds or thousands of times faster.”
Watch the corresponding video on YouTube
Real-time interactive simulation of realistic knitted garments becomes possible. Seams and edges of the regular pattern are left to future works. © IST Austria
Pursuing a Seamless Simulation of Nature
Apart from animated movies, the textile industry offers another application. There is a constant search for simulating fabrics better, and even digital fashion fantasies become gradually more grounded. One may enter an online shop with a spotless avatar of themselves one day and try virtual clothing that is simulated in real-time. The IST Austria scientists however are inspired by more fundamental questions.
“We looked at tiny things – sand grains in sand piles, yarn threads in cloths – that influence the large-scale behavior,” describes Sperl the start of his PhD project. Understanding and connecting phenomena across different scales is an inspiring challenge of today’s science. There are multiple theories focusing on single aspects, but how to knit them together remains a major interdisciplinary quest. “What is the right way to simulate at a certain scale?”, asks Wojtan himself and his group. “How can we fit all of nature into a computer, and simulate it faster than real-time, such that we can learn from it?” The new algorithm is the group’s contribution to building a framework for connecting small-scale simulations to large-scale ones in a way that they seamlessly connect.
Publication
Georg Sperl, Rahul Narain, and Chris Wojtan. 2021. Mechanics-aware Deformation of Yarn Pattern Geometry. ACM Transactions on Graphics 40(4) (SIGGRAPH 2021). DOI: 10.1145/3450626.3459816