Retiming Smoke Simulation Using Machine Learning
Art-directability is a crucial aspect of creating aesthetically pleasing visual effects that help tell stories. A particularly common method of art direction is the retiming of a simulation. Unfortunately, the means of retiming an existing simulation sequence which preserves the desired shapes is an...
Main Author: | |
---|---|
Format: | Others |
Published: |
BYU ScholarsArchive
2020
|
Subjects: | |
Online Access: | https://scholarsarchive.byu.edu/etd/8106 https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?article=9106&context=etd |
Summary: | Art-directability is a crucial aspect of creating aesthetically pleasing visual effects that help tell stories. A particularly common method of art direction is the retiming of a simulation. Unfortunately, the means of retiming an existing simulation sequence which preserves the desired shapes is an ill-defined problem. Naively interpolating values between frames leads to visual artifacts such as choppy frames or jittering intensities. Due to the difficulty in formulating a proper interpolation method we elect to use a machine learning approach to approximate this function. Our model is based on the ODE-net structure and reproduces a set of desired time samples (in our case equivalent to time steps) that achieves the desired new sequence speed, based on training from frames in the original sequence. The flexibility of the updated sequences' duration provided by the time samples input makes this a visually effective and intuitively directable way to retime a simulation. |
---|