Definition
A technique training generative models via continuous normalizing flows, potentially faster than diffusion.
Detailed Explanation
A newer technique for training generative models, related to diffusion, that learns to transform a simple prior distribution into the target data distribution via continuous normalizing flows, potentially offering faster training and sampling.
Use Cases
Training generative models for images/audio/video, potentially faster sampling than diffusion models, alternative approach to generative modeling with theoretical benefits.