Accelerating Diffusion Sampling with Optimized Time Steps

S Xue, Z Liu, F Chen, S Zhang, T Hu… - Proceedings of the …, 2024 - openaccess.thecvf.com
Diffusion probabilistic models (DPMs) have shown remarkable performance in high-
resolution image synthesis but their sampling efficiency is still to be desired due to the …

Lightweight diffusion models: a survey

W Song, W Ma, M Zhang, Y Zhang, X Zhao - Artificial Intelligence Review, 2024 - Springer
Diffusion models (DMs) are a type of potential generative models, which have achieved
better effects in many fields than traditional methods. DMs consist of two main processes …

Implicit biases in multitask and continual learning from a backward error analysis perspective

B Dherin - arXiv preprint arXiv:2311.00235, 2023 - arxiv.org
Using backward error analysis, we compute implicit training biases in multitask and
continual learning settings for neural networks trained with stochastic gradient descent. In …

Majority Kernels: An Approach to Leverage Big Model Dynamics for Efficient Small Model Training

H Mazzawi, P Awasthi, J Gonzalvo… - Workshop on Machine … - openreview.net
Recent breakthroughs and successful deployment of large language and vision models in a
constrained environment predominantly follow a two phase approach. First, large models …

Implicit biases in multitask and continual learningfrom a backward error analysis perspective

B Dherin - NeurIPS 2023 Workshop on Mathematics of Modern … - openreview.net
Using backward error analysis, we compute implicit training biases in multitask and
continual learning settings for neural networks trained with stochastic gradient descent. In …