admin管理员组

文章数量:1530259

"Dataset Distillation: A Comprehensive Review," arXiv, 2023.

paper:https://arxiv/pdf/2301.07014.pdf

code:https://github/Guang000/Awesome-Dataset-Distillation


"Dataset Distillation"是一种知识蒸馏(distillation)方法,它旨在通过在大型训练数据集中提取关键样本或特征来减少深度神经网络的体积。这种方法可以帮助缓解由于海量数据带来的存储和传输压力,并且可以加速模型推理的速度。

 Fig. 1. An overview for dataset distillation. Dataset distillation aims to generate a small informative dataset such that the models trained on these samples have similar test performance to those trained on the original dataset.

本文标签: DatasetdistillationreviewComprehensive