Abstract: Dataset distillation (DD) aims to accelerate the training speed of neural networks (NNs) by synthesizing a reduced dataset. NNs trained on the smaller dataset are expected to obtain almost ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results