PaddleClas
latest
Contents:
1. tutorials
2. models
3. advanced_tutorials
3.1. image_augmentation
3.2. distillation
3.2.1. Introduction of model compression methods
3.2.2. SSLD
3.2.3. Experiments
3.2.4. Application of the distillation model
3.2.5. Practice
3.2.6. Reference
4. application
5. extension
6. Competition Support
7. Release Notes
8. FAQ
PaddleClas
Docs
»
3. advanced_tutorials
»
3.2. distillation
Edit on GitHub
3.2. distillation
ΒΆ
3.2.1. Introduction of model compression methods
3.2.2. SSLD
3.2.2.1. Introduction
3.2.2.2. Data selection
3.2.3. Experiments
3.2.3.1. Choice of teacher model
3.2.3.2. Distillation using large-scale dataset
3.2.3.3. finetuning using ImageNet1k
3.2.3.4. Data agmentation and Fix strategy
3.2.4. Application of the distillation model
3.2.4.1. Instructions
3.2.4.2. Transfer learning
3.2.4.3. Object detection
3.2.5. Practice
3.2.5.1. Configuration
3.2.5.1.1. Distill ResNet50_vd using ResNeXt101_32x16d_wsl
3.2.5.1.2. Distill MobileNetV3_large_x1_0 using ResNet50_vd_ssld
3.2.5.2. Begin to train the network
3.2.5.3. Note
3.2.6. Reference
Read the Docs
v: latest
Versions
latest
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds
Free document hosting provided by
Read the Docs
.