-
NAVER AI LAB
- South Korea
-
16:51
(UTC +09:00) - https://sites.google.com/view/byeongho-heo/home
Highlights
- Pro
Popular repositories Loading
-
AB_distillation
AB_distillation PublicKnowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
-
BSS_distillation
BSS_distillation PublicKnowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
-
Knowledge_distillation_methods_wtih_Tensorflow
Knowledge_distillation_methods_wtih_Tensorflow PublicForked from sseung0703/KD_methods_with_TF
Knowledge distillation methods implemented with Tensorflow (now there are 8 methods, and will be added more.)
-
attention-feature-distillation
attention-feature-distillation PublicForked from clovaai/attention-feature-distillation
Official implementation for (Show, Attend and Distill: Knowledge Distillation via Attention-based Feature Matching, AAAI-2021)
Python 2
-
vit-pytorch
vit-pytorch PublicForked from lucidrains/vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Python 1
-
If the problem persists, check the GitHub status page or contact support.