Popular repositories Loading
-
-
Question-Answering-with-BERT-and-Knowledge-Distillation
Question-Answering-with-BERT-and-Knowledge-Distillation PublicForked from AristotelisPap/Question-Answering-with-BERT-and-Knowledge-Distillation
Fine-tuned BERT on SQuAd 2.0 Dataset. Applied Knowledge Distillation (KD) and fine-tuned DistilBERT (student) using BERT as the teacher model. Reduced the size of the original BERT by 40%.
Jupyter Notebook
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.