This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & ...
Gift 5 articles to anyone you choose each month when you subscribe. San Francisco/London | Leading artificial intelligence firms including OpenAI, Microsoft and Meta are turning to a process ...
Leading artificial intelligence firms including OpenAI, Microsoft, and Meta are turning to a process called “distillation” in the global race to create AI models that are cheaper for consumers ...
One of the most promising techniques in this race is AI model distillation, a process that allows developers to create smaller models that retain the capabilities of their larger counterparts.
We present Distill-Any-Depth, a new SOTA monocular depth estimation model trained with our proposed knowledge distillation algorithms. Models with various sizes are available in this repo. We ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results