Reverse Distillation: Consistently Scaling Protein Language Model Representations
This paper introduces Reverse Distillation, a framework that decomposes large protein language model representations into nested, orthogonal subspaces guided by smaller models to ensure consistent performance scaling and outperform existing baselines on ProteinGym benchmarks.