Navigating Model Phase Transitions to Enable Extreme Lossless Compression: A Perspective
-
Updated
Aug 3, 2025
Navigating Model Phase Transitions to Enable Extreme Lossless Compression: A Perspective
Transformer models library with compression options
Official code for "Vanishing Contributions: A Unified Approach to Smoothly Transition Neural Models into Compressed Form", including training and evaluation scripts.
Code repository accompanying the paper "Beyond linear summation: Three-Body RNNs for modeling complex neural and biological systems" 🧠
A comprehensive implementation of post-training pruning methods for large language models (LLMs)
Add a description, image, and links to the low-rank-decomposition topic page so that developers can more easily learn about it.
To associate your repository with the low-rank-decomposition topic, visit your repo's landing page and select "manage topics."