TroyDoesAI's picture
29 Layers: Removed from the 32 layer Foundation Model with no loss of output without fine tuning.
4977f32 verified
|
raw
history blame
303 Bytes
metadata
license: cc-by-4.0

What If I told you I found 4 layers out of the 32 that do absolutely nothing.

Cutting these layers out do not change the output individually, but cutting all 4 at once breaks the model. Lets take 3 layers out of 4 out and see if we can recover after removing this redundancy.