Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
danielhanchenΒ 
posted an update 7 days ago

maybe i am not understanding this correctly, but in the picture you say that you menaged to squeeze the model to a size of 34GB, but then you say that now it can run on GPUs with "just" 24 GB of VRAM.

Was this a typo or am I maybe missing something here?