I am trying to create a xlarge zerogpu space for a fairly large (22B / ~80 GiB) model. Unfortunately, I seem to be running out of disk space during the tensor packing step. I attempted to purchase more storage, only to find that this is not possible anymore for new spaces.
Is there any way to load such a large model using zerogpu now? If not, is it possible with paid spaces? I actually couldn’t find any documentation stating whether paid spaces provide more ephemeral disk space.