Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
jwgcurrie
/
synthetic-distance
like
0
Safetensors
License:
mit
Model card
Files
Files and versions
xet
Community
main
synthetic-distance
1.25 GB
1 contributor
History:
47 commits
This model has 1 file scanned as unsafe.
Show
files
jwgcurrie
Training in progress, epoch 2
8de4a70
verified
4 months ago
.gitattributes
Safe
1.52 kB
initial commit
4 months ago
README.md
Safe
24 Bytes
initial commit
4 months ago
model.safetensors
Safe
1.02 GB
xet
Training in progress, epoch 2
4 months ago
training_args.bin
pickle
Detected Pickle imports (10)
"torch.device"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.training_args.TrainingArguments"
,
"transformers.trainer_utils.SchedulerType"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
How to fix it?
5.71 kB
xet
Training in progress, epoch 1
4 months ago
vlm_regression_full_model.pt
Unsafe
pickle
Detected Pickle imports (60)
"transformers.models.idefics3.modeling_idefics3.Idefics3EncoderLayer"
,
"torch.float32"
,
"torch.nn.modules.conv.Conv2d"
,
"peft.tuners.lora.bnb.Linear4bit"
,
"transformers.models.llama.modeling_llama.LlamaAttention"
,
"torch.ByteStorage"
,
"torch.uint8"
,
"torch.Size"
,
"torch.nn.modules.linear.Linear"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3Connector"
,
"transformers.models.llama.modeling_llama.LlamaModel"
,
"transformers.utils.quantization_config.QuantizationMethod"
,
"transformers.generation.configuration_utils.GenerationConfig"
,
"peft.utils.peft_types.PeftType"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.HalfStorage"
,
"peft.peft_model.PeftModelForCausalLM"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3SimpleMLP"
,
"__main__.VLMRegressionWrapper"
,
"torch.float16"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3VisionEmbeddings"
,
"__builtin__.getattr"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3VisionTransformer"
,
"torch._utils._rebuild_parameter"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3ForConditionalGeneration"
,
"torch.nn.modules.container.ParameterDict"
,
"transformers.models.llama.modeling_llama.LlamaRotaryEmbedding"
,
"transformers.models.idefics3.configuration_idefics3.Idefics3Config"
,
"peft.tuners.lora.config.LoraConfig"
,
"torch.nn.modules.activation.SiLU"
,
"bitsandbytes.nn.modules.Linear4bit"
,
"transformers.models.llama.configuration_llama.LlamaConfig"
,
"torch.nn.modules.container.Sequential"
,
"torch.FloatStorage"
,
"torch.nn.modules.dropout.Dropout"
,
"__builtin__.set"
,
"torch.nn.modules.container.ModuleDict"
,
"peft.tuners.lora.config.LoraRuntimeConfig"
,
"transformers.models.idefics3.configuration_idefics3.Idefics3VisionConfig"
,
"transformers.modeling_rope_utils._compute_default_rope_parameters"
,
"collections.OrderedDict"
,
"transformers.generation.configuration_utils.CompileConfig"
,
"torch.bfloat16"
,
"torch.nn.modules.normalization.LayerNorm"
,
"torch.nn.modules.activation.ReLU"
,
"torch.nn.modules.container.ModuleList"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3VisionAttention"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3Encoder"
,
"transformers.models.llama.modeling_llama.LlamaMLP"
,
"transformers.models.llama.modeling_llama.LlamaDecoderLayer"
,
"torch._utils._rebuild_parameter_with_state"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3Model"
,
"torch.nn.modules.sparse.Embedding"
,
"transformers.models.idefics3.modeling_idefics3.Idefics3VisionMLP"
,
"transformers.utils.quantization_config.BitsAndBytesConfig"
,
"peft.tuners.lora.model.LoraModel"
,
"bitsandbytes.functional.QuantState"
,
"transformers.activations.PytorchGELUTanh"
,
"transformers.models.llama.modeling_llama.LlamaRMSNorm"
,
"transformers.quantizers.quantizer_bnb_4bit.Bnb4BitHfQuantizer"
How to fix it?
224 MB
xet
Upload vlm_regression_full_model.pt
4 months ago