dm32 / config.json
jqhoogland's picture
Upload final model (step 75000) and all checkpoints at 2024-10-18T04:41:26.654759
d454c52 verified
raw
history blame
216 Bytes
{
"architectures": [
"HFHookedTransformer"
],
"hidden_size": 32,
"num_attention_heads": 8,
"num_hidden_layers": 2,
"torch_dtype": "float32",
"transformers_version": "4.45.2",
"vocab_size": 5000
}