modernbert-embed-large / special_tokens_map.json

Commit History

Add new SentenceTransformer model
369c3c3
verified

tomaarsen HF staff commited on

Add tokenizer files
e5eded7
verified

NohTow commited on