--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:9623924 - loss:MSELoss base_model: BAAI/bge-m3 pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine - negative_mse model-index: - name: SentenceTransformer based on BAAI/bge-m3 results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts dev type: sts-dev metrics: - type: pearson_cosine value: 0.9378885799751235 name: Pearson Cosine - type: spearman_cosine value: 0.930037764519436 name: Spearman Cosine - task: type: knowledge-distillation name: Knowledge Distillation dataset: name: Unknown type: unknown metrics: - type: negative_mse value: -0.010874464351218194 name: Negative Mse - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test type: sts-test metrics: - type: pearson_cosine value: 0.9378994572414889 name: Pearson Cosine - type: spearman_cosine value: 0.9300802695581766 name: Spearman Cosine --- # SentenceTransformer based on BAAI/bge-m3 This is a [sentence-transformers](https://www.SBERT.net) model distilled from [BAAI/bge-m3](https://huggingface.co/BAAI/bge-m3) on the [tr-sentences](https://huggingface.co/datasets/altaidevorg/tr-sentences) dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. Refer to the [blog post](https://medium.com/altai-dev/distilling-efficiency-experiments-in-compressing-baai-bge-m3-using-a-synthetic-dataset-9430e21c6b8f) and the [8l variant](https://huggingface.co/altaidevorg/bge-m3-distill-8l) for more information.