---
base_model: BAAI/bge-base-en-v1.5
datasets: []
language:
- en
library_name: sentence-transformers
license: apache-2.0
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:1500
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
widget:
- source_sentence: The depreciation and amortization expense for the year 2021 was
recorded at $3,103.
sentences:
- In what sequence do the signature pages appear relative to the financial documents
in this report?
- What was the depreciation and amortization expense in 2021?
- What was the net impact on other comprehensive income (loss), net of tax, for
the fiscal year ended March 31, 2023?
- source_sentence: 'Actual Asset Returns: U.S. Plans: (21.20)%, Non-U.S. Plans: (25.40)%.'
sentences:
- What were the total other current liabilities for the fiscal year ending in 2023
compared to 2022?
- What was the percentage of proprietary brand product sales as part of the front
store revenues in 2023?
- By how much did actual asset returns vary between U.S. and Non-U.S. pension plans
in 2023?
- source_sentence: Intellectual property rights are important to Nike's brand, success,
and competitive position. The company strategically pursues protections of these
rights and vigorously protects them against third-party theft and infringement.
sentences:
- What types of legal issues are generally categorized under Commitments and Contingencies
in a Form 10-K?
- What role does intellectual property play in Nike's competitive position?
- How is the revenue from sales of Online-Hosted Service Games recognized?
- source_sentence: Item 3, titled 'Legal Proceedings' in a 10-K filing, directs to
Note 16 where specific information is further detailed in Item 8 of Part II.
sentences:
- How does Garmin manage the costs of manufacturing its products?
- What is indicated by Item 3, 'Legal Proceedings', in a 10-K filing?
- How much did UnitedHealthcare's cash provided by operating activities amount to
in 2023?
- source_sentence: During 2023, FedEx ranked 18th in FORTUNE magazine's 'World's Most
Admired Companies' list and maintained its position as the highest-ranked delivery
company on the list.
sentences:
- What was the total depreciation and amortization expense for the company in 2023?
- What was the valuation allowance against deferred tax assets at the end of 2023,
and what changes may affect its realization?
- What recognition did FedEx receive from FORTUNE magazine in 2023?
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7766666666666666
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.86
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.89
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9333333333333333
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7766666666666666
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2866666666666667
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17799999999999996
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09333333333333332
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7766666666666666
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.86
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.89
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9333333333333333
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8519532537710081
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8263650793650793
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8285686593594938
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.7566666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.87
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8933333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9333333333333333
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7566666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.29
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17866666666666664
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09333333333333332
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7566666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.87
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8933333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9333333333333333
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8462349355848354
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8183306878306877
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8207466430359656
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.76
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.86
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.89
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9266666666666666
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.76
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2866666666666666
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17799999999999996
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09266666666666666
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.76
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.86
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.89
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9266666666666666
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8433224215661056
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.8166931216931217
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.8190592083326618
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.7066666666666667
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.84
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8633333333333333
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.91
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7066666666666667
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27999999999999997
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17266666666666666
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09099999999999998
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7066666666666667
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.84
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8633333333333333
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.91
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8099084142081584
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7776230158730157
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7810311049771785
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6833333333333333
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7933333333333333
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8366666666666667
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.88
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6833333333333333
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.26444444444444437
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1673333333333333
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.088
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6833333333333333
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7933333333333333
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8366666666666667
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.88
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7796467165928374
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7475780423280424
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.751941519893099
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("adarshheg/bge-base-financial-matryoshka")
# Run inference
sentences = [
"During 2023, FedEx ranked 18th in FORTUNE magazine's 'World's Most Admired Companies' list and maintained its position as the highest-ranked delivery company on the list.",
'What recognition did FedEx receive from FORTUNE magazine in 2023?',
'What was the valuation allowance against deferred tax assets at the end of 2023, and what changes may affect its realization?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `dim_768`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7767 |
| cosine_accuracy@3 | 0.86 |
| cosine_accuracy@5 | 0.89 |
| cosine_accuracy@10 | 0.9333 |
| cosine_precision@1 | 0.7767 |
| cosine_precision@3 | 0.2867 |
| cosine_precision@5 | 0.178 |
| cosine_precision@10 | 0.0933 |
| cosine_recall@1 | 0.7767 |
| cosine_recall@3 | 0.86 |
| cosine_recall@5 | 0.89 |
| cosine_recall@10 | 0.9333 |
| cosine_ndcg@10 | 0.852 |
| cosine_mrr@10 | 0.8264 |
| **cosine_map@100** | **0.8286** |
#### Information Retrieval
* Dataset: `dim_512`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.7567 |
| cosine_accuracy@3 | 0.87 |
| cosine_accuracy@5 | 0.8933 |
| cosine_accuracy@10 | 0.9333 |
| cosine_precision@1 | 0.7567 |
| cosine_precision@3 | 0.29 |
| cosine_precision@5 | 0.1787 |
| cosine_precision@10 | 0.0933 |
| cosine_recall@1 | 0.7567 |
| cosine_recall@3 | 0.87 |
| cosine_recall@5 | 0.8933 |
| cosine_recall@10 | 0.9333 |
| cosine_ndcg@10 | 0.8462 |
| cosine_mrr@10 | 0.8183 |
| **cosine_map@100** | **0.8207** |
#### Information Retrieval
* Dataset: `dim_256`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.76 |
| cosine_accuracy@3 | 0.86 |
| cosine_accuracy@5 | 0.89 |
| cosine_accuracy@10 | 0.9267 |
| cosine_precision@1 | 0.76 |
| cosine_precision@3 | 0.2867 |
| cosine_precision@5 | 0.178 |
| cosine_precision@10 | 0.0927 |
| cosine_recall@1 | 0.76 |
| cosine_recall@3 | 0.86 |
| cosine_recall@5 | 0.89 |
| cosine_recall@10 | 0.9267 |
| cosine_ndcg@10 | 0.8433 |
| cosine_mrr@10 | 0.8167 |
| **cosine_map@100** | **0.8191** |
#### Information Retrieval
* Dataset: `dim_128`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:----------|
| cosine_accuracy@1 | 0.7067 |
| cosine_accuracy@3 | 0.84 |
| cosine_accuracy@5 | 0.8633 |
| cosine_accuracy@10 | 0.91 |
| cosine_precision@1 | 0.7067 |
| cosine_precision@3 | 0.28 |
| cosine_precision@5 | 0.1727 |
| cosine_precision@10 | 0.091 |
| cosine_recall@1 | 0.7067 |
| cosine_recall@3 | 0.84 |
| cosine_recall@5 | 0.8633 |
| cosine_recall@10 | 0.91 |
| cosine_ndcg@10 | 0.8099 |
| cosine_mrr@10 | 0.7776 |
| **cosine_map@100** | **0.781** |
#### Information Retrieval
* Dataset: `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.6833 |
| cosine_accuracy@3 | 0.7933 |
| cosine_accuracy@5 | 0.8367 |
| cosine_accuracy@10 | 0.88 |
| cosine_precision@1 | 0.6833 |
| cosine_precision@3 | 0.2644 |
| cosine_precision@5 | 0.1673 |
| cosine_precision@10 | 0.088 |
| cosine_recall@1 | 0.6833 |
| cosine_recall@3 | 0.7933 |
| cosine_recall@5 | 0.8367 |
| cosine_recall@10 | 0.88 |
| cosine_ndcg@10 | 0.7796 |
| cosine_mrr@10 | 0.7476 |
| **cosine_map@100** | **0.7519** |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 1,500 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
In the U.S., Visa Inc.'s total nominal payments volume increased by 17% from $4,725 billion in 2021 to $5,548 billion in 2022.
| What is the total percentage increase in Visa Inc.'s nominal payments volume in the U.S. from 2021 to 2022?
|
| The section titled 'Financial Wtatement and Supplementary Data' is labeled with the number 39 in the document.
| What is the numerical label associated with the section on Financial Statements and Supplementary Data in the document?
|
| The consolidated financial statements and accompanying notes are incorporated by reference herein.
| Are the consolidated financial statements and accompanying notes incorporated by reference in the Annual Report on Form 10-K?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 2
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `tf32`: False
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters