anonamename commited on
Commit
e087902
·
verified ·
1 Parent(s): 05fac30

Upload turing-motors/Heron-NVILA-Lite-2B

Browse files
Files changed (1) hide show
  1. README.md +8 -11
README.md CHANGED
@@ -118,7 +118,7 @@ I used [llm-jp-eval-mm](https://github.com/llm-jp/llm-jp-eval-mm) for this evalu
118
  | Model | LLM Size | Heron-Bench overall LLM (%) | JA-VLM-Bench-In-the-Wild LLM (/5.0) | JA-VG-VQA-500 LLM (/5.0) |
119
  |--------------------------------|----------|------------------------------|-------------------------------------|--------------------------|
120
  | **Heron NVILA-Lite 2B** | 1.5B | 52.8 | 3.52 | 3.50 |
121
- | **Heron NVILA-Lite 15B** | 14B | 59.6 | 4.2 | 3.82 |
122
  | [LLaVA-CALM2-SigLIP](https://huggingface.co/cyberagent/llava-calm2-siglip) | 7B | 43.3 | 3.15 | 3.21 |
123
  | [Llama-3-EvoVLM-JP-v2](https://huggingface.co/SakanaAI/Llama-3-EvoVLM-JP-v2) | 8B | 39.3 | 2.92 | 2.96 |
124
  | [VILA-jp](https://huggingface.co/llm-jp/llm-jp-3-vila-14b) | 13B | 57.2 | 3.69 | 3.62 |
@@ -135,16 +135,9 @@ This model is experimental and has not been thoroughly calibrated for ethical co
135
  - Model weights are licensed under [Apache License 2.0](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct/blob/main/LICENSE).
136
  - Users must comply with [OpenAI terms of use](https://openai.com/policies/terms-of-use) due to the inclusion of GPT-4-generated synthetic data.
137
 
138
- ## How to cite
139
 
140
- ```bibtex
141
- @misc{HeronNVILALite2B,
142
- title = {Heron NVILA-Lite 2B},
143
- author = {Shingo Yokoi},
144
- year = {2025},
145
- url = {https://huggingface.co/turing-motors/Heron-NVILA-Lite-2B},
146
- }
147
- ```
148
 
149
  ## Citations
150
 
@@ -158,4 +151,8 @@ This model is experimental and has not been thoroughly calibrated for ethical co
158
  primaryClass={cs.CV},
159
  url={https://arxiv.org/abs/2412.04468},
160
  }
161
- ```
 
 
 
 
 
118
  | Model | LLM Size | Heron-Bench overall LLM (%) | JA-VLM-Bench-In-the-Wild LLM (/5.0) | JA-VG-VQA-500 LLM (/5.0) |
119
  |--------------------------------|----------|------------------------------|-------------------------------------|--------------------------|
120
  | **Heron NVILA-Lite 2B** | 1.5B | 52.8 | 3.52 | 3.50 |
121
+ | **[Heron NVILA-Lite 15B](https://huggingface.co/turing-motors/Heron-NVILA-Lite-15B)** | 14B | 59.6 | 4.2 | 3.82 |
122
  | [LLaVA-CALM2-SigLIP](https://huggingface.co/cyberagent/llava-calm2-siglip) | 7B | 43.3 | 3.15 | 3.21 |
123
  | [Llama-3-EvoVLM-JP-v2](https://huggingface.co/SakanaAI/Llama-3-EvoVLM-JP-v2) | 8B | 39.3 | 2.92 | 2.96 |
124
  | [VILA-jp](https://huggingface.co/llm-jp/llm-jp-3-vila-14b) | 13B | 57.2 | 3.69 | 3.62 |
 
135
  - Model weights are licensed under [Apache License 2.0](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct/blob/main/LICENSE).
136
  - Users must comply with [OpenAI terms of use](https://openai.com/policies/terms-of-use) due to the inclusion of GPT-4-generated synthetic data.
137
 
138
+ ## Acknowledgements
139
 
140
+ This model is based on the results obtained in the project, subsidized by the [Generative AI Accelerator Challenge (GENIAC)](https://www.meti.go.jp/policy/mono_info_service/geniac/index.html).
 
 
 
 
 
 
 
141
 
142
  ## Citations
143
 
 
151
  primaryClass={cs.CV},
152
  url={https://arxiv.org/abs/2412.04468},
153
  }
154
+ ```
155
+
156
+ ## Model Card Authors
157
+
158
+ Shingo Yokoi