modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-06-02 08:43:47
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 462
values | tags
sequencelengths 1
4.05k
| pipeline_tag
stringclasses 54
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-06-02 08:40:46
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
hmahmoud/flan-t5-large-lfqa-fr-v3 | hmahmoud | 2023-06-20T12:50:53Z | 120 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"t5",
"text2text-generation",
"flan-t5",
"qa",
"lfqa",
"information retrieval",
"fr",
"dataset:vblagoje/lfqa",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-04-26T14:00:38Z | ---
license: apache-2.0
language:
- fr
tags:
- flan-t5
- qa
- lfqa
- information retrieval
datasets:
- vblagoje/lfqa
metrics:
- rouge
model-index:
- name: flan-t5-large-lfqa-fr-v3
results: []
widget:
- text: >-
Please answer to the following question : Comment fonctionne un modèle de langue ? Que signifi un modèle
de question réponse générative ? context : Les modèles de langage basés
sur le deep learning sont des modèles dapprentissage automatique qui
utilisent des techniques dapprentissage profond pour effectuer des tâches
de langage.En traitement automatique des langues, un modèle de langage est
un modèle statistique qui modélise la distribution de séquences de mots,
plus généralement de séquences de symboles discrets (lettres, phonèmes,
mots), dans une langue naturelle. Un modèle de langage peut par exemple
prédire le mot suivant une séquence de mots1.BERT, GPT-3 et Bloom sont des
modèles de langage.Les modèles de Question Réponse (QA) permette
d'automatiser la réponse aux questions fréquemment posées en utilisant une
base de connaissances (documents) comme contexte. Les réponses aux
questions des clients peuvent être tirées de ces documents.Il existe
différentes variantes de modèle de question réponse : question réponse
extractive : le modèle extrait la réponse d'un contexte. Le contexte ici
peut être un texte fourni, un tableau ou même du HTML ! Ceci est
généralement résolu avec des modèles de type BERT. question réponse
générative ouverte : le modèle génère du texte libre directement en
fonction du contexte. question réponse générative fermée : dans ce cas,
aucun contexte n'est fourni. La réponse est entièrement générée par un
modèle.Les modèles de langage basés sur le deep learning sont des modèles
dapprentissage automatique qui utilisent des techniques dapprentissage
profond pour effectuer des tâches de langage.En traitement automatique des
langues, un modèle de langage est un modèle statistique qui modélise la
distribution de séquences de mots, plus généralement de séquences de
symboles discrets (lettres, phonèmes, mots), dans une langue naturelle. Un
modèle de langage peut par exemple prédire le mot suivant une séquence de
mots.Les modèles de Question Réponse (QA) permette d'automatiser la
réponse aux questions fréquemment posées en utilisant une base de
connaissances (documents) comme contexte. Les réponses aux questions des
clients peuvent être tirées de ces documents.Il existe différentes
variantes de modèle de question réponse : question réponse extractive : le
modèle extrait la réponse d'un contexte. Le contexte ici peut être un
texte fourni, un tableau ou même du HTML ! Ceci est généralement résolu
avec des modèles de type BERT. question réponse générative ouverte : le
modèle génère du texte libre directement en fonction du contexte. question
réponse générative fermée : dans ce cas, aucun contexte n'est fourni. La
réponse est entièrement générée par un modèle.
example_title: Les modèles de langage
inference:
parameters:
max_length: 512
num_return_sequences: 1
min_length: 80
no_repeat_ngram_size: 4
do_sample: false
num_beams: 8
early_stopping: true
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-large-lfqa-fr
This model is a fine-tuned version of [google/flan-t5-large](https://huggingface.co/google/flan-t5-large) on some examples (50000) of the vblagoje/lfqa dataset translated automatically to French using Helsinki-NLP/opus-mt-en-fr model.
Therefore the main task this model can perform is abstractive question answering given certain context paragraphs which can be used to answer that question.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0
### Training results
### Usage
```python
from transformers import AutoTokenizer, AutoModel, AutoModelForSeq2SeqLM
model_name = "hmahmoud/flan-t5-large-lfqa-fr-v3"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
query = "Comment fonctionne un modèle de langue ? Que signifi un modèle de question réponse générative ?"
document = "Les modèles de langage basés sur le deep learning sont des modèles dapprentissage automatique qui utilisent des techniques dapprentissage profond pour effectuer des tâches de langage.En traitement automatique des langues, un modèle de langage est un modèle statistique qui modélise la distribution de séquences de mots, plus généralement de séquences de symboles discrets (lettres, phonèmes, mots), dans une langue naturelle. Un modèle de langage peut par exemple prédire le mot suivant une séquence de mots1.BERT, GPT-3 et Bloom sont des modèles de langage.Les modèles de Question Réponse (QA) permette d'automatiser la réponse aux questions fréquemment posées en utilisant une base de connaissances (documents) comme contexte. Les réponses aux questions des clients peuvent être tirées de ces documents.Il existe différentes variantes de modèle de question réponse : question réponse extractive : le modèle extrait la réponse d'un contexte. Le contexte ici peut être un texte fourni, un tableau ou même du HTML ! Ceci est généralement résolu avec des modèles de type BERT. question réponse générative ouverte : le modèle génère du texte libre directement en fonction du contexte. question réponse générative fermée : dans ce cas, aucun contexte n'est fourni. La réponse est entièrement générée par un modèle.Les modèles de langage basés sur le deep learning sont des modèles dapprentissage automatique qui utilisent des techniques dapprentissage profond pour effectuer des tâches de langage.En traitement automatique des langues, un modèle de langage est un modèle statistique qui modélise la distribution de séquences de mots, plus généralement de séquences de symboles discrets (lettres, phonèmes, mots), dans une langue naturelle. Un modèle de langage peut par exemple prédire le mot suivant une séquence de mots.Les modèles de Question Réponse (QA) permette d'automatiser la réponse aux questions fréquemment posées en utilisant une base de connaissances (documents) comme contexte. Les réponses aux questions des clients peuvent être tirées de ces documents.Il existe différentes variantes de modèle de question réponse : question réponse extractive : le modèle extrait la réponse d'un contexte. Le contexte ici peut être un texte fourni, un tableau ou même du HTML ! Ceci est généralement résolu avec des modèles de type BERT. question réponse générative ouverte : le modèle génère du texte libre directement en fonction du contexte. question réponse générative fermée : dans ce cas, aucun contexte n'est fourni. La réponse est entièrement générée par un modèle."
query_and_docs = "Please answer to the following question : {} context: {}".format(query, document)
model_input = tokenizer(query_and_docs, truncation=True, padding=True, return_tensors="pt")
generated_answers_encoded = model.generate(input_ids=model_input["input_ids"].to(device),
attention_mask=model_input["attention_mask"].to(device),
min_length=80,
max_length=512,
do_sample=False,
early_stopping=True,
num_beams=8,
temperature=None,
top_k=None,
top_p=None,
eos_token_id=tokenizer.eos_token_id,
no_repeat_ngram_size=4,
num_return_sequences=1)
tokenizer.batch_decode(generated_answers_encoded, skip_special_tokens=True,clean_up_tokenization_spaces=True)
```
|
Yandexxxx/DrawListner | Yandexxxx | 2023-06-20T12:44:52Z | 0 | 0 | keras | [
"keras",
"tf-keras",
"region:us"
] | null | 2023-06-08T13:32:31Z | ---
library_name: keras
---
# Модель для распознования цифр, натренерованна на наборе данных mnist
 |
gokuls/bert-base-Massive-intent_48 | gokuls | 2023-06-20T11:54:36Z | 132 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:massive",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-20T11:48:55Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- massive
metrics:
- accuracy
model-index:
- name: bert-base-Massive-intent_48
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: massive
type: massive
config: en-US
split: validation
args: en-US
metrics:
- name: Accuracy
type: accuracy
value: 0.8622725036891293
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-Massive-intent_48
This model is a fine-tuned version of [gokuls/bert_base_48](https://huggingface.co/gokuls/bert_base_48) on the massive dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6203
- Accuracy: 0.8623
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 33
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.6304 | 1.0 | 180 | 0.8747 | 0.7821 |
| 0.6654 | 2.0 | 360 | 0.6418 | 0.8347 |
| 0.4063 | 3.0 | 540 | 0.5890 | 0.8529 |
| 0.2592 | 4.0 | 720 | 0.6132 | 0.8446 |
| 0.1832 | 5.0 | 900 | 0.6417 | 0.8519 |
| 0.1357 | 6.0 | 1080 | 0.6203 | 0.8623 |
| 0.0969 | 7.0 | 1260 | 0.6742 | 0.8534 |
| 0.0735 | 8.0 | 1440 | 0.7212 | 0.8436 |
| 0.0532 | 9.0 | 1620 | 0.7192 | 0.8529 |
| 0.0378 | 10.0 | 1800 | 0.7625 | 0.8564 |
| 0.0298 | 11.0 | 1980 | 0.7275 | 0.8588 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.13.0
- Tokenizers 0.13.3
|
boleshirish/Marathi_DistilBert_Pretrained | boleshirish | 2023-06-20T11:23:43Z | 104 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"fill-mask",
"generated_from_trainer",
"mr",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-06-13T08:38:40Z | ---
tags:
- generated_from_trainer
model-index:
- name: Mara_DistilBert_Pretrained
results: []
language:
- mr
pipeline_tag: fill-mask
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Mara_DistilBert_Pretrained
DistilBERT, a variant of BERT, was employed to pre-trained a Marathi language model from scratch using one million sentences. This compact yet powerful model utilizes a distilled version of BERT's transformer architecture
- Loss: 7.4249
-
## Examples
माझं प्रिय मित्र [MASK] आहे
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Examples
माझं प्रिय मित्र [MASK] आहे
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 7.9421 | 0.84 | 1000 | 7.4249 |
### Framework versions
- Transformers 4.18.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.12.1 |
tux/poca-SoccerTwos | tux | 2023-06-20T10:51:59Z | 15 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"SoccerTwos",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SoccerTwos",
"region:us"
] | reinforcement-learning | 2023-06-20T10:50:32Z | ---
library_name: ml-agents
tags:
- SoccerTwos
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: tux/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
hoangtubongcuoi/wikineural-multilingual-ner | hoangtubongcuoi | 2023-06-20T10:18:39Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2023-06-07T10:58:54Z | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: wikineural-multilingual-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wikineural-multilingual-ner
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2354
- Precision: 0.7143
- Recall: 0.7547
- F1: 0.7339
- Accuracy: 0.9725
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
gokuls/hbertv2-Massive-intent | gokuls | 2023-06-20T09:05:06Z | 47 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"hybridbert",
"text-classification",
"generated_from_trainer",
"dataset:massive",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-20T08:56:06Z | ---
tags:
- generated_from_trainer
datasets:
- massive
metrics:
- accuracy
model-index:
- name: hbertv2-Massive-intent
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: massive
type: massive
config: en-US
split: validation
args: en-US
metrics:
- name: Accuracy
type: accuracy
value: 0.8514510575504181
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hbertv2-Massive-intent
This model is a fine-tuned version of [gokuls/bert_12_layer_model_v2_complete_training_new](https://huggingface.co/gokuls/bert_12_layer_model_v2_complete_training_new) on the massive dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9457
- Accuracy: 0.8515
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 33
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.1277 | 1.0 | 180 | 1.0263 | 0.7364 |
| 0.9042 | 2.0 | 360 | 0.8013 | 0.7875 |
| 0.6379 | 3.0 | 540 | 0.8182 | 0.7914 |
| 0.4865 | 4.0 | 720 | 0.8074 | 0.7973 |
| 0.3637 | 5.0 | 900 | 0.7780 | 0.8190 |
| 0.3019 | 6.0 | 1080 | 0.7656 | 0.8288 |
| 0.2218 | 7.0 | 1260 | 0.8253 | 0.8254 |
| 0.1741 | 8.0 | 1440 | 0.8295 | 0.8239 |
| 0.1316 | 9.0 | 1620 | 0.8590 | 0.8308 |
| 0.1011 | 10.0 | 1800 | 0.8465 | 0.8431 |
| 0.078 | 11.0 | 1980 | 0.9007 | 0.8401 |
| 0.0573 | 12.0 | 2160 | 0.9133 | 0.8470 |
| 0.0382 | 13.0 | 2340 | 0.9233 | 0.8470 |
| 0.0247 | 14.0 | 2520 | 0.9365 | 0.8490 |
| 0.0148 | 15.0 | 2700 | 0.9457 | 0.8515 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.13.0
- Tokenizers 0.13.3
|
ireneli1024/bigbird-pegasus-large-pubmed-elife-finetuned | ireneli1024 | 2023-06-20T08:51:37Z | 16,963 | 0 | transformers | [
"transformers",
"pytorch",
"bigbird_pegasus",
"text2text-generation",
"license:other",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-06-20T05:30:08Z | ---
license: other
---
This is the finetuned model based on the [google/bigbird-pegasus-large-pubmed](https://huggingface.co/google/bigbird-pegasus-large-pubmed) model.
The data is from BioLaySumm 2023 [shared task 1](https://biolaysumm.org/#data).
|
kc621/LunarLander-v2 | kc621 | 2023-06-20T07:23:03Z | 4 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-20T07:22:42Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 261.61 +/- 21.13
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
thisispublic/flan-t5-small-cnndm | thisispublic | 2023-06-20T05:06:22Z | 106 | 0 | transformers | [
"transformers",
"pytorch",
"rust",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:cnn_dailymail",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-06-20T04:57:37Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- cnn_dailymail
metrics:
- rouge
model-index:
- name: small
results:
- task:
name: Summarization
type: summarization
dataset:
name: cnn_dailymail 3.0.0
type: cnn_dailymail
config: 3.0.0
split: validation
args: 3.0.0
metrics:
- name: Rouge1
type: rouge
value: 39.3083
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# small
This model is a fine-tuned version of [google/flan-t5-small](https://huggingface.co/google/flan-t5-small) on the cnn_dailymail 3.0.0 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6841
- Rouge1: 39.3083
- Rouge2: 17.5532
- Rougel: 27.97
- Rougelsum: 36.4953
- Gen Len: 77.6173
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.27.0.dev0
- Pytorch 1.13.0+cu117
- Datasets 2.7.1
- Tokenizers 0.12.1
|
joohwan/jjjtt1 | joohwan | 2023-06-20T03:35:33Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2023-06-20T02:40:11Z | ---
tags:
- generated_from_trainer
model-index:
- name: waaaaabh
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# waaaaabh
This model is a fine-tuned version of [Hyuk/wav2vec2-korean-v2](https://huggingface.co/Hyuk/wav2vec2-korean-v2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2211
- Cer: 0.2478
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 20.3296 | 3.25 | 500 | 3.1922 | 0.9414 |
| 2.8739 | 6.49 | 1000 | 3.1014 | 0.7727 |
| 2.1991 | 9.74 | 1500 | 2.9818 | 0.6527 |
| 1.025 | 12.99 | 2000 | 1.1202 | 0.2879 |
| 0.2377 | 16.23 | 2500 | 1.2061 | 0.2660 |
| 0.1402 | 19.48 | 3000 | 1.2518 | 0.2629 |
| 0.1144 | 22.73 | 3500 | 1.2232 | 0.2553 |
| 0.0985 | 25.97 | 4000 | 1.2062 | 0.2478 |
| 0.0886 | 29.22 | 4500 | 1.2211 | 0.2478 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
Brandulio/Reinforce-Pixelcopter | Brandulio | 2023-06-20T01:29:14Z | 0 | 0 | null | [
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-20T01:29:10Z | ---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Pixelcopter
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 50.10 +/- 43.88
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
Brendan/refpydst-5p-icdst-split-v3 | Brendan | 2023-06-19T20:49:28Z | 2 | 0 | sentence-transformers | [
"sentence-transformers",
"pytorch",
"mpnet",
"feature-extraction",
"sentence-similarity",
"transformers",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2023-06-19T19:26:23Z | ---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# Brendan/refpydst-5p-icdst-split-v3
This model was initialized with `sentence-transformers/all-mpnet-base-v2` and then fine-tuned using a 5% few-shot split of the MultiWOZ dataset and a supervised contrastive loss. It is fine-tuned to be used as an in-context example retriever using this few-shot training set, which is provided in the linked repository. More details available [in the repo](https://github.com/jlab-nlp/RefPyDST) and paper linked within. To cite this model, please consult the citation in the [linked GithHub repository README](https://github.com/jlab-nlp/RefPyDST).
The remainder of this README is automatically generated from `sentence_transformers` and is accurate, though this model is not intended as a general purpose sentence-encoder: it is expecting in-context examples from MultiWOZ to be formatted in a particular way, see the linked repo for details.
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('Brendan/refpydst-5p-icdst-split-v3')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('Brendan/refpydst-5p-icdst-split-v3')
model = AutoModel.from_pretrained('Brendan/refpydst-5p-icdst-split-v3')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=Brendan/refpydst-5p-icdst-split-v3)
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 2233 with parameters:
```
{'batch_size': 24, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.OnlineContrastiveLoss.OnlineContrastiveLoss`
Parameters of the fit()-Method:
```
{
"epochs": 15,
"evaluation_steps": 800,
"evaluator": "refpydst.retriever.code.st_evaluator.RetrievalEvaluator",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 100,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information --> |
shyamsn97/Mario-GPT2-700-context-length | shyamsn97 | 2023-06-19T19:49:50Z | 946 | 13 | transformers | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"arxiv:2302.05981",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-02-14T05:28:18Z | GPT model for the paper: https://arxiv.org/abs/2302.05981 |
Curiolearner/ppo-LunarLander-v2 | Curiolearner | 2023-06-19T17:50:19Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-19T17:50:02Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 248.95 +/- 23.44
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CodyKilpatrick/a2c-PandaReachDense-v2 | CodyKilpatrick | 2023-06-19T17:47:06Z | 3 | 0 | stable-baselines3 | [
"stable-baselines3",
"PandaReachDense-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-19T17:23:42Z | ---
library_name: stable-baselines3
tags:
- PandaReachDense-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: PandaReachDense-v2
type: PandaReachDense-v2
metrics:
- type: mean_reward
value: -3.16 +/- 0.33
name: mean_reward
verified: false
---
# **A2C** Agent playing **PandaReachDense-v2**
This is a trained model of a **A2C** agent playing **PandaReachDense-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
hhhiro/BLOOM_3B_lora_test | hhhiro | 2023-06-19T17:03:29Z | 0 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-06-19T17:03:24Z | ---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.4.0.dev0
|
gilang21/Sarah | gilang21 | 2023-06-19T16:57:50Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-19T16:51:56Z | ---
license: creativeml-openrail-m
---
|
teddy0413/Accounting_glm0619 | teddy0413 | 2023-06-19T14:55:02Z | 1 | 0 | peft | [
"peft",
"region:us"
] | null | 2023-06-19T14:54:58Z | ---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.4.0.dev0
|
draziert/SpaceInvadersNoFrameskip-v4 | draziert | 2023-06-19T03:12:09Z | 2 | 0 | stable-baselines3 | [
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-19T03:11:30Z | ---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 576.00 +/- 223.45
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga draziert -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga draziert -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga draziert
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1500000),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
|
minoosh/videomae-base-finetuned-IEMOCAP_4 | minoosh | 2023-06-18T23:55:50Z | 60 | 0 | transformers | [
"transformers",
"pytorch",
"videomae",
"video-classification",
"generated_from_trainer",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | video-classification | 2023-06-18T19:02:34Z | ---
license: cc-by-nc-4.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: videomae-base-finetuned-IEMOCAP_4
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# videomae-base-finetuned-IEMOCAP_4
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3971
- Accuracy: 0.2747
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 4490
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3364 | 0.1 | 450 | 1.4142 | 0.2882 |
| 1.3951 | 1.1 | 900 | 1.3692 | 0.3058 |
| 1.2918 | 2.1 | 1350 | 1.3544 | 0.3357 |
| 1.2283 | 3.1 | 1800 | 1.3673 | 0.3298 |
| 1.2638 | 4.1 | 2250 | 1.3652 | 0.3404 |
| 1.2674 | 5.1 | 2700 | 1.3265 | 0.3538 |
| 1.2737 | 6.1 | 3150 | 1.3092 | 0.3802 |
| 1.1625 | 7.1 | 3600 | 1.2969 | 0.3884 |
| 1.35 | 8.1 | 4050 | 1.3067 | 0.3726 |
| 1.1373 | 9.1 | 4490 | 1.2835 | 0.3972 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
BenjaminOcampo/model-bert__trained-in-sbic__seed-0 | BenjaminOcampo | 2023-06-18T22:05:50Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"bert",
"text-classification",
"en",
"arxiv:1910.09700",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-18T22:04:45Z | ---
language: en
---
# Model Card for BenjaminOcampo/model-bert__trained-in-sbic__seed-0
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
**Classification results dev set**
```
precision recall f1-score support
0 0.8686 0.8503 0.8594 8756
1 0.8346 0.8545 0.8445 7741
accuracy 0.8523 16497
macro avg 0.8516 0.8524 0.8519 16497
weighted avg 0.8527 0.8523 0.8524 16497
```
**Classification results test set**
```
precision recall f1-score support
0 0.8641 0.8559 0.8600 8471
1 0.8625 0.8704 0.8664 8798
accuracy 0.8633 17269
macro avg 0.8633 0.8631 0.8632 17269
weighted avg 0.8633 0.8633 0.8633 17269
```
- **Developed by:** Benjamin Ocampo
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** en
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/huggingface/huggingface_hub
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
### How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Data Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
LarryAIDraw/mai_kawakami_DG | LarryAIDraw | 2023-06-18T19:21:21Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-18T18:56:55Z | ---
license: creativeml-openrail-m
---
https://civitai.com/models/92718/mai-kawakami-musaigen-no-phantom-world |
ItchyB/poca-SoccerTwos | ItchyB | 2023-06-18T18:48:50Z | 51 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"SoccerTwos",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SoccerTwos",
"region:us"
] | reinforcement-learning | 2023-06-18T18:48:27Z | ---
library_name: ml-agents
tags:
- SoccerTwos
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: ItchyB/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
husienburgir/gabagthaupdate | husienburgir | 2023-06-18T11:13:31Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-18T11:10:04Z | ---
license: creativeml-openrail-m
---
|
Middelz2/roberta-large-aphasia-narration_eps_6 | Middelz2 | 2023-06-18T10:17:42Z | 3 | 0 | transformers | [
"transformers",
"tf",
"roberta",
"fill-mask",
"generated_from_keras_callback",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2023-06-18T09:08:44Z | ---
license: mit
tags:
- generated_from_keras_callback
model-index:
- name: Middelz2/roberta-large-aphasia-narration_eps_6
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Middelz2/roberta-large-aphasia-narration_eps_6
This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.0917
- Validation Loss: 1.0443
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 1.5148 | 1.3056 | 0 |
| 1.2794 | 1.1696 | 1 |
| 1.1875 | 1.0934 | 2 |
| 1.1245 | 1.0617 | 3 |
| 1.0917 | 1.0443 | 4 |
### Framework versions
- Transformers 4.30.2
- TensorFlow 2.12.0
- Datasets 2.13.0
- Tokenizers 0.13.3
|
minoosh/videomae-base-finetuned-IEMOCAP_2 | minoosh | 2023-06-17T19:22:21Z | 59 | 0 | transformers | [
"transformers",
"pytorch",
"videomae",
"video-classification",
"generated_from_trainer",
"license:cc-by-nc-4.0",
"endpoints_compatible",
"region:us"
] | video-classification | 2023-06-17T14:36:03Z | ---
license: cc-by-nc-4.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: videomae-base-finetuned-IEMOCAP_2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# videomae-base-finetuned-IEMOCAP_2
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3381
- Accuracy: 0.3434
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 4500
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.3215 | 0.1 | 451 | 1.4351 | 0.2622 |
| 1.3236 | 1.1 | 902 | 1.3517 | 0.3579 |
| 1.2642 | 2.1 | 1353 | 1.4280 | 0.2982 |
| 1.2741 | 3.1 | 1804 | 1.3943 | 0.3012 |
| 1.2655 | 4.1 | 2255 | 1.3665 | 0.3311 |
| 1.1476 | 5.1 | 2706 | 1.3808 | 0.3293 |
| 1.2231 | 6.1 | 3157 | 1.3216 | 0.3573 |
| 1.2715 | 7.1 | 3608 | 1.3162 | 0.3720 |
| 1.3088 | 8.1 | 4059 | 1.2985 | 0.3982 |
| 1.2636 | 9.1 | 4500 | 1.2666 | 0.4098 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
Enterprize1/q-FrozenLake-v1-4x4-noSlippery | Enterprize1 | 2023-06-17T11:09:18Z | 0 | 0 | null | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-17T11:08:59Z | ---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="Enterprize1/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
nolanaatama/dmnslyrkmtsnybnmstyllr | nolanaatama | 2023-06-17T02:31:22Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-17T02:27:58Z | ---
license: creativeml-openrail-m
---
|
TacLucas/Test | TacLucas | 2023-06-16T23:07:19Z | 0 | 0 | null | [
"arxiv:1910.09700",
"region:us"
] | null | 2023-06-16T23:06:14Z | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Data Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
sambanovasystems/starcoder-toolbench | sambanovasystems | 2023-06-16T18:23:22Z | 23 | 4 | transformers | [
"transformers",
"pytorch",
"gpt_bigcode",
"text-generation",
"arxiv:2305.16504",
"arxiv:2305.06161",
"license:bigcode-openrail-m",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-05-24T04:41:14Z | ---
license: bigcode-openrail-m
---
# starcoder-toolbench
<!-- Provide a quick summary of what the model is/does. -->
starcoder-toolbench is a 15 billion parameter model used for api based action generation. It is instruction tuned from [starcoder](https://huggingface.co/bigcode/starcoder) on api based action generation datasets.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [SambaNova Systems](https://sambanova.ai/)
- **Model type:** Language Model
- **Language(s):** English
- **License:** bigcode-openrail-m
- **Finetuned from model:** [starcoder](https://huggingface.co/bigcode/starcoder)
### Basic Information
<!-- Provide the basic links for the model. -->
- **Paper**: [link](https://arxiv.org/abs/2305.16504)
- **Github**: [link](https://github.com/sambanova/toolbench)
## Uses
<details>
<summary>Click to expand</summary>
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
This model is intended for commercial and research use.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
starcoder-toolbench should NOT be used for purpose other than API based action generation.
</details>
---
## How to Get Started with the Model
<details>
<summary>Click to expand</summary>
### Loading in model with Huggingface
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("sambanovasystems/starcoder-toolbench")
model = AutoModelForCausalLM.from_pretrained("sambanovasystems/starcoder-toolbench", device_map="auto", torch_dtype="auto")
```
### Example Prompts To Try in GPU Tutorial
Prompt 1:
```
I have the following set of API:\n\n# To set the maximum commute time in minute to your office location, assuming the office location is already defined\nAPI.set_max_commute_time(value: int)\n\n# To set the maximum home size in square feet\nAPI.set_max_square_feet(value: int)\n\n# To set the minimum home price in dollars\nAPI.set_min_price(value: int)\n\n# To set the number of garage(s)\nAPI.set_num_garages(value: int)\n\n# To set home types for search. For home buying, home_types choices are: \"House\", \"Townhouse\", \"Condo\", \"Land\", \"Multi-family\", \"Mobile\", \"Co-op\"; for home renting, home_types choices are: \"House\", \"Townhouse\", \"Condo\", \"Apartment\".\nAPI.select_home_type(home_types: List[str])\n\n# To set the number of balconies\nAPI.set_num_balconies(value: int)\n\n# Submit criterion to get search results. This function should be called after setting all the criterion.\nAPI.search()\n\n# To set the floor number\nAPI.set_floor_number(value: int)\n\n# To set the number of bedroom(s)\nAPI.set_num_beds(value: int)\n\n# To set the number of swimming pool(s)\nAPI.set_num_swimming_pools(value: int)\n\n# To set the maximum home price in dollars\nAPI.set_max_price(value: int)\n\n# To specify whether to search homes for buying or renting. 'value' can be chosen from ['buy', 'rent']. This function must be called after setting the location and before setting any other criteria.\nAPI.set_buy_or_rent(value: str)\n\n# To set the number of bathroom(s)\nAPI.set_num_baths(value: float)\n\n# To set the location for the search area. This function must be called before setting any criteria.\nAPI.set_location(value: string)\n\n# To set the minimum home size in square feet\nAPI.set_min_square_feet(value: int)\n\n-------------\n\nTask: Looking for homes to rent in Santa Clarita with a price range between $110000 and $1753000, a minimum of 1700 square feet, at least 2 balconies, and 3.5 bathrooms.\nAction:\n
```
Prompt 2:
```
I have the following set of API:\n\n# To set the location for hotel search, given a Loc object. This function must be called if booking type is 'hotels' or 'both'.\nAPI.set_hotel_location(Loc)\n\n# To set the number of hotel rooms to book.\nAPI.set_num_rooms(value)\n\n# To set the location for departure, given a Loc object. This function must be called if booking type is 'trip tickets' or 'both'.\nAPI.set_origin(Loc)\n\n# To select the transportation type from ['flight', 'train', 'bus', 'cruise']. This function must be called if booking type is 'trip tickets' or 'both'.\nAPI.select_transportation(transportation_type)\n\n# To set the return date of the trip, given a Date object. If booking type is 'both' and this function is not called explicitly, 'return_date' will be set to 'hotel_checkout_date' implicitly.\nAPI.set_return_date(Date)\n\n# To set the hotel check-in date, given a Date object. This function must be called if booking type is 'hotels' or 'both'.\nAPI.set_checkin_date(Date)\n\n# To define a date.\ndate = Date(month, day, year)\n\n# To set the departure date of the trip, given a Date object. This function must be called if booking type is 'trip tickets'. If booking type is 'both' and this function is not called explicitly, 'departure_date' will be set to 'hotel_checkin_date' implicitly.\nAPI.set_departure_date(Date)\n\n# To set the location for arrival, given a Loc object. This function must be called if booking type is 'trip tickets' or 'both'.\nAPI.set_destination(Loc)\n\n# To define a location of a given city 'City'.\nlocation = Loc('City')\n\n# To set maximum hotel room price.\nAPI.set_max_room_price(value)\n\n# To set minimum ticket price.\nAPI.set_min_ticket_price(value)\n\n# To select the booking type from ['hotels', 'trip tickets', 'both']. This function must be called before setting any criteria.\nAPI.select_booking_type(booking_type)\n\n# To set minimum hotel room price.\nAPI.set_min_room_price(value)\n\n# To set the number of child tickets to purchase.\nAPI.set_num_children(value)\n\n# To set the number of adult tickets to purchase.\nAPI.set_num_adults(value)\n\n# To select the hotel room type from ['King Bed', 'Queen Bed', 'Double', 'Luxury'].\nAPI.select_room_type(room_type)\n\n# To set maximum ticket price.\nAPI.set_max_ticket_price(value)\n\n# Submit criterion to get search results. This function should be called after setting all the criterion.\nAPI.search()\n\n# To set the hotel check-out date, given a Date object. This function must be called if booking type is 'hotels' or 'both'.\nAPI.set_checkout_date(Date)\n\n-------------\n\nTask: Looking to book 2 adult and 4 child tickets from Stockton to Baltimore by cruise, on 2023-07-29.\nAction:\n
```
</details>
---
## Training Details
<details>
<summary>Click to expand</summary>
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
The training data is curated for the 8 tasks in ToolBench. See Appendix A of the [paper](https://arxiv.org/abs/2305.16504) for task details and Appendix C.1 for the training data curation details. In total, there are 9704 training samples, organized in all-shot format as described in Appendix C.2. Here is the [download link](https://drive.google.com/file/d/1lUatLGnSVhfy1uVIPEQ7qCoLtnCIXi2O/view?usp=sharing) to the training data.
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
We trained starcoder-toolbench on 4 80GB A100 gpu's. We started from [starcoder](https://huggingface.co/bigcode/starcoder) and finetuned it on the dataset mentioned above.
### Hyperparameters
- Hardware: A100 GPU
- Optimizer: AdamW
- Grad accumulation: 1
- Epochs: 8
- Global Batch size: 16
- Batch tokens: 16 * 2048 = 32,768 tokens
- Learning Rate: 1e-5
- Learning Rate Scheduler: Fixed LR
- Weight decay: 0.1
</details>
## Acknowledgment
We would like to express our gratitude to the great work done in [StarCoder: may the source be with you!](https://arxiv.org/abs/2305.06161)
## Cite starcoder-toolbench
```
@misc{xu2023tool,
title={On the Tool Manipulation Capability of Open-source Large Language Models},
author={Qiantong Xu and Fenglu Hong and Bo Li and Changran Hu and Zhengyu Chen and Jian Zhang},
year={2023},
eprint={2305.16504},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
LarryAIDraw/mitsuki_nase-07 | LarryAIDraw | 2023-06-16T17:23:09Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-16T17:13:47Z | ---
license: creativeml-openrail-m
---
https://civitai.com/models/90270/mitsuki-nase-kyoukai-no-kanata |
studio-ousia/mluke-large-lite | studio-ousia | 2023-06-16T13:55:27Z | 1,461 | 2 | transformers | [
"transformers",
"pytorch",
"luke",
"fill-mask",
"named entity recognition",
"relation classification",
"question answering",
"multilingual",
"ar",
"bn",
"de",
"el",
"en",
"es",
"fi",
"fr",
"hi",
"id",
"it",
"ja",
"ko",
"nl",
"pl",
"pt",
"ru",
"sv",
"sw",
"te",
"th",
"tr",
"vi",
"zh",
"arxiv:2010.01057",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | fill-mask | 2022-04-13T10:48:26Z | ---
language:
- multilingual
- ar
- bn
- de
- el
- en
- es
- fi
- fr
- hi
- id
- it
- ja
- ko
- nl
- pl
- pt
- ru
- sv
- sw
- te
- th
- tr
- vi
- zh
thumbnail: https://github.com/studio-ousia/luke/raw/master/resources/luke_logo.png
tags:
- luke
- named entity recognition
- relation classification
- question answering
license: apache-2.0
---
## mLUKE
**mLUKE** (multilingual LUKE) is a multilingual extension of LUKE.
Please check the [official repository](https://github.com/studio-ousia/luke) for
more details and updates.
This is the mLUKE base model with 12 hidden layers, 768 hidden size. The total number
of parameters in this model is 561M.
The model was initialized with the weights of XLM-RoBERTa(large) and trained using December 2020 version of Wikipedia in 24 languages.
This model is a lite-weight version of [studio-ousia/mluke-large](https://huggingface.co/studio-ousia/mluke-large), without Wikipedia entity embeddings but only with special entities such as `[MASK]`.
## Note
When you load the model from `AutoModel.from_pretrained` with the default configuration, you will see the following warning:
```
Some weights of the model checkpoint at studio-ousia/mluke-base-lite were not used when initializing LukeModel: [
'luke.encoder.layer.0.attention.self.w2e_query.weight', 'luke.encoder.layer.0.attention.self.w2e_query.bias',
'luke.encoder.layer.0.attention.self.e2w_query.weight', 'luke.encoder.layer.0.attention.self.e2w_query.bias',
'luke.encoder.layer.0.attention.self.e2e_query.weight', 'luke.encoder.layer.0.attention.self.e2e_query.bias',
...]
```
These weights are the weights for entity-aware attention (as described in [the LUKE paper](https://arxiv.org/abs/2010.01057)).
This is expected because `use_entity_aware_attention` is set to `false` by default, but the pretrained weights contain the weights for it in case you enable `use_entity_aware_attention` and have the weights loaded into the model.
### Citation
If you find mLUKE useful for your work, please cite the following paper:
```latex
@inproceedings{ri-etal-2022-mluke,
title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
author = "Ri, Ryokan and
Yamada, Ikuya and
Tsuruoka, Yoshimasa",
booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
year = "2022",
url = "https://aclanthology.org/2022.acl-long.505",
```
|
gokuls/sa_BERT_48_mnli | gokuls | 2023-06-16T12:38:17Z | 131 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-16T06:16:12Z | ---
language:
- en
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: sa_BERT_48_mnli
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MNLI
type: glue
config: mnli
split: validation_matched
args: mnli
metrics:
- name: Accuracy
type: accuracy
value: 0.7034174125305126
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sa_BERT_48_mnli
This model is a fine-tuned version of [gokuls/bert_base_48](https://huggingface.co/gokuls/bert_base_48) on the GLUE MNLI dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7082
- Accuracy: 0.7034
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 96
- eval_batch_size: 96
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.9145 | 1.0 | 4091 | 0.8006 | 0.6536 |
| 0.7442 | 2.0 | 8182 | 0.7245 | 0.6903 |
| 0.6631 | 3.0 | 12273 | 0.7323 | 0.6979 |
| 0.5942 | 4.0 | 16364 | 0.7073 | 0.7076 |
| 0.5241 | 5.0 | 20455 | 0.7475 | 0.7016 |
| 0.4526 | 6.0 | 24546 | 0.8377 | 0.7088 |
| 0.3842 | 7.0 | 28637 | 0.8736 | 0.6956 |
| 0.3213 | 8.0 | 32728 | 0.9334 | 0.6945 |
| 0.2669 | 9.0 | 36819 | 1.0196 | 0.7027 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.13.0
- Tokenizers 0.13.3
|
KHEW/Daimondoll | KHEW | 2023-06-16T10:46:55Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-16T10:45:55Z | ---
license: creativeml-openrail-m
---
|
aditya9729/starcoder_sc_final_2_epochs | aditya9729 | 2023-06-16T03:32:35Z | 0 | 0 | null | [
"pytorch",
"generated_from_trainer",
"license:bigcode-openrail-m",
"region:us"
] | null | 2023-06-16T00:51:59Z | ---
license: bigcode-openrail-m
tags:
- generated_from_trainer
model-index:
- name: starcoder_sc_final_2_epochs
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# starcoder_sc_final_2_epochs
This model is a fine-tuned version of [bigcode/starcoder](https://huggingface.co/bigcode/starcoder) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5576
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 0.6682 | 1.0 | 152 | 0.6275 |
| 0.495 | 1.99 | 304 | 0.5576 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 1.13.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Carina124/plant-vit-model-4-final | Carina124 | 2023-06-15T23:42:20Z | 252 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | image-classification | 2023-06-15T21:50:32Z | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: plant-vit-model-4-final
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# plant-vit-model-4-final
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0741
- Precision: 1.0
- Recall: 1.0
- F1: 1.0
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 1.2499 | 1.0 | 83 | 0.9951 | 0.9157 | 0.9157 | 0.9157 | 0.9157 |
| 0.4403 | 2.0 | 166 | 0.3535 | 0.9845 | 0.9845 | 0.9845 | 0.9845 |
| 0.2681 | 3.0 | 249 | 0.2108 | 0.9973 | 0.9973 | 0.9973 | 0.9973 |
| 0.2026 | 4.0 | 332 | 0.1501 | 0.9984 | 0.9984 | 0.9984 | 0.9984 |
| 0.158 | 5.0 | 415 | 0.1212 | 0.9984 | 0.9984 | 0.9984 | 0.9984 |
| 0.1382 | 6.0 | 498 | 0.1024 | 0.9984 | 0.9984 | 0.9984 | 0.9984 |
| 0.1233 | 7.0 | 581 | 0.0882 | 0.9995 | 0.9995 | 0.9995 | 0.9995 |
| 0.1026 | 8.0 | 664 | 0.0790 | 0.9995 | 0.9995 | 0.9995 | 0.9995 |
| 0.1046 | 9.0 | 747 | 0.0741 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0881 | 10.0 | 830 | 0.0731 | 1.0 | 1.0 | 1.0 | 1.0 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
NBRZ/gpt2-dp | NBRZ | 2023-06-15T22:05:33Z | 134 | 0 | transformers | [
"transformers",
"pytorch",
"gpt2",
"text-generation",
"generated_from_trainer",
"dataset:generator",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-06-15T20:50:09Z | ---
license: mit
tags:
- generated_from_trainer
datasets:
- generator
model-index:
- name: gpt2-dp
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2-dp
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 4.1923
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 1000
- num_epochs: 9
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 6.303 | 2.14 | 500 | 5.0458 |
| 4.5667 | 4.27 | 1000 | 4.4729 |
| 3.9756 | 6.41 | 1500 | 4.2455 |
| 3.5295 | 8.55 | 2000 | 4.1923 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.11.0+cu113
- Datasets 2.13.0
- Tokenizers 0.13.3
|
hangeol/3 | hangeol | 2023-06-15T18:58:18Z | 0 | 0 | diffusers | [
"diffusers",
"tensorboard",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"textual_inversion",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | text-to-image | 2023-06-14T19:07:04Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- textual_inversion
inference: true
---
# Textual inversion text2image fine-tuning - hangeol/3
These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
|
panpannn/pitloraaa | panpannn | 2023-06-15T15:10:03Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-15T15:08:25Z | ---
license: creativeml-openrail-m
---
|
anth0nyhak1m/CFGFP_BasicTypeCalssifier | anth0nyhak1m | 2023-06-15T15:00:14Z | 18 | 0 | transformers | [
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-15T14:59:34Z | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: CFGFP_BasicTypeCalssifier
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CFGFP_BasicTypeCalssifier
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9680
- Accuracy: 0.8450
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.4133 | 1.0 | 3321 | 1.2102 | 0.8081 |
| 0.9236 | 2.0 | 6642 | 0.9680 | 0.8450 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.13.0
- Tokenizers 0.13.3
|
chencjiajy/ppo-Huggy | chencjiajy | 2023-06-15T13:26:06Z | 7 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] | reinforcement-learning | 2023-06-15T13:25:56Z | ---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: chencjiajy/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
gokuls/sa_BERT_no_pretrain_stsb | gokuls | 2023-06-15T08:03:24Z | 129 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-05-29T14:26:57Z | ---
language:
- en
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- spearmanr
model-index:
- name: sa_BERT_no_pretrain_stsb
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE STSB
type: glue
config: stsb
split: validation
args: stsb
metrics:
- name: Spearmanr
type: spearmanr
value: 0.12459536879199183
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sa_BERT_no_pretrain_stsb
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the GLUE STSB dataset.
It achieves the following results on the evaluation set:
- Loss: 2.5396
- Pearson: 0.1394
- Spearmanr: 0.1246
- Combined Score: 0.1320
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 96
- eval_batch_size: 96
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Pearson | Spearmanr | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:---------:|:--------------:|
| 2.257 | 1.0 | 60 | 3.1111 | 0.0528 | 0.0709 | 0.0619 |
| 2.0476 | 2.0 | 120 | 2.5396 | 0.1394 | 0.1246 | 0.1320 |
| 1.8905 | 3.0 | 180 | 2.5928 | 0.1553 | 0.1593 | 0.1573 |
| 1.5383 | 4.0 | 240 | 3.1130 | 0.1930 | 0.2086 | 0.2008 |
| 1.3384 | 5.0 | 300 | 2.8651 | 0.1788 | 0.2014 | 0.1901 |
| 1.1299 | 6.0 | 360 | 2.9651 | 0.1818 | 0.1947 | 0.1883 |
| 1.0952 | 7.0 | 420 | 2.6404 | 0.2100 | 0.2124 | 0.2112 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.12.0
- Tokenizers 0.13.3
|
octipuw/unit1LunarLander | octipuw | 2023-06-15T04:23:55Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-15T04:23:37Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 252.30 +/- 16.42
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
nolanaatama/shnhlrcllctnftrtrs | nolanaatama | 2023-06-15T00:53:13Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-15T00:51:20Z | ---
license: creativeml-openrail-m
---
|
Stevenhpy/unsup-simcse-bert-base-uncased-focal | Stevenhpy | 2023-06-14T23:11:49Z | 103 | 0 | transformers | [
"transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"endpoints_compatible",
"region:us"
] | sentence-similarity | 2023-06-14T22:55:56Z | ---
pipeline_tag: sentence-similarity
---
## How to Get Started with the Model
please refer to https://github.com/puerrrr/Focal-InfoNCE
## Model Card Contact
[email protected] |
LarryAIDraw/arima_kana_v08 | LarryAIDraw | 2023-06-14T20:31:19Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-14T20:29:10Z | ---
license: creativeml-openrail-m
---
https://civitai.com/models/80779/arima-kana-oshi-no-ko |
cambridgeltl/SapBERT-UMLS-2020AB-all-lang-from-XLMR | cambridgeltl | 2023-06-14T19:00:30Z | 299,120 | 3 | transformers | [
"transformers",
"pytorch",
"safetensors",
"xlm-roberta",
"feature-extraction",
"arxiv:2010.11784",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2022-03-02T23:29:05Z | ---
language: multilingual
tags:
- biomedical
- lexical-semantics
- cross-lingual
datasets:
- UMLS
**[news]** A cross-lingual extension of SapBERT will appear in the main onference of **ACL 2021**! <br>
**[news]** SapBERT will appear in the conference proceedings of **NAACL 2021**!
### SapBERT-XLMR
SapBERT [(Liu et al. 2020)](https://arxiv.org/pdf/2010.11784.pdf) trained with [UMLS](https://www.nlm.nih.gov/research/umls/licensedcontent/umlsknowledgesources.html) 2020AB, using [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) as the base model. Please use [CLS] as the representation of the input.
#### Extracting embeddings from SapBERT
The following script converts a list of strings (entity names) into embeddings.
```python
import numpy as np
import torch
from tqdm.auto import tqdm
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("cambridgeltl/SapBERT-from-PubMedBERT-fulltext")
model = AutoModel.from_pretrained("cambridgeltl/SapBERT-from-PubMedBERT-fulltext").cuda()
# replace with your own list of entity names
all_names = ["covid-19", "Coronavirus infection", "high fever", "Tumor of posterior wall of oropharynx"]
bs = 128 # batch size during inference
all_embs = []
for i in tqdm(np.arange(0, len(all_names), bs)):
toks = tokenizer.batch_encode_plus(all_names[i:i+bs],
padding="max_length",
max_length=25,
truncation=True,
return_tensors="pt")
toks_cuda = {}
for k,v in toks.items():
toks_cuda[k] = v.cuda()
cls_rep = model(**toks_cuda)[0][:,0,:] # use CLS representation as the embedding
all_embs.append(cls_rep.cpu().detach().numpy())
all_embs = np.concatenate(all_embs, axis=0)
```
For more details about training and eval, see SapBERT [github repo](https://github.com/cambridgeltl/sapbert).
### Citation
```bibtex
@inproceedings{liu2021learning,
title={Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking},
author={Liu, Fangyu and Vuli{\'c}, Ivan and Korhonen, Anna and Collier, Nigel},
booktitle={Proceedings of ACL-IJCNLP 2021},
month = aug,
year={2021}
}
``` |
gokuls/hBERTv1_new_pretrain_48_KD_sst2 | gokuls | 2023-06-14T17:00:34Z | 45 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"hybridbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-14T16:10:40Z | ---
language:
- en
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
model-index:
- name: hBERTv1_new_pretrain_48_KD_sst2
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE SST2
type: glue
config: sst2
split: validation
args: sst2
metrics:
- name: Accuracy
type: accuracy
value: 0.8165137614678899
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# hBERTv1_new_pretrain_48_KD_sst2
This model is a fine-tuned version of [gokuls/bert_12_layer_model_v1_complete_training_new_48_KD](https://huggingface.co/gokuls/bert_12_layer_model_v1_complete_training_new_48_KD) on the GLUE SST2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4098
- Accuracy: 0.8165
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 128
- eval_batch_size: 128
- seed: 10
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.3947 | 1.0 | 527 | 0.4098 | 0.8165 |
| 0.2426 | 2.0 | 1054 | 0.4796 | 0.8257 |
| 0.1948 | 3.0 | 1581 | 0.4835 | 0.8188 |
| 0.1702 | 4.0 | 2108 | 0.5116 | 0.8028 |
| 0.1484 | 5.0 | 2635 | 0.5547 | 0.8085 |
| 0.1355 | 6.0 | 3162 | 0.6598 | 0.7993 |
### Framework versions
- Transformers 4.30.2
- Pytorch 1.14.0a0+410ce96
- Datasets 2.12.0
- Tokenizers 0.13.3
|
ABrinkmann/deberta-v3-large-ad-opentag-finetuned-ner-2epochs | ABrinkmann | 2023-06-14T12:10:32Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"deberta-v2",
"token-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | token-classification | 2023-06-14T08:33:49Z | ---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: deberta-v3-large-ad-opentag-finetuned-ner-2epochs
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta-v3-large-ad-opentag-finetuned-ner-2epochs
This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0023
- Precision: 0.9843
- Recall: 0.9911
- F1: 0.9877
- Accuracy: 0.9996
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0041 | 1.0 | 8909 | 0.0036 | 0.9746 | 0.9838 | 0.9792 | 0.9993 |
| 0.0032 | 2.0 | 17818 | 0.0023 | 0.9843 | 0.9911 | 0.9877 | 0.9996 |
### Framework versions
- Transformers 4.30.1
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Jeibros/a2c-v1 | Jeibros | 2023-06-14T09:40:42Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"PandaReachDense-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-05T09:17:12Z | ---
library_name: stable-baselines3
tags:
- PandaReachDense-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: PandaReachDense-v2
type: PandaReachDense-v2
metrics:
- type: mean_reward
value: -0.80 +/- 0.27
name: mean_reward
verified: false
---
# **A2C** Agent playing **PandaReachDense-v2**
This is a trained model of a **A2C** agent playing **PandaReachDense-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Honkware/Manticore-13b-Landmark | Honkware | 2023-06-14T05:37:48Z | 6 | 8 | transformers | [
"transformers",
"pytorch",
"llama",
"text-generation",
"custom_code",
"arxiv:2305.16300",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2023-06-13T01:02:34Z | ---
license: other
---
# Manticore-13b-Landmark
## Key Features
- **[Landmark Attention](https://arxiv.org/pdf/2305.16300v1.pdf)**
- **[Large Context Size (~18k)](https://i.ibb.co/tLLGLNc/image.jpg)**
## Composition
Manticore-13b-Landmark is a blend of:
- [Manticore-13B](https://huggingface.co/openaccess-ai-collective/manticore-13b)
- [Manticore-13B-Landmark-QLoRA](https://huggingface.co/Honkware/Manticore-13b-Landmark-QLoRA)
## Using [Oobabooga](https://github.com/oobabooga/text-generation-webui)
- Trust Remote Code - **(Enabled)**
- Add the bos_token to the beginning of prompts - **(Disabled)**
- Truncate the prompt up to this length - **(Increased)**
## Landmark Training Code
See [GitHub](https://github.com/eugenepentland/landmark-attention-qlora) for the training code. |
Lunetta/Zxchb | Lunetta | 2023-06-14T04:44:36Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-14T04:43:39Z | ---
license: creativeml-openrail-m
---
|
mayonek/mayonek1 | mayonek | 2023-06-14T03:11:01Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-06-14T02:57:41Z | torchrun --nproc_per_node=1 --master_port=9778 fastchat/train/train_flant5.py \
--model_name_or_path lmsys/fastchat-t5-3b-v1.0 \
--data_path /workspace/processed_data.json \
--bf16 True \
--output_dir ./workspace/RYRMODEL \
--num_train_epochs 3 \
--per_device_train_batch_size 1 \
--per_device_eval_batch_size 1 \
--gradient_accumulation_steps 4 \
--evaluation_strategy "no" \
--save_strategy "steps" \
--save_steps 300 \
--save_total_limit 1 \
--learning_rate 2e-5 \
--weight_decay 0. \
--warmup_ratio 0.03 \
--lr_scheduler_type "cosine" \
--logging_steps 1 \
--fsdp "full_shard auto_wrap" \
--fsdp_transformer_layer_cls_to_wrap T5Block \
--tf32 True \
--model_max_length 2048 \
--preprocessed_path ./preprocessed_data/processed.json \
--gradient_checkpointing True |
vhahvhah/my_Portugalian_model | vhahvhah | 2023-06-13T16:08:00Z | 105 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:opus_books",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-06-07T17:13:02Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- opus_books
metrics:
- bleu
model-index:
- name: my_Portugalian_model
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: opus_books
type: opus_books
config: en-pt
split: train
args: en-pt
metrics:
- name: Bleu
type: bleu
value: 0.8381
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_Portugalian_model
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the opus_books dataset.
It achieves the following results on the evaluation set:
- Loss: 3.5905
- Bleu: 0.8381
- Gen Len: 17.1601
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
| No log | 1.0 | 71 | 3.6818 | 0.8636 | 17.0356 |
| No log | 2.0 | 142 | 3.5905 | 0.8381 | 17.1601 |
### Framework versions
- Transformers 4.30.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
leo1452/q-Taxi-v3 | leo1452 | 2023-06-13T13:24:07Z | 0 | 0 | null | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-13T11:38:16Z | ---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.54 +/- 2.73
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="leo1452/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
yoninazarathy/distilbert-base-uncased-finetuned-cola | yoninazarathy | 2023-06-13T11:52:00Z | 109 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-13T11:45:21Z | ---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.5363967157085073
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8120
- Matthews Correlation: 0.5364
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5227 | 1.0 | 535 | 0.5222 | 0.4210 |
| 0.3466 | 2.0 | 1070 | 0.5042 | 0.4832 |
| 0.2335 | 3.0 | 1605 | 0.5640 | 0.5173 |
| 0.1812 | 4.0 | 2140 | 0.7634 | 0.5200 |
| 0.1334 | 5.0 | 2675 | 0.8120 | 0.5364 |
### Framework versions
- Transformers 4.30.1
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
hasu234/image_classifier | hasu234 | 2023-06-13T10:15:18Z | 0 | 0 | null | [
"image-classification",
"en",
"license:apache-2.0",
"region:us"
] | image-classification | 2023-06-13T09:50:12Z | ---
license: apache-2.0
language:
- en
pipeline_tag: image-classification
---
This is the model for [https://github.com/hasu234/SDPDSSample](https://github.com/hasu234/SDPDSSample). this repository.
This classifier can classify between ```dog```, ```berry```, ```flower``` and ```bird``` images.

## Dependencies
* Python 3.9
## Setting up conda Environment
* Clone the repository by running
```
git clone https://github.com/hasu234/SDPDSSample.git
```
* Change current directory to SDPDSSample
```
cd SDPDSSample
```
* Create a conda environmet
```
conda create -n myenv python=3.9
```
* Activate the environment
```
conda activate myenv
```
* Install the required library from ```requirment.txt``` by running
```
pip install -r requirmen.txt
```
* Or, create a conda environment from ```environment.yml``` by running
```
conda env create -f environment.yml
```
## Trining your data
* To train your own dataset
Make sure you have a data folder having the same folder hiararchy like below
```
├── dataset
| ├── train
│ │ ├── class1
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class2
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class3
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class4
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
| ├── test
│ │ ├── class1
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class2
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class3
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
│ │ ├── class4
│ │ │ ├──image1.jpg
│ │ │ ├──image2.jpg
```
or make some chainges on ```train.py``` according to to your dataset directory.
* Make sure you are in the project directory and run the ```train.py``` script with the folder directory of your dataset
```
python train.py /path/to/dataset_directory
```
## Running inference
* To run the inference on your test data make sure you downloaded the pretrained model from [this link](https://drive.google.com/uc?id=197Kuuo4LhHunYLgGKfGeouNTL0WguP0T&export=download).
* Then run the ```infer.py``` script from terminal specifying the test image location and downloaded pretrained model location
```
python infer.py path/to/image.jpg path/to/model.pth
```
## Running on Docker
* Clone the repository by running
```
git clone https://github.com/hasu234/SDPDSSample.git
```
* Change current directory to SDPDSSample
```
cd SDPDSSample
```
Before Building the docker image transfer the files (model, test image, dataset) to current working directory if you don't want to deal with docker volume
* Build the Docker image by running
```
docker build -t sdpdsample .
```
* Run the docker image
```
docker run -d sdpdsample
```
if the container failed to run in background, run it on foreground using ```docker run -it sdpdsample``` then exit to get the running container id
* Get the container id
```
docker ps
```
* Getting inside the container
```
docker exec -it <container id> bash
```
You will get a Linux like command-line interface
* Running the project
```
# for training your data
python train.py /path/to/dataset_directory
# for running inference
python infer.py path/to/image.jpg path/to/model.pth
``` |
undrwolf/q-FrozenLake-v1-4x4-noSlippery | undrwolf | 2023-06-13T08:18:29Z | 0 | 0 | null | [
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-13T08:18:26Z | ---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="undrwolf/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
timdettmers/qlora-flan-33b | timdettmers | 2023-06-13T01:23:59Z | 0 | 0 | null | [
"arxiv:2305.14314",
"arxiv:2302.13971",
"region:us"
] | null | 2023-05-22T20:36:41Z | # QLoRA Instruction Tuned Models
| [Paper](https://arxiv.org/abs/2305.14314) | [Code](https://github.com/artidoro/qlora) | [Demo](https://huggingface.co/spaces/uwnlp/guanaco-playground-tgi) |
**The `QLoRA Instruction Tuned Models` are open-source models obtained through 4-bit QLoRA tuning of LLaMA base models on various instruction tuning datasets. They are available in 7B, 13B, 33B, and 65B parameter sizes.**
**Note: The best performing chatbot models are named [Guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) and finetuned on OASST1. This model card is for the other models finetuned on other instruction tuning datasets.**
⚠️ These models are purely intended for research purposes and could produce problematic outputs.
## What are QLoRA Instruction Tuned Models and why use them?
- **Strong performance on MMLU** following the QLoRA instruction tuning.
- **Replicable and efficient instruction tuning procedure** that can be extended to new use cases. QLoRA training scripts are available in the [QLoRA repo](https://github.com/artidoro/qlora).
- **Rigorous comparison to 16-bit methods** (both 16-bit full-finetuning and LoRA) in [our paper](https://arxiv.org/abs/2305.14314) demonstrates the effectiveness of 4-bit QLoRA finetuning.
- **Lightweight** checkpoints which only contain adapter weights.
## License and Intended Use
QLoRA Instruction Tuned adapter weights are available under Apache 2 license. Note the use of these adapter weights, requires access to the LLaMA model weighs and therefore should be used according to the LLaMA license.
## Usage
Here is an example of how you would load Flan v2 7B in 4-bits:
```python
import torch
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
model_name = "huggyllama/llama-7b"
adapters_name = 'timdettmers/qlora-flan-7b'
model = AutoModelForCausalLM.from_pretrained(
model_name,
load_in_4bit=True,
torch_dtype=torch.bfloat16,
device_map="auto",
max_memory= {i: '24000MB' for i in range(torch.cuda.device_count())},
quantization_config=BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type='nf4'
),
)
model = PeftModel.from_pretrained(model, adapters_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
Inference can then be performed as usual with HF models as follows:
```python
prompt = "Introduce yourself"
formatted_prompt = (
f"A chat between a curious human and an artificial intelligence assistant."
f"The assistant gives helpful, detailed, and polite answers to the user's questions.\n"
f"### Human: {prompt} ### Assistant:"
)
inputs = tokenizer(formatted_prompt, return_tensors="pt").to("cuda:0")
outputs = model.generate(inputs=inputs.input_ids, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
Expected output similar to the following:
```
A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
### Human: Introduce yourself ### Assistant: I am an artificial intelligence assistant. I am here to help you with any questions you may have.
```
## Current Inference Limitations
Currently, 4-bit inference is slow. We recommend loading in 16 bits if inference speed is a concern. We are actively working on releasing efficient 4-bit inference kernels.
Below is how you would load the model in 16 bits:
```python
model_name = "huggyllama/llama-7b"
adapters_name = 'timdettmers/qlora-flan-7b'
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
device_map="auto",
max_memory= {i: '24000MB' for i in range(torch.cuda.device_count())},
)
model = PeftModel.from_pretrained(model, adapters_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Model Card
**Architecture**: The models released here are LoRA adapters to be used on top of LLaMA models. They are added to all layers. For all model sizes, we use $r=64$.
**Base Model**: These models use LLaMA as base model with sizes 7B, 13B, 33B, 65B. LLaMA is a causal language model pretrained on a large corpus of text. See [LLaMA paper](https://arxiv.org/abs/2302.13971) for more details. Note that these models can inherit biases and limitations of the base model.
**Finetuning Data**: These models are finetuned on various instruction tuning datasets. The datasets used are: Alpaca, HH-RLHF, Unnatural Instr., Chip2, Longform, Self-Instruct, FLAN v2.
**Languages**: The different datasets cover different languages. We direct to the various papers and resources describing the datasets for more details.
Next, we describe Training and Evaluation details.
### Training
QLoRA Instruction Tuned Models are the result of 4-bit QLoRA supervised finetuning on different instruction tuning datasets.
All models use NormalFloat4 datatype for the base model and LoRA adapters on all linear layers with BFloat16 as computation datatype. We set LoRA $r=64$, $\alpha=16$. We also use Adam beta2 of 0.999, max grad norm of 0.3 and LoRA dropout of 0.1 for models up to 13B and 0.05 for 33B and 65B models.
For the finetuning process, we use constant learning rate schedule and paged AdamW optimizer.
### Training hyperparameters
| Parameters | Dataset | Batch size | LR | Steps | Source Length | Target Length |
|------------|----------|------------|------|-------|---------------|---------------|
| 7B | All | 16 | 2e-4 | 10000 | 384 | 128 |
| 7B | OASST1 | 16 | 2e-4 | 1875 | - | 512 |
| 7B | HH-RLHF | 16 | 2e-4 | 10000 | - | 768 |
| 7B | Longform | 16 | 2e-4 | 4000 | 512 | 1024 |
| 13B | All | 16 | 2e-4 | 10000 | 384 | 128 |
| 13B | OASST1 | 16 | 2e-4 | 1875 | - | 512 |
| 13B | HH-RLHF | 16 | 2e-4 | 10000 | - | 768 |
| 13B | Longform | 16 | 2e-4 | 4000 | 512 | 1024 |
| 33B | All | 32 | 1e-4 | 5000 | 384 | 128 |
| 33B | OASST1 | 16 | 1e-4 | 1875 | - | 512 |
| 33B | HH-RLHF | 32 | 1e-4 | 5000 | - | 768 |
| 33B | Longform | 32 | 1e-4 | 2343 | 512 | 1024 |
| 65B | All | 64 | 1e-4 | 2500 | 384 | 128 |
| 65B | OASST1 | 16 | 1e-4 | 1875 | - | 512 |
| 65B | HH-RLHF | 64 | 1e-4 | 2500 | - | 768 |
| 65B | Longform | 32 | 1e-4 | 2343 | 512 | 1024 |
### Evaluation
We use the MMLU benchmark to measure performance on a range of language understanding tasks. This is a multiple-choice benchmark covering 57 tasks including elementary mathematics, US history, computer science, law, and more. We report 5-shot test accuracy.
Dataset | 7B | 13B | 33B | 65B
---|---|---|---|---
LLaMA no tuning | 35.1 | 46.9 | 57.8 | 63.4
Self-Instruct | 36.4 | 33.3 | 53.0 | 56.7
Longform | 32.1 | 43.2 | 56.6 | 59.7
Chip2 | 34.5 | 41.6 | 53.6 | 59.8
HH-RLHF | 34.9 | 44.6 | 55.8 | 60.1
Unnatural Instruct | 41.9 | 48.1 | 57.3 | 61.3
OASST1 (Guanaco) | 36.6 | 46.4 | 57.0 | 62.2
Alpaca | 38.8 | 47.8 | 57.3 | 62.5
FLAN v2 | 44.5 | 51.4 | 59.2 | 63.9
We evaluate the generative language capabilities through automated evaluations on the Vicuna benchmark. We report the score of the QLoRA Instruction Finetuned Models relative to the score obtained by ChatGPT. The rater in this case is GPT-4 which is tasked to assign a score out of 10 to both ChatGPT and the model outputs for each prompt. We report scores for models ranging 7B to 65B and compare them to both academic and commercial baselilnes.
| Model / Dataset | Params | Model bits | Memory | ChatGPT vs Sys | Sys vs ChatGPT | Mean | 95\% CI |
|------------------|--------|------------|--------|----------------|----------------|------------------|---------|
| GPT-4 | - | - | - | 119.4\% | 110.1\% | **114.5**\% | 2.6\% |
| Bard | - | - | - | 93.2\% | 96.4\% | 94.8\% | 4.1\% |
| Guanaco | 65B | 4-bit | 41 GB | 96.7\% | 101.9\% | **99.3**\% | 4.4\% |
| Alpaca | 65B | 4-bit | 41 GB | 63.0\% | 77.9\% | 70.7\% | 4.3\% |
| FLAN v2 | 65B | 4-bit | 41 GB | 37.0\% | 59.6\% | 48.4\% | 4.6\% |
| Guanaco | 33B | 4-bit | 21 GB | 96.5\% | 99.2\% | **97.8**\% | 4.4\% |
| Open Assistant | 33B | 16-bit | 66 GB | 73.4\% | 85.7\% | 78.1\% | 5.3\% |
| Alpaca | 33B | 4-bit | 21 GB | 67.2\% | 79.7\% | 73.6\% | 4.2\% |
| FLAN v2 | 33B | 4-bit | 21 GB | 26.3\% | 49.7\% | 38.0\% | 3.9\% |
| Vicuna | 13B | 16-bit | 26 GB | 91.2\% | 98.7\% | **94.9**\% | 4.5\% |
| Guanaco | 13B | 4-bit | 10 GB | 87.3\% | 93.4\% | 90.4\% | 5.2\% |
| Alpaca | 13B | 4-bit | 10 GB | 63.8\% | 76.7\% | 69.4\% | 4.2\% |
| HH-RLHF | 13B | 4-bit | 10 GB | 55.5\% | 69.1\% | 62.5\% | 4.7\% |
| Unnatural Instr. | 13B | 4-bit | 10 GB | 50.6\% | 69.8\% | 60.5\% | 4.2\% |
| Chip2 | 13B | 4-bit | 10 GB | 49.2\% | 69.3\% | 59.5\% | 4.7\% |
| Longform | 13B | 4-bit | 10 GB | 44.9\% | 62.0\% | 53.6\% | 5.2\% |
| Self-Instruct | 13B | 4-bit | 10 GB | 38.0\% | 60.5\% | 49.1\% | 4.6\% |
| FLAN v2 | 13B | 4-bit | 10 GB | 32.4\% | 61.2\% | 47.0\% | 3.6\% |
| Guanaco | 7B | 4-bit | 5 GB | 84.1\% | 89.8\% | **87.0**\% | 5.4\% |
| Alpaca | 7B | 4-bit | 5 GB | 57.3\% | 71.2\% | 64.4\% | 5.0\% |
| FLAN v2 | 7B | 4-bit | 5 GB | 33.3\% | 56.1\% | 44.8\% | 4.0\% |
## Citation
```bibtex
@article{dettmers2023qlora,
title={QLoRA: Efficient Finetuning of Quantized LLMs},
author={Dettmers, Tim and Pagnoni, Artidoro and Holtzman, Ari and Zettlemoyer, Luke},
journal={arXiv preprint arXiv:2305.14314},
year={2023}
}
``` |
timdettmers/qlora-hh-rlhf-13b | timdettmers | 2023-06-13T01:23:52Z | 0 | 0 | null | [
"arxiv:2305.14314",
"arxiv:2302.13971",
"region:us"
] | null | 2023-05-22T18:55:09Z | # QLoRA Instruction Tuned Models
| [Paper](https://arxiv.org/abs/2305.14314) | [Code](https://github.com/artidoro/qlora) | [Demo](https://huggingface.co/spaces/uwnlp/guanaco-playground-tgi) |
**The `QLoRA Instruction Tuned Models` are open-source models obtained through 4-bit QLoRA tuning of LLaMA base models on various instruction tuning datasets. They are available in 7B, 13B, 33B, and 65B parameter sizes.**
**Note: The best performing chatbot models are named [Guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) and finetuned on OASST1. This model card is for the other models finetuned on other instruction tuning datasets.**
⚠️ These models are purely intended for research purposes and could produce problematic outputs.
## What are QLoRA Instruction Tuned Models and why use them?
- **Strong performance on MMLU** following the QLoRA instruction tuning.
- **Replicable and efficient instruction tuning procedure** that can be extended to new use cases. QLoRA training scripts are available in the [QLoRA repo](https://github.com/artidoro/qlora).
- **Rigorous comparison to 16-bit methods** (both 16-bit full-finetuning and LoRA) in [our paper](https://arxiv.org/abs/2305.14314) demonstrates the effectiveness of 4-bit QLoRA finetuning.
- **Lightweight** checkpoints which only contain adapter weights.
## License and Intended Use
QLoRA Instruction Tuned adapter weights are available under Apache 2 license. Note the use of these adapter weights, requires access to the LLaMA model weighs and therefore should be used according to the LLaMA license.
## Usage
Here is an example of how you would load Flan v2 7B in 4-bits:
```python
import torch
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
model_name = "huggyllama/llama-7b"
adapters_name = 'timdettmers/qlora-flan-7b'
model = AutoModelForCausalLM.from_pretrained(
model_name,
load_in_4bit=True,
torch_dtype=torch.bfloat16,
device_map="auto",
max_memory= {i: '24000MB' for i in range(torch.cuda.device_count())},
quantization_config=BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_compute_dtype=torch.bfloat16,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type='nf4'
),
)
model = PeftModel.from_pretrained(model, adapters_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
Inference can then be performed as usual with HF models as follows:
```python
prompt = "Introduce yourself"
formatted_prompt = (
f"A chat between a curious human and an artificial intelligence assistant."
f"The assistant gives helpful, detailed, and polite answers to the user's questions.\n"
f"### Human: {prompt} ### Assistant:"
)
inputs = tokenizer(formatted_prompt, return_tensors="pt").to("cuda:0")
outputs = model.generate(inputs=inputs.input_ids, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
Expected output similar to the following:
```
A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions.
### Human: Introduce yourself ### Assistant: I am an artificial intelligence assistant. I am here to help you with any questions you may have.
```
## Current Inference Limitations
Currently, 4-bit inference is slow. We recommend loading in 16 bits if inference speed is a concern. We are actively working on releasing efficient 4-bit inference kernels.
Below is how you would load the model in 16 bits:
```python
model_name = "huggyllama/llama-7b"
adapters_name = 'timdettmers/qlora-flan-7b'
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype=torch.bfloat16,
device_map="auto",
max_memory= {i: '24000MB' for i in range(torch.cuda.device_count())},
)
model = PeftModel.from_pretrained(model, adapters_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Model Card
**Architecture**: The models released here are LoRA adapters to be used on top of LLaMA models. They are added to all layers. For all model sizes, we use $r=64$.
**Base Model**: These models use LLaMA as base model with sizes 7B, 13B, 33B, 65B. LLaMA is a causal language model pretrained on a large corpus of text. See [LLaMA paper](https://arxiv.org/abs/2302.13971) for more details. Note that these models can inherit biases and limitations of the base model.
**Finetuning Data**: These models are finetuned on various instruction tuning datasets. The datasets used are: Alpaca, HH-RLHF, Unnatural Instr., Chip2, Longform, Self-Instruct, FLAN v2.
**Languages**: The different datasets cover different languages. We direct to the various papers and resources describing the datasets for more details.
Next, we describe Training and Evaluation details.
### Training
QLoRA Instruction Tuned Models are the result of 4-bit QLoRA supervised finetuning on different instruction tuning datasets.
All models use NormalFloat4 datatype for the base model and LoRA adapters on all linear layers with BFloat16 as computation datatype. We set LoRA $r=64$, $\alpha=16$. We also use Adam beta2 of 0.999, max grad norm of 0.3 and LoRA dropout of 0.1 for models up to 13B and 0.05 for 33B and 65B models.
For the finetuning process, we use constant learning rate schedule and paged AdamW optimizer.
### Training hyperparameters
| Parameters | Dataset | Batch size | LR | Steps | Source Length | Target Length |
|------------|----------|------------|------|-------|---------------|---------------|
| 7B | All | 16 | 2e-4 | 10000 | 384 | 128 |
| 7B | OASST1 | 16 | 2e-4 | 1875 | - | 512 |
| 7B | HH-RLHF | 16 | 2e-4 | 10000 | - | 768 |
| 7B | Longform | 16 | 2e-4 | 4000 | 512 | 1024 |
| 13B | All | 16 | 2e-4 | 10000 | 384 | 128 |
| 13B | OASST1 | 16 | 2e-4 | 1875 | - | 512 |
| 13B | HH-RLHF | 16 | 2e-4 | 10000 | - | 768 |
| 13B | Longform | 16 | 2e-4 | 4000 | 512 | 1024 |
| 33B | All | 32 | 1e-4 | 5000 | 384 | 128 |
| 33B | OASST1 | 16 | 1e-4 | 1875 | - | 512 |
| 33B | HH-RLHF | 32 | 1e-4 | 5000 | - | 768 |
| 33B | Longform | 32 | 1e-4 | 2343 | 512 | 1024 |
| 65B | All | 64 | 1e-4 | 2500 | 384 | 128 |
| 65B | OASST1 | 16 | 1e-4 | 1875 | - | 512 |
| 65B | HH-RLHF | 64 | 1e-4 | 2500 | - | 768 |
| 65B | Longform | 32 | 1e-4 | 2343 | 512 | 1024 |
### Evaluation
We use the MMLU benchmark to measure performance on a range of language understanding tasks. This is a multiple-choice benchmark covering 57 tasks including elementary mathematics, US history, computer science, law, and more. We report 5-shot test accuracy.
Dataset | 7B | 13B | 33B | 65B
---|---|---|---|---
LLaMA no tuning | 35.1 | 46.9 | 57.8 | 63.4
Self-Instruct | 36.4 | 33.3 | 53.0 | 56.7
Longform | 32.1 | 43.2 | 56.6 | 59.7
Chip2 | 34.5 | 41.6 | 53.6 | 59.8
HH-RLHF | 34.9 | 44.6 | 55.8 | 60.1
Unnatural Instruct | 41.9 | 48.1 | 57.3 | 61.3
OASST1 (Guanaco) | 36.6 | 46.4 | 57.0 | 62.2
Alpaca | 38.8 | 47.8 | 57.3 | 62.5
FLAN v2 | 44.5 | 51.4 | 59.2 | 63.9
We evaluate the generative language capabilities through automated evaluations on the Vicuna benchmark. We report the score of the QLoRA Instruction Finetuned Models relative to the score obtained by ChatGPT. The rater in this case is GPT-4 which is tasked to assign a score out of 10 to both ChatGPT and the model outputs for each prompt. We report scores for models ranging 7B to 65B and compare them to both academic and commercial baselilnes.
| Model / Dataset | Params | Model bits | Memory | ChatGPT vs Sys | Sys vs ChatGPT | Mean | 95\% CI |
|------------------|--------|------------|--------|----------------|----------------|------------------|---------|
| GPT-4 | - | - | - | 119.4\% | 110.1\% | **114.5**\% | 2.6\% |
| Bard | - | - | - | 93.2\% | 96.4\% | 94.8\% | 4.1\% |
| Guanaco | 65B | 4-bit | 41 GB | 96.7\% | 101.9\% | **99.3**\% | 4.4\% |
| Alpaca | 65B | 4-bit | 41 GB | 63.0\% | 77.9\% | 70.7\% | 4.3\% |
| FLAN v2 | 65B | 4-bit | 41 GB | 37.0\% | 59.6\% | 48.4\% | 4.6\% |
| Guanaco | 33B | 4-bit | 21 GB | 96.5\% | 99.2\% | **97.8**\% | 4.4\% |
| Open Assistant | 33B | 16-bit | 66 GB | 73.4\% | 85.7\% | 78.1\% | 5.3\% |
| Alpaca | 33B | 4-bit | 21 GB | 67.2\% | 79.7\% | 73.6\% | 4.2\% |
| FLAN v2 | 33B | 4-bit | 21 GB | 26.3\% | 49.7\% | 38.0\% | 3.9\% |
| Vicuna | 13B | 16-bit | 26 GB | 91.2\% | 98.7\% | **94.9**\% | 4.5\% |
| Guanaco | 13B | 4-bit | 10 GB | 87.3\% | 93.4\% | 90.4\% | 5.2\% |
| Alpaca | 13B | 4-bit | 10 GB | 63.8\% | 76.7\% | 69.4\% | 4.2\% |
| HH-RLHF | 13B | 4-bit | 10 GB | 55.5\% | 69.1\% | 62.5\% | 4.7\% |
| Unnatural Instr. | 13B | 4-bit | 10 GB | 50.6\% | 69.8\% | 60.5\% | 4.2\% |
| Chip2 | 13B | 4-bit | 10 GB | 49.2\% | 69.3\% | 59.5\% | 4.7\% |
| Longform | 13B | 4-bit | 10 GB | 44.9\% | 62.0\% | 53.6\% | 5.2\% |
| Self-Instruct | 13B | 4-bit | 10 GB | 38.0\% | 60.5\% | 49.1\% | 4.6\% |
| FLAN v2 | 13B | 4-bit | 10 GB | 32.4\% | 61.2\% | 47.0\% | 3.6\% |
| Guanaco | 7B | 4-bit | 5 GB | 84.1\% | 89.8\% | **87.0**\% | 5.4\% |
| Alpaca | 7B | 4-bit | 5 GB | 57.3\% | 71.2\% | 64.4\% | 5.0\% |
| FLAN v2 | 7B | 4-bit | 5 GB | 33.3\% | 56.1\% | 44.8\% | 4.0\% |
## Citation
```bibtex
@article{dettmers2023qlora,
title={QLoRA: Efficient Finetuning of Quantized LLMs},
author={Dettmers, Tim and Pagnoni, Artidoro and Holtzman, Ari and Zettlemoyer, Luke},
journal={arXiv preprint arXiv:2305.14314},
year={2023}
}
``` |
natope/mT5-bm25-10pass-all-questions-QA | natope | 2023-06-12T16:24:55Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"mt5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text2text-generation | 2023-06-12T11:55:13Z | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: mT5-bm25-10pass-all-questions-QA
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mT5-bm25-10pass-all-questions-QA
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the None dataset.
It achieves the following results on the evaluation set:
- eval_loss: 2.3310
- eval_rouge1: 0.0688
- eval_rouge2: 0.0121
- eval_rougeL: 0.0614
- eval_rougeLsum: 0.0615
- eval_gen_len: 7.6507
- eval_runtime: 495.0904
- eval_samples_per_second: 7.574
- eval_steps_per_second: 7.574
- epoch: 5.0
- step: 56240
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
cartesinus/slurp-intent_baseline-xlm_r-en | cartesinus | 2023-06-12T14:44:18Z | 103 | 1 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"safetensors",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-04-27T08:55:37Z | ---
license: mit
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: slurp-intent_baseline-xlm_r-en
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# slurp-intent_baseline-xlm_r-en
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an SLURP dataset.
It achieves the following results on the test set:
- Loss: 0.68222
- Accuracy: 0.8746
- F1: 0.8746
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 2.9687 | 1.0 | 720 | 1.3267 | 0.6955 | 0.6955 |
| 1.4534 | 2.0 | 1440 | 0.8053 | 0.8219 | 0.8219 |
| 0.6775 | 3.0 | 2160 | 0.6912 | 0.8421 | 0.8421 |
| 0.5624 | 4.0 | 2880 | 0.6377 | 0.8623 | 0.8623 |
| 0.3756 | 5.0 | 3600 | 0.6188 | 0.8746 | 0.8746 |
| 0.3346 | 6.0 | 4320 | 0.6548 | 0.8711 | 0.8711 |
| 0.2541 | 7.0 | 5040 | 0.6618 | 0.8751 | 0.8751 |
| 0.2243 | 8.0 | 5760 | 0.6662 | 0.8780 | 0.8780 |
| 0.212 | 9.0 | 6480 | 0.6673 | 0.8810 | 0.8810 |
| 0.1664 | 10.0 | 7200 | 0.6783 | 0.8810 | 0.8810 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3
|
joneslaura/donut-base-sroie | joneslaura | 2023-06-12T07:02:45Z | 3 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"vision-encoder-decoder",
"image-text-to-text",
"generated_from_trainer",
"dataset:imagefolder",
"license:mit",
"endpoints_compatible",
"region:us"
] | image-text-to-text | 2023-06-11T07:47:45Z | ---
license: mit
tags:
- generated_from_trainer
datasets:
- imagefolder
model-index:
- name: donut-base-sroie
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# donut-base-sroie
This model is a fine-tuned version of [naver-clova-ix/donut-base](https://huggingface.co/naver-clova-ix/donut-base) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Retrial9842/a2c-AntBulletEnv-v0 | Retrial9842 | 2023-06-12T06:58:24Z | 1 | 0 | stable-baselines3 | [
"stable-baselines3",
"AntBulletEnv-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-12T06:57:13Z | ---
library_name: stable-baselines3
tags:
- AntBulletEnv-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: AntBulletEnv-v0
type: AntBulletEnv-v0
metrics:
- type: mean_reward
value: 1597.99 +/- 89.85
name: mean_reward
verified: false
---
# **A2C** Agent playing **AntBulletEnv-v0**
This is a trained model of a **A2C** agent playing **AntBulletEnv-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
alea31415/world-dai-star | alea31415 | 2023-06-11T14:42:56Z | 0 | 2 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-11T14:41:15Z | ---
license: creativeml-openrail-m
---
|
draziert/ppo-LunarLander-v2 | draziert | 2023-06-11T08:00:09Z | 1 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-11T07:59:51Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO-MlpPolicy
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 267.08 +/- 20.30
name: mean_reward
verified: false
---
# **PPO-MlpPolicy** Agent playing **LunarLander-v2**
This is a trained model of a **PPO-MlpPolicy** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
DunnBC22/mpnet-base-News_About_Gold | DunnBC22 | 2023-06-10T22:33:53Z | 105 | 1 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"mpnet",
"text-classification",
"generated_from_trainer",
"en",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-06-06T15:46:06Z | ---
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- recall
- precision
model-index:
- name: mpnet-base-News_About_Gold
results: []
language:
- en
pipeline_tag: text-classification
---
# mpnet-base-News_About_Gold
This model is a fine-tuned version of [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base).
It achieves the following results on the evaluation set:
- Loss: 0.3098
- Accuracy: 0.9068
- Weighted f1: 0.9068
- Micro f1: 0.9068
- Macro f1: 0.8351
- Weighted recall: 0.9068
- Micro recall: 0.9068
- Macro recall: 0.8406
- Weighted precision: 0.9071
- Micro precision: 0.9068
- Macro precision: 0.8309
## Model description
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Sentiment%20Analysis/Sentiment%20Analysis%20of%20Commodity%20News%20-%20Gold%20(Transformer%20Comparison)/News%20About%20Gold%20-%20Sentiment%20Analysis%20-%20MPNet-Base%20with%20W%26B.ipynb
This project is part of a comparison of seven (7) transformers. Here is the README page for the comparison: https://github.com/DunnBC22/NLP_Projects/tree/main/Sentiment%20Analysis/Sentiment%20Analysis%20of%20Commodity%20News%20-%20Gold%20(Transformer%20Comparison)
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/ankurzing/sentiment-analysis-in-commodity-market-gold
_Input Word Length:_
/Images/Input%20Word%20Length.png)
_Class Distribution:_
/Images/Class%20Distribution.png)
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Weighted f1 | Micro f1 | Macro f1 | Weighted recall | Micro recall | Macro recall | Weighted precision | Micro precision | Macro precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------:|:--------:|:--------:|:---------------:|:------------:|:------------:|:------------------:|:---------------:|:---------------:|
| 0.8316 | 1.0 | 133 | 0.5146 | 0.8742 | 0.8604 | 0.8742 | 0.6541 | 0.8742 | 0.8742 | 0.6583 | 0.8487 | 0.8742 | 0.6515 |
| 0.4675 | 2.0 | 266 | 0.3833 | 0.8898 | 0.8857 | 0.8898 | 0.7813 | 0.8898 | 0.8898 | 0.7542 | 0.8862 | 0.8898 | 0.8298 |
| 0.3276 | 3.0 | 399 | 0.3464 | 0.8997 | 0.8985 | 0.8997 | 0.8302 | 0.8997 | 0.8997 | 0.8212 | 0.8984 | 0.8997 | 0.8408 |
| 0.2767 | 4.0 | 532 | 0.3098 | 0.9101 | 0.9103 | 0.9101 | 0.8412 | 0.9101 | 0.9101 | 0.8462 | 0.9106 | 0.9101 | 0.8367 |
| 0.2429 | 5.0 | 665 | 0.3098 | 0.9068 | 0.9068 | 0.9068 | 0.8351 | 0.9068 | 0.9068 | 0.8406 | 0.9071 | 0.9068 | 0.8309 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.11.0
- Tokenizers 0.13.3 |
Parthi/poca-SoccerTwos | Parthi | 2023-06-10T13:43:33Z | 3 | 0 | ml-agents | [
"ml-agents",
"tensorboard",
"onnx",
"SoccerTwos",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SoccerTwos",
"region:us"
] | reinforcement-learning | 2023-06-10T13:43:26Z | ---
library_name: ml-agents
tags:
- SoccerTwos
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos**
using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
- A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your
browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction
- A *longer tutorial* to understand how works ML-Agents:
https://huggingface.co/learn/deep-rl-course/unit5/introduction
### Resume the training
```bash
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser**
1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity
2. Step 1: Find your model_id: Parthi/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Perno/Mitsuba | Perno | 2023-06-10T09:35:30Z | 0 | 0 | null | [
"Hanako kun",
"Mitsuba Sousuke ",
"Mitsuba ",
"audio-to-audio",
"ru",
"en",
"uk",
"es",
"dataset:fka/awesome-chatgpt-prompts",
"dataset:tiiuae/falcon-refinedweb",
"license:openrail",
"region:us"
] | audio-to-audio | 2023-06-10T09:25:14Z | ---
license: openrail
datasets:
- fka/awesome-chatgpt-prompts
- tiiuae/falcon-refinedweb
language:
- ru
- en
- uk
- es
metrics:
- character
pipeline_tag: audio-to-audio
tags:
- Hanako kun
- 'Mitsuba Sousuke '
- 'Mitsuba '
--- |
YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_4 | YakovElm | 2023-06-10T05:44:58Z | 3 | 0 | sentence-transformers | [
"sentence-transformers",
"pytorch",
"mpnet",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
] | text-classification | 2023-06-10T05:44:22Z | ---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_4
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("YakovElm/IntelDAOS5SetFitModel_Train_balance_ratio_4")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
cnll0075/ppo-LunarLander-v2 | cnll0075 | 2023-06-09T19:05:01Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-09T19:04:43Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 268.31 +/- 19.58
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Nika7664/digits | Nika7664 | 2023-06-09T18:19:52Z | 0 | 0 | keras | [
"keras",
"tf-keras",
"region:us"
] | null | 2023-06-06T15:56:38Z | ---
library_name: keras
---
# Модель для распознавания цифр
Натренирована на наборе данных mnist
 |
manadopeee/segformer-b0-scene-parse-150_epoch_800_230609 | manadopeee | 2023-06-09T03:25:50Z | 36 | 0 | transformers | [
"transformers",
"pytorch",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-06-09T02:33:52Z | ---
license: other
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: segformer-b0-scene-parse-150_epoch_800_230609
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-scene-parse-150_epoch_800_230609
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8271
- Mean Iou: 0.0744
- Mean Accuracy: 0.2072
- Overall Accuracy: 0.4220
- Per Category Iou: [0.501232010557704, 0.5516945572424972, 0.30339537001175504, 0.3561498486096415, 0.17926212776450925, 0.41521929122712, 0.022357896363847135, 0.14456232554795104, 0.6096779598538923, 0.1801962533452275, 0.07653157994562812, 0.39338028513437506, 0.23253295537338806, 0.0, 0.0, 0.028717201166180758, 0.0, 0.0026822920824485495, 0.02444039190445541, 0.07505241709240174, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004311856875584659, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10147744945567652, 0.0, 0.0, 0.41222728555873694, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
- Per Category Accuracy: [0.767430856324387, 0.601685910937265, 0.3102056412394053, 0.9619138846887642, 0.2639985473522805, 0.8389792568273392, 0.022379515742244914, 0.7226651852820077, 0.7291534743541445, 0.2678771804549628, 0.1563852464311091, 0.9120623671155209, 0.24849598788281044, nan, 0.0, 0.03176778875226769, nan, 0.002978195355433434, 0.033374838252887035, 0.27018551001966723, nan, nan, nan, 0.0, nan, nan, nan, 0.004835669207442013, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20243019648397104, nan, nan, 0.5247163903290385, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 800
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 3.2296 | 10.0 | 100 | 3.6973 | 0.0677 | 0.1732 | 0.4418 | [0.5067850430027909, 0.5344790715654584, 0.19384134857688534, 0.4612318640193205, 0.5504812610875678, 0.36840530269024746, 0.0, 0.08109675081531587, 0.03207866748040513, 0.04368635437881874, 0.005856090914361918, 0.34650094677893717, 0.034397318790416534, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.022905088221011168, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7585875422023668, 0.6110134271099744, 0.40614839516465195, 0.9902944133503273, 0.8775236391515461, 0.9185942396657215, 0.0, 0.8775897000196066, 0.03749689396186186, 0.043762113638682036, 0.006524126348427104, 0.9752515946137491, 0.034670989371931346, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.04512836868123107, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 2.2743 | 20.0 | 200 | 3.2022 | 0.0749 | 0.1635 | 0.3922 | [0.5324160681256488, 0.6193403067818657, 0.16116128226542925, 0.45012014431837155, 0.18364659273075126, 0.49944490307153633, 0.0, 0.08774446711662219, 0.0, 0.169232327324286, 0.039146816527236734, 0.360638436804004, 0.06496340238042185, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0531513666938839, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8613971967397606, 0.7006450278321047, 0.28689037098791165, 0.9817129647954781, 0.2827518258974807, 0.8727553101527136, 0.0, 0.913371675053918, 0.0, 0.1704580230541671, 0.03959692526322589, 0.9655090952043468, 0.0657425607774062, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.0726625205974592, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 2.1657 | 30.0 | 300 | 2.9946 | 0.0781 | 0.1593 | 0.3891 | [0.5321718964802125, 0.6220417211489336, 0.26247065819536164, 0.4036110594810692, 0.1858677946786585, 0.4173911700665565, 0.0, 0.08788243169158916, 0.000230206233032213, 0.2129794034090909, 0.0, 0.36426040258649034, 0.05935813926316624, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.053858946254441, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8696522354465778, 0.675436287046788, 0.2913088786994581, 0.9798605118786564, 0.2950623427979609, 0.672262846341342, 0.0, 0.9481406444023266, 0.00023245188646729242, 0.2447210037743548, 0.0, 0.942206472950626, 0.06052212833669449, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.07574549513634189, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 1.5913 | 40.0 | 400 | 2.9703 | 0.0884 | 0.1810 | 0.4092 | [0.5452514378712536, 0.6503614860629362, 0.21253727663649769, 0.5137556173764177, 0.18892181364587546, 0.36876593070217295, 0.0, 0.09017860543051906, 0.12581673246164496, 0.4768626309662398, 0.033241946538725156, 0.36799133427162933, 0.16516527528224936, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.060512895590854744, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8504863929338744, 0.7316505566420942, 0.3114422676115048, 0.9502845969331613, 0.28748974403809163, 0.7880415858329602, 0.0, 0.8511861969805895, 0.13258574669156842, 0.668570845659492, 0.05639170596214715, 0.9438034490904795, 0.17247423820828972, nan, 0.0, 0.0, nan, 0.0, 0.0, 0.13182373890394938, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 1.3105 | 50.0 | 500 | 2.9799 | 0.0890 | 0.1762 | 0.3974 | [0.5089479875940252, 0.6846175629484567, 0.2134251187398545, 0.4270691091621236, 0.18903741237912028, 0.43369075801884605, 0.0, 0.08407402822480813, 0.3064203253391068, 0.3567489607194367, 0.027794292508917955, 0.36557916593037637, 0.1670422781967461, 0.0, 0.0, 0.0, 0.0, 7.090941322460557e-05, 0.0, 0.06055227222300664, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7303055621866794, 0.8069805927486084, 0.24662359316381827, 0.9779763931000087, 0.2699638183114315, 0.6971347560065662, 0.0, 0.8877197568786354, 0.3380651987463629, 0.42895032133020505, 0.03623796912344164, 0.9383699503897944, 0.17849028667704797, nan, 0.0, 0.0, nan, 7.090941322460557e-05, 0.0, 0.15957050975389359, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 1.1609 | 60.0 | 600 | 2.7672 | 0.0971 | 0.1850 | 0.4187 | [0.5430925303975345, 0.6786886076590786, 0.23576660253812032, 0.5033148830222844, 0.18894547184606303, 0.5146605225017652, 0.0, 0.08336613505259587, 0.4345118450655552, 0.45425960499164786, 0.022937625754527163, 0.3746513236400672, 0.1704665758470395, 0.0, 0.0, 0.0, 0.0, 0.004503796805446452, 0.0, 0.06472969957172765, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8034115711216451, 0.7556275387392809, 0.286918160344588, 0.968207474726684, 0.2884514506301532, 0.688902153907377, 0.0, 0.8028233448794196, 0.49383200942632477, 0.4715903294909721, 0.02945546153349267, 0.9315851641861564, 0.18154511185425334, nan, 0.0, 0.0, nan, 0.004573657152987059, 0.0, 0.3237654813161112, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.9025 | 70.0 | 700 | 2.9312 | 0.0916 | 0.1996 | 0.4132 | [0.5151394922387289, 0.6238110964332892, 0.2563785280773084, 0.4643489671231526, 0.18874172185430463, 0.47769865674057294, 0.0, 0.11482144466909634, 0.3510014534368464, 0.38918597742127153, 0.11413561591042778, 0.36849566038085013, 0.23772491434165444, 0.0, 0.0, 0.0, 0.0, 0.003868962138235145, 0.0, 0.10904418988804077, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7610365583330492, 0.7104332781706033, 0.30227872724746424, 0.969292030494225, 0.26603628929210327, 0.7341441575884197, 0.0, 0.874060518920332, 0.39682743252883607, 0.4677139651127206, 0.3634778115108843, 0.9408173871958422, 0.25829440345816423, nan, 0.0, 0.0, nan, 0.004059563907108669, 0.0, 0.5368628076330197, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.7327 | 80.0 | 800 | 2.8415 | 0.0944 | 0.2006 | 0.4153 | [0.5345656955413646, 0.6784546780083235, 0.2820506524914986, 0.5223088223465011, 0.19393971634486395, 0.3824550849079822, 0.0, 0.11781012410872124, 0.2214453980630674, 0.665730845861257, 0.06367007325676043, 0.3753507495160782, 0.2778013194571332, 0.0, 0.0, 0.0009873657484836883, 0.0, 0.03385373044323093, 0.0, 0.08008391004063085, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.00037442778978899184, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.005737664022351943, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.750618115472496, 0.7503337972017452, 0.3192302348200639, 0.9766780927651422, 0.28195824982850687, 0.6957170571556485, 0.0, 0.847134174236978, 0.25769295510472356, 0.7498724880138733, 0.09095019701569666, 0.945504370422868, 0.3040274338617585, nan, 0.0, 0.0009877040919169522, nan, 0.035826981031731965, 0.0, 0.6108010418327752, nan, nan, nan, 0.0, nan, nan, nan, 0.0005081550692566183, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0059031877213695395, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.5396 | 90.0 | 900 | 2.9683 | 0.0975 | 0.1971 | 0.4129 | [0.527361734969971, 0.6539254866572128, 0.2806172855769104, 0.535851377523424, 0.19172870067551007, 0.39952192776411416, 0.0, 0.09640475316687969, 0.32768254153990173, 0.4265555372173626, 0.13919916125650228, 0.3769124731117642, 0.23602246365069893, 0.0, 0.0, 0.0018454577925943565, 0.0, 0.005485415759686548, 0.0, 0.07707186116072844, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.20891488652771092, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7602372710841319, 0.7154308334587032, 0.2959913852994303, 0.9724981990041086, 0.29195192812083876, 0.7191712679699548, 0.0, 0.7468139337298216, 0.35914618017425876, 0.527287565031113, 0.22298301143336993, 0.9404772029293645, 0.2540908278661323, nan, 0.0, 0.001874622052005644, nan, 0.005584116291437688, 0.0, 0.3876043161643544, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.2896668548842462, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.4233 | 100.0 | 1000 | 2.8725 | 0.1074 | 0.2190 | 0.4346 | [0.5427704533568795, 0.6860671313636614, 0.2958787323466725, 0.5335683842128972, 0.19378123649739942, 0.4936863913378801, 0.0, 0.11369801412884227, 0.34249730653490357, 0.5345016429353778, 0.12806125816885602, 0.3732073697129115, 0.319589494762785, 0.0, 0.0, 0.009386696239380557, 0.0, 0.01352755827573123, 0.0, 0.0630176876617774, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.296577373473388, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7808780650001705, 0.762120129381676, 0.30958732805335554, 0.9751977137247761, 0.2820086889854332, 0.8119683629309058, 0.0, 0.7683811515587217, 0.3694782657486153, 0.7964908701417933, 0.25062980427620957, 0.9224190881171745, 0.35309609509951584, nan, 0.0, 0.009554525297319089, nan, 0.01453642971104414, 0.0, 0.31058310742571626, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.6058210564139418, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.4377 | 110.0 | 1100 | 2.9822 | 0.1031 | 0.2231 | 0.4214 | [0.513876017271641, 0.6456354512146417, 0.3086308209639618, 0.5510319708619992, 0.18929543883737435, 0.42168031530047817, 0.0, 0.12145227311260202, 0.3704750516360474, 0.5476564957533816, 0.09725908914670882, 0.3808212434037447, 0.28266283748550436, 0.0, 0.0, 0.009217035090431288, 0.0, 0.019630932275607752, 0.0, 0.05831007425351519, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.32763133295943164, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7351140742761655, 0.6992158116443509, 0.31563151313047105, 0.9701153429017012, 0.27444617805694915, 0.9233945182311097, 0.0, 0.8192928566760342, 0.4097565667657927, 0.8879934713863104, 0.21064530715070087, 0.9356201275691, 0.3090436440065011, nan, 0.0, 0.009614996976416046, nan, 0.022460556638893812, 0.0, 0.2354223143570935, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.7196755813356603, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.3132 | 120.0 | 1200 | 3.0958 | 0.1045 | 0.2131 | 0.4164 | [0.5080410500750633, 0.6332342109085938, 0.30540321426868233, 0.5457427698940184, 0.19506725425974314, 0.4545866520186762, 0.0, 0.11699195273013406, 0.42553959124769203, 0.5397208214572827, 0.09497027892089621, 0.37089690873901116, 0.221525468126146, 0.0, 0.0, 0.002448408534452606, 0.0, 0.020647472289808056, 0.0, 0.0512952870279225, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.32194583659018805, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7472611090270436, 0.6967946065894388, 0.31143532027233567, 0.9620563810669812, 0.2851796306508669, 0.8233348256479133, 0.0, 0.7919417031566565, 0.4821613216092083, 0.7533408140365194, 0.13416445966022866, 0.901809591306402, 0.23747649316280772, nan, 0.0, 0.0026809111066317276, nan, 0.021662825740117, 0.0, 0.22713017594216764, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.7188542682613829, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.5676 | 130.0 | 1300 | 3.0132 | 0.1058 | 0.2216 | 0.4366 | [0.5494331820597236, 0.6801446164852573, 0.30748785444170806, 0.5038758422188788, 0.19470465632944733, 0.42181896104481986, 0.0, 0.12211634349030472, 0.5070865051903114, 0.5412557943531395, 0.07391626528343831, 0.3719786649862718, 0.2512288869424013, 0.0, 0.0, 0.014870438529972071, 0.0, 0.013362515839189033, 0.0, 0.06562596894602524, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.0009910335063709297, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.009202453987730062, nan, nan, 0.34137487363293817, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7669214439177438, 0.7508885587483075, 0.31747255801028207, 0.9756568687212533, 0.28585551535367937, 0.9302342933890464, 0.0, 0.7202797202797203, 0.5873337768622201, 0.6551055799245129, 0.12886764420903044, 0.9371226080793763, 0.2672461474314792, nan, 0.0, 0.016206409997984277, nan, 0.01439461088459493, 0.0, 0.29251049805985224, nan, nan, nan, 0.0, nan, nan, nan, 0.0010327022375215145, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.009307135470527405, nan, nan, 0.7626918536009445, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.4535 | 140.0 | 1400 | 3.0065 | 0.1026 | 0.2137 | 0.4146 | [0.5108796779065107, 0.6315908694894581, 0.3075031504376554, 0.4874114605398086, 0.19322197417574405, 0.39467227068340044, 0.0, 0.10843431198812203, 0.38258161587179396, 0.5907152345661749, 0.14003785895525395, 0.37485434630622233, 0.24673589796599926, 0.0, 0.0, 0.01659479439396273, 0.0, 0.008485984493482151, 0.0, 0.06607963246554364, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.002715770305484914, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.01112797167425392, nan, nan, 0.348792510152835, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7361840534733827, 0.6816420941778246, 0.3136237321106016, 0.9745643964882559, 0.28958801296622594, 0.8361189872158384, 0.0, 0.7398209267368145, 0.4382038683200141, 0.8618790166275629, 0.2245978941928816, 0.9119773210489015, 0.2640424101634629, nan, 0.0, 0.017375529127192098, nan, 0.008828221946463393, 0.0, 0.22936267474618616, nan, nan, nan, 0.0, nan, nan, nan, 0.0028358331284320955, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.011375387797311272, nan, nan, 0.5775370874185104, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2613 | 150.0 | 1500 | 3.0882 | 0.0972 | 0.2080 | 0.4266 | [0.5184199167614468, 0.6743078082783505, 0.30394687296751444, 0.52852039566165, 0.18882410332651922, 0.3942247244380492, 0.0, 0.14419640641286544, 0.5955062345660194, 0.26072805139186295, 0.034381952708645934, 0.3689255487127868, 0.25585649816666145, 0.0, 0.0, 0.025883478356849832, 0.0, 0.019285197130257354, 0.0, 0.05751895340236687, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.009383358207852468, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.05396825396825397, nan, nan, 0.3287772807039225, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7014417010537803, 0.7485848879193621, 0.30843406975128523, 0.9732977620152155, 0.27332979138364694, 0.7018604188429588, 0.0, 0.7594601660022221, 0.782849860128089, 0.31051718861572986, 0.08584716749563982, 0.9074226317032837, 0.2778827253465397, nan, 0.0, 0.028361217496472486, nan, 0.02077645807480943, 0.0, 0.264551108276192, nan, nan, nan, 0.0, nan, nan, nan, 0.010408982870256536, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.06592554291623579, nan, nan, 0.6828191571274576, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2634 | 160.0 | 1600 | 3.0367 | 0.0918 | 0.2024 | 0.4236 | [0.5147127928901629, 0.6484200659577853, 0.29042929051795874, 0.5364288559435172, 0.19246559291939686, 0.42864949695933635, 0.0, 0.12450020115107678, 0.4491244806680631, 0.37878278412911903, 0.041319548997959116, 0.37791711683901835, 0.3005961792988744, 0.0, 0.0, 0.050467357437866824, 0.0, 0.005256833884049264, 0.0, 0.049259199317315754, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.007390429861078629, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.07043879907621248, nan, nan, 0.30959325879111976, 0.0, 0.0, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7470778058179586, 0.7089899578757334, 0.29582464915937196, 0.9701786746253532, 0.2872240978116131, 0.7608565885688703, 0.0, 0.4955231684203647, 0.5207643659273627, 0.4596552075895134, 0.07977520831987597, 0.9254996456413891, 0.32757256273453655, nan, 0.0, 0.05615803265470671, nan, 0.005584116291437688, 0.0, 0.17796204752033168, nan, nan, nan, 0.0, nan, nan, nan, 0.0077698549299237765, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.07885211995863495, nan, nan, 0.7845593142035829, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1752 | 170.0 | 1700 | 3.0697 | 0.0964 | 0.2161 | 0.4330 | [0.5351392549779288, 0.6485878045309286, 0.27373499550885977, 0.4964212781897347, 0.19269703866326252, 0.3794665273038417, 0.0, 0.14226547443770124, 0.6100702354433415, 0.2238968585436717, 0.03395823055418517, 0.369336010566418, 0.2844740805481205, 0.0, 0.0, 0.021870428030160934, 0.0, 0.011601869280197312, 0.0012000369242130527, 0.05624676668391102, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.005234375, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.1723175965665236, nan, nan, 0.36201253393550353, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7183354363468949, 0.7070388897246879, 0.2794706127553147, 0.9756885345830794, 0.28983348352993393, 0.9210316868129135, 0.0, 0.7767466178681132, 0.7749144336590331, 0.25155564623074567, 0.08003358956139785, 0.9142641152846681, 0.31020090368365966, nan, 0.0, 0.02350332594235033, nan, 0.012675057613898245, 0.0015869528064650015, 0.23116993568277255, nan, nan, nan, 0.0, nan, nan, nan, 0.005491353167773133, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2076008273009307, nan, nan, 0.7324059339869616, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2313 | 180.0 | 1800 | 3.0846 | 0.0931 | 0.2049 | 0.4328 | [0.5426649244025863, 0.668195561052704, 0.28912930854678426, 0.4831876538073576, 0.19142022404833728, 0.3867258799350379, 0.0, 0.11577615629045546, 0.6262971796134197, 0.34484301588485344, 0.03201963725623892, 0.3654458973674709, 0.2667484202551024, 0.0, 0.0, 0.01479402169789849, 0.0, 0.005365028981507038, 0.0005188548133049198, 0.051112724429676376, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0065839840759454905, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.010389081279282949, nan, nan, 0.44006104292486525, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7506266412031511, 0.7314766059876636, 0.29854800611365845, 0.9745802294191689, 0.2846483381979098, 0.66925334527185, 0.0, 0.6429318345206195, 0.785134301081302, 0.4716923390798735, 0.07583489438666753, 0.8799433026222537, 0.2905998179017861, nan, 0.0, 0.015722636565208628, nan, 0.005513206878213083, 0.0006836104397080007, 0.1945994790836124, nan, nan, nan, 0.0, nan, nan, nan, 0.007048602573559545, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.013185108583247156, nan, nan, 0.6957035059801858, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1648 | 190.0 | 1900 | 3.2173 | 0.0905 | 0.2189 | 0.4246 | [0.5084934667963441, 0.6138006125122546, 0.28688594338078877, 0.4148205302721444, 0.19179147894088835, 0.35000153513933946, 0.0, 0.13654472073948704, 0.5743584779106191, 0.569439610163592, 0.05187126660977691, 0.3804110676812298, 0.2797740646710358, 0.0, 0.0, 0.020129457103303775, 0.0, 0.019297081154587104, 0.0017959152480846756, 0.050444490386603265, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0012674866209745565, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, 0.015992689056431347, nan, nan, 0.4187761845754411, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7176576407598131, 0.6652296148638484, 0.30449492844240655, 0.9746356446773644, 0.2860001076035348, 0.8505944386409988, 0.0, 0.6854453957257696, 0.6657902963360773, 0.8344384372130981, 0.10994121826755378, 0.9325490196078431, 0.30241067401866933, nan, 0.0, 0.021689175569441645, nan, 0.021131005140932458, 0.0022949779047340023, 0.19454632435018337, nan, nan, nan, 0.0, nan, nan, nan, 0.0013277600196705188, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.01809720785935884, nan, nan, 0.7286073610184282, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1051 | 200.0 | 2000 | 3.3141 | 0.0825 | 0.2061 | 0.4163 | [0.5118282175515311, 0.5848123831470934, 0.26380866307778955, 0.45906519019252673, 0.19336888267589597, 0.3481965576318686, 0.012555117196565328, 0.1418940696175333, 0.38343606764659394, 0.4910780120926117, 0.06705253249484289, 0.3780839549969195, 0.22393222902470616, 0.0, 0.0, 0.024555362069302863, 0.0, 0.014416162156738134, 0.0003916887380161898, 0.05576543869671784, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0001558044965177695, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.46598639455782315, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.8013483443031068, 0.6323388370693546, 0.2798527164096151, 0.9730681845169769, 0.2883841984209181, 0.7632940357160622, 0.012555117196565328, 0.690543101758055, 0.42780765808732174, 0.6793838620830358, 0.21626509915380143, 0.9104275927238366, 0.23685531701256818, nan, 0.0, 0.025297319088893367, nan, 0.0152809785499025, 0.0005127078297810005, 0.16084622335618987, nan, nan, nan, 0.0, nan, nan, nan, 0.0001639209900827801, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.7173143062471126, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1505 | 210.0 | 2100 | 3.3377 | 0.0846 | 0.2017 | 0.4230 | [0.5323386224438127, 0.6443095201694804, 0.2817177749118314, 0.4304698457223001, 0.1924120517374379, 0.4353848164118646, 0.0019958226966813645, 0.1168390845681723, 0.5395218218434458, 0.27942720047983205, 0.06712938982699679, 0.35527322172176423, 0.21510108540554407, 0.0, 0.0, 0.016442223324304735, 0.0, 0.005680940530335536, 0.0010841675400238516, 0.05609421978686687, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, 0.002006668313163744, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.022764609246655715, 0.0, nan, 0.4596130278159468, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7730748900180745, 0.693475440048142, 0.32020980964290674, 0.971904464094871, 0.2885287906707735, 0.7458588270407401, 0.0019958226966813645, 0.7477289066074113, 0.6205904277916269, 0.38018973783535653, 0.16743104450616886, 0.8690857547838412, 0.22832903615585565, nan, 0.0, 0.017375529127192098, nan, 0.005867753944336111, 0.0014648795136600016, 0.16759687450167438, nan, nan, nan, 0.0, nan, nan, nan, 0.0021309728710761414, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.025077559462254394, nan, nan, 0.6352856629536472, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0672 | 220.0 | 2200 | 3.2337 | 0.0856 | 0.2136 | 0.4437 | [0.5324180569688989, 0.6797928606656349, 0.2963288159317936, 0.45763127236258205, 0.1890342587327423, 0.4590018085292573, 0.0, 0.13272089272481583, 0.5979436829699074, 0.41361440491875273, 0.0673603706291135, 0.379242591446182, 0.25470814857311325, 0.0, 0.0, 0.007673879594102409, 0.0, 0.02672021774488001, 0.0005049466065717865, 0.05048265886866019, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0018874873400239389, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0035896467031928964, 0.0, nan, 0.4151369152893822, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7962371687753641, 0.7517018955919964, 0.32356537446158123, 0.9608847441794188, 0.2761073076250555, 0.7827438690742675, 0.0, 0.7959610482974969, 0.7136914161129235, 0.48036315413648883, 0.11833860861701441, 0.9320103945192535, 0.27200707970626026, nan, 0.0, 0.007941947188066923, nan, 0.02854103882290374, 0.0006591957811470006, 0.17067984904055705, nan, nan, nan, 0.0, nan, nan, nan, 0.0020162281780181954, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.004912099276111685, nan, nan, 0.6988347620758688, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1569 | 230.0 | 2300 | 3.1466 | 0.0892 | 0.2177 | 0.4406 | [0.5296333834684223, 0.6621664634272368, 0.25378680313457275, 0.5095053529152961, 0.1848228060883782, 0.46038865983535004, 0.01137154792295196, 0.14593582522873538, 0.6217603614327454, 0.3372355548122753, 0.05085820044114291, 0.3855883571906781, 0.27441858657463547, 0.0, 0.0, 0.05276968457626812, 0.0, 0.03101813198065502, 0.0037916547992025065, 0.061143740340030915, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0021600919188050557, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.08489178398823283, 0.0, 0.0, 0.42229204069419507, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7640674555809432, 0.7531029035655182, 0.2918160344588023, 0.9626105336489365, 0.2698932034917347, 0.8526339352335472, 0.01137154792295196, 0.6291092085484609, 0.7368804956836089, 0.4114046720391717, 0.12957819262321554, 0.9143208126624144, 0.30041525200180397, nan, 0.0, 0.05877847208224148, nan, 0.03490515865981209, 0.005200322273493005, 0.31542018816775635, nan, nan, nan, 0.0, nan, nan, nan, 0.0023112859601671994, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.10444674250258532, nan, nan, 0.7244494635798984, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0924 | 240.0 | 2400 | 3.2397 | 0.0843 | 0.2076 | 0.4177 | [0.4968533234038832, 0.5592773954473266, 0.2868292810048732, 0.4602194916049151, 0.18583482646045796, 0.3738584295345643, 0.04209793455558134, 0.14075280169946447, 0.5971028560258302, 0.24678255565721258, 0.057126968479057655, 0.3632706850759281, 0.21866243153738377, 0.0, 0.0, 0.03390383987398913, 0.0, 0.010861018281895026, 0.002194013477511362, 0.06073107226186048, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0009567488761997327, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.0319617927994122, 0.0, nan, 0.46822165482023326, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.750839784469529, 0.5960254626147135, 0.3058635542587189, 0.9710336528946556, 0.27830981747750416, 0.8868328110232304, 0.04209793455558134, 0.7231553493235736, 0.7100363105877826, 0.4127307966948893, 0.145985401459854, 0.8465674462556106, 0.23372816310554037, nan, 0.0, 0.035577504535375934, nan, 0.011753235241978372, 0.0030762469786860033, 0.19093180247701058, nan, nan, nan, 0.0, nan, nan, nan, 0.0010327022375215145, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.04498448810754912, nan, nan, 0.6965761511216056, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1651 | 250.0 | 2500 | 3.3382 | 0.0811 | 0.2041 | 0.4270 | [0.5233985885168868, 0.5745486031751613, 0.3026345158781156, 0.4744664678711337, 0.19277621527623745, 0.4460298532012007, 0.011510791366906475, 0.14656090106795783, 0.6239889342642603, 0.21035525321239606, 0.04601589175734754, 0.3598276475977484, 0.26257665819383275, 0.0, 0.0, 0.03020706455542022, 0.0, 0.00848001025936969, 0.0027774670991313425, 0.06781540411151789, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.00260978442878909, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.027502750275027504, 0.0, nan, 0.3882392026578073, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7606486375882413, 0.6145582593651271, 0.31084479644296237, 0.9636080082964558, 0.29225120045193487, 0.8093568124160573, 0.011510791366906475, 0.8368080517613228, 0.7593481728480166, 0.2838926859124758, 0.09314643756863251, 0.8498936924167257, 0.2843625286123946, nan, 0.0, 0.03249344890143116, nan, 0.009377769898954086, 0.003637784125589004, 0.20480518790198268, nan, nan, nan, 0.0, nan, nan, nan, 0.0028358331284320955, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.03231644260599793, nan, nan, 0.5998665366254299, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1387 | 260.0 | 2600 | 3.4070 | 0.0833 | 0.2099 | 0.4270 | [0.5202995710888713, 0.6105514372910166, 0.2904397834912043, 0.4605412431161299, 0.18876949506518004, 0.408959252188796, 0.02483174750522163, 0.14103278993232088, 0.6060280827442884, 0.22147960359529845, 0.0654350585248907, 0.376735896526014, 0.2358539340141898, 0.0, 0.0, 0.01803897639296747, 0.0, 0.0053106744556558685, 0.0026094198204867215, 0.07567737375323962, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0038972220963429554, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.05134575569358178, nan, nan, 0.35955526123174514, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7485250485966647, 0.668732134797653, 0.29822842851188, 0.9705349155708959, 0.28266103541501336, 0.9271004327712282, 0.02483174750522163, 0.8430167962878243, 0.710589385765929, 0.29409364480261146, 0.1798333440992184, 0.8928703047484053, 0.2523209013010662, nan, 0.0, 0.019068736141906874, nan, 0.005672753057968445, 0.0034424668571010034, 0.2560995056609791, nan, nan, nan, 0.0, nan, nan, nan, 0.0042291615441357264, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0641158221302999, nan, nan, 0.5295416046404189, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0593 | 270.0 | 2700 | 3.3596 | 0.0773 | 0.1983 | 0.4199 | [0.5187306268021166, 0.5622120212555227, 0.28905586629826807, 0.44506316639917104, 0.19227707094653784, 0.4475505617977528, 0.043590933704649185, 0.14780668714144984, 0.5596318675468483, 0.09082813891362422, 0.04678014522080705, 0.346103734439834, 0.21620495482089072, 0.0, 0.0, 0.01697619621885378, 0.0, 0.007656992949663879, 0.0032964867334324666, 0.0672103137620379, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0040075868820362215, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, 0.027530204000792237, nan, 0.0, 0.3735974424873065, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7768731030249293, 0.6138013389499022, 0.2979922189801306, 0.9724981990041086, 0.29154168964450483, 0.8668606675620554, 0.043590933704649185, 0.8335076138814457, 0.6740864240082721, 0.10404978067938386, 0.12692978489761644, 0.7881974958658162, 0.23180506981849744, nan, 0.0, 0.01844386212457166, nan, 0.0082786739939727, 0.004492297175224005, 0.2300005315473343, nan, nan, nan, 0.0, nan, nan, nan, 0.004294729940168838, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0359358841778697, nan, nan, 0.6118782403367383, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1972 | 280.0 | 2800 | 3.3263 | 0.0816 | 0.2085 | 0.4329 | [0.5366076806476567, 0.5980282186494071, 0.27392181451393055, 0.46709103599257035, 0.18633103297770237, 0.42231041431169214, 0.03431380113890647, 0.14992140032527754, 0.550659269298097, 0.3235050770966529, 0.044718018587903266, 0.3818846953462338, 0.23827091249496393, 0.0, 0.0, 0.02956502457592668, 0.0, 0.026325238281760022, 0.002529001044587388, 0.06005039791574721, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004398958613879163, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.03798882681564246, 0.0, nan, 0.28411931915476557, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.801327029976469, 0.6542002030991425, 0.3141308878699458, 0.9695136915270071, 0.28136306777677644, 0.8548724071034174, 0.034400866403651274, 0.9006600875759754, 0.6253196213438925, 0.4387432418647353, 0.1187261804792972, 0.9025939050318923, 0.2566563704592449, nan, 0.0, 0.03225156218504334, nan, 0.028594220882822194, 0.0033692228814180035, 0.224206665603572, nan, nan, nan, 0.0, nan, nan, nan, 0.004819277108433735, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.052740434332988625, nan, nan, 0.42585082901288435, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2243 | 290.0 | 2900 | 3.4376 | 0.0833 | 0.2104 | 0.4247 | [0.5221423764694836, 0.5435663604813709, 0.28044583795358086, 0.4563456948028287, 0.18572120395187586, 0.4221764763204054, 0.004989556741703411, 0.1498291696705639, 0.5994130495053522, 0.27792665726375176, 0.06053951470787583, 0.3708370624159724, 0.23988610280720296, 0.0, 0.0, 0.03764803882510594, 0.0, 0.0038002622014968717, 0.00250009328706295, 0.05770272843104231, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004145016358010984, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0771594953089615, 0.0, 0.0, 0.4493485871016637, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7822464447703168, 0.5795095531818866, 0.30573155481450603, 0.9716353042693499, 0.2798364426271403, 0.862035517087002, 0.004989556741703411, 0.7809947062283511, 0.7105252611075932, 0.4020197898602469, 0.14424132807958143, 0.8888069926765887, 0.258792195304589, nan, 0.0, 0.04190687361419069, nan, 0.004059563907108669, 0.0032715642471740033, 0.2094296497103067, nan, nan, nan, 0.0, nan, nan, nan, 0.004589787722317842, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1233195449844881, nan, nan, 0.6391355679893229, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1086 | 300.0 | 3000 | 3.4696 | 0.0820 | 0.2126 | 0.4330 | [0.5174925531702218, 0.627213347316568, 0.29365643676614717, 0.43124854750452474, 0.18489062489501468, 0.4724372300116473, 0.011827956989247311, 0.14800645547075428, 0.616306205080813, 0.28371518726848677, 0.06530276972046102, 0.3454970987556841, 0.2724308785378737, 0.0, 0.0, 0.0377334496039532, 0.0, 0.005787644281217209, 0.003075489282385834, 0.08193073802748684, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0043870265036911535, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.07008980866848886, 0.0, 0.0, 0.4449345578132909, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7724290659209494, 0.6776224236497668, 0.31848686952897043, 0.9695453573888331, 0.27759021883868884, 0.7969954733124409, 0.011827956989247311, 0.7522057381870466, 0.7430284473015543, 0.4219116596960114, 0.16433046960790645, 0.8062839593668792, 0.29756464912056774, nan, 0.0, 0.041866559161459385, nan, 0.0062577557170714415, 0.004028418662565004, 0.28772657205124114, nan, nan, nan, 0.0, nan, nan, nan, 0.004753708712400623, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.09281282316442606, nan, nan, 0.631692418253683, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0722 | 310.0 | 3100 | 3.3817 | 0.0877 | 0.2142 | 0.4327 | [0.5256100870236518, 0.6528359804783831, 0.2860540048096568, 0.4202690792113503, 0.18980270323195297, 0.455835011350665, 0.0625821923106676, 0.13899216319628863, 0.5455221288126172, 0.3684809533373502, 0.055798638734199595, 0.3858243652381554, 0.22783124145047878, 0.0, 0.0, 0.023400155630488754, 0.0, 0.006775627571846441, 0.004645503815949563, 0.06897757354777484, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.003943044906900329, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.053034347748709736, 0.0, nan, 0.3489267209474463, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7720262251474952, 0.7119001053106665, 0.29667222453800196, 0.9701232593671577, 0.2900722288727185, 0.8839725414117295, 0.0625821923106676, 0.838049800666623, 0.6293594748190482, 0.5614607773130674, 0.1631031587106776, 0.8947224190881172, 0.24234804584790545, nan, 0.0, 0.025458576899818584, nan, 0.00726821485552207, 0.0061524939573720065, 0.22022006059639612, nan, nan, nan, 0.0, nan, nan, nan, 0.004425866732235063, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.07704239917269907, nan, nan, 0.48395872901801756, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1221 | 320.0 | 3200 | 3.3717 | 0.0821 | 0.2103 | 0.4297 | [0.5200888187367714, 0.6418608022993505, 0.29350612635650447, 0.4292671492395803, 0.18190697134903144, 0.4083059449289837, 0.01280266109692891, 0.14888921867841864, 0.5827550628173744, 0.245645918521884, 0.04873845610003977, 0.3827041233363708, 0.21935664423869042, 0.0, 0.0, 0.022483778481967708, 0.0, 0.004537396932784725, 0.004364829946225295, 0.0696068841350175, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.003177101117923898, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.08897082398204553, 0.0, nan, 0.37350195094760313, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7683217951778467, 0.7076359635925982, 0.3173614005835765, 0.9713265621165462, 0.2636656489165669, 0.8744466000099488, 0.01280266109692891, 0.754035683942226, 0.6926665437610715, 0.33091910639600125, 0.14249725469930882, 0.924677533664068, 0.2367744790204137, nan, 0.0, 0.024027413827857287, nan, 0.004945931572416238, 0.006103664640250006, 0.2609897411364482, nan, nan, nan, 0.0, nan, nan, nan, 0.003507909187771494, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1434850051706308, nan, nan, 0.5503310918330682, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1307 | 330.0 | 3300 | 3.3948 | 0.0830 | 0.2124 | 0.4344 | [0.5344828847653706, 0.632533365510168, 0.2900707344483374, 0.41337265872050594, 0.18409163601471293, 0.45896610955094475, 0.055231950987445176, 0.14185919094146654, 0.595131529363542, 0.27612989621694006, 0.04924924924924925, 0.39319339499873995, 0.22234201097124012, 0.0, 0.0, 0.02136450314287732, 0.0, 0.0051252296683106085, 0.005748628168278025, 0.08711603588616244, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0010471517472474868, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.06476856476856477, 0.0, 0.0, 0.2995762379345523, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7785356205026771, 0.6900669474951105, 0.3048422954008615, 0.9686191309304222, 0.2786965176806058, 0.8373625826990997, 0.05523323276862381, 0.9137638062871708, 0.7215547023413516, 0.42068754462919516, 0.12182675537755959, 0.8993716040633121, 0.2386677898892945, nan, 0.0, 0.024390243902439025, nan, 0.005637298351356143, 0.007519714836788007, 0.3117525115611545, nan, nan, nan, 0.0, nan, nan, nan, 0.0011474469305794606, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.10056876938986556, nan, nan, 0.3919203326317951, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.104 | 340.0 | 3400 | 3.4451 | 0.0834 | 0.2140 | 0.4302 | [0.5241374474254009, 0.61151023660502, 0.3100157159527038, 0.40974731675745646, 0.18471868509401726, 0.3730898617004903, 0.0555194554034192, 0.14825912978068798, 0.5691204582406444, 0.33826269912834384, 0.05932806243775533, 0.3825024211371481, 0.2308624376336422, 0.0, 0.0, 0.018897608300713575, 0.0, 0.004736617825417118, 0.008011249051786342, 0.08279622421482263, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004588214924684019, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.028315946348733235, 0.0, nan, 0.4072450484198781, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7806393445418273, 0.6710875206860237, 0.3193136028900931, 0.9622780420997633, 0.2697486112418793, 0.8440779983087101, 0.0555194554034192, 0.785765636232926, 0.6705755989643868, 0.5740079567479343, 0.18855371100058135, 0.8733380581148122, 0.24805350624154393, nan, 0.0, 0.020338641402942954, nan, 0.005123205105477752, 0.010571547156913011, 0.27927496943602825, nan, nan, nan, 0.0, nan, nan, nan, 0.005212687484632407, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.044208893485005174, nan, nan, 0.5245623941276115, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0778 | 350.0 | 3500 | 3.5680 | 0.0792 | 0.2075 | 0.4165 | [0.49561161450912594, 0.5152816932921644, 0.2968995468768258, 0.4268240350834348, 0.18062333812057244, 0.3939759735360687, 0.02899357933008432, 0.14701717843100542, 0.589789538925867, 0.2859130665253114, 0.0641777088512327, 0.3853812609528467, 0.24315088601054524, 0.0, 0.0, 0.021505376344086023, 0.0, 0.004356829288494989, 0.012228357046789835, 0.07914394625732893, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0031720616890473243, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0546448087431694, 0.0, nan, 0.36537534158495416, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7503175834669031, 0.5413626448021663, 0.3177365568987078, 0.9685083004140311, 0.2761577467819818, 0.8442272297667015, 0.02899357933008432, 0.8210901248284426, 0.7017321673332959, 0.4401713761093543, 0.18177120341063238, 0.8935601228443184, 0.2642891787710923, nan, 0.0, 0.0231808103204999, nan, 0.004662293919517816, 0.016870529065651018, 0.2461064157763249, nan, nan, nan, 0.0, nan, nan, nan, 0.0034423407917383822, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.09565667011375388, nan, nan, 0.46671115445819006, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1175 | 360.0 | 3600 | 3.4477 | 0.0811 | 0.2020 | 0.4260 | [0.5058399294926865, 0.5640702238862114, 0.30025518928731154, 0.41355301208075457, 0.17972330898782135, 0.4360531259094637, 0.04693277635955752, 0.1488526535595696, 0.5932028556692636, 0.15678280207561157, 0.08271117418399457, 0.3942196721715452, 0.25188375440784805, 0.0, 0.0, 0.02224580215197885, 0.0, 0.004431555280393626, 0.0082361385250314, 0.07577932235643213, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0030940684800938485, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0887827376050074, 0.0, nan, 0.2640903133059913, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7755857176960066, 0.5992317962990823, 0.31715992774767265, 0.9690941188578124, 0.2737198541972104, 0.8198527582947819, 0.04693277635955752, 0.852558656296974, 0.7006580793061712, 0.2157502805263695, 0.17318002713003036, 0.907016300496102, 0.27321539495741115, nan, 0.0, 0.024087885506954242, nan, 0.004821840099273178, 0.01120632827949901, 0.2497740923829267, nan, nan, nan, 0.0, nan, nan, nan, 0.00345873289074666, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.13934850051706307, nan, nan, 0.3206200913710795, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1257 | 370.0 | 3700 | 3.4510 | 0.0826 | 0.2117 | 0.4241 | [0.5091975859565346, 0.5872247122742442, 0.29964067696993174, 0.39076219687163233, 0.1818623793048866, 0.4272440787334617, 0.02952734586524329, 0.1552538555385241, 0.5799153152789278, 0.2199093741149816, 0.05740476516232434, 0.3947780219076549, 0.23210183540556542, 0.0, 0.0, 0.018710191082802547, 0.0, 0.004100511743865634, 0.011916466081296402, 0.06943531246655148, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004930390298815851, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10257146961643442, 0.0, nan, 0.43128017901307264, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.756383640828019, 0.6270403941627802, 0.31168542448242326, 0.9702499228144618, 0.2704379463865388, 0.8497487937123812, 0.02952734586524329, 0.8108947127638716, 0.6905183677068221, 0.316841783127614, 0.18971642658742977, 0.932123789274746, 0.2501850764557221, nan, 0.0, 0.020842572062084258, nan, 0.0044318383265378476, 0.017163504968383016, 0.24137564450114282, nan, nan, nan, 0.0, nan, nan, nan, 0.005671666256864191, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18459152016546018, nan, nan, 0.5639340896257893, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0411 | 380.0 | 3800 | 3.4760 | 0.0816 | 0.2168 | 0.4276 | [0.5130715980000113, 0.5991079870990583, 0.30460726143733624, 0.3770378174559166, 0.18049638523108702, 0.4233997901364113, 0.021149624315883863, 0.1599175603984581, 0.5522647360115175, 0.35026890224612467, 0.07975037965072133, 0.3932518198592777, 0.2514428085352473, 0.0, 0.0, 0.024780618084700497, 0.0, 0.01601801222257961, 0.010379326948798888, 0.07485848135857896, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00601557951930807, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.05700441784238278, 0.0, nan, 0.4203525768283967, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7690081164955836, 0.6523619678050249, 0.3142212032791441, 0.9689041236868563, 0.2632755861030035, 0.8630801372929413, 0.0211650034810861, 0.8215149336644664, 0.6395552954944412, 0.5647250841579109, 0.21710483818874748, 0.9163241200094495, 0.2700797317880513, nan, 0.0, 0.02618423704898206, nan, 0.017656443892926785, 0.014209331282502013, 0.24461808324031256, nan, nan, nan, 0.0, nan, nan, nan, 0.0071141709695926566, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.10341261633919338, nan, nan, 0.5434525948359941, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0812 | 390.0 | 3900 | 3.5357 | 0.0771 | 0.2052 | 0.4202 | [0.5086205545479284, 0.5481811238406983, 0.31139451292569503, 0.38525294209003286, 0.18095484012494772, 0.4174831113195351, 0.024352131198267193, 0.14885334847481152, 0.5710615524236601, 0.20508066011496384, 0.05652869348105957, 0.3874197445846924, 0.252980653795438, 0.0, 0.0, 0.03244296470054103, 0.0, 0.004995563151148651, 0.015609859229373316, 0.07776383568881258, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.007114926645694294, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.04396527583309997, 0.0, 0.0, 0.4489092475849245, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7608894894792484, 0.5905013539942832, 0.319591496456857, 0.9622463762379373, 0.266285122466273, 0.8192558324628165, 0.024352131198267193, 0.7709953597804066, 0.6859494858003959, 0.3384678159747016, 0.18719720948259155, 0.8878053390030711, 0.27344939967154247, nan, 0.0, 0.03456964321709333, nan, 0.005389115405070023, 0.021631387485046023, 0.25035879445064585, nan, nan, nan, 0.0, nan, nan, nan, 0.007933775920006558, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0811789038262668, nan, nan, 0.5080848005749191, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0218 | 400.0 | 4000 | 3.6522 | 0.0776 | 0.2045 | 0.4158 | [0.5066063421441083, 0.5635048337242695, 0.297512776376713, 0.40027323763384165, 0.17817917252315565, 0.4208121001164436, 0.009027616616384312, 0.14266192791975518, 0.5394599392346294, 0.3065267203784463, 0.05641328571812153, 0.3897372003598468, 0.1975139769211805, 0.0, 0.0, 0.022681516780258953, 0.0, 0.004545908682186033, 0.012802300906683503, 0.05854766615373342, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004573237764390308, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.05968586387434555, 0.0, 0.0, 0.4088050314465409, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7761356273232616, 0.6042528584323755, 0.3098026955675976, 0.9555965452544748, 0.26599257535610044, 0.8359200119385166, 0.009027616616384312, 0.8119403960525455, 0.6333271880535761, 0.4230337651739263, 0.13571474710935985, 0.9251972596267423, 0.2113360392787549, nan, 0.0, 0.024753073977020763, nan, 0.00483956745257933, 0.017822700749530018, 0.21235316004890237, nan, nan, nan, 0.0, nan, nan, nan, 0.005114334890582739, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1031540847983454, nan, nan, 0.5038242390021046, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.094 | 410.0 | 4100 | 3.5320 | 0.0811 | 0.2094 | 0.4255 | [0.5098057997037816, 0.5792602290750447, 0.29917905928268623, 0.3828659429722075, 0.18245962123661008, 0.43340489372653884, 0.010342693587065832, 0.14209644842340713, 0.6026448837911259, 0.21918335901386748, 0.06510588302719432, 0.3874413485023782, 0.22762324047933982, 0.0, 0.0, 0.011098880939277196, 0.0, 0.0065159508492235434, 0.015972791053812458, 0.07819832998110562, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004860708067324434, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.07446283032847617, 0.0, nan, 0.38976361767728673, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7600710193363571, 0.6198567022717015, 0.30888564679727665, 0.9663471053444058, 0.275212853242229, 0.8586280654628662, 0.010342693587065832, 0.8345206195673486, 0.7681733289514817, 0.2902172804243599, 0.1467605451844196, 0.9113630994566502, 0.2452752320901301, nan, 0.0, 0.012195121951219513, nan, 0.006984577202623648, 0.021899948729217023, 0.27279009195768883, nan, nan, nan, 0.0, nan, nan, nan, 0.005491353167773133, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.15589451913133404, nan, nan, 0.48667932857656176, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0968 | 420.0 | 4200 | 3.5323 | 0.0846 | 0.2095 | 0.4290 | [0.5144775633729448, 0.593644380899913, 0.29926863151013294, 0.42372988746484375, 0.18313583607174075, 0.45091874339612914, 0.03544519223331013, 0.1458585473168347, 0.5823363165332077, 0.24281150159744408, 0.06835564053537285, 0.38747355537362915, 0.22953930074905626, 0.0, 0.0, 0.01814833357173509, 0.0, 0.005897993715252598, 0.012945041923263371, 0.0709436232382262, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.006799016938676399, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.06086840703249867, 0.0, nan, 0.40662220104811814, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.785189953278996, 0.6383189784865353, 0.30531471446436015, 0.9529603622574593, 0.2782896418147336, 0.870367606824852, 0.03544519223331013, 0.7994575517940004, 0.6936765071298604, 0.32561460777313067, 0.16626832891932045, 0.9034349161351287, 0.2470664318110263, nan, 0.0, 0.020459584761136868, nan, 0.00648821131005141, 0.017602968822481017, 0.2632753946738957, nan, nan, nan, 0.0, nan, nan, nan, 0.007573149741824441, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.11814891416752844, nan, nan, 0.5257430316718854, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0799 | 430.0 | 4300 | 3.6391 | 0.0802 | 0.2085 | 0.4217 | [0.49260082945934125, 0.5957218640261241, 0.30280764959232875, 0.403015277096858, 0.18048704095215723, 0.4028124785110138, 0.015533379747814652, 0.1506498272224091, 0.5944629461432368, 0.22912747793372884, 0.06545584045584045, 0.38992878700263783, 0.2408285963713357, 0.0, 0.0, 0.01570660955172284, 0.0, 0.005092772803944466, 0.017027118583629004, 0.07339340157721722, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.006763032625101591, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09278797514091632, nan, nan, 0.3784535112479685, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7369598949629983, 0.6372423649766812, 0.3106502709462276, 0.9696403549743111, 0.2712785990019772, 0.8741730090036313, 0.015533379747814652, 0.8049147114567675, 0.7082408201543802, 0.32306436805059674, 0.1780892707189458, 0.9303094731868651, 0.25938784366783246, nan, 0.0, 0.016851441241685146, nan, 0.005566388938131537, 0.023315998925755022, 0.2290437463456121, nan, nan, nan, 0.0, nan, nan, nan, 0.007638718137857553, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.16597724922440538, nan, nan, 0.45423746214260047, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0662 | 440.0 | 4400 | 3.7599 | 0.0757 | 0.2086 | 0.4203 | [0.4926412308000367, 0.573800181104739, 0.29685009843442395, 0.39860827174083974, 0.1794377961462717, 0.44264216028333314, 0.05368920668187142, 0.1479309241297819, 0.595599989517545, 0.16642497065119813, 0.05472306944260266, 0.39135098700839666, 0.19992089302979696, 0.0, 0.0, 0.021776314579463316, 0.0, 0.003954079513548537, 0.014348425482513423, 0.07420236768062854, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004547266472768075, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0939026311643296, 0.0, 0.0, 0.41152635577584024, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7551218326910616, 0.6256111779750263, 0.3079824927052939, 0.9631805191618046, 0.2641027882765949, 0.8361936029448341, 0.05390268430416957, 0.7854715378079864, 0.7286965861635019, 0.2458431092522697, 0.15993798850203475, 0.8997873848334514, 0.2128974889166858, nan, 0.0, 0.02386615601693207, nan, 0.0042191100868640315, 0.019311994921751018, 0.29718811460160527, nan, nan, nan, 0.0, nan, nan, nan, 0.0050487664945496275, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.15408479834539815, nan, nan, 0.583902263744161, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0617 | 450.0 | 4500 | 3.6476 | 0.0758 | 0.2095 | 0.4176 | [0.4980115095224274, 0.5149895782518071, 0.2988037158323866, 0.3699981573947374, 0.17840644170738712, 0.4340881520420542, 0.0175524096851551, 0.14420802377414563, 0.6038700628442647, 0.21671177266576455, 0.07842737828478305, 0.3880463399647608, 0.2417068263697188, 0.0, 0.0, 0.03248398152287289, 0.0, 0.0031865255491066347, 0.01187199530257991, 0.08092192256834466, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.003916353699346787, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.1126454417593658, 0.0, 0.0, 0.3941723346882923, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.760680609078198, 0.5459464796148639, 0.31061553425038213, 0.9696799373015936, 0.2653066028219027, 0.8543998408197782, 0.0175524096851551, 0.7928566760342461, 0.710910009057608, 0.3267367132510456, 0.17408436147535689, 0.9011292227734468, 0.2637530952441733, nan, 0.0, 0.03515420278169724, nan, 0.0034745612480056726, 0.015796284088967015, 0.2797533620368894, nan, nan, nan, 0.0, nan, nan, nan, 0.004393082534218507, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.22776628748707342, nan, nan, 0.5027462655921154, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0342 | 460.0 | 4600 | 3.7074 | 0.0799 | 0.2095 | 0.4218 | [0.5028446001811095, 0.5585803695565409, 0.30021348216855054, 0.3731737686352469, 0.17902986722174913, 0.402559539142449, 0.016863928212268894, 0.15198692201099623, 0.6031615503285919, 0.23618480652258314, 0.08036402442639473, 0.3853098452287265, 0.23592683013126703, 0.0, 0.0, 0.029742270923016716, 0.0, 0.00258307502583075, 0.018283240984309994, 0.07948157903189541, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.006129421221864952, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.07670142700329309, 0.0, nan, 0.39392981346822636, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7622195034614466, 0.5987569580261772, 0.31165763512574685, 0.9703924191926788, 0.2672300160060258, 0.8551211262000696, 0.016863928212268894, 0.7777596235540161, 0.7113909439951265, 0.3723349994899521, 0.17511788644144435, 0.9021875738247106, 0.25532892553544534, nan, 0.0, 0.03217093327958073, nan, 0.0027477397624534657, 0.023926365389780022, 0.26306277574017967, nan, nan, nan, 0.0, nan, nan, nan, 0.0069994262765347105, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.14451913133402275, nan, nan, 0.5116780452748833, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0657 | 470.0 | 4700 | 3.6987 | 0.0807 | 0.2092 | 0.4285 | [0.5121206372218667, 0.5842399933876532, 0.29763593221706136, 0.3729076774572705, 0.18197149957166597, 0.43436451637685985, 0.03788195250251412, 0.14975226855202362, 0.6136289793184847, 0.15479186138471493, 0.08083951069216924, 0.39209442127887784, 0.23890628640479566, 0.0, 0.0, 0.02587479698804075, 0.0, 0.004216237416655772, 0.017552269626461153, 0.08500752987920152, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0018531802965088475, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09611932053583759, 0.0, 0.0, 0.39802283158761914, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7709796917095795, 0.6313985632616218, 0.30762123106850076, 0.969482025665181, 0.27356853672643144, 0.843729791573397, 0.03788195250251412, 0.7911247630873799, 0.741778016464006, 0.21962664490462103, 0.1711775725082359, 0.8912260807937633, 0.2573966762821331, nan, 0.0, 0.028260431364644224, nan, 0.004573657152987059, 0.023242754950072022, 0.2820390155743369, nan, nan, nan, 0.0, nan, nan, nan, 0.002032620277026473, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.17993795243019647, nan, nan, 0.5208151532262204, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0628 | 480.0 | 4800 | 3.5813 | 0.0789 | 0.2125 | 0.4285 | [0.5044925731553737, 0.5827042460686552, 0.3016070474812556, 0.365731154555206, 0.18079639395907252, 0.41206837214638065, 0.028351512338516283, 0.16746760878787667, 0.6125493291239148, 0.14705882352941177, 0.08216630505226971, 0.39327139077221634, 0.2420292162560204, 0.0, 0.0, 0.025348109236138595, 0.0, 0.004380048561407964, 0.0190767153945896, 0.08110579734044945, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.00506235702162202, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.105871535531328, 0.0, nan, 0.3946700507614213, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7607701292500767, 0.6332509026628554, 0.30992774767264136, 0.9690070377377908, 0.2705556377527002, 0.8933989951748496, 0.028351512338516283, 0.7868766747271421, 0.7465072100162716, 0.1994287463021524, 0.18835992506943997, 0.9115237420269312, 0.2587028480501025, nan, 0.0, 0.02840153194920379, nan, 0.004892749512497784, 0.026221343294514025, 0.28756710785095413, nan, nan, nan, 0.0, nan, nan, nan, 0.005622489959839358, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20837642192347466, nan, nan, 0.5587495508444125, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0835 | 490.0 | 4900 | 3.6482 | 0.0800 | 0.2102 | 0.4330 | [0.5007355015792054, 0.6161499911966917, 0.3021304508247127, 0.3688357543658833, 0.17830281097652334, 0.44255768399094353, 0.06774156397558448, 0.1531792089230363, 0.6002667733760171, 0.1480657431995981, 0.08799611210108538, 0.3937319692548865, 0.23802889227694868, 0.0, 0.0, 0.021359187236492636, 0.0, 0.0030881343746936374, 0.017506505796072866, 0.07923873268069934, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.00692891608803107, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09003636363636364, 0.0, nan, 0.4036197645551373, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7792389932817242, 0.6745571310365579, 0.30719744337918575, 0.9642334090675195, 0.2601113696584933, 0.865368352982142, 0.06825249477837085, 0.7961244363113522, 0.7214344686069719, 0.21044578190349894, 0.17544086299334669, 0.9066383179777935, 0.2544056705724181, nan, 0.0, 0.0231808103204999, nan, 0.003350469774862613, 0.023486901535682024, 0.26933503428480304, nan, nan, nan, 0.0, nan, nan, nan, 0.007720678632898942, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.16003102378490175, nan, nan, 0.5174272367948257, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0932 | 500.0 | 5000 | 3.6990 | 0.0843 | 0.2111 | 0.4280 | [0.5083389105295105, 0.5978525732096884, 0.3061921843837425, 0.3720533589475985, 0.1813041873981569, 0.44401315789473683, 0.01577318790129187, 0.1436313232181715, 0.6181803904118105, 0.18661257606490872, 0.07377964209600488, 0.3912757613945006, 0.21991067986120036, 0.0, 0.0, 0.029721473495058402, 0.0, 0.002938439688525393, 0.023155278633436013, 0.07598555312295185, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.00593498226924047, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10407632263660017, 0.0, nan, 0.4183602818392864, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7657534188179927, 0.6528697156612006, 0.3125121578435459, 0.9567048504183852, 0.26601275101887095, 0.8393025916529872, 0.01577318790129187, 0.7872688059603947, 0.757199996793767, 0.2721615831888197, 0.14088237193979716, 0.9447484053862509, 0.23296658412682206, nan, 0.0, 0.033340052408788554, nan, 0.0031909235951072504, 0.032300593276203035, 0.2896401424546856, nan, nan, nan, 0.0, nan, nan, nan, 0.006556839603311204, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18614270941054809, nan, nan, 0.5272829936861557, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0665 | 510.0 | 5100 | 3.6691 | 0.0757 | 0.2114 | 0.4246 | [0.5000358489161219, 0.5442281741439321, 0.302040564531054, 0.3663952827892132, 0.18222311627105006, 0.42491217624292943, 0.05388865737525274, 0.1569765609736114, 0.602897638217209, 0.20973689144428473, 0.08917628736838541, 0.388390302979683, 0.22481293002314595, 0.0, 0.0, 0.028535980148883373, 0.0, 0.0024328913852785577, 0.021611828862130804, 0.07894736842105263, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00879876508560202, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.08256206845893933, 0.0, 0.0, 0.42514147802929425, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.772983238413532, 0.5858799082292764, 0.30613450048631374, 0.9651279696641044, 0.27719006819374015, 0.8874794806745262, 0.05401872050746499, 0.8139010522188093, 0.7214745465184318, 0.34234418035295316, 0.17014404754214843, 0.8788282541932435, 0.2413354436303917, nan, 0.0, 0.031062285829469866, nan, 0.0026413756426165572, 0.028833711760541027, 0.27427842449370116, nan, nan, nan, 0.0, nan, nan, nan, 0.010277846078190312, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1452947259565667, nan, nan, 0.5244597299933268, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0877 | 520.0 | 5200 | 3.7111 | 0.0763 | 0.2102 | 0.4210 | [0.49668371752010576, 0.5615962149800385, 0.30848538198890313, 0.38246477858588135, 0.18070557745165297, 0.4227186708938534, 0.013522085557360564, 0.14653196516097183, 0.6246860050758456, 0.2056387051862165, 0.07109080452425075, 0.38006611294758186, 0.21454665563019323, 0.0, 0.0, 0.02773475226415438, 0.0, 0.002885075094383171, 0.02019042601234382, 0.07990498812351544, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.006268402999119692, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.1016192783373709, 0.0, 0.0, 0.40866329389137435, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7479495617774443, 0.6038062283737025, 0.32137001528414616, 0.966663763962666, 0.27103649104873095, 0.8555936924837089, 0.013522085557360564, 0.8147506698908568, 0.7714276553620237, 0.30133632561460777, 0.14372456559653768, 0.9050224427120246, 0.22937993005386362, nan, 0.0, 0.030679298528522476, nan, 0.0031022868285764935, 0.026196928635953028, 0.2682187848827938, nan, nan, nan, 0.0, nan, nan, nan, 0.006769936890418818, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2060496380558428, nan, nan, 0.5017196242492685, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0421 | 530.0 | 5300 | 3.6585 | 0.0766 | 0.2082 | 0.4281 | [0.5123497569629336, 0.5903871531645787, 0.3025471846010183, 0.36483517800894916, 0.18037864582156654, 0.41149912903340485, 0.02565947242206235, 0.15110792417073748, 0.6285871853331451, 0.16450822541127055, 0.07003968915718907, 0.38460129814637717, 0.23915240733914409, 0.0, 0.0, 0.03114525392712844, 0.0, 0.002186061801446417, 0.023191780315629965, 0.08216752257836019, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.007163467750039478, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.07952043063371667, 0.0, 0.0, 0.4198401904114247, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7670067012242949, 0.6478204453136753, 0.30837154369876335, 0.9636871729510208, 0.26847418187687466, 0.8637019350345719, 0.02565947242206235, 0.771256780602575, 0.7491683833372075, 0.23972253391818832, 0.1356501517989794, 0.8930781951334751, 0.25670742603323715, nan, 0.0, 0.034650272122555933, nan, 0.002357737989718135, 0.03064039649405503, 0.2713017594216765, nan, nan, nan, 0.0, nan, nan, nan, 0.008179657405130727, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.16804550155118925, nan, nan, 0.5070581592320723, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0507 | 540.0 | 5400 | 3.7101 | 0.0815 | 0.2115 | 0.4270 | [0.4974824017942398, 0.5720456241338876, 0.30314888413329255, 0.3655056852184321, 0.17869754152854492, 0.4239332448321639, 0.057302007926636124, 0.15676353044774097, 0.6200374973778058, 0.16726321585903084, 0.07578496956547942, 0.3980758857836362, 0.2334041928754432, 0.0, 0.0, 0.030038220675052487, 0.0, 0.0028783420749648203, 0.01730433687694494, 0.07859108881840918, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004943880277926242, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.1187785848802798, 0.0, nan, 0.42254351157034536, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7640120383316851, 0.6307309688581315, 0.3100041684035015, 0.9670120884427521, 0.261207580669025, 0.8339800029846292, 0.057376034656146054, 0.7562577609306581, 0.7581378199219282, 0.24788330103029685, 0.16245720560687293, 0.892652964800378, 0.2503892987516912, nan, 0.0, 0.033743196936101595, nan, 0.0031909235951072504, 0.023486901535682024, 0.29899537553819167, nan, nan, nan, 0.0, nan, nan, nan, 0.005458568969756578, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2282833505687694, nan, nan, 0.5520763821159078, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0932 | 550.0 | 5500 | 3.7090 | 0.0810 | 0.2116 | 0.4285 | [0.5073254998642902, 0.5980655310884674, 0.2975618927393958, 0.3733753325789779, 0.18044834938843712, 0.42536103921236607, 0.026626440782857586, 0.14937098412583924, 0.6102345160151593, 0.20635964912280702, 0.08033240997229917, 0.3919819002373415, 0.22244013627906603, 0.0, 0.0, 0.024436605559921772, 0.0, 0.0030642195916524206, 0.020905492330529442, 0.07742817667753928, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.004672828959857479, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09352857817666509, 0.0, nan, 0.4043623451832439, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7649242915117825, 0.6630810892131789, 0.3078713352785883, 0.9665212675844489, 0.26436507189261166, 0.874695319106601, 0.026626440782857586, 0.8120711064636298, 0.7343876495908045, 0.28797306946853, 0.16859376009301724, 0.934826364280652, 0.23500455245534765, nan, 0.0, 0.02695021165087684, nan, 0.0033681971281687645, 0.02888254107766303, 0.28894913092010843, nan, nan, nan, 0.0, nan, nan, nan, 0.005245471682648963, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.15356773526370218, nan, nan, 0.491042554283661, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0798 | 560.0 | 5600 | 3.7925 | 0.0793 | 0.2133 | 0.4273 | [0.4951765512657083, 0.6000822488198151, 0.29813820751511855, 0.37068550037131903, 0.18114780966265298, 0.4194969931585226, 0.03585867867315998, 0.14795304820662972, 0.6054297692544893, 0.2341155866900175, 0.08440450460698441, 0.39587849418362014, 0.23529529153977866, 0.0, 0.0, 0.022588875025605692, 0.0, 0.0031117669703580137, 0.020952896589063345, 0.08079366947108313, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00915078393658565, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09277223664435032, 0.0, nan, 0.4249505589861565, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7526088735804658, 0.6585771776741387, 0.30482840072252326, 0.968112477141206, 0.26573701696100716, 0.8570860070636224, 0.03622650266883268, 0.7397555715312725, 0.7396699183212164, 0.3409160461083342, 0.1917188812092242, 0.9232600992204111, 0.2508232711306257, nan, 0.0, 0.02445071558153598, nan, 0.00343910654139337, 0.028345418589321027, 0.2919789507255621, nan, nan, nan, 0.0, nan, nan, nan, 0.01029423817719859, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.17554291623578078, nan, nan, 0.540475334941738, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0618 | 570.0 | 5700 | 3.6804 | 0.0794 | 0.2115 | 0.4267 | [0.4942523648835537, 0.5821488844671383, 0.2942625907850224, 0.36369266130160743, 0.18122735052757627, 0.4029482321037444, 0.04512260742562841, 0.15366743016612694, 0.6096287820829638, 0.1779097217254453, 0.09405419206587304, 0.4010141837549495, 0.2325442300561025, 0.0, 0.0, 0.03670373931817769, 0.0, 0.0028999125166056443, 0.019034210381964126, 0.08065516630235718, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004685121414108571, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0908973531310523, 0.0, nan, 0.41445456306917267, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7617399311120963, 0.6340924477207763, 0.3003404196192858, 0.9670437543045781, 0.2702328271483718, 0.8647962990598418, 0.04540883422294423, 0.7629239918959545, 0.721755091898651, 0.25369784759767416, 0.19036237969123443, 0.9206520198440822, 0.24813008960253236, nan, 0.0, 0.04077806893771417, nan, 0.003173196241801099, 0.025415659562001026, 0.29027799925583375, nan, nan, nan, 0.0, nan, nan, nan, 0.005196295385624129, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18200620475698034, nan, nan, 0.5489964580873672, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0721 | 580.0 | 5800 | 3.7025 | 0.0755 | 0.2094 | 0.4198 | [0.49315858792804557, 0.5697901362757052, 0.29030091348737236, 0.3649888665421635, 0.18010450947356108, 0.411802986297738, 0.017269113575549472, 0.13857836311344204, 0.5995576535833361, 0.2327894428915036, 0.07984911402035058, 0.3903314104192114, 0.22903595438992924, 0.0, 0.0, 0.03229199343088591, 0.0, 0.003269719483967044, 0.022079874033781848, 0.08402137736815156, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.003863757950425013, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10465910643530536, 0.0, 0.0, 0.43328835287172135, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7470117314053815, 0.6217560553633218, 0.30026399888842575, 0.9654367118169079, 0.2692307692307692, 0.8498980251703726, 0.017273922797246075, 0.6959349062152801, 0.7279110590988882, 0.36529633785575844, 0.1627155868483948, 0.9149633829435388, 0.2432202452369404, nan, 0.0, 0.03527514613989115, nan, 0.003580925367842581, 0.03012768866427403, 0.27911550523574125, nan, nan, nan, 0.0, nan, nan, nan, 0.004261945742152283, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1980351602895553, nan, nan, 0.5274369898875828, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0527 | 590.0 | 5900 | 3.7035 | 0.0802 | 0.2123 | 0.4290 | [0.5070100232088128, 0.6042485962552574, 0.2953623483844664, 0.36947230781337964, 0.18118315917474692, 0.408192188843196, 0.0534406351941897, 0.14785952354933887, 0.5646835942991412, 0.19077656293804318, 0.07515401638853998, 0.39406755397037047, 0.25569382391069073, 0.0, 0.0, 0.02784652330931823, 0.0, 0.004082739262874184, 0.015982564475118054, 0.07915074553969029, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00466529081766125, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.1384177133238379, 0.0, nan, 0.41690298358429523, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7640887699075811, 0.6653001353994283, 0.301979991663193, 0.9663391888789493, 0.26854815930703324, 0.8739491618166443, 0.05373249787266961, 0.7074374223906934, 0.6761384130750179, 0.277670100989493, 0.16232801498611202, 0.9104086935979211, 0.27441520094622995, nan, 0.0, 0.030618826849425518, nan, 0.0045382024463747565, 0.021484899533680022, 0.2928825811938553, nan, nan, nan, 0.0, nan, nan, nan, 0.005261863781657241, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2763702171664943, nan, nan, 0.5358041168317849, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.046 | 600.0 | 6000 | 3.7546 | 0.0749 | 0.2085 | 0.4241 | [0.499135089402018, 0.5835027791987826, 0.29646241186334527, 0.36441729160377445, 0.17969397839446571, 0.4229806983405649, 0.03664099709284345, 0.14449766087155996, 0.6179875628473142, 0.15077143882310728, 0.07935297295579496, 0.40057949725788033, 0.20571350078175818, 0.0, 0.0, 0.032519736365611646, 0.0, 0.0031993857179421553, 0.018049369804720605, 0.07641576437506341, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.006564163217031342, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.11324786324786325, 0.0, nan, 0.41200126163065764, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7620084916277325, 0.6381403264630661, 0.3058357649020425, 0.9651913013877564, 0.26596567447240643, 0.8704422225538477, 0.03665970449446894, 0.679269328802039, 0.7487756198049007, 0.21432214628175048, 0.16129449002002455, 0.9380014174344437, 0.2166373097116211, nan, 0.0, 0.03620237855271115, nan, 0.003545470661230278, 0.024439073219561024, 0.2802317546377505, nan, nan, nan, 0.0, nan, nan, nan, 0.007278091959675437, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2329369183040331, nan, nan, 0.536420101637493, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0698 | 610.0 | 6100 | 3.6660 | 0.0762 | 0.2103 | 0.4283 | [0.5104935762830499, 0.6036304951371975, 0.29713498547813577, 0.35243516468228664, 0.18068930160177668, 0.4222921468905882, 0.06034276727135357, 0.14988041638398333, 0.6027946127946128, 0.184774423221743, 0.07433460621895546, 0.4023887816365064, 0.22188441325386143, 0.0, 0.0, 0.03401702647365207, 0.0, 0.0032525742935567295, 0.020454084361570667, 0.07814396558346777, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0069433440412567165, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10457831325301205, 0.0, nan, 0.41350798198935734, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7710415032568291, 0.6664613735519783, 0.30420314019730443, 0.9626421995107625, 0.2678386484996032, 0.8611898721583843, 0.060547690879554424, 0.7146918502058689, 0.7175148488661959, 0.2753238804447618, 0.16723725857502744, 0.8961682022206473, 0.23567678417957946, nan, 0.0, 0.03817778673654505, nan, 0.003634107427761035, 0.027075856344149028, 0.2703449742199543, nan, nan, nan, 0.0, nan, nan, nan, 0.00790099172199, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.22440537745604963, nan, nan, 0.5185565422719572, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0235 | 620.0 | 6200 | 3.7796 | 0.0772 | 0.2052 | 0.4227 | [0.49988190885599726, 0.5697725067385445, 0.30120161850655985, 0.3638983167009842, 0.17974541830312243, 0.4112418957391325, 0.018236189563645075, 0.1473463687150838, 0.5982861140258832, 0.11266889437745169, 0.07351937984496124, 0.39174737691686845, 0.23493660855784468, 0.0, 0.0, 0.030201585176070356, 0.0, 0.003219038772601336, 0.02520233226002959, 0.07539380081300813, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0030925851940634204, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10229163999487902, 0.0, nan, 0.4153388570214574, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7623921495072128, 0.6211260719121409, 0.3071904960400167, 0.9667666780136005, 0.2656193255948458, 0.8550713823807392, 0.01825636265181403, 0.7722371086857068, 0.7129539825420618, 0.1582168723860043, 0.15315548091208578, 0.9173163241200094, 0.25229111888290406, nan, 0.0, 0.03364241080427333, nan, 0.00356319801453643, 0.035352425596328035, 0.2523786743209483, nan, nan, nan, 0.0, nan, nan, nan, 0.003425948692730104, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20656670113753878, nan, nan, 0.5017709563164109, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0299 | 630.0 | 6300 | 3.8038 | 0.0712 | 0.2030 | 0.4158 | [0.4953845008047643, 0.5238679208359969, 0.29448047991116766, 0.3596473894847982, 0.1784929623865405, 0.419049371358479, 0.020620466121315018, 0.14468204764264003, 0.5759730814861487, 0.12710098624809002, 0.06578532304780477, 0.3840914229338376, 0.24010454089649347, 0.0, 0.0, 0.0319536544487357, 0.0, 0.0022659588242910787, 0.019069674874708255, 0.07944365512600014, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0060591902585741195, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10411502231036192, 0.0, 0.0, 0.4143185056197562, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7655722470415715, 0.5663504212426659, 0.30216062248158954, 0.9653812965587125, 0.26417004048583, 0.8496990498930508, 0.02066991568035894, 0.7061629958826221, 0.6887789863494633, 0.1866775476894828, 0.14818164201278988, 0.8829293645168911, 0.25875815825526083, nan, 0.0, 0.03535577504535376, nan, 0.0024818294628611948, 0.025733050123294025, 0.282357943974911, nan, nan, nan, 0.0, nan, nan, nan, 0.006802721088435374, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2171664943123061, nan, nan, 0.5373954109131974, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0246 | 640.0 | 6400 | 3.7962 | 0.0713 | 0.2062 | 0.4192 | [0.49759388977385766, 0.5662343396912527, 0.3005323101449667, 0.3621106612846133, 0.17973588342440802, 0.4150895937140667, 0.02325904647176018, 0.1473192467105472, 0.5520546182345252, 0.18366523143164692, 0.06832279250790561, 0.38993779418252167, 0.23115957290395903, 0.0, 0.0, 0.031141304347826086, 0.0, 0.003610280552197977, 0.022773806530406983, 0.07817908084868615, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0034068312083284594, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09308996088657105, 0.0, 0.0, 0.4123529652854625, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7623452579886096, 0.6149343688882203, 0.3090801722940114, 0.9637030058819338, 0.2654444698508346, 0.861936029448341, 0.023300069621721976, 0.7584471603163192, 0.6533260658720553, 0.2784861777007039, 0.14514566242490795, 0.9104370422867942, 0.24750040418996078, nan, 0.0, 0.034650272122555933, nan, 0.004024109200496366, 0.030957787055348032, 0.2677403922819327, nan, nan, nan, 0.0, nan, nan, nan, 0.0038193590689287764, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18459152016546018, nan, nan, 0.5164519275191212, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0997 | 650.0 | 6500 | 3.8232 | 0.0742 | 0.2032 | 0.4233 | [0.5045426728804783, 0.5946193759024478, 0.29827082500152513, 0.3628505033935979, 0.17875316460054305, 0.4605787727564192, 0.023315541115494703, 0.140518647311741, 0.5598503655840843, 0.1450291585670647, 0.06762743228845398, 0.3930294993146296, 0.23362907978782713, 0.0, 0.0, 0.02345779823324178, 0.0, 0.0031516974370460436, 0.01882654971385784, 0.07653787772092077, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00452589075742113, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09655996812325673, 0.0, nan, 0.41728425059733526, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7767196398731371, 0.6524559951857981, 0.3057037654578296, 0.9666400145662964, 0.26187674015091394, 0.8217927672486693, 0.023315541115494703, 0.6813606953793869, 0.6597786096170957, 0.2130980312149342, 0.1558038886376849, 0.9157949444838176, 0.24961070124830878, nan, 0.0, 0.025478734126184236, nan, 0.003492288601311824, 0.025781879440416027, 0.259979801201297, nan, nan, nan, 0.0, nan, nan, nan, 0.005015982296533071, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18795243019648397, nan, nan, 0.5289256198347108, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0255 | 660.0 | 6600 | 3.7623 | 0.0758 | 0.2102 | 0.4279 | [0.5063419773882061, 0.6092111995872679, 0.30079701752342053, 0.3578716514571681, 0.17891515788401624, 0.4536758355361302, 0.031021234819848024, 0.14639853245813264, 0.5900038168979526, 0.18642155561454737, 0.07354116706634692, 0.3959881099061073, 0.21661470381172684, 0.0, 0.0, 0.030301376839271747, 0.0, 0.0029966988237556487, 0.019386867661979997, 0.07775841869686684, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004620884071027929, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09623430962343096, 0.0, nan, 0.4232421713624721, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7784567574941172, 0.6772886264480217, 0.30493261081005973, 0.9623967890816109, 0.25991633825171157, 0.8622842361836541, 0.031043552254970217, 0.7667145938173976, 0.693852849940284, 0.28654493522391106, 0.16045475098507847, 0.9202078903850697, 0.229090615134574, nan, 0.0, 0.033581939125176374, nan, 0.00331501506825031, 0.026123684660270027, 0.2751820549619944, nan, nan, nan, 0.0, nan, nan, nan, 0.005212687484632407, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1843329886246122, nan, nan, 0.5258970278733125, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0482 | 670.0 | 6700 | 3.7754 | 0.0763 | 0.2085 | 0.4224 | [0.502063126826474, 0.5762712593631549, 0.2965278295714199, 0.3573437766649209, 0.18001334782004588, 0.43666229840017645, 0.030818582360670945, 0.14851926074370964, 0.575, 0.195733834182989, 0.0665501375215279, 0.39583591318489897, 0.2284259299907809, 0.0, 0.0, 0.03055752485866463, 0.0, 0.002670439191874235, 0.022096015618067264, 0.07379306388227533, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.005629582218084487, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10678655006774233, 0.0, 0.0, 0.42292665243484917, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.771544521365481, 0.6297060704077028, 0.30383493122134225, 0.9614468132268305, 0.26574710479239244, 0.8370392478734517, 0.03084242283592481, 0.7192667145938174, 0.6775170932292376, 0.2976639804141589, 0.16723725857502744, 0.9061752893928656, 0.2435180694185621, nan, 0.0, 0.033884297520661154, nan, 0.002978195355433434, 0.03039624990844503, 0.273481103492266, nan, nan, nan, 0.0, nan, nan, nan, 0.00634374231620359, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.22414684591520165, nan, nan, 0.5389867049946101, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0399 | 680.0 | 6800 | 3.7680 | 0.0753 | 0.2100 | 0.4267 | [0.5075098825294796, 0.592645709285699, 0.2973987554830154, 0.3531670592071362, 0.17858826943305886, 0.45031788556119795, 0.03284076058799422, 0.14837098645274188, 0.5890219966159053, 0.21835641221905225, 0.07855248791140228, 0.39547628904856963, 0.23564132301891905, 0.0, 0.0, 0.029366923188616233, 0.0, 0.0027090770015298317, 0.02476146397037881, 0.07876219664748561, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0038490339973041784, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.106944632462434, 0.0, nan, 0.4216417910447761, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7727743580124816, 0.6527756882804273, 0.30381408920383496, 0.9603385080629201, 0.264893001735107, 0.8403223399492613, 0.0328537170263789, 0.7397555715312725, 0.6975800957060526, 0.32010608997245743, 0.16265099153801435, 0.9116938341601701, 0.25155506769118186, nan, 0.0, 0.03261439225962508, nan, 0.0030136500620457367, 0.03396079005835104, 0.2677403922819327, nan, nan, nan, 0.0, nan, nan, nan, 0.004212769445127449, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20423991726990692, nan, nan, 0.5220471228376367, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0207 | 690.0 | 6900 | 3.8334 | 0.0744 | 0.2065 | 0.4211 | [0.5025129002071587, 0.5639846015743681, 0.30607679625664425, 0.35261096795795666, 0.17863105940015803, 0.42825162581290643, 0.028071341873535938, 0.1372938411152276, 0.5975412185902009, 0.1879914700786352, 0.07025469981807156, 0.3924673021863613, 0.22558070873995578, 0.0, 0.0, 0.024576784109458395, 0.0, 0.0026239958788776383, 0.023871330804000945, 0.07872541258145058, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.005274945482407025, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09970201142289546, 0.0, 0.0, 0.40686938323279576, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.765721447328036, 0.6143843087106966, 0.31083784910379325, 0.9587314655752499, 0.26226007774355387, 0.8516888026662687, 0.02808849694437998, 0.6890399320305862, 0.7184125940828971, 0.2877690502907273, 0.1496673341515406, 0.9119111741081974, 0.24055684612700926, nan, 0.0, 0.027010683329973795, nan, 0.002889558588902677, 0.03210527600771503, 0.2748631265614203, nan, nan, nan, 0.0, nan, nan, nan, 0.005868371444963528, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2076008273009307, nan, nan, 0.508957445716339, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0981 | 700.0 | 7000 | 3.8103 | 0.0729 | 0.2052 | 0.4196 | [0.5052781799955447, 0.5688994107202594, 0.3006372975323405, 0.3543077620098505, 0.17862790386752048, 0.42596050041277705, 0.03018208257916802, 0.14188269189025635, 0.5554154976348283, 0.19005847953216373, 0.07584514044245588, 0.3954363928883517, 0.21972074207616635, 0.0, 0.0, 0.02485146147195703, 0.0, 0.002422550576759558, 0.0232024253894818, 0.07840461495518407, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004563555503880476, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0966557911908646, 0.0, 0.0, 0.4179904928289928, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7735203594448045, 0.6236272002407101, 0.3074128108934278, 0.9595864438445523, 0.2605754099022153, 0.8341541063522857, 0.03018488435058405, 0.7048558917717797, 0.6569250623211523, 0.29174742425788025, 0.15767715263871843, 0.9150956768249469, 0.23386005667168713, nan, 0.0, 0.027232412819995968, nan, 0.00267683034922886, 0.03232500793476403, 0.26875033221708394, nan, nan, nan, 0.0, nan, nan, nan, 0.005147119088599295, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.18381592554291623, nan, nan, 0.5281043067604333, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0162 | 710.0 | 7100 | 3.8549 | 0.0729 | 0.2066 | 0.4200 | [0.5054046672448164, 0.5403135498320268, 0.2965074403750764, 0.346386244138127, 0.18044234334612938, 0.4219967135384654, 0.02872421814022859, 0.1440656481419708, 0.5976206814642773, 0.19320551631348806, 0.06896452264205365, 0.39534362735743384, 0.23146301343737832, 0.0, 0.0, 0.02660140065266014, 0.0, 0.0025928164667705913, 0.024119842085338815, 0.07829170568459189, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.005161346921468571, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09643213572854291, 0.0, 0.0, 0.40938446818689206, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7651736691334448, 0.5852499247780953, 0.3031610393219397, 0.9648271439767573, 0.26901556216121697, 0.8367407849574691, 0.0287537711766071, 0.7056401542382851, 0.725995334931106, 0.2929715393246965, 0.1544473871196951, 0.9005055516182376, 0.2478577932079068, nan, 0.0, 0.029248135456561176, nan, 0.002889558588902677, 0.03266681315461803, 0.27108914048796046, nan, nan, nan, 0.0, nan, nan, nan, 0.005786410949922137, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.1998448810754912, nan, nan, 0.5271289974847287, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0558 | 720.0 | 7200 | 3.8465 | 0.0771 | 0.2099 | 0.4250 | [0.5075710780285729, 0.589164891151342, 0.2986792109010657, 0.35261278391633527, 0.1795206640502644, 0.4245596376446905, 0.0264800877775889, 0.14369941576888032, 0.5946476025998889, 0.21131900212314225, 0.0766879954159106, 0.39588146130713553, 0.2312921372953189, 0.0, 0.0, 0.029099567414300774, 0.0, 0.0026899691210645273, 0.026277009392255385, 0.07749732845682636, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.005446464881194446, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10547770700636942, 0.0, nan, 0.4241235596960039, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7636198547215496, 0.6513182638784414, 0.3074544949284424, 0.9630380227835876, 0.2620986724413897, 0.8392777197433219, 0.026510404579562157, 0.7008692242337102, 0.7120642529076525, 0.3249005406508212, 0.15561010270654352, 0.9230238601464682, 0.2468196632033969, nan, 0.0, 0.03227171941140899, nan, 0.002995922708739585, 0.036133694670280034, 0.27369372242598206, nan, nan, nan, 0.0, nan, nan, nan, 0.006147037128104254, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2140641158221303, nan, nan, 0.5328268569375288, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0827 | 730.0 | 7300 | 3.8448 | 0.0725 | 0.2066 | 0.4187 | [0.4974839378238342, 0.5354029984568034, 0.30146292271107444, 0.34764077578132385, 0.17841068499284896, 0.40816792974858657, 0.034615503301857684, 0.1394486065328139, 0.6045761043517855, 0.17785733585517427, 0.08029148504309054, 0.39771993526259736, 0.2183205741053565, 0.0, 0.0, 0.028753231987970588, 0.0, 0.0024086647268024572, 0.02285216339741424, 0.07947421315424251, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004671078240560529, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10078203834510595, 0.0, 0.0, 0.40904973649538867, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.767430856324387, 0.5839382428163081, 0.3073711268584132, 0.9608055795248538, 0.262586250958344, 0.8439287668507188, 0.03471029627910575, 0.7603097836742696, 0.723710893977893, 0.26859124757727226, 0.15586848394806538, 0.8963571934798016, 0.23259643121537793, nan, 0.0, 0.03160653094134247, nan, 0.0026413756426165572, 0.03137283625088503, 0.2648168819433371, nan, nan, nan, 0.0, nan, nan, nan, 0.005311040078682075, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20656670113753878, nan, nan, 0.5099840870591859, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0443 | 740.0 | 7400 | 3.8271 | 0.0776 | 0.2101 | 0.4272 | [0.5087987639360839, 0.5899426528762505, 0.29930417897164124, 0.35696465879739414, 0.1797385097022565, 0.43035002386819593, 0.021584291647010603, 0.13956567273302112, 0.6092215465619798, 0.207703359102415, 0.0814789802464967, 0.3991113529934976, 0.23328796733648227, 0.0, 0.0, 0.027861240098333788, 0.0, 0.002589928057553957, 0.024752655317910665, 0.0764445593770652, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.00471231601605678, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.11380400421496312, 0.0, nan, 0.42583236706973665, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7678635371551342, 0.6548489920264781, 0.30839933305543976, 0.9616843071905256, 0.26390775686981316, 0.8296274187932149, 0.021590469559836, 0.6924057251160055, 0.7363915451637985, 0.30970111190451904, 0.15586848394806538, 0.9285896527285613, 0.24795990435589138, nan, 0.0, 0.03084055633944769, nan, 0.0028718312355965254, 0.033228350301521034, 0.28283633657577206, nan, nan, nan, 0.0, nan, nan, nan, 0.005311040078682075, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2233712512926577, nan, nan, 0.5278476464247215, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0309 | 750.0 | 7500 | 3.8323 | 0.0726 | 0.2101 | 0.4226 | [0.5045083630764924, 0.5632677634359936, 0.29661806399351964, 0.34995296331138287, 0.17958512910070107, 0.4062931732647325, 0.023730437066448312, 0.14477466989960316, 0.6080041802883327, 0.2106049775813893, 0.08098320728111781, 0.3956758707518424, 0.2201100028863731, 0.0, 0.0, 0.030137876272287593, 0.0, 0.0027136826736915607, 0.02446944010907199, 0.07926354779468983, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004409338453227432, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10452120904241809, 0.0, 0.0, 0.4149140343862455, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7682663779285884, 0.620199902211524, 0.30527303042934556, 0.9600455988410295, 0.2638371420501163, 0.8484803263194548, 0.023764214434903688, 0.7355728383765767, 0.7274782176551216, 0.33061307762929715, 0.16323234933143854, 0.9117127332860855, 0.23360477880172567, nan, 0.0, 0.03366256803063898, nan, 0.003031377415351888, 0.033301594277204034, 0.27071705735395735, nan, nan, nan, 0.0, nan, nan, nan, 0.004950413900499959, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2127714581178904, nan, nan, 0.5326728607361019, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0337 | 760.0 | 7600 | 3.8178 | 0.0737 | 0.2078 | 0.4229 | [0.5014347491657869, 0.5553838804222921, 0.3034435021034664, 0.3579621185030632, 0.17969277071236303, 0.41706279709782335, 0.02410392741773443, 0.14268762401276117, 0.6100376077478616, 0.1927638592750533, 0.07909133364467276, 0.3939000065397947, 0.23343269345415524, 0.0, 0.0, 0.026169548162138602, 0.0, 0.0027546016177313545, 0.025574075398812506, 0.07690053066765395, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0037852215596866595, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10240733664501338, 0.0, 0.0, 0.4130232181206694, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7683857381577601, 0.6066834662253648, 0.3136932055022926, 0.958873961953467, 0.264210391811371, 0.8292294682385714, 0.0241277945385627, 0.7190706489771911, 0.7346201014772719, 0.295113740691625, 0.15315548091208578, 0.9106543822348216, 0.24989150690526638, nan, 0.0, 0.029046563192904655, nan, 0.0030668321219641907, 0.034912961742230034, 0.2649763461436241, nan, nan, nan, 0.0, nan, nan, nan, 0.004245553643144004, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.2078593588417787, nan, nan, 0.5232277603819105, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0635 | 770.0 | 7700 | 3.8449 | 0.0726 | 0.2084 | 0.4233 | [0.49918442748472447, 0.5607539650159545, 0.30449444301203676, 0.3557284382625385, 0.17941550365856182, 0.41509433962264153, 0.03296304587807533, 0.14735331485711653, 0.6091311499525255, 0.18503118503118504, 0.07787058402938023, 0.39652961862083463, 0.22788849059681482, 0.0, 0.0, 0.027110556850837717, 0.0, 0.002732022175712163, 0.024375870566805957, 0.07680547911698159, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004103612618968769, 0.0, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10246753246753247, 0.0, 0.0, 0.41513391416585993, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7677441769259625, 0.6138624567474048, 0.3117757398916215, 0.9625076195980019, 0.26244165870848857, 0.8448490275083321, 0.03303163920476522, 0.731455460427423, 0.7302035156343932, 0.2814444557788432, 0.15477036367159744, 0.9089062130876447, 0.24370527318986718, nan, 0.0, 0.030235839548478128, nan, 0.003031377415351888, 0.03332600893576503, 0.26704938074735557, nan, nan, nan, 0.0, nan, nan, nan, 0.004671748217359233, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20398138572905894, nan, nan, 0.5283096350290026, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0498 | 780.0 | 7800 | 3.8291 | 0.0735 | 0.2087 | 0.4229 | [0.5022543260390749, 0.5590297908158226, 0.3024086062452399, 0.35384731930317326, 0.17970538083470872, 0.4098139400366336, 0.023894899536321484, 0.14839261949187113, 0.6052715846775808, 0.18922623535055857, 0.07586686914705137, 0.39468503136262684, 0.23252584662407777, 0.0, 0.0, 0.029775240297752404, 0.0, 0.0026618211957474616, 0.023192630177408326, 0.07676828352605497, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004621764045267371, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10620915032679738, 0.0, 0.0, 0.4127314166899916, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.7650223374143164, 0.6119960132390552, 0.30895512018896765, 0.9614151473650044, 0.26507794531050344, 0.845843903894941, 0.023918929372630928, 0.7250833278870662, 0.7272457657686543, 0.2816484749566459, 0.15489955429235838, 0.9168627450980392, 0.24793437656889525, nan, 0.0, 0.033219109050594636, nan, 0.0029604680021272822, 0.031470494885129034, 0.2717269972891086, nan, nan, nan, 0.0, nan, nan, nan, 0.005261863781657241, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.21845915201654603, nan, nan, 0.529849597043273, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0449 | 790.0 | 7900 | 3.8492 | 0.0732 | 0.2075 | 0.4223 | [0.5014079093590461, 0.5596447135921497, 0.3024270818415543, 0.36038014519929984, 0.17921434098176564, 0.41699855735440994, 0.021078749652165846, 0.14494021580635755, 0.6073220366303318, 0.1779608650875386, 0.07636588763349327, 0.3933589124924219, 0.22892724126105576, 0.0, 0.0, 0.02855013993384945, 0.0, 0.0026251340579130185, 0.02343974175035868, 0.07456745880210322, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0045611189945266575, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.09872448979591837, 0.0, 0.0, 0.41213113444874816, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.767430856324387, 0.6134769444862344, 0.30809365013199946, 0.96002184944466, 0.2640288108464363, 0.8411431129682136, 0.02109538175910884, 0.7308345859747729, 0.7248170443341857, 0.26440885443231665, 0.15935663070861056, 0.9135553980628396, 0.2442541206102843, nan, 0.0, 0.03166700262043943, nan, 0.0029072859422088282, 0.03190995873922703, 0.26986658161909316, nan, nan, nan, 0.0, nan, nan, nan, 0.0051635111876075735, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20010341261633918, nan, nan, 0.5297982649761306, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0342 | 800.0 | 8000 | 3.8271 | 0.0744 | 0.2072 | 0.4220 | [0.501232010557704, 0.5516945572424972, 0.30339537001175504, 0.3561498486096415, 0.17926212776450925, 0.41521929122712, 0.022357896363847135, 0.14456232554795104, 0.6096779598538923, 0.1801962533452275, 0.07653157994562812, 0.39338028513437506, 0.23253295537338806, 0.0, 0.0, 0.028717201166180758, 0.0, 0.0026822920824485495, 0.02444039190445541, 0.07505241709240174, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.004311856875584659, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.10147744945567652, 0.0, 0.0, 0.41222728555873694, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.767430856324387, 0.601685910937265, 0.3102056412394053, 0.9619138846887642, 0.2639985473522805, 0.8389792568273392, 0.022379515742244914, 0.7226651852820077, 0.7291534743541445, 0.2678771804549628, 0.1563852464311091, 0.9120623671155209, 0.24849598788281044, nan, 0.0, 0.03176778875226769, nan, 0.002978195355433434, 0.033374838252887035, 0.27018551001966723, nan, nan, nan, 0.0, nan, nan, nan, 0.004835669207442013, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.20243019648397104, nan, nan, 0.5247163903290385, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
### Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1
- Datasets 2.12.0
- Tokenizers 0.13.3
|
bestemoon/ppo-LunarLander-v2 | bestemoon | 2023-06-08T20:01:50Z | 0 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-03-22T13:53:56Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 253.68 +/- 33.16
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Dantenator/Reinforce-Cartpole-v1 | Dantenator | 2023-06-08T14:28:06Z | 0 | 0 | null | [
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-08T14:27:58Z | ---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Cartpole-v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
Parthi/a2c-AntBulletEnv-v0 | Parthi | 2023-06-08T12:45:48Z | 1 | 0 | stable-baselines3 | [
"stable-baselines3",
"AntBulletEnv-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-08T12:35:35Z | ---
library_name: stable-baselines3
tags:
- AntBulletEnv-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: AntBulletEnv-v0
type: AntBulletEnv-v0
metrics:
- type: mean_reward
value: 1896.18 +/- 226.38
name: mean_reward
verified: false
---
# **A2C** Agent playing **AntBulletEnv-v0**
This is a trained model of a **A2C** agent playing **AntBulletEnv-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
VaianiLorenzo/ViPER-V | VaianiLorenzo | 2023-06-08T09:35:13Z | 0 | 0 | null | [
"region:us"
] | null | 2023-06-08T09:18:43Z | # ViPER-VATF
## (Vision Audio Text FAU)
This repository contains the checkpoints for the ViPER model.
It is a Perceiver-based model finetuned on the concatenation of visual, acoustic, textual and FAU-related features.
For more information on how to use this model please refer to the following [repository](https://github.com/VaianiLorenzo/ViPER)
If you find this useful please cite:
```
@inproceedings{vaiani2022viper,
title={ViPER: Video-based Perceiver for Emotion Recognition},
author={Vaiani, Lorenzo and La Quatra, Moreno and Cagliero, Luca and Garza, Paolo},
booktitle={Proceedings of the 3rd International on Multimodal Sentiment Analysis Workshop and Challenge},
pages={67--73},
year={2022}
}
```
For any other question feel free to contact me at [email protected]
|
SiraH/ppo-LunarLander-v2 | SiraH | 2023-06-07T09:58:55Z | 2 | 0 | stable-baselines3 | [
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] | reinforcement-learning | 2023-06-07T09:58:37Z | ---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 257.01 +/- 13.60
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
ChristianMDahl/segFormer-b3-version2-horizontal | ChristianMDahl | 2023-06-06T17:39:51Z | 31 | 0 | transformers | [
"transformers",
"tf",
"segformer",
"generated_from_keras_callback",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-06-06T05:50:31Z | ---
license: other
tags:
- generated_from_keras_callback
model-index:
- name: ChristianMDahl/segFormer-b3-version2-horizontal
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# ChristianMDahl/segFormer-b3-version2-horizontal
This model is a fine-tuned version of [ChristianMDahl/segFormer-b3-horizontal](https://huggingface.co/ChristianMDahl/segFormer-b3-horizontal) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1370
- Validation Loss: 0.1425
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 6e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.1527 | 0.1461 | 0 |
| 0.1423 | 0.1400 | 1 |
| 0.1370 | 0.1425 | 2 |
### Framework versions
- Transformers 4.28.1
- TensorFlow 2.10.1
- Datasets 2.12.0
- Tokenizers 0.13.0.dev0
|
mnavas/roberta-finetuned-token-reqadjinsiders | mnavas | 2023-06-06T12:44:35Z | 24 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"roberta",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2023-06-05T13:45:14Z | ---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: roberta-finetuned-token-reqadjinsiders
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-finetuned-token-reqadjinsiders
This model is a fine-tuned version of [PlanTL-GOB-ES/roberta-base-bne](https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0597
- F1 B-cadj: 0
- F1 I-cadj: 0.4734
- F1 B-peso: 0
- F1 I-peso: 0
- Macro F1: 0.1183
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 200
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 B-cadj | F1 I-cadj | F1 B-peso | F1 I-peso | Macro F1 |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:---------:|:---------:|:---------:|:--------:|
| 0.2188 | 1.0 | 10 | 0.0758 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0711 | 2.0 | 20 | 0.0678 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0656 | 3.0 | 30 | 0.0639 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0634 | 4.0 | 40 | 0.0629 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0624 | 5.0 | 50 | 0.0615 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0602 | 6.0 | 60 | 0.0598 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0573 | 7.0 | 70 | 0.0628 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0537 | 8.0 | 80 | 0.0531 | 0 | 0.1373 | 0 | 0 | 0.0343 |
| 0.0595 | 9.0 | 90 | 0.0565 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0435 | 10.0 | 100 | 0.0659 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0507 | 11.0 | 110 | 0.0558 | 0 | 0.4549 | 0 | 0 | 0.1137 |
| 0.0313 | 12.0 | 120 | 0.0561 | 0 | 0.3955 | 0 | 0 | 0.0989 |
| 0.0278 | 13.0 | 130 | 0.0629 | 0 | 0.3756 | 0 | 0 | 0.0939 |
| 0.0248 | 14.0 | 140 | 0.0634 | 0 | 0.3726 | 0 | 0 | 0.0932 |
| 0.0282 | 15.0 | 150 | 0.0607 | 0 | 0.3303 | 0 | 0 | 0.0826 |
| 0.0302 | 16.0 | 160 | 0.0628 | 0 | 0.4428 | 0 | 0 | 0.1107 |
| 0.0205 | 17.0 | 170 | 0.0551 | 0 | 0.3855 | 0 | 0 | 0.0964 |
| 0.0186 | 18.0 | 180 | 0.0627 | 0 | 0.4419 | 0 | 0 | 0.1105 |
| 0.0171 | 19.0 | 190 | 0.0721 | 0 | 0.3524 | 0 | 0 | 0.0881 |
| 0.0152 | 20.0 | 200 | 0.0574 | 0 | 0.3281 | 0 | 0 | 0.0820 |
| 0.0152 | 21.0 | 210 | 0.0597 | 0 | 0.1515 | 0 | 0 | 0.0379 |
| 0.0157 | 22.0 | 220 | 0.0675 | 0 | 0.3633 | 0 | 0 | 0.0908 |
| 0.0135 | 23.0 | 230 | 0.0728 | 0 | 0.3135 | 0 | 0 | 0.0784 |
| 0.0128 | 24.0 | 240 | 0.0703 | 0 | 0.4114 | 0 | 0 | 0.1028 |
| 0.0126 | 25.0 | 250 | 0.0605 | 0 | 0.3695 | 0 | 0 | 0.0924 |
| 0.0228 | 26.0 | 260 | 0.0490 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0266 | 27.0 | 270 | 0.0819 | 0 | 0.2214 | 0 | 0 | 0.0554 |
| 0.0512 | 28.0 | 280 | 0.0598 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0625 | 29.0 | 290 | 0.0595 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0601 | 30.0 | 300 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 31.0 | 310 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 32.0 | 320 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 33.0 | 330 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0599 | 34.0 | 340 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0601 | 35.0 | 350 | 0.0599 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 36.0 | 360 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 37.0 | 370 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 38.0 | 380 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 39.0 | 390 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 40.0 | 400 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 41.0 | 410 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 42.0 | 420 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 43.0 | 430 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 44.0 | 440 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 45.0 | 450 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 46.0 | 460 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 47.0 | 470 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 48.0 | 480 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 49.0 | 490 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 50.0 | 500 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 51.0 | 510 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 52.0 | 520 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 53.0 | 530 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 54.0 | 540 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 55.0 | 550 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0601 | 56.0 | 560 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 57.0 | 570 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 58.0 | 580 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 59.0 | 590 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 60.0 | 600 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 61.0 | 610 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 62.0 | 620 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 63.0 | 630 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 64.0 | 640 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 65.0 | 650 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0581 | 66.0 | 660 | 0.0580 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0503 | 67.0 | 670 | 0.0608 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0338 | 68.0 | 680 | 0.0653 | 0 | 0.2763 | 0 | 0 | 0.0691 |
| 0.0311 | 69.0 | 690 | 0.0727 | 0 | 0.3019 | 0 | 0 | 0.0755 |
| 0.0305 | 70.0 | 700 | 0.0683 | 0 | 0.3515 | 0 | 0 | 0.0879 |
| 0.0236 | 71.0 | 710 | 0.0757 | 0 | 0.2626 | 0 | 0 | 0.0656 |
| 0.0261 | 72.0 | 720 | 0.0597 | 0 | 0.4734 | 0 | 0 | 0.1183 |
| 0.0242 | 73.0 | 730 | 0.0621 | 0 | 0.4411 | 0 | 0 | 0.1103 |
| 0.0293 | 74.0 | 740 | 0.0722 | 0 | 0.3305 | 0 | 0 | 0.0826 |
| 0.0465 | 75.0 | 750 | 0.0600 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0598 | 76.0 | 760 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 77.0 | 770 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 78.0 | 780 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0602 | 79.0 | 790 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0598 | 80.0 | 800 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 81.0 | 810 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 82.0 | 820 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 83.0 | 830 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 84.0 | 840 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 85.0 | 850 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 86.0 | 860 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 87.0 | 870 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 88.0 | 880 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.06 | 89.0 | 890 | 0.0595 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 90.0 | 900 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 91.0 | 910 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 92.0 | 920 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 93.0 | 930 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 94.0 | 940 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 95.0 | 950 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 96.0 | 960 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 97.0 | 970 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 98.0 | 980 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 99.0 | 990 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 100.0 | 1000 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 101.0 | 1010 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 102.0 | 1020 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 103.0 | 1030 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 104.0 | 1040 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 105.0 | 1050 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 106.0 | 1060 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 107.0 | 1070 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0599 | 108.0 | 1080 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0598 | 109.0 | 1090 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 110.0 | 1100 | 0.0594 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 111.0 | 1110 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 112.0 | 1120 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 113.0 | 1130 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 114.0 | 1140 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 115.0 | 1150 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 116.0 | 1160 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 117.0 | 1170 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 118.0 | 1180 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 119.0 | 1190 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 120.0 | 1200 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 121.0 | 1210 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 122.0 | 1220 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 123.0 | 1230 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 124.0 | 1240 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 125.0 | 1250 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 126.0 | 1260 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 127.0 | 1270 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 128.0 | 1280 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 129.0 | 1290 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 130.0 | 1300 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 131.0 | 1310 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 132.0 | 1320 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 133.0 | 1330 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 134.0 | 1340 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 135.0 | 1350 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 136.0 | 1360 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0596 | 137.0 | 1370 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 138.0 | 1380 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 139.0 | 1390 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 140.0 | 1400 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 141.0 | 1410 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 142.0 | 1420 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 143.0 | 1430 | 0.0593 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 144.0 | 1440 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 145.0 | 1450 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 146.0 | 1460 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 147.0 | 1470 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 148.0 | 1480 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 149.0 | 1490 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 150.0 | 1500 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 151.0 | 1510 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 152.0 | 1520 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 153.0 | 1530 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 154.0 | 1540 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 155.0 | 1550 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 156.0 | 1560 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 157.0 | 1570 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 158.0 | 1580 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 159.0 | 1590 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 160.0 | 1600 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 161.0 | 1610 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 162.0 | 1620 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0594 | 163.0 | 1630 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 164.0 | 1640 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 165.0 | 1650 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 166.0 | 1660 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 167.0 | 1670 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 168.0 | 1680 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0597 | 169.0 | 1690 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 170.0 | 1700 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 171.0 | 1710 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 172.0 | 1720 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 173.0 | 1730 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 174.0 | 1740 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 175.0 | 1750 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0589 | 176.0 | 1760 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 177.0 | 1770 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 178.0 | 1780 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 179.0 | 1790 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 180.0 | 1800 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 181.0 | 1810 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 182.0 | 1820 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 183.0 | 1830 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 184.0 | 1840 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 185.0 | 1850 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 186.0 | 1860 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 187.0 | 1870 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0598 | 188.0 | 1880 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 189.0 | 1890 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0591 | 190.0 | 1900 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 191.0 | 1910 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0599 | 192.0 | 1920 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 193.0 | 1930 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0595 | 194.0 | 1940 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0593 | 195.0 | 1950 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 196.0 | 1960 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.059 | 197.0 | 1970 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0589 | 198.0 | 1980 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0592 | 199.0 | 1990 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
| 0.0589 | 200.0 | 2000 | 0.0592 | 0 | 0 | 0 | 0 | 0.0 |
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
|
kejolong/xinyan | kejolong | 2023-06-05T14:10:56Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-06-05T14:07:25Z | ---
license: creativeml-openrail-m
---
|
facebook/fasttext-ru-vectors | facebook | 2023-06-03T22:15:42Z | 98 | 3 | fasttext | [
"fasttext",
"feature-extraction",
"ru",
"arxiv:1607.04606",
"arxiv:1802.06893",
"arxiv:1607.01759",
"arxiv:1612.03651",
"license:cc-by-sa-3.0",
"region:us"
] | feature-extraction | 2023-03-21T01:44:56Z |
---
license: cc-by-sa-3.0
tags:
- feature-extraction
library_name: fasttext
language: ru
widget:
- text: apple
example_title: apple
---
# fastText (Russian)
fastText is an open-source, free, lightweight library that allows users to learn text representations and text classifiers. It works on standard, generic hardware. Models can later be reduced in size to even fit on mobile devices. It was introduced in [this paper](https://arxiv.org/abs/1607.04606). The official website can be found [here](https://fasttext.cc/).
## Model description
fastText is a library for efficient learning of word representations and sentence classification. fastText is designed to be simple to use for developers, domain experts, and students. It's dedicated to text classification and learning word representations, and was designed to allow for quick model iteration and refinement without specialized hardware. fastText models can be trained on more than a billion words on any multicore CPU in less than a few minutes.
It includes pre-trained models learned on Wikipedia and in over 157 different languages. fastText can be used as a command line, linked to a C++ application, or used as a library for use cases from experimentation and prototyping to production.
## Intended uses & limitations
You can use pre-trained word vectors for text classification or language identification. See the [tutorials](https://fasttext.cc/docs/en/supervised-tutorial.html) and [resources](https://fasttext.cc/docs/en/english-vectors.html) on its official website to look for tasks that interest you.
### How to use
Here is how to load and use a pre-trained vectors
```python
>>> import fasttext
>>> from huggingface_hub import hf_hub_download
>>> model_path = hf_hub_download(repo_id="facebook/fasttext-ru-vectors", filename="model.bin")
>>> model = fasttext.load_model(model_path)
>>> model.words
['the', 'of', 'and', 'to', 'in', 'a', 'that', 'is', ...]
>>> len(model.words)
145940
>>> model['bread']
array([ 4.89417791e-01, 1.60882145e-01, -2.25947708e-01, -2.94273376e-01,
-1.04577184e-01, 1.17962055e-01, 1.34821936e-01, -2.41778508e-01, ...])
```
Here is how to use this model to query nearest neighbors of an English word vector:
```python
>>> import fasttext
>>> from huggingface_hub import hf_hub_download
>>> model_path = hf_hub_download(repo_id="facebook/fasttext-en-nearest-neighbors", filename="model.bin")
>>> model = fasttext.load_model(model_path)
>>> model.get_nearest_neighbors("bread", k=5)
[(0.5641006231307983, 'butter'),
(0.48875734210014343, 'loaf'),
(0.4491206705570221, 'eat'),
(0.42444291710853577, 'food'),
(0.4229326844215393, 'cheese')]
```
Here is how to use this model to detect the language of a given text:
```python
>>> import fasttext
>>> from huggingface_hub import hf_hub_download
>>> model_path = hf_hub_download(repo_id="facebook/fasttext-language-identification", filename="model.bin")
>>> model = fasttext.load_model(model_path)
>>> model.predict("Hello, world!")
(('__label__eng_Latn',), array([0.81148803]))
>>> model.predict("Hello, world!", k=5)
(('__label__eng_Latn', '__label__vie_Latn', '__label__nld_Latn', '__label__pol_Latn', '__label__deu_Latn'),
array([0.61224753, 0.21323682, 0.09696738, 0.01359863, 0.01319415]))
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions.
Cosine similarity can be used to measure the similarity between two different word vectors. If two two vectors are identical, the cosine similarity will be 1. For two completely unrelated vectors, the value will be 0. If two vectors have an opposite relationship, the value will be -1.
```python
>>> import numpy as np
>>> def cosine_similarity(word1, word2):
>>> return np.dot(model[word1], model[word2]) / (np.linalg.norm(model[word1]) * np.linalg.norm(model[word2]))
>>> cosine_similarity("man", "boy")
0.061653383
>>> cosine_similarity("man", "ceo")
0.11989131
>>> cosine_similarity("woman", "ceo")
-0.08834904
```
## Training data
Pre-trained word vectors for 157 languages were trained on [Common Crawl](http://commoncrawl.org/) and [Wikipedia](https://www.wikipedia.org/) using fastText. These models were trained using CBOW with position-weights, in dimension 300, with character n-grams of length 5, a window of size 5 and 10 negatives. We also distribute three new word analogy datasets, for French, Hindi and Polish.
## Training procedure
### Tokenization
We used the [Stanford word segmenter](https://nlp.stanford.edu/software/segmenter.html) for Chinese, [Mecab](http://taku910.github.io/mecab/) for Japanese and [UETsegmenter](https://github.com/phongnt570/UETsegmenter) for Vietnamese. For languages using the Latin, Cyrillic, Hebrew or Greek scripts, we used the tokenizer from the [Europarl](https://www.statmt.org/europarl/) preprocessing tools. For the remaining languages, we used the ICU tokenizer.
More information about the training of these models can be found in the article [Learning Word Vectors for 157 Languages](https://arxiv.org/abs/1802.06893).
### License
The word vectors are distributed under the [*Creative Commons Attribution-Share-Alike License 3.0*](https://creativecommons.org/licenses/by-sa/3.0/).
### Evaluation datasets
The analogy evaluation datasets described in the paper are available here: [French](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-fr.txt), [Hindi](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-hi.txt), [Polish](https://dl.fbaipublicfiles.com/fasttext/word-analogies/questions-words-pl.txt).
### BibTeX entry and citation info
Please cite [1] if using this code for learning word representations or [2] if using for text classification.
[1] P. Bojanowski\*, E. Grave\*, A. Joulin, T. Mikolov, [*Enriching Word Vectors with Subword Information*](https://arxiv.org/abs/1607.04606)
```markup
@article{bojanowski2016enriching,
title={Enriching Word Vectors with Subword Information},
author={Bojanowski, Piotr and Grave, Edouard and Joulin, Armand and Mikolov, Tomas},
journal={arXiv preprint arXiv:1607.04606},
year={2016}
}
```
[2] A. Joulin, E. Grave, P. Bojanowski, T. Mikolov, [*Bag of Tricks for Efficient Text Classification*](https://arxiv.org/abs/1607.01759)
```markup
@article{joulin2016bag,
title={Bag of Tricks for Efficient Text Classification},
author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Mikolov, Tomas},
journal={arXiv preprint arXiv:1607.01759},
year={2016}
}
```
[3] A. Joulin, E. Grave, P. Bojanowski, M. Douze, H. Jégou, T. Mikolov, [*FastText.zip: Compressing text classification models*](https://arxiv.org/abs/1612.03651)
```markup
@article{joulin2016fasttext,
title={FastText.zip: Compressing text classification models},
author={Joulin, Armand and Grave, Edouard and Bojanowski, Piotr and Douze, Matthijs and J{'e}gou, H{'e}rve and Mikolov, Tomas},
journal={arXiv preprint arXiv:1612.03651},
year={2016}
}
```
If you use these word vectors, please cite the following paper:
[4] E. Grave\*, P. Bojanowski\*, P. Gupta, A. Joulin, T. Mikolov, [*Learning Word Vectors for 157 Languages*](https://arxiv.org/abs/1802.06893)
```markup
@inproceedings{grave2018learning,
title={Learning Word Vectors for 157 Languages},
author={Grave, Edouard and Bojanowski, Piotr and Gupta, Prakhar and Joulin, Armand and Mikolov, Tomas},
booktitle={Proceedings of the International Conference on Language Resources and Evaluation (LREC 2018)},
year={2018}
}
```
(\* These authors contributed equally.)
|
prognosis/BLOOM-alpaca-lora_qa | prognosis | 2023-06-02T18:23:36Z | 0 | 0 | null | [
"pytorch",
"generated_from_trainer",
"license:bigscience-bloom-rail-1.0",
"region:us"
] | null | 2023-06-02T13:49:03Z | ---
license: bigscience-bloom-rail-1.0
tags:
- generated_from_trainer
model-index:
- name: BLOOM-alpaca-lora_qa
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# BLOOM-alpaca-lora_qa
This model is a fine-tuned version of [bigscience/bloom-3b](https://huggingface.co/bigscience/bloom-3b) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.30.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
aneeshmb02/marian-finetuned-kde4-en-to-fr | aneeshmb02 | 2023-06-02T06:12:48Z | 103 | 0 | transformers | [
"transformers",
"pytorch",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | translation | 2023-06-01T09:53:24Z | ---
license: apache-2.0
tags:
- translation
- generated_from_trainer
datasets:
- kde4
model-index:
- name: marian-finetuned-kde4-en-to-fr
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
RedRayz/MyLora | RedRayz | 2023-05-31T10:42:55Z | 0 | 8 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-04-02T13:40:17Z | ---
license: creativeml-openrail-m
---
# LoRA置き場(checkpoint置き場)
需要の低そうなモデルはこちらにアップロードします。
# アップロードしたLoRA
## コイカツ!画風
ファイル名:koikatsu.safetensors
トリガーワードは無し。適用するだけでOK。
負の強度で適用すると、べた塗り感を抑えられます。
## コイカツ!チェックポイント
ファイル名:KKMix.safetensors
コイカツ!のスクリーンショットを用いたキャラクターLoRA学習にこのモデルを使用すると画風の影響を大幅に低減します。
学習用に作成。ネタとして生成にも使えます。
## 高コントラスト
ファイル名:high_contrast.safetensors
トリガーワードは無し。適用するだけでOK。
負の強度で適用すると、コントラストが低下しますが、編集ソフトほどの効果はありません。
## 洒落た構図
ファイル名:syaretakouzu.safetensors
トリガーワードは無し。適用するだけでOK。
適用すると少しだけ洒落た構図?になります。背景と構図がより芸術的になる感じです。 |
Brazilia/Bombampublic | Brazilia | 2023-05-27T16:26:04Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-27T16:20:05Z | ---
license: creativeml-openrail-m
---
|
vnykr/q-Taxi-v3 | vnykr | 2023-05-27T10:59:29Z | 0 | 0 | null | [
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] | reinforcement-learning | 2023-05-27T10:59:27Z | ---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="vnykr/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
satyamverma/Pre-requisite_Model_2 | satyamverma | 2023-05-23T13:58:55Z | 9 | 0 | transformers | [
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2023-05-23T07:25:35Z | ---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: Pre-requisite_Model_2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Pre-requisite_Model_2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7157
- Accuracy: 0.5741
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.5839 | 1.0 | 648 | 0.6894 | 0.5702 |
| 0.5469 | 2.0 | 1296 | 0.7157 | 0.5741 |
| 0.5156 | 3.0 | 1944 | 0.7157 | 0.5741 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.