ppo-LunarLander-v2 / results.json
emirsahin's picture
My first HF commit!
abd87e4 verified
raw
history blame contribute delete
157 Bytes
{"mean_reward": 269.976781, "std_reward": 12.080924502294422, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2024-02-21T05:31:32.786342"}