ppo-LunarLander-v2 / results.json
pastells's picture
LunarLander model with PPO for RL tutorial
74ef372
raw
history blame contribute delete
165 Bytes
{"mean_reward": 266.24234790916836, "std_reward": 12.581200596263752, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-01-16T15:18:13.998719"}