|
[2025-04-29 02:53:24,272][03043] Saving configuration to /content/train_dir/default_experiment/config.json... |
|
[2025-04-29 02:53:24,273][03043] Rollout worker 0 uses device cpu |
|
[2025-04-29 02:53:24,275][03043] Rollout worker 1 uses device cpu |
|
[2025-04-29 02:53:24,275][03043] Rollout worker 2 uses device cpu |
|
[2025-04-29 02:53:24,276][03043] Rollout worker 3 uses device cpu |
|
[2025-04-29 02:53:24,278][03043] Rollout worker 4 uses device cpu |
|
[2025-04-29 02:53:24,278][03043] Rollout worker 5 uses device cpu |
|
[2025-04-29 02:53:24,279][03043] Rollout worker 6 uses device cpu |
|
[2025-04-29 02:53:24,280][03043] Rollout worker 7 uses device cpu |
|
[2025-04-29 02:53:24,363][03043] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2025-04-29 02:53:24,364][03043] InferenceWorker_p0-w0: min num requests: 2 |
|
[2025-04-29 02:53:24,368][03043] Starting seed is not provided |
|
[2025-04-29 02:53:24,368][03043] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2025-04-29 02:53:24,369][03043] Initializing actor-critic model on device cuda:0 |
|
[2025-04-29 02:53:24,371][03043] RunningMeanStd input shape: (3, 72, 128) |
|
[2025-04-29 02:53:24,375][03043] RunningMeanStd input shape: (1,) |
|
[2025-04-29 02:53:24,387][03043] ConvEncoder: input_channels=3 |
|
[2025-04-29 02:53:24,666][03043] Conv encoder output size: 512 |
|
[2025-04-29 02:53:24,667][03043] Policy head output size: 512 |
|
[2025-04-29 02:53:24,723][03043] Created Actor Critic model with architecture: |
|
[2025-04-29 02:53:24,724][03043] ActorCriticSharedWeights( |
|
(obs_normalizer): ObservationNormalizer( |
|
(running_mean_std): RunningMeanStdDictInPlace( |
|
(running_mean_std): ModuleDict( |
|
(obs): RunningMeanStdInPlace() |
|
) |
|
) |
|
) |
|
(returns_normalizer): RecursiveScriptModule(original_name=RunningMeanStdInPlace) |
|
(encoder): VizdoomEncoder( |
|
(basic_encoder): ConvEncoder( |
|
(enc): RecursiveScriptModule( |
|
original_name=ConvEncoderImpl |
|
(conv_head): RecursiveScriptModule( |
|
original_name=Sequential |
|
(0): RecursiveScriptModule(original_name=Conv2d) |
|
(1): RecursiveScriptModule(original_name=ELU) |
|
(2): RecursiveScriptModule(original_name=Conv2d) |
|
(3): RecursiveScriptModule(original_name=ELU) |
|
(4): RecursiveScriptModule(original_name=Conv2d) |
|
(5): RecursiveScriptModule(original_name=ELU) |
|
) |
|
(mlp_layers): RecursiveScriptModule( |
|
original_name=Sequential |
|
(0): RecursiveScriptModule(original_name=Linear) |
|
(1): RecursiveScriptModule(original_name=ELU) |
|
) |
|
) |
|
) |
|
) |
|
(core): ModelCoreRNN( |
|
(core): GRU(512, 512) |
|
) |
|
(decoder): MlpDecoder( |
|
(mlp): Identity() |
|
) |
|
(critic_linear): Linear(in_features=512, out_features=1, bias=True) |
|
(action_parameterization): ActionParameterizationDefault( |
|
(distribution_linear): Linear(in_features=512, out_features=5, bias=True) |
|
) |
|
) |
|
[2025-04-29 02:53:25,079][03043] Using optimizer <class 'torch.optim.adam.Adam'> |
|
[2025-04-29 02:53:29,567][03043] No checkpoints found |
|
[2025-04-29 02:53:29,568][03043] Did not load from checkpoint, starting from scratch! |
|
[2025-04-29 02:53:29,570][03043] Initialized policy 0 weights for model version 0 |
|
[2025-04-29 02:53:29,571][03043] LearnerWorker_p0 finished initialization! |
|
[2025-04-29 02:53:29,572][03043] Using GPUs [0] for process 0 (actually maps to GPUs [0]) |
|
[2025-04-29 02:53:29,622][03043] Fps is (10 sec: nan, 60 sec: nan, 300 sec: nan). Total num frames: 0. Throughput: 0: nan. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2025-04-29 02:53:29,624][03043] Inference worker 0-0 is ready! |
|
[2025-04-29 02:53:29,626][03043] All inference workers are ready! Signal rollout workers to start! |
|
[2025-04-29 02:53:29,632][03043] Doom resolution: 160x120, resize resolution: (128, 72) |
|
[2025-04-29 02:53:29,996][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:30,233][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:30,522][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:30,798][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:31,117][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:31,373][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:31,647][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:32,024][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:32,505][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:32,873][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:33,299][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:33,775][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:34,249][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:34,483][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:34,759][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:35,035][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:35,351][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:35,599][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:35,868][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:36,149][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:36,468][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:36,718][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:36,991][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:37,273][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:37,600][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:37,846][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:38,192][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:38,639][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:39,140][03043] Decorrelating experience for 0 frames... |
|
[2025-04-29 02:53:39,386][03043] Decorrelating experience for 32 frames... |
|
[2025-04-29 02:53:39,666][03043] Decorrelating experience for 64 frames... |
|
[2025-04-29 02:53:39,962][03043] Decorrelating experience for 96 frames... |
|
[2025-04-29 02:53:40,999][03043] Fps is (10 sec: 0.0, 60 sec: 0.0, 300 sec: 0.0). Total num frames: 0. Throughput: 0: 0.0. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2025-04-29 02:53:41,037][03043] Fps is (10 sec: 0.0, 60 sec: 0.0, 300 sec: 0.0). Total num frames: 0. Throughput: 0: 0.0. Samples: 0. Policy #0 lag: (min: -1.0, avg: -1.0, max: -1.0) |
|
[2025-04-29 02:53:44,359][03043] Fps is (10 sec: 1219.4, 60 sec: 277.9, 300 sec: 277.9). Total num frames: 4096. Throughput: 0: 43.4. Samples: 640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 0.0) |
|
[2025-04-29 02:53:44,361][03043] Avg episode reward: [(0, '1.189')] |
|
[2025-04-29 02:53:44,453][03043] Heartbeat connected on Batcher_0 |
|
[2025-04-29 02:53:44,456][03043] Heartbeat connected on LearnerWorker_p0 |
|
[2025-04-29 02:53:44,457][03043] Heartbeat connected on InferenceWorker_p0-w0 |
|
[2025-04-29 02:53:44,458][03043] Heartbeat connected on RolloutWorker_w0 |
|
[2025-04-29 02:53:44,459][03043] Heartbeat connected on RolloutWorker_w1 |
|
[2025-04-29 02:53:44,460][03043] Heartbeat connected on RolloutWorker_w2 |
|
[2025-04-29 02:53:44,460][03043] Heartbeat connected on RolloutWorker_w3 |
|
[2025-04-29 02:53:44,461][03043] Heartbeat connected on RolloutWorker_w4 |
|
[2025-04-29 02:53:44,462][03043] Heartbeat connected on RolloutWorker_w5 |
|
[2025-04-29 02:53:44,463][03043] Heartbeat connected on RolloutWorker_w6 |
|
[2025-04-29 02:53:44,463][03043] Heartbeat connected on RolloutWorker_w7 |
|
[2025-04-29 02:53:48,861][03043] Fps is (10 sec: 2093.9, 60 sec: 851.6, 300 sec: 851.6). Total num frames: 16384. Throughput: 0: 164.7. Samples: 3168. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:53:48,862][03043] Avg episode reward: [(0, '2.979')] |
|
[2025-04-29 02:53:53,868][03043] Fps is (10 sec: 3015.0, 60 sec: 1351.4, 300 sec: 1351.4). Total num frames: 32768. Throughput: 0: 356.3. Samples: 8640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:53:53,869][03043] Avg episode reward: [(0, '3.614')] |
|
[2025-04-29 02:53:58,866][03043] Fps is (10 sec: 3275.1, 60 sec: 1680.7, 300 sec: 1680.7). Total num frames: 49152. Throughput: 0: 388.4. Samples: 11360. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:53:58,868][03043] Avg episode reward: [(0, '4.245')] |
|
[2025-04-29 02:54:03,854][03043] Fps is (10 sec: 3281.6, 60 sec: 1914.5, 300 sec: 1914.5). Total num frames: 65536. Throughput: 0: 473.0. Samples: 16192. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:03,854][03043] Avg episode reward: [(0, '4.480')] |
|
[2025-04-29 02:54:08,856][03043] Fps is (10 sec: 3690.4, 60 sec: 2192.4, 300 sec: 2192.4). Total num frames: 86016. Throughput: 0: 549.7. Samples: 21568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:08,857][03043] Avg episode reward: [(0, '4.491')] |
|
[2025-04-29 02:54:13,866][03043] Fps is (10 sec: 3681.6, 60 sec: 2314.4, 300 sec: 2314.4). Total num frames: 102400. Throughput: 0: 540.3. Samples: 23904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:13,867][03043] Avg episode reward: [(0, '4.466')] |
|
[2025-04-29 02:54:13,873][03043] Saving new best policy, reward=4.466! |
|
[2025-04-29 02:54:18,844][03043] Fps is (10 sec: 3280.5, 60 sec: 2413.2, 300 sec: 2413.2). Total num frames: 118784. Throughput: 0: 766.1. Samples: 28992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:18,845][03043] Avg episode reward: [(0, '4.531')] |
|
[2025-04-29 02:54:18,882][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000029_118784.pth... |
|
[2025-04-29 02:54:18,934][03043] Saving new best policy, reward=4.531! |
|
[2025-04-29 02:54:23,854][03043] Fps is (10 sec: 3280.8, 60 sec: 2492.4, 300 sec: 2492.4). Total num frames: 135168. Throughput: 0: 796.7. Samples: 34112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:23,855][03043] Avg episode reward: [(0, '4.567')] |
|
[2025-04-29 02:54:23,913][03043] Saving new best policy, reward=4.567! |
|
[2025-04-29 02:54:28,853][03043] Fps is (10 sec: 3273.9, 60 sec: 2558.6, 300 sec: 2558.6). Total num frames: 151552. Throughput: 0: 802.6. Samples: 36352. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:28,856][03043] Avg episode reward: [(0, '4.349')] |
|
[2025-04-29 02:54:33,865][03043] Fps is (10 sec: 3273.2, 60 sec: 3176.6, 300 sec: 2614.0). Total num frames: 167936. Throughput: 0: 856.8. Samples: 41728. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:33,866][03043] Avg episode reward: [(0, '4.367')] |
|
[2025-04-29 02:54:38,854][03043] Fps is (10 sec: 3276.5, 60 sec: 3188.0, 300 sec: 2662.3). Total num frames: 184320. Throughput: 0: 843.6. Samples: 46592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:38,855][03043] Avg episode reward: [(0, '4.521')] |
|
[2025-04-29 02:54:43,865][03043] Fps is (10 sec: 3277.0, 60 sec: 3304.0, 300 sec: 2703.3). Total num frames: 200704. Throughput: 0: 839.9. Samples: 49152. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:43,866][03043] Avg episode reward: [(0, '4.638')] |
|
[2025-04-29 02:54:43,958][03043] Saving new best policy, reward=4.638! |
|
[2025-04-29 02:54:48,847][03043] Fps is (10 sec: 3689.1, 60 sec: 3414.2, 300 sec: 2791.8). Total num frames: 221184. Throughput: 0: 849.9. Samples: 54432. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:48,851][03043] Avg episode reward: [(0, '4.553')] |
|
[2025-04-29 02:54:53,865][03043] Fps is (10 sec: 3276.7, 60 sec: 3345.3, 300 sec: 2771.4). Total num frames: 233472. Throughput: 0: 835.4. Samples: 59168. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:53,866][03043] Avg episode reward: [(0, '4.481')] |
|
[2025-04-29 02:54:58,844][03043] Fps is (10 sec: 3277.7, 60 sec: 3414.6, 300 sec: 2846.3). Total num frames: 253952. Throughput: 0: 841.0. Samples: 61728. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:54:58,845][03043] Avg episode reward: [(0, '4.550')] |
|
[2025-04-29 02:55:03,841][03043] Fps is (10 sec: 3695.2, 60 sec: 3414.0, 300 sec: 2869.2). Total num frames: 270336. Throughput: 0: 844.1. Samples: 66976. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:03,842][03043] Avg episode reward: [(0, '4.602')] |
|
[2025-04-29 02:55:08,862][03043] Fps is (10 sec: 3270.8, 60 sec: 3344.7, 300 sec: 2889.1). Total num frames: 286720. Throughput: 0: 845.4. Samples: 72160. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:08,863][03043] Avg episode reward: [(0, '4.429')] |
|
[2025-04-29 02:55:13,847][03043] Fps is (10 sec: 3274.9, 60 sec: 3346.2, 300 sec: 2908.2). Total num frames: 303104. Throughput: 0: 854.9. Samples: 74816. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:13,848][03043] Avg episode reward: [(0, '4.380')] |
|
[2025-04-29 02:55:18,840][03043] Fps is (10 sec: 3284.1, 60 sec: 3345.3, 300 sec: 2925.2). Total num frames: 319488. Throughput: 0: 841.7. Samples: 79584. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:18,841][03043] Avg episode reward: [(0, '4.450')] |
|
[2025-04-29 02:55:18,881][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000078_319488.pth... |
|
[2025-04-29 02:55:23,857][03043] Fps is (10 sec: 3682.8, 60 sec: 3413.2, 300 sec: 2976.0). Total num frames: 339968. Throughput: 0: 854.0. Samples: 85024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:23,858][03043] Avg episode reward: [(0, '4.396')] |
|
[2025-04-29 02:55:28,849][03043] Fps is (10 sec: 3683.2, 60 sec: 3413.6, 300 sec: 2988.8). Total num frames: 356352. Throughput: 0: 854.3. Samples: 87584. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:28,850][03043] Avg episode reward: [(0, '4.356')] |
|
[2025-04-29 02:55:33,859][03043] Fps is (10 sec: 3276.0, 60 sec: 3413.7, 300 sec: 3000.2). Total num frames: 372736. Throughput: 0: 844.6. Samples: 92448. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:33,860][03043] Avg episode reward: [(0, '4.289')] |
|
[2025-04-29 02:55:38,840][03043] Fps is (10 sec: 3279.6, 60 sec: 3414.1, 300 sec: 3011.3). Total num frames: 389120. Throughput: 0: 859.5. Samples: 97824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:38,842][03043] Avg episode reward: [(0, '4.433')] |
|
[2025-04-29 02:55:43,851][03043] Fps is (10 sec: 3279.3, 60 sec: 3414.1, 300 sec: 3021.0). Total num frames: 405504. Throughput: 0: 856.7. Samples: 100288. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:43,853][03043] Avg episode reward: [(0, '4.437')] |
|
[2025-04-29 02:55:48,858][03043] Fps is (10 sec: 3270.9, 60 sec: 3344.4, 300 sec: 3030.0). Total num frames: 421888. Throughput: 0: 852.3. Samples: 105344. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:48,859][03043] Avg episode reward: [(0, '4.516')] |
|
[2025-04-29 02:55:53,844][03043] Fps is (10 sec: 3689.0, 60 sec: 3482.8, 300 sec: 3067.3). Total num frames: 442368. Throughput: 0: 853.7. Samples: 110560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:53,845][03043] Avg episode reward: [(0, '4.444')] |
|
[2025-04-29 02:55:58,870][03043] Fps is (10 sec: 3272.8, 60 sec: 3343.6, 300 sec: 3046.3). Total num frames: 454656. Throughput: 0: 842.2. Samples: 112736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:55:58,871][03043] Avg episode reward: [(0, '4.370')] |
|
[2025-04-29 02:56:03,868][03043] Fps is (10 sec: 3269.2, 60 sec: 3411.8, 300 sec: 3080.4). Total num frames: 475136. Throughput: 0: 854.9. Samples: 118080. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:03,869][03043] Avg episode reward: [(0, '4.490')] |
|
[2025-04-29 02:56:08,858][03043] Fps is (10 sec: 3691.0, 60 sec: 3413.6, 300 sec: 3086.7). Total num frames: 491520. Throughput: 0: 842.6. Samples: 122944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:08,861][03043] Avg episode reward: [(0, '4.585')] |
|
[2025-04-29 02:56:13,868][03043] Fps is (10 sec: 3276.7, 60 sec: 3412.1, 300 sec: 3092.3). Total num frames: 507904. Throughput: 0: 843.0. Samples: 125536. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:13,869][03043] Avg episode reward: [(0, '4.577')] |
|
[2025-04-29 02:56:18,852][03043] Fps is (10 sec: 3278.6, 60 sec: 3412.6, 300 sec: 3098.1). Total num frames: 524288. Throughput: 0: 853.5. Samples: 130848. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:18,853][03043] Avg episode reward: [(0, '4.578')] |
|
[2025-04-29 02:56:18,894][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000128_524288.pth... |
|
[2025-04-29 02:56:18,982][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000029_118784.pth |
|
[2025-04-29 02:56:23,850][03043] Fps is (10 sec: 3282.8, 60 sec: 3345.5, 300 sec: 3103.2). Total num frames: 540672. Throughput: 0: 841.1. Samples: 135680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:23,851][03043] Avg episode reward: [(0, '4.486')] |
|
[2025-04-29 02:56:28,850][03043] Fps is (10 sec: 3687.4, 60 sec: 3413.3, 300 sec: 3130.9). Total num frames: 561152. Throughput: 0: 846.3. Samples: 138368. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:28,851][03043] Avg episode reward: [(0, '4.588')] |
|
[2025-04-29 02:56:33,860][03043] Fps is (10 sec: 3682.5, 60 sec: 3413.3, 300 sec: 3134.7). Total num frames: 577536. Throughput: 0: 854.7. Samples: 143808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:33,863][03043] Avg episode reward: [(0, '4.593')] |
|
[2025-04-29 02:56:38,857][03043] Fps is (10 sec: 3274.4, 60 sec: 3412.4, 300 sec: 3138.5). Total num frames: 593920. Throughput: 0: 846.7. Samples: 148672. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:38,858][03043] Avg episode reward: [(0, '4.471')] |
|
[2025-04-29 02:56:43,861][03043] Fps is (10 sec: 3276.4, 60 sec: 3412.8, 300 sec: 3142.0). Total num frames: 610304. Throughput: 0: 861.3. Samples: 151488. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:43,863][03043] Avg episode reward: [(0, '4.341')] |
|
[2025-04-29 02:56:48,837][03043] Fps is (10 sec: 3283.5, 60 sec: 3414.5, 300 sec: 3145.8). Total num frames: 626688. Throughput: 0: 848.9. Samples: 156256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:48,838][03043] Avg episode reward: [(0, '4.453')] |
|
[2025-04-29 02:56:53,902][03043] Fps is (10 sec: 3263.5, 60 sec: 3341.8, 300 sec: 3148.0). Total num frames: 643072. Throughput: 0: 858.2. Samples: 161600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:53,903][03043] Avg episode reward: [(0, '4.342')] |
|
[2025-04-29 02:56:58,847][03043] Fps is (10 sec: 3682.7, 60 sec: 3483.0, 300 sec: 3171.5). Total num frames: 663552. Throughput: 0: 862.3. Samples: 164320. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:56:58,848][03043] Avg episode reward: [(0, '4.447')] |
|
[2025-04-29 02:57:03,861][03043] Fps is (10 sec: 3701.8, 60 sec: 3413.7, 300 sec: 3173.7). Total num frames: 679936. Throughput: 0: 849.6. Samples: 169088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:03,862][03043] Avg episode reward: [(0, '4.418')] |
|
[2025-04-29 02:57:08,864][03043] Fps is (10 sec: 3271.2, 60 sec: 3413.0, 300 sec: 3176.0). Total num frames: 696320. Throughput: 0: 860.2. Samples: 174400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:08,865][03043] Avg episode reward: [(0, '4.453')] |
|
[2025-04-29 02:57:13,855][03043] Fps is (10 sec: 3278.7, 60 sec: 3414.1, 300 sec: 3178.4). Total num frames: 712704. Throughput: 0: 856.8. Samples: 176928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:13,856][03043] Avg episode reward: [(0, '4.465')] |
|
[2025-04-29 02:57:18,860][03043] Fps is (10 sec: 3278.0, 60 sec: 3412.9, 300 sec: 3180.5). Total num frames: 729088. Throughput: 0: 846.9. Samples: 181920. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:18,862][03043] Avg episode reward: [(0, '4.620')] |
|
[2025-04-29 02:57:18,900][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000178_729088.pth... |
|
[2025-04-29 02:57:18,954][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000078_319488.pth |
|
[2025-04-29 02:57:23,886][03043] Fps is (10 sec: 3266.8, 60 sec: 3411.3, 300 sec: 3182.2). Total num frames: 745472. Throughput: 0: 855.6. Samples: 187200. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:23,887][03043] Avg episode reward: [(0, '4.543')] |
|
[2025-04-29 02:57:28,858][03043] Fps is (10 sec: 3277.7, 60 sec: 3344.6, 300 sec: 3184.5). Total num frames: 761856. Throughput: 0: 843.4. Samples: 189440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:28,859][03043] Avg episode reward: [(0, '4.612')] |
|
[2025-04-29 02:57:33,858][03043] Fps is (10 sec: 3696.7, 60 sec: 3413.5, 300 sec: 3203.2). Total num frames: 782336. Throughput: 0: 852.9. Samples: 194656. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:33,859][03043] Avg episode reward: [(0, '4.838')] |
|
[2025-04-29 02:57:33,894][03043] Saving new best policy, reward=4.838! |
|
[2025-04-29 02:57:38,857][03043] Fps is (10 sec: 3276.9, 60 sec: 3345.1, 300 sec: 3188.2). Total num frames: 794624. Throughput: 0: 842.8. Samples: 199488. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:38,859][03043] Avg episode reward: [(0, '5.007')] |
|
[2025-04-29 02:57:38,920][03043] Saving new best policy, reward=5.007! |
|
[2025-04-29 02:57:43,862][03043] Fps is (10 sec: 3275.3, 60 sec: 3413.3, 300 sec: 3206.0). Total num frames: 815104. Throughput: 0: 833.8. Samples: 201856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:43,864][03043] Avg episode reward: [(0, '4.832')] |
|
[2025-04-29 02:57:48,859][03043] Fps is (10 sec: 3685.8, 60 sec: 3412.1, 300 sec: 3207.4). Total num frames: 831488. Throughput: 0: 845.5. Samples: 207136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:48,860][03043] Avg episode reward: [(0, '4.933')] |
|
[2025-04-29 02:57:53,849][03043] Fps is (10 sec: 3281.0, 60 sec: 3416.3, 300 sec: 3208.9). Total num frames: 847872. Throughput: 0: 838.0. Samples: 212096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:53,850][03043] Avg episode reward: [(0, '4.935')] |
|
[2025-04-29 02:57:58,846][03043] Fps is (10 sec: 3281.0, 60 sec: 3345.1, 300 sec: 3210.2). Total num frames: 864256. Throughput: 0: 844.3. Samples: 214912. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:57:58,848][03043] Avg episode reward: [(0, '5.234')] |
|
[2025-04-29 02:57:58,887][03043] Saving new best policy, reward=5.234! |
|
[2025-04-29 02:58:03,850][03043] Fps is (10 sec: 3276.8, 60 sec: 3345.7, 300 sec: 3211.3). Total num frames: 880640. Throughput: 0: 848.6. Samples: 220096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:03,851][03043] Avg episode reward: [(0, '5.250')] |
|
[2025-04-29 02:58:03,899][03043] Saving new best policy, reward=5.250! |
|
[2025-04-29 02:58:08,902][03043] Fps is (10 sec: 3258.5, 60 sec: 3342.9, 300 sec: 3211.9). Total num frames: 897024. Throughput: 0: 841.6. Samples: 225088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:08,904][03043] Avg episode reward: [(0, '5.477')] |
|
[2025-04-29 02:58:08,933][03043] Saving new best policy, reward=5.477! |
|
[2025-04-29 02:58:13,862][03043] Fps is (10 sec: 3681.7, 60 sec: 3412.9, 300 sec: 3227.9). Total num frames: 917504. Throughput: 0: 849.7. Samples: 227680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:13,864][03043] Avg episode reward: [(0, '5.666')] |
|
[2025-04-29 02:58:13,870][03043] Saving new best policy, reward=5.666! |
|
[2025-04-29 02:58:18,839][03043] Fps is (10 sec: 3710.0, 60 sec: 3414.6, 300 sec: 3229.0). Total num frames: 933888. Throughput: 0: 841.6. Samples: 232512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:18,840][03043] Avg episode reward: [(0, '5.589')] |
|
[2025-04-29 02:58:18,880][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000228_933888.pth... |
|
[2025-04-29 02:58:18,927][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000128_524288.pth |
|
[2025-04-29 02:58:23,866][03043] Fps is (10 sec: 3275.7, 60 sec: 3414.5, 300 sec: 3229.5). Total num frames: 950272. Throughput: 0: 852.5. Samples: 237856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:23,867][03043] Avg episode reward: [(0, '5.781')] |
|
[2025-04-29 02:58:23,902][03043] Saving new best policy, reward=5.781! |
|
[2025-04-29 02:58:28,865][03043] Fps is (10 sec: 3268.4, 60 sec: 3412.9, 300 sec: 3358.0). Total num frames: 966656. Throughput: 0: 860.4. Samples: 240576. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:28,868][03043] Avg episode reward: [(0, '5.805')] |
|
[2025-04-29 02:58:28,915][03043] Saving new best policy, reward=5.805! |
|
[2025-04-29 02:58:33,855][03043] Fps is (10 sec: 3280.5, 60 sec: 3345.3, 300 sec: 3357.2). Total num frames: 983040. Throughput: 0: 849.2. Samples: 245344. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:33,856][03043] Avg episode reward: [(0, '5.898')] |
|
[2025-04-29 02:58:33,891][03043] Saving new best policy, reward=5.898! |
|
[2025-04-29 02:58:38,863][03043] Fps is (10 sec: 3686.9, 60 sec: 3481.3, 300 sec: 3393.6). Total num frames: 1003520. Throughput: 0: 860.2. Samples: 250816. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:38,864][03043] Avg episode reward: [(0, '6.047')] |
|
[2025-04-29 02:58:38,875][03043] Saving new best policy, reward=6.047! |
|
[2025-04-29 02:58:43,848][03043] Fps is (10 sec: 3278.9, 60 sec: 3345.9, 300 sec: 3388.0). Total num frames: 1015808. Throughput: 0: 849.0. Samples: 253120. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:43,849][03043] Avg episode reward: [(0, '6.212')] |
|
[2025-04-29 02:58:43,882][03043] Saving new best policy, reward=6.212! |
|
[2025-04-29 02:58:48,865][03043] Fps is (10 sec: 3276.2, 60 sec: 3413.0, 300 sec: 3401.8). Total num frames: 1036288. Throughput: 0: 845.2. Samples: 258144. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:48,866][03043] Avg episode reward: [(0, '5.934')] |
|
[2025-04-29 02:58:53,869][03043] Fps is (10 sec: 3678.8, 60 sec: 3412.2, 300 sec: 3401.7). Total num frames: 1052672. Throughput: 0: 856.8. Samples: 263616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:53,870][03043] Avg episode reward: [(0, '6.050')] |
|
[2025-04-29 02:58:58,861][03043] Fps is (10 sec: 3278.2, 60 sec: 3412.5, 300 sec: 3401.7). Total num frames: 1069056. Throughput: 0: 845.5. Samples: 265728. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:58:58,862][03043] Avg episode reward: [(0, '6.047')] |
|
[2025-04-29 02:59:03,856][03043] Fps is (10 sec: 3281.1, 60 sec: 3413.0, 300 sec: 3387.9). Total num frames: 1085440. Throughput: 0: 860.1. Samples: 271232. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:03,857][03043] Avg episode reward: [(0, '6.451')] |
|
[2025-04-29 02:59:03,894][03043] Saving new best policy, reward=6.451! |
|
[2025-04-29 02:59:08,880][03043] Fps is (10 sec: 3270.6, 60 sec: 3414.6, 300 sec: 3387.7). Total num frames: 1101824. Throughput: 0: 851.6. Samples: 276192. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:08,881][03043] Avg episode reward: [(0, '6.928')] |
|
[2025-04-29 02:59:08,892][03043] Saving new best policy, reward=6.928! |
|
[2025-04-29 02:59:13,859][03043] Fps is (10 sec: 3685.2, 60 sec: 3413.5, 300 sec: 3401.6). Total num frames: 1122304. Throughput: 0: 846.3. Samples: 278656. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:13,860][03043] Avg episode reward: [(0, '7.405')] |
|
[2025-04-29 02:59:13,894][03043] Saving new best policy, reward=7.405! |
|
[2025-04-29 02:59:18,849][03043] Fps is (10 sec: 3697.8, 60 sec: 3412.8, 300 sec: 3401.8). Total num frames: 1138688. Throughput: 0: 859.8. Samples: 284032. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:18,850][03043] Avg episode reward: [(0, '7.473')] |
|
[2025-04-29 02:59:18,886][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000278_1138688.pth... |
|
[2025-04-29 02:59:18,938][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000178_729088.pth |
|
[2025-04-29 02:59:18,944][03043] Saving new best policy, reward=7.473! |
|
[2025-04-29 02:59:23,861][03043] Fps is (10 sec: 3276.3, 60 sec: 3413.6, 300 sec: 3401.7). Total num frames: 1155072. Throughput: 0: 844.8. Samples: 288832. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:23,862][03043] Avg episode reward: [(0, '7.521')] |
|
[2025-04-29 02:59:23,897][03043] Saving new best policy, reward=7.521! |
|
[2025-04-29 02:59:28,854][03043] Fps is (10 sec: 3275.0, 60 sec: 3413.9, 300 sec: 3401.9). Total num frames: 1171456. Throughput: 0: 851.1. Samples: 291424. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:28,856][03043] Avg episode reward: [(0, '7.514')] |
|
[2025-04-29 02:59:33,847][03043] Fps is (10 sec: 3281.2, 60 sec: 3413.7, 300 sec: 3401.8). Total num frames: 1187840. Throughput: 0: 860.1. Samples: 296832. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:33,849][03043] Avg episode reward: [(0, '7.382')] |
|
[2025-04-29 02:59:38,857][03043] Fps is (10 sec: 3275.9, 60 sec: 3345.4, 300 sec: 3401.9). Total num frames: 1204224. Throughput: 0: 848.6. Samples: 301792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:38,858][03043] Avg episode reward: [(0, '7.501')] |
|
[2025-04-29 02:59:43,858][03043] Fps is (10 sec: 3682.5, 60 sec: 3481.0, 300 sec: 3401.6). Total num frames: 1224704. Throughput: 0: 859.8. Samples: 304416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:43,859][03043] Avg episode reward: [(0, '8.024')] |
|
[2025-04-29 02:59:43,898][03043] Saving new best policy, reward=8.024! |
|
[2025-04-29 02:59:48,892][03043] Fps is (10 sec: 3673.4, 60 sec: 3411.8, 300 sec: 3415.3). Total num frames: 1241088. Throughput: 0: 844.8. Samples: 309280. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:48,895][03043] Avg episode reward: [(0, '8.432')] |
|
[2025-04-29 02:59:48,932][03043] Saving new best policy, reward=8.432! |
|
[2025-04-29 02:59:53,848][03043] Fps is (10 sec: 3280.2, 60 sec: 3414.5, 300 sec: 3401.7). Total num frames: 1257472. Throughput: 0: 853.9. Samples: 314592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:53,849][03043] Avg episode reward: [(0, '7.912')] |
|
[2025-04-29 02:59:58,857][03043] Fps is (10 sec: 3288.4, 60 sec: 3413.6, 300 sec: 3401.6). Total num frames: 1273856. Throughput: 0: 859.1. Samples: 317312. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 02:59:58,860][03043] Avg episode reward: [(0, '7.329')] |
|
[2025-04-29 03:00:03,857][03043] Fps is (10 sec: 3273.8, 60 sec: 3413.3, 300 sec: 3401.8). Total num frames: 1290240. Throughput: 0: 847.5. Samples: 322176. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:03,858][03043] Avg episode reward: [(0, '7.385')] |
|
[2025-04-29 03:00:08,864][03043] Fps is (10 sec: 3683.9, 60 sec: 3482.5, 300 sec: 3415.5). Total num frames: 1310720. Throughput: 0: 861.8. Samples: 327616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:08,865][03043] Avg episode reward: [(0, '7.989')] |
|
[2025-04-29 03:00:13,846][03043] Fps is (10 sec: 3280.2, 60 sec: 3345.8, 300 sec: 3401.7). Total num frames: 1323008. Throughput: 0: 858.5. Samples: 330048. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:13,848][03043] Avg episode reward: [(0, '8.401')] |
|
[2025-04-29 03:00:18,836][03043] Fps is (10 sec: 3285.9, 60 sec: 3414.1, 300 sec: 3402.0). Total num frames: 1343488. Throughput: 0: 847.1. Samples: 334944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:18,837][03043] Avg episode reward: [(0, '8.758')] |
|
[2025-04-29 03:00:18,878][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000328_1343488.pth... |
|
[2025-04-29 03:00:18,926][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000228_933888.pth |
|
[2025-04-29 03:00:18,933][03043] Saving new best policy, reward=8.758! |
|
[2025-04-29 03:00:23,839][03043] Fps is (10 sec: 3689.1, 60 sec: 3414.6, 300 sec: 3401.9). Total num frames: 1359872. Throughput: 0: 857.9. Samples: 340384. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:23,840][03043] Avg episode reward: [(0, '8.252')] |
|
[2025-04-29 03:00:28,861][03043] Fps is (10 sec: 3268.6, 60 sec: 3412.9, 300 sec: 3401.7). Total num frames: 1376256. Throughput: 0: 846.2. Samples: 342496. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:28,865][03043] Avg episode reward: [(0, '8.246')] |
|
[2025-04-29 03:00:33,865][03043] Fps is (10 sec: 3677.0, 60 sec: 3480.6, 300 sec: 3415.4). Total num frames: 1396736. Throughput: 0: 865.2. Samples: 348192. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:33,866][03043] Avg episode reward: [(0, '8.056')] |
|
[2025-04-29 03:00:38,937][03043] Fps is (10 sec: 3658.7, 60 sec: 3477.0, 300 sec: 3414.7). Total num frames: 1413120. Throughput: 0: 855.2. Samples: 353152. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:38,939][03043] Avg episode reward: [(0, '8.061')] |
|
[2025-04-29 03:00:43,843][03043] Fps is (10 sec: 3283.9, 60 sec: 3414.2, 300 sec: 3415.8). Total num frames: 1429504. Throughput: 0: 852.9. Samples: 355680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:43,844][03043] Avg episode reward: [(0, '7.849')] |
|
[2025-04-29 03:00:48,863][03043] Fps is (10 sec: 3301.2, 60 sec: 3415.0, 300 sec: 3401.5). Total num frames: 1445888. Throughput: 0: 865.3. Samples: 361120. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:48,864][03043] Avg episode reward: [(0, '8.231')] |
|
[2025-04-29 03:00:53,847][03043] Fps is (10 sec: 3275.5, 60 sec: 3413.4, 300 sec: 3415.9). Total num frames: 1462272. Throughput: 0: 851.5. Samples: 365920. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:53,848][03043] Avg episode reward: [(0, '8.265')] |
|
[2025-04-29 03:00:58,863][03043] Fps is (10 sec: 3686.2, 60 sec: 3481.2, 300 sec: 3415.7). Total num frames: 1482752. Throughput: 0: 859.4. Samples: 368736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:00:58,866][03043] Avg episode reward: [(0, '8.129')] |
|
[2025-04-29 03:01:03,842][03043] Fps is (10 sec: 3688.1, 60 sec: 3482.4, 300 sec: 3415.8). Total num frames: 1499136. Throughput: 0: 867.4. Samples: 373984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:03,844][03043] Avg episode reward: [(0, '8.061')] |
|
[2025-04-29 03:01:08,846][03043] Fps is (10 sec: 3282.6, 60 sec: 3414.4, 300 sec: 3415.9). Total num frames: 1515520. Throughput: 0: 856.1. Samples: 378912. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:08,847][03043] Avg episode reward: [(0, '8.537')] |
|
[2025-04-29 03:01:13,858][03043] Fps is (10 sec: 3271.6, 60 sec: 3480.9, 300 sec: 3415.6). Total num frames: 1531904. Throughput: 0: 869.7. Samples: 381632. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:13,859][03043] Avg episode reward: [(0, '9.516')] |
|
[2025-04-29 03:01:13,897][03043] Saving new best policy, reward=9.516! |
|
[2025-04-29 03:01:18,861][03043] Fps is (10 sec: 3271.8, 60 sec: 3411.9, 300 sec: 3415.5). Total num frames: 1548288. Throughput: 0: 849.1. Samples: 386400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:18,862][03043] Avg episode reward: [(0, '9.792')] |
|
[2025-04-29 03:01:18,902][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000378_1548288.pth... |
|
[2025-04-29 03:01:18,949][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000278_1138688.pth |
|
[2025-04-29 03:01:18,955][03043] Saving new best policy, reward=9.792! |
|
[2025-04-29 03:01:23,845][03043] Fps is (10 sec: 3281.2, 60 sec: 3413.0, 300 sec: 3401.8). Total num frames: 1564672. Throughput: 0: 854.4. Samples: 391520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:23,846][03043] Avg episode reward: [(0, '10.228')] |
|
[2025-04-29 03:01:23,878][03043] Saving new best policy, reward=10.228! |
|
[2025-04-29 03:01:28,867][03043] Fps is (10 sec: 3274.8, 60 sec: 3413.0, 300 sec: 3401.7). Total num frames: 1581056. Throughput: 0: 857.8. Samples: 394304. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:28,868][03043] Avg episode reward: [(0, '9.106')] |
|
[2025-04-29 03:01:33,850][03043] Fps is (10 sec: 3275.1, 60 sec: 3345.9, 300 sec: 3401.8). Total num frames: 1597440. Throughput: 0: 843.6. Samples: 399072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:33,851][03043] Avg episode reward: [(0, '9.578')] |
|
[2025-04-29 03:01:38,865][03043] Fps is (10 sec: 3687.3, 60 sec: 3417.5, 300 sec: 3415.6). Total num frames: 1617920. Throughput: 0: 856.6. Samples: 404480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:38,866][03043] Avg episode reward: [(0, '9.347')] |
|
[2025-04-29 03:01:43,865][03043] Fps is (10 sec: 3271.8, 60 sec: 3343.8, 300 sec: 3401.4). Total num frames: 1630208. Throughput: 0: 849.0. Samples: 406944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:43,866][03043] Avg episode reward: [(0, '9.157')] |
|
[2025-04-29 03:01:48,839][03043] Fps is (10 sec: 3285.1, 60 sec: 3414.7, 300 sec: 3416.4). Total num frames: 1650688. Throughput: 0: 843.4. Samples: 411936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:48,840][03043] Avg episode reward: [(0, '9.542')] |
|
[2025-04-29 03:01:53,852][03043] Fps is (10 sec: 3691.2, 60 sec: 3413.1, 300 sec: 3401.7). Total num frames: 1667072. Throughput: 0: 857.5. Samples: 417504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:53,854][03043] Avg episode reward: [(0, '10.041')] |
|
[2025-04-29 03:01:58,850][03043] Fps is (10 sec: 3273.3, 60 sec: 3345.8, 300 sec: 3401.9). Total num frames: 1683456. Throughput: 0: 844.2. Samples: 419616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:01:58,851][03043] Avg episode reward: [(0, '9.343')] |
|
[2025-04-29 03:02:03,846][03043] Fps is (10 sec: 3688.7, 60 sec: 3413.1, 300 sec: 3415.9). Total num frames: 1703936. Throughput: 0: 858.6. Samples: 425024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:03,847][03043] Avg episode reward: [(0, '9.334')] |
|
[2025-04-29 03:02:08,879][03043] Fps is (10 sec: 3675.8, 60 sec: 3411.5, 300 sec: 3415.4). Total num frames: 1720320. Throughput: 0: 856.9. Samples: 430112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:08,881][03043] Avg episode reward: [(0, '10.061')] |
|
[2025-04-29 03:02:13,856][03043] Fps is (10 sec: 3273.3, 60 sec: 3413.4, 300 sec: 3415.7). Total num frames: 1736704. Throughput: 0: 848.6. Samples: 432480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:13,857][03043] Avg episode reward: [(0, '11.418')] |
|
[2025-04-29 03:02:13,890][03043] Saving new best policy, reward=11.418! |
|
[2025-04-29 03:02:18,846][03043] Fps is (10 sec: 3287.6, 60 sec: 3414.2, 300 sec: 3416.1). Total num frames: 1753088. Throughput: 0: 861.2. Samples: 437824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:18,847][03043] Avg episode reward: [(0, '12.380')] |
|
[2025-04-29 03:02:18,890][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000428_1753088.pth... |
|
[2025-04-29 03:02:18,937][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000328_1343488.pth |
|
[2025-04-29 03:02:18,944][03043] Saving new best policy, reward=12.380! |
|
[2025-04-29 03:02:23,845][03043] Fps is (10 sec: 3280.6, 60 sec: 3413.3, 300 sec: 3415.8). Total num frames: 1769472. Throughput: 0: 845.9. Samples: 442528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:23,846][03043] Avg episode reward: [(0, '11.608')] |
|
[2025-04-29 03:02:28,865][03043] Fps is (10 sec: 3270.7, 60 sec: 3413.5, 300 sec: 3401.7). Total num frames: 1785856. Throughput: 0: 850.5. Samples: 445216. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:28,866][03043] Avg episode reward: [(0, '9.621')] |
|
[2025-04-29 03:02:33,852][03043] Fps is (10 sec: 3274.6, 60 sec: 3413.2, 300 sec: 3415.7). Total num frames: 1802240. Throughput: 0: 859.5. Samples: 450624. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:33,853][03043] Avg episode reward: [(0, '8.909')] |
|
[2025-04-29 03:02:38,839][03043] Fps is (10 sec: 3695.9, 60 sec: 3414.8, 300 sec: 3415.9). Total num frames: 1822720. Throughput: 0: 847.2. Samples: 455616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:38,840][03043] Avg episode reward: [(0, '9.000')] |
|
[2025-04-29 03:02:43,840][03043] Fps is (10 sec: 3690.7, 60 sec: 3483.1, 300 sec: 3415.9). Total num frames: 1839104. Throughput: 0: 858.5. Samples: 458240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:43,842][03043] Avg episode reward: [(0, '9.770')] |
|
[2025-04-29 03:02:48,863][03043] Fps is (10 sec: 3269.0, 60 sec: 3412.0, 300 sec: 3415.5). Total num frames: 1855488. Throughput: 0: 842.3. Samples: 462944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:48,864][03043] Avg episode reward: [(0, '11.682')] |
|
[2025-04-29 03:02:53,850][03043] Fps is (10 sec: 3273.4, 60 sec: 3413.4, 300 sec: 3415.6). Total num frames: 1871872. Throughput: 0: 844.6. Samples: 468096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:53,851][03043] Avg episode reward: [(0, '12.194')] |
|
[2025-04-29 03:02:58,858][03043] Fps is (10 sec: 3278.6, 60 sec: 3412.9, 300 sec: 3415.6). Total num frames: 1888256. Throughput: 0: 850.5. Samples: 470752. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:02:58,862][03043] Avg episode reward: [(0, '13.677')] |
|
[2025-04-29 03:02:58,909][03043] Saving new best policy, reward=13.677! |
|
[2025-04-29 03:03:03,849][03043] Fps is (10 sec: 3277.2, 60 sec: 3344.9, 300 sec: 3416.3). Total num frames: 1904640. Throughput: 0: 837.6. Samples: 475520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:03,850][03043] Avg episode reward: [(0, '13.791')] |
|
[2025-04-29 03:03:03,889][03043] Saving new best policy, reward=13.791! |
|
[2025-04-29 03:03:08,841][03043] Fps is (10 sec: 3282.2, 60 sec: 3347.2, 300 sec: 3402.0). Total num frames: 1921024. Throughput: 0: 854.8. Samples: 480992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:08,842][03043] Avg episode reward: [(0, '13.098')] |
|
[2025-04-29 03:03:13,847][03043] Fps is (10 sec: 3277.6, 60 sec: 3345.6, 300 sec: 3401.7). Total num frames: 1937408. Throughput: 0: 850.8. Samples: 483488. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:13,848][03043] Avg episode reward: [(0, '13.453')] |
|
[2025-04-29 03:03:18,860][03043] Fps is (10 sec: 3679.2, 60 sec: 3412.5, 300 sec: 3415.7). Total num frames: 1957888. Throughput: 0: 841.8. Samples: 488512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:18,862][03043] Avg episode reward: [(0, '12.426')] |
|
[2025-04-29 03:03:18,867][03043] Process: main process 3043 has queue size: 11 |
|
[2025-04-29 03:03:18,904][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000478_1957888.pth... |
|
[2025-04-29 03:03:18,956][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000378_1548288.pth |
|
[2025-04-29 03:03:23,854][03043] Fps is (10 sec: 3683.6, 60 sec: 3412.8, 300 sec: 3415.8). Total num frames: 1974272. Throughput: 0: 850.2. Samples: 493888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:23,857][03043] Avg episode reward: [(0, '12.699')] |
|
[2025-04-29 03:03:28,847][03043] Fps is (10 sec: 3281.1, 60 sec: 3414.3, 300 sec: 3415.7). Total num frames: 1990656. Throughput: 0: 839.7. Samples: 496032. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:28,849][03043] Avg episode reward: [(0, '12.058')] |
|
[2025-04-29 03:03:33,837][03043] Fps is (10 sec: 3282.3, 60 sec: 3414.1, 300 sec: 3402.1). Total num frames: 2007040. Throughput: 0: 856.7. Samples: 501472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:33,838][03043] Avg episode reward: [(0, '13.889')] |
|
[2025-04-29 03:03:33,874][03043] Saving new best policy, reward=13.889! |
|
[2025-04-29 03:03:38,874][03043] Fps is (10 sec: 3268.1, 60 sec: 3343.1, 300 sec: 3415.3). Total num frames: 2023424. Throughput: 0: 852.9. Samples: 506496. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:38,884][03043] Avg episode reward: [(0, '14.042')] |
|
[2025-04-29 03:03:38,895][03043] Saving new best policy, reward=14.042! |
|
[2025-04-29 03:03:43,846][03043] Fps is (10 sec: 3274.0, 60 sec: 3344.7, 300 sec: 3402.0). Total num frames: 2039808. Throughput: 0: 850.0. Samples: 508992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:43,847][03043] Avg episode reward: [(0, '14.182')] |
|
[2025-04-29 03:03:43,887][03043] Saving new best policy, reward=14.182! |
|
[2025-04-29 03:03:48,840][03043] Fps is (10 sec: 3698.9, 60 sec: 3414.6, 300 sec: 3416.0). Total num frames: 2060288. Throughput: 0: 859.9. Samples: 514208. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:48,841][03043] Avg episode reward: [(0, '14.764')] |
|
[2025-04-29 03:03:48,880][03043] Saving new best policy, reward=14.764! |
|
[2025-04-29 03:03:53,869][03043] Fps is (10 sec: 3677.9, 60 sec: 3412.3, 300 sec: 3415.6). Total num frames: 2076672. Throughput: 0: 847.1. Samples: 519136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:53,870][03043] Avg episode reward: [(0, '14.567')] |
|
[2025-04-29 03:03:58,864][03043] Fps is (10 sec: 3268.9, 60 sec: 3412.9, 300 sec: 3415.6). Total num frames: 2093056. Throughput: 0: 850.9. Samples: 521792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:03:58,865][03043] Avg episode reward: [(0, '13.629')] |
|
[2025-04-29 03:04:03,857][03043] Fps is (10 sec: 3280.9, 60 sec: 3412.9, 300 sec: 3415.9). Total num frames: 2109440. Throughput: 0: 859.8. Samples: 527200. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:03,858][03043] Avg episode reward: [(0, '14.348')] |
|
[2025-04-29 03:04:08,841][03043] Fps is (10 sec: 3284.5, 60 sec: 3413.3, 300 sec: 3402.0). Total num frames: 2125824. Throughput: 0: 849.3. Samples: 532096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:08,842][03043] Avg episode reward: [(0, '15.069')] |
|
[2025-04-29 03:04:08,881][03043] Saving new best policy, reward=15.069! |
|
[2025-04-29 03:04:13,854][03043] Fps is (10 sec: 3687.5, 60 sec: 3481.2, 300 sec: 3415.6). Total num frames: 2146304. Throughput: 0: 858.9. Samples: 534688. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:13,855][03043] Avg episode reward: [(0, '15.306')] |
|
[2025-04-29 03:04:13,890][03043] Saving new best policy, reward=15.306! |
|
[2025-04-29 03:04:18,866][03043] Fps is (10 sec: 3268.7, 60 sec: 3344.8, 300 sec: 3401.7). Total num frames: 2158592. Throughput: 0: 847.8. Samples: 539648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:18,867][03043] Avg episode reward: [(0, '14.473')] |
|
[2025-04-29 03:04:18,913][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000527_2158592.pth... |
|
[2025-04-29 03:04:18,967][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000428_1753088.pth |
|
[2025-04-29 03:04:23,839][03043] Fps is (10 sec: 3281.7, 60 sec: 3414.2, 300 sec: 3415.8). Total num frames: 2179072. Throughput: 0: 853.3. Samples: 544864. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:23,840][03043] Avg episode reward: [(0, '14.503')] |
|
[2025-04-29 03:04:28,842][03043] Fps is (10 sec: 3695.2, 60 sec: 3413.6, 300 sec: 3415.7). Total num frames: 2195456. Throughput: 0: 856.3. Samples: 547520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:28,847][03043] Avg episode reward: [(0, '14.904')] |
|
[2025-04-29 03:04:33,842][03043] Fps is (10 sec: 3275.8, 60 sec: 3413.1, 300 sec: 3415.8). Total num frames: 2211840. Throughput: 0: 851.2. Samples: 552512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:33,843][03043] Avg episode reward: [(0, '15.372')] |
|
[2025-04-29 03:04:33,878][03043] Saving new best policy, reward=15.372! |
|
[2025-04-29 03:04:38,850][03043] Fps is (10 sec: 3683.4, 60 sec: 3483.0, 300 sec: 3415.7). Total num frames: 2232320. Throughput: 0: 868.6. Samples: 558208. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:38,851][03043] Avg episode reward: [(0, '15.161')] |
|
[2025-04-29 03:04:43,872][03043] Fps is (10 sec: 3675.3, 60 sec: 3480.1, 300 sec: 3415.9). Total num frames: 2248704. Throughput: 0: 871.7. Samples: 561024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:43,874][03043] Avg episode reward: [(0, '15.448')] |
|
[2025-04-29 03:04:43,917][03043] Saving new best policy, reward=15.448! |
|
[2025-04-29 03:04:48,910][03043] Fps is (10 sec: 3664.4, 60 sec: 3477.6, 300 sec: 3428.8). Total num frames: 2269184. Throughput: 0: 864.4. Samples: 566144. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:48,911][03043] Avg episode reward: [(0, '14.818')] |
|
[2025-04-29 03:04:53,851][03043] Fps is (10 sec: 3694.2, 60 sec: 3482.7, 300 sec: 3429.6). Total num frames: 2285568. Throughput: 0: 884.4. Samples: 571904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:53,852][03043] Avg episode reward: [(0, '14.523')] |
|
[2025-04-29 03:04:58,855][03043] Fps is (10 sec: 3294.9, 60 sec: 3482.1, 300 sec: 3429.6). Total num frames: 2301952. Throughput: 0: 879.6. Samples: 574272. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:04:58,856][03043] Avg episode reward: [(0, '14.675')] |
|
[2025-04-29 03:05:03,859][03043] Fps is (10 sec: 3683.5, 60 sec: 3549.7, 300 sec: 3429.6). Total num frames: 2322432. Throughput: 0: 894.0. Samples: 579872. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:03,860][03043] Avg episode reward: [(0, '15.123')] |
|
[2025-04-29 03:05:08,871][03043] Fps is (10 sec: 3680.4, 60 sec: 3548.1, 300 sec: 3443.1). Total num frames: 2338816. Throughput: 0: 896.8. Samples: 585248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:08,873][03043] Avg episode reward: [(0, '16.302')] |
|
[2025-04-29 03:05:08,885][03043] Saving new best policy, reward=16.302! |
|
[2025-04-29 03:05:13,859][03043] Fps is (10 sec: 3686.2, 60 sec: 3549.5, 300 sec: 3443.1). Total num frames: 2359296. Throughput: 0: 895.7. Samples: 587840. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:13,860][03043] Avg episode reward: [(0, '17.592')] |
|
[2025-04-29 03:05:13,897][03043] Saving new best policy, reward=17.592! |
|
[2025-04-29 03:05:18,842][03043] Fps is (10 sec: 3697.2, 60 sec: 3619.6, 300 sec: 3443.4). Total num frames: 2375680. Throughput: 0: 910.2. Samples: 593472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:18,843][03043] Avg episode reward: [(0, '18.513')] |
|
[2025-04-29 03:05:18,882][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000580_2375680.pth... |
|
[2025-04-29 03:05:18,930][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000478_1957888.pth |
|
[2025-04-29 03:05:18,937][03043] Saving new best policy, reward=18.513! |
|
[2025-04-29 03:05:23,856][03043] Fps is (10 sec: 3277.9, 60 sec: 3548.8, 300 sec: 3443.5). Total num frames: 2392064. Throughput: 0: 893.7. Samples: 598432. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:23,857][03043] Avg episode reward: [(0, '18.903')] |
|
[2025-04-29 03:05:23,890][03043] Saving new best policy, reward=18.903! |
|
[2025-04-29 03:05:28,853][03043] Fps is (10 sec: 3682.2, 60 sec: 3617.4, 300 sec: 3443.5). Total num frames: 2412544. Throughput: 0: 892.8. Samples: 601184. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:28,854][03043] Avg episode reward: [(0, '15.157')] |
|
[2025-04-29 03:05:33,859][03043] Fps is (10 sec: 3685.2, 60 sec: 3617.1, 300 sec: 3444.3). Total num frames: 2428928. Throughput: 0: 908.4. Samples: 606976. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:33,861][03043] Avg episode reward: [(0, '12.153')] |
|
[2025-04-29 03:05:38,913][03043] Fps is (10 sec: 3664.5, 60 sec: 3614.3, 300 sec: 3456.5). Total num frames: 2449408. Throughput: 0: 892.6. Samples: 612128. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:38,914][03043] Avg episode reward: [(0, '11.224')] |
|
[2025-04-29 03:05:43,853][03043] Fps is (10 sec: 3688.8, 60 sec: 3619.3, 300 sec: 3457.4). Total num frames: 2465792. Throughput: 0: 902.4. Samples: 614880. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:43,854][03043] Avg episode reward: [(0, '11.951')] |
|
[2025-04-29 03:05:48,856][03043] Fps is (10 sec: 3295.8, 60 sec: 3553.1, 300 sec: 3457.2). Total num frames: 2482176. Throughput: 0: 888.9. Samples: 619872. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:48,857][03043] Avg episode reward: [(0, '12.676')] |
|
[2025-04-29 03:05:53,860][03043] Fps is (10 sec: 3274.6, 60 sec: 3549.3, 300 sec: 3443.5). Total num frames: 2498560. Throughput: 0: 891.3. Samples: 625344. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:53,861][03043] Avg episode reward: [(0, '14.347')] |
|
[2025-04-29 03:05:58,844][03043] Fps is (10 sec: 3690.7, 60 sec: 3618.8, 300 sec: 3457.3). Total num frames: 2519040. Throughput: 0: 892.8. Samples: 628000. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:05:58,847][03043] Avg episode reward: [(0, '16.644')] |
|
[2025-04-29 03:06:03,863][03043] Fps is (10 sec: 3685.2, 60 sec: 3549.6, 300 sec: 3457.1). Total num frames: 2535424. Throughput: 0: 875.7. Samples: 632896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:03,864][03043] Avg episode reward: [(0, '17.498')] |
|
[2025-04-29 03:06:08,842][03043] Fps is (10 sec: 3277.3, 60 sec: 3551.6, 300 sec: 3457.5). Total num frames: 2551808. Throughput: 0: 887.7. Samples: 638368. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:08,843][03043] Avg episode reward: [(0, '17.798')] |
|
[2025-04-29 03:06:13,843][03043] Fps is (10 sec: 3283.1, 60 sec: 3482.5, 300 sec: 3457.5). Total num frames: 2568192. Throughput: 0: 884.1. Samples: 640960. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:13,845][03043] Avg episode reward: [(0, '17.974')] |
|
[2025-04-29 03:06:18,843][03043] Fps is (10 sec: 3276.8, 60 sec: 3481.6, 300 sec: 3457.3). Total num frames: 2584576. Throughput: 0: 866.5. Samples: 645952. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:18,844][03043] Avg episode reward: [(0, '18.224')] |
|
[2025-04-29 03:06:18,885][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000631_2584576.pth... |
|
[2025-04-29 03:06:18,953][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000527_2158592.pth |
|
[2025-04-29 03:06:23,845][03043] Fps is (10 sec: 3685.8, 60 sec: 3550.5, 300 sec: 3471.4). Total num frames: 2605056. Throughput: 0: 872.4. Samples: 651328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:23,846][03043] Avg episode reward: [(0, '18.257')] |
|
[2025-04-29 03:06:28,859][03043] Fps is (10 sec: 3680.3, 60 sec: 3481.3, 300 sec: 3471.1). Total num frames: 2621440. Throughput: 0: 858.2. Samples: 653504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:28,860][03043] Avg episode reward: [(0, '18.267')] |
|
[2025-04-29 03:06:33,837][03043] Fps is (10 sec: 3279.5, 60 sec: 3482.9, 300 sec: 3457.6). Total num frames: 2637824. Throughput: 0: 865.8. Samples: 658816. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:33,838][03043] Avg episode reward: [(0, '18.291')] |
|
[2025-04-29 03:06:38,878][03043] Fps is (10 sec: 3270.5, 60 sec: 3415.3, 300 sec: 3471.0). Total num frames: 2654208. Throughput: 0: 859.4. Samples: 664032. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:38,880][03043] Avg episode reward: [(0, '17.604')] |
|
[2025-04-29 03:06:43,868][03043] Fps is (10 sec: 3266.6, 60 sec: 3412.5, 300 sec: 3457.0). Total num frames: 2670592. Throughput: 0: 852.9. Samples: 666400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:43,870][03043] Avg episode reward: [(0, '16.722')] |
|
[2025-04-29 03:06:48,867][03043] Fps is (10 sec: 3690.5, 60 sec: 3480.9, 300 sec: 3471.0). Total num frames: 2691072. Throughput: 0: 864.6. Samples: 671808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:48,868][03043] Avg episode reward: [(0, '16.047')] |
|
[2025-04-29 03:06:53,868][03043] Fps is (10 sec: 3686.7, 60 sec: 3481.1, 300 sec: 3471.0). Total num frames: 2707456. Throughput: 0: 852.1. Samples: 676736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:53,870][03043] Avg episode reward: [(0, '15.688')] |
|
[2025-04-29 03:06:58,854][03043] Fps is (10 sec: 3281.0, 60 sec: 3412.7, 300 sec: 3457.2). Total num frames: 2723840. Throughput: 0: 851.7. Samples: 679296. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:06:58,855][03043] Avg episode reward: [(0, '16.405')] |
|
[2025-04-29 03:07:03,866][03043] Fps is (10 sec: 3277.3, 60 sec: 3413.1, 300 sec: 3457.5). Total num frames: 2740224. Throughput: 0: 862.8. Samples: 684800. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:03,870][03043] Avg episode reward: [(0, '16.935')] |
|
[2025-04-29 03:07:08,848][03043] Fps is (10 sec: 3278.8, 60 sec: 3413.0, 300 sec: 3457.4). Total num frames: 2756608. Throughput: 0: 852.6. Samples: 689696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:08,849][03043] Avg episode reward: [(0, '17.851')] |
|
[2025-04-29 03:07:13,845][03043] Fps is (10 sec: 3694.2, 60 sec: 3481.5, 300 sec: 3471.2). Total num frames: 2777088. Throughput: 0: 866.4. Samples: 692480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:13,846][03043] Avg episode reward: [(0, '19.030')] |
|
[2025-04-29 03:07:13,882][03043] Saving new best policy, reward=19.030! |
|
[2025-04-29 03:07:18,840][03043] Fps is (10 sec: 3689.5, 60 sec: 3481.7, 300 sec: 3471.2). Total num frames: 2793472. Throughput: 0: 866.8. Samples: 697824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:18,841][03043] Avg episode reward: [(0, '19.341')] |
|
[2025-04-29 03:07:18,879][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000682_2793472.pth... |
|
[2025-04-29 03:07:18,931][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000580_2375680.pth |
|
[2025-04-29 03:07:18,938][03043] Saving new best policy, reward=19.341! |
|
[2025-04-29 03:07:23,910][03043] Fps is (10 sec: 3662.4, 60 sec: 3477.8, 300 sec: 3484.5). Total num frames: 2813952. Throughput: 0: 871.9. Samples: 703296. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:23,913][03043] Avg episode reward: [(0, '17.854')] |
|
[2025-04-29 03:07:28,865][03043] Fps is (10 sec: 3677.3, 60 sec: 3481.3, 300 sec: 3484.9). Total num frames: 2830336. Throughput: 0: 884.0. Samples: 706176. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:28,866][03043] Avg episode reward: [(0, '17.018')] |
|
[2025-04-29 03:07:33,860][03043] Fps is (10 sec: 3293.5, 60 sec: 3480.3, 300 sec: 3470.9). Total num frames: 2846720. Throughput: 0: 876.9. Samples: 711264. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:33,861][03043] Avg episode reward: [(0, '16.756')] |
|
[2025-04-29 03:07:38,863][03043] Fps is (10 sec: 3687.0, 60 sec: 3550.8, 300 sec: 3484.8). Total num frames: 2867200. Throughput: 0: 896.8. Samples: 717088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:38,864][03043] Avg episode reward: [(0, '16.225')] |
|
[2025-04-29 03:07:43,855][03043] Fps is (10 sec: 3688.0, 60 sec: 3550.7, 300 sec: 3485.2). Total num frames: 2883584. Throughput: 0: 903.1. Samples: 719936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:43,859][03043] Avg episode reward: [(0, '15.628')] |
|
[2025-04-29 03:07:48,870][03043] Fps is (10 sec: 3684.0, 60 sec: 3549.7, 300 sec: 3498.7). Total num frames: 2904064. Throughput: 0: 891.7. Samples: 724928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:48,872][03043] Avg episode reward: [(0, '17.970')] |
|
[2025-04-29 03:07:53,853][03043] Fps is (10 sec: 3687.2, 60 sec: 3550.7, 300 sec: 3499.0). Total num frames: 2920448. Throughput: 0: 903.0. Samples: 730336. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:53,854][03043] Avg episode reward: [(0, '18.744')] |
|
[2025-04-29 03:07:58,844][03043] Fps is (10 sec: 3285.2, 60 sec: 3550.5, 300 sec: 3499.0). Total num frames: 2936832. Throughput: 0: 889.6. Samples: 732512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:07:58,845][03043] Avg episode reward: [(0, '18.064')] |
|
[2025-04-29 03:08:03,846][03043] Fps is (10 sec: 3279.3, 60 sec: 3551.1, 300 sec: 3498.9). Total num frames: 2953216. Throughput: 0: 885.9. Samples: 737696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:03,847][03043] Avg episode reward: [(0, '17.750')] |
|
[2025-04-29 03:08:08,861][03043] Fps is (10 sec: 3271.3, 60 sec: 3549.1, 300 sec: 3498.8). Total num frames: 2969600. Throughput: 0: 875.6. Samples: 742656. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:08,864][03043] Avg episode reward: [(0, '16.744')] |
|
[2025-04-29 03:08:13,847][03043] Fps is (10 sec: 3276.2, 60 sec: 3481.5, 300 sec: 3485.2). Total num frames: 2985984. Throughput: 0: 865.0. Samples: 745088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:13,849][03043] Avg episode reward: [(0, '14.861')] |
|
[2025-04-29 03:08:18,863][03043] Fps is (10 sec: 3685.6, 60 sec: 3548.5, 300 sec: 3498.8). Total num frames: 3006464. Throughput: 0: 874.6. Samples: 750624. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:18,864][03043] Avg episode reward: [(0, '14.500')] |
|
[2025-04-29 03:08:18,904][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000734_3006464.pth... |
|
[2025-04-29 03:08:18,958][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000631_2584576.pth |
|
[2025-04-29 03:08:23,844][03043] Fps is (10 sec: 3278.0, 60 sec: 3417.1, 300 sec: 3485.1). Total num frames: 3018752. Throughput: 0: 851.6. Samples: 755392. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:23,845][03043] Avg episode reward: [(0, '14.410')] |
|
[2025-04-29 03:08:28,870][03043] Fps is (10 sec: 3274.6, 60 sec: 3481.3, 300 sec: 3498.6). Total num frames: 3039232. Throughput: 0: 848.8. Samples: 758144. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:28,871][03043] Avg episode reward: [(0, '15.669')] |
|
[2025-04-29 03:08:33,849][03043] Fps is (10 sec: 3684.5, 60 sec: 3482.2, 300 sec: 3499.3). Total num frames: 3055616. Throughput: 0: 860.1. Samples: 763616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:33,853][03043] Avg episode reward: [(0, '17.694')] |
|
[2025-04-29 03:08:38,843][03043] Fps is (10 sec: 3285.8, 60 sec: 3414.5, 300 sec: 3499.0). Total num frames: 3072000. Throughput: 0: 847.8. Samples: 768480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:38,843][03043] Avg episode reward: [(0, '18.356')] |
|
[2025-04-29 03:08:43,845][03043] Fps is (10 sec: 3687.8, 60 sec: 3482.2, 300 sec: 3498.9). Total num frames: 3092480. Throughput: 0: 861.1. Samples: 771264. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:43,846][03043] Avg episode reward: [(0, '17.581')] |
|
[2025-04-29 03:08:48,863][03043] Fps is (10 sec: 3270.0, 60 sec: 3345.4, 300 sec: 3485.1). Total num frames: 3104768. Throughput: 0: 856.5. Samples: 776256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:48,865][03043] Avg episode reward: [(0, '16.730')] |
|
[2025-04-29 03:08:53,846][03043] Fps is (10 sec: 3276.5, 60 sec: 3413.7, 300 sec: 3499.2). Total num frames: 3125248. Throughput: 0: 862.9. Samples: 781472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:53,847][03043] Avg episode reward: [(0, '15.672')] |
|
[2025-04-29 03:08:58,850][03043] Fps is (10 sec: 3691.2, 60 sec: 3413.0, 300 sec: 3499.0). Total num frames: 3141632. Throughput: 0: 870.3. Samples: 784256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:08:58,856][03043] Avg episode reward: [(0, '13.466')] |
|
[2025-04-29 03:09:03,846][03043] Fps is (10 sec: 3276.7, 60 sec: 3413.3, 300 sec: 3498.9). Total num frames: 3158016. Throughput: 0: 852.9. Samples: 788992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:03,847][03043] Avg episode reward: [(0, '14.522')] |
|
[2025-04-29 03:09:08,853][03043] Fps is (10 sec: 3685.6, 60 sec: 3482.1, 300 sec: 3499.0). Total num frames: 3178496. Throughput: 0: 870.2. Samples: 794560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:08,854][03043] Avg episode reward: [(0, '15.452')] |
|
[2025-04-29 03:09:13,875][03043] Fps is (10 sec: 3267.4, 60 sec: 3411.8, 300 sec: 3498.8). Total num frames: 3190784. Throughput: 0: 865.3. Samples: 797088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:13,878][03043] Avg episode reward: [(0, '16.214')] |
|
[2025-04-29 03:09:18,852][03043] Fps is (10 sec: 3276.9, 60 sec: 3414.0, 300 sec: 3498.8). Total num frames: 3211264. Throughput: 0: 854.7. Samples: 802080. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:18,853][03043] Avg episode reward: [(0, '16.432')] |
|
[2025-04-29 03:09:18,891][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000784_3211264.pth... |
|
[2025-04-29 03:09:18,940][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000682_2793472.pth |
|
[2025-04-29 03:09:23,864][03043] Fps is (10 sec: 3690.6, 60 sec: 3480.5, 300 sec: 3498.7). Total num frames: 3227648. Throughput: 0: 862.9. Samples: 807328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:23,865][03043] Avg episode reward: [(0, '17.005')] |
|
[2025-04-29 03:09:28,847][03043] Fps is (10 sec: 3278.4, 60 sec: 3414.6, 300 sec: 3498.9). Total num frames: 3244032. Throughput: 0: 852.6. Samples: 809632. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:28,848][03043] Avg episode reward: [(0, '17.641')] |
|
[2025-04-29 03:09:33,862][03043] Fps is (10 sec: 3277.4, 60 sec: 3412.6, 300 sec: 3484.9). Total num frames: 3260416. Throughput: 0: 859.1. Samples: 814912. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:33,863][03043] Avg episode reward: [(0, '17.715')] |
|
[2025-04-29 03:09:38,876][03043] Fps is (10 sec: 3267.6, 60 sec: 3411.5, 300 sec: 3485.0). Total num frames: 3276800. Throughput: 0: 856.3. Samples: 820032. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:38,877][03043] Avg episode reward: [(0, '18.495')] |
|
[2025-04-29 03:09:43,850][03043] Fps is (10 sec: 3690.9, 60 sec: 3413.1, 300 sec: 3485.8). Total num frames: 3297280. Throughput: 0: 850.5. Samples: 822528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:43,851][03043] Avg episode reward: [(0, '18.740')] |
|
[2025-04-29 03:09:48,860][03043] Fps is (10 sec: 3692.2, 60 sec: 3481.8, 300 sec: 3485.0). Total num frames: 3313664. Throughput: 0: 863.0. Samples: 827840. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:48,861][03043] Avg episode reward: [(0, '19.994')] |
|
[2025-04-29 03:09:48,899][03043] Saving new best policy, reward=19.994! |
|
[2025-04-29 03:09:53,852][03043] Fps is (10 sec: 3276.2, 60 sec: 3413.0, 300 sec: 3485.1). Total num frames: 3330048. Throughput: 0: 845.5. Samples: 832608. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:53,852][03043] Avg episode reward: [(0, '21.312')] |
|
[2025-04-29 03:09:53,889][03043] Saving new best policy, reward=21.312! |
|
[2025-04-29 03:09:58,845][03043] Fps is (10 sec: 3281.6, 60 sec: 3413.6, 300 sec: 3471.3). Total num frames: 3346432. Throughput: 0: 853.2. Samples: 835456. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:09:58,847][03043] Avg episode reward: [(0, '19.786')] |
|
[2025-04-29 03:10:03,852][03043] Fps is (10 sec: 3686.2, 60 sec: 3481.3, 300 sec: 3485.3). Total num frames: 3366912. Throughput: 0: 863.3. Samples: 840928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:03,853][03043] Avg episode reward: [(0, '19.851')] |
|
[2025-04-29 03:10:08,841][03043] Fps is (10 sec: 3688.0, 60 sec: 3414.0, 300 sec: 3471.4). Total num frames: 3383296. Throughput: 0: 857.3. Samples: 845888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:08,842][03043] Avg episode reward: [(0, '19.635')] |
|
[2025-04-29 03:10:13,865][03043] Fps is (10 sec: 3272.7, 60 sec: 3482.2, 300 sec: 3470.9). Total num frames: 3399680. Throughput: 0: 865.1. Samples: 848576. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:13,866][03043] Avg episode reward: [(0, '18.983')] |
|
[2025-04-29 03:10:18,854][03043] Fps is (10 sec: 3272.7, 60 sec: 3413.3, 300 sec: 3471.2). Total num frames: 3416064. Throughput: 0: 857.8. Samples: 853504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:18,855][03043] Avg episode reward: [(0, '19.776')] |
|
[2025-04-29 03:10:18,897][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000834_3416064.pth... |
|
[2025-04-29 03:10:18,951][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000734_3006464.pth |
|
[2025-04-29 03:10:23,837][03043] Fps is (10 sec: 3285.9, 60 sec: 3414.9, 300 sec: 3457.5). Total num frames: 3432448. Throughput: 0: 860.5. Samples: 858720. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:23,838][03043] Avg episode reward: [(0, '19.773')] |
|
[2025-04-29 03:10:28,866][03043] Fps is (10 sec: 3682.0, 60 sec: 3480.5, 300 sec: 3471.1). Total num frames: 3452928. Throughput: 0: 863.0. Samples: 861376. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:28,867][03043] Avg episode reward: [(0, '21.019')] |
|
[2025-04-29 03:10:33,872][03043] Fps is (10 sec: 3673.4, 60 sec: 3481.0, 300 sec: 3457.8). Total num frames: 3469312. Throughput: 0: 853.8. Samples: 866272. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:33,874][03043] Avg episode reward: [(0, '20.766')] |
|
[2025-04-29 03:10:38,839][03043] Fps is (10 sec: 3285.6, 60 sec: 3483.7, 300 sec: 3457.5). Total num frames: 3485696. Throughput: 0: 868.5. Samples: 871680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:38,840][03043] Avg episode reward: [(0, '21.736')] |
|
[2025-04-29 03:10:38,880][03043] Saving new best policy, reward=21.736! |
|
[2025-04-29 03:10:43,872][03043] Fps is (10 sec: 3276.8, 60 sec: 3412.1, 300 sec: 3457.1). Total num frames: 3502080. Throughput: 0: 862.8. Samples: 874304. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:43,876][03043] Avg episode reward: [(0, '21.873')] |
|
[2025-04-29 03:10:43,925][03043] Saving new best policy, reward=21.873! |
|
[2025-04-29 03:10:48,849][03043] Fps is (10 sec: 3273.5, 60 sec: 3414.0, 300 sec: 3457.4). Total num frames: 3518464. Throughput: 0: 848.4. Samples: 879104. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:48,850][03043] Avg episode reward: [(0, '21.361')] |
|
[2025-04-29 03:10:53,864][03043] Fps is (10 sec: 3279.5, 60 sec: 3412.6, 300 sec: 3443.2). Total num frames: 3534848. Throughput: 0: 857.9. Samples: 884512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:53,865][03043] Avg episode reward: [(0, '22.224')] |
|
[2025-04-29 03:10:53,959][03043] Saving new best policy, reward=22.224! |
|
[2025-04-29 03:10:58,863][03043] Fps is (10 sec: 3272.1, 60 sec: 3412.3, 300 sec: 3443.4). Total num frames: 3551232. Throughput: 0: 847.0. Samples: 886688. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:10:58,864][03043] Avg episode reward: [(0, '21.965')] |
|
[2025-04-29 03:11:03,840][03043] Fps is (10 sec: 3695.3, 60 sec: 3414.0, 300 sec: 3457.3). Total num frames: 3571712. Throughput: 0: 857.1. Samples: 892064. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:03,841][03043] Avg episode reward: [(0, '21.558')] |
|
[2025-04-29 03:11:08,837][03043] Fps is (10 sec: 3696.0, 60 sec: 3413.5, 300 sec: 3457.4). Total num frames: 3588096. Throughput: 0: 852.6. Samples: 897088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:08,839][03043] Avg episode reward: [(0, '21.412')] |
|
[2025-04-29 03:11:13,842][03043] Fps is (10 sec: 3276.1, 60 sec: 3414.6, 300 sec: 3457.3). Total num frames: 3604480. Throughput: 0: 850.9. Samples: 899648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:13,843][03043] Avg episode reward: [(0, '22.242')] |
|
[2025-04-29 03:11:13,880][03043] Saving new best policy, reward=22.242! |
|
[2025-04-29 03:11:18,934][03043] Fps is (10 sec: 3245.3, 60 sec: 3408.7, 300 sec: 3442.4). Total num frames: 3620864. Throughput: 0: 862.1. Samples: 905120. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:18,936][03043] Avg episode reward: [(0, '19.857')] |
|
[2025-04-29 03:11:18,952][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000885_3624960.pth... |
|
[2025-04-29 03:11:19,000][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000784_3211264.pth |
|
[2025-04-29 03:11:23,865][03043] Fps is (10 sec: 3269.2, 60 sec: 3411.7, 300 sec: 3443.3). Total num frames: 3637248. Throughput: 0: 850.7. Samples: 909984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:23,866][03043] Avg episode reward: [(0, '18.959')] |
|
[2025-04-29 03:11:28,843][03043] Fps is (10 sec: 3720.3, 60 sec: 3414.6, 300 sec: 3457.2). Total num frames: 3657728. Throughput: 0: 850.3. Samples: 912544. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:28,844][03043] Avg episode reward: [(0, '19.465')] |
|
[2025-04-29 03:11:33,839][03043] Fps is (10 sec: 3696.1, 60 sec: 3415.2, 300 sec: 3457.8). Total num frames: 3674112. Throughput: 0: 864.2. Samples: 917984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:33,847][03043] Avg episode reward: [(0, '18.804')] |
|
[2025-04-29 03:11:38,857][03043] Fps is (10 sec: 3272.3, 60 sec: 3412.3, 300 sec: 3457.4). Total num frames: 3690496. Throughput: 0: 849.9. Samples: 922752. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:38,862][03043] Avg episode reward: [(0, '18.268')] |
|
[2025-04-29 03:11:43,865][03043] Fps is (10 sec: 3268.3, 60 sec: 3413.7, 300 sec: 3443.4). Total num frames: 3706880. Throughput: 0: 860.4. Samples: 925408. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:43,866][03043] Avg episode reward: [(0, '19.235')] |
|
[2025-04-29 03:11:48,867][03043] Fps is (10 sec: 3273.6, 60 sec: 3412.3, 300 sec: 3443.4). Total num frames: 3723264. Throughput: 0: 851.4. Samples: 930400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:48,868][03043] Avg episode reward: [(0, '20.179')] |
|
[2025-04-29 03:11:53,862][03043] Fps is (10 sec: 3278.0, 60 sec: 3413.5, 300 sec: 3443.3). Total num frames: 3739648. Throughput: 0: 856.4. Samples: 935648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:53,863][03043] Avg episode reward: [(0, '18.748')] |
|
[2025-04-29 03:11:58,840][03043] Fps is (10 sec: 3696.6, 60 sec: 3483.0, 300 sec: 3457.6). Total num frames: 3760128. Throughput: 0: 857.6. Samples: 938240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:11:58,841][03043] Avg episode reward: [(0, '18.126')] |
|
[2025-04-29 03:12:03,893][03043] Fps is (10 sec: 3674.9, 60 sec: 3410.3, 300 sec: 3456.8). Total num frames: 3776512. Throughput: 0: 843.4. Samples: 943040. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:03,894][03043] Avg episode reward: [(0, '17.384')] |
|
[2025-04-29 03:12:08,855][03043] Fps is (10 sec: 3271.7, 60 sec: 3412.3, 300 sec: 3443.3). Total num frames: 3792896. Throughput: 0: 855.7. Samples: 948480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:08,856][03043] Avg episode reward: [(0, '16.848')] |
|
[2025-04-29 03:12:13,850][03043] Fps is (10 sec: 3290.9, 60 sec: 3412.9, 300 sec: 3443.3). Total num frames: 3809280. Throughput: 0: 857.5. Samples: 951136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:13,852][03043] Avg episode reward: [(0, '16.980')] |
|
[2025-04-29 03:12:18,865][03043] Fps is (10 sec: 3273.7, 60 sec: 3417.3, 300 sec: 3430.1). Total num frames: 3825664. Throughput: 0: 844.3. Samples: 956000. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:18,866][03043] Avg episode reward: [(0, '18.963')] |
|
[2025-04-29 03:12:18,876][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000934_3825664.pth... |
|
[2025-04-29 03:12:18,925][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000834_3416064.pth |
|
[2025-04-29 03:12:23,841][03043] Fps is (10 sec: 3279.9, 60 sec: 3414.7, 300 sec: 3429.8). Total num frames: 3842048. Throughput: 0: 856.5. Samples: 961280. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:23,842][03043] Avg episode reward: [(0, '19.760')] |
|
[2025-04-29 03:12:28,866][03043] Fps is (10 sec: 3276.2, 60 sec: 3343.8, 300 sec: 3429.5). Total num frames: 3858432. Throughput: 0: 847.6. Samples: 963552. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:28,867][03043] Avg episode reward: [(0, '20.397')] |
|
[2025-04-29 03:12:33,865][03043] Fps is (10 sec: 3677.5, 60 sec: 3411.9, 300 sec: 3429.5). Total num frames: 3878912. Throughput: 0: 853.4. Samples: 968800. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:33,866][03043] Avg episode reward: [(0, '20.135')] |
|
[2025-04-29 03:12:38,849][03043] Fps is (10 sec: 3692.7, 60 sec: 3413.8, 300 sec: 3429.6). Total num frames: 3895296. Throughput: 0: 855.0. Samples: 974112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:38,854][03043] Avg episode reward: [(0, '20.268')] |
|
[2025-04-29 03:12:43,839][03043] Fps is (10 sec: 3285.2, 60 sec: 3414.8, 300 sec: 3416.0). Total num frames: 3911680. Throughput: 0: 855.5. Samples: 976736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:43,840][03043] Avg episode reward: [(0, '19.507')] |
|
[2025-04-29 03:12:48,858][03043] Fps is (10 sec: 3683.1, 60 sec: 3482.1, 300 sec: 3429.5). Total num frames: 3932160. Throughput: 0: 877.5. Samples: 982496. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:48,859][03043] Avg episode reward: [(0, '18.906')] |
|
[2025-04-29 03:12:53,836][03043] Fps is (10 sec: 3687.7, 60 sec: 3483.1, 300 sec: 3429.6). Total num frames: 3948544. Throughput: 0: 871.5. Samples: 987680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:53,837][03043] Avg episode reward: [(0, '19.240')] |
|
[2025-04-29 03:12:58,848][03043] Fps is (10 sec: 3690.3, 60 sec: 3481.1, 300 sec: 3443.4). Total num frames: 3969024. Throughput: 0: 875.4. Samples: 990528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:12:58,849][03043] Avg episode reward: [(0, '21.446')] |
|
[2025-04-29 03:13:03,869][03043] Fps is (10 sec: 3674.3, 60 sec: 3483.0, 300 sec: 3443.3). Total num frames: 3985408. Throughput: 0: 895.2. Samples: 996288. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:03,870][03043] Avg episode reward: [(0, '23.163')] |
|
[2025-04-29 03:13:03,915][03043] Saving new best policy, reward=23.163! |
|
[2025-04-29 03:13:08,863][03043] Fps is (10 sec: 3271.6, 60 sec: 3481.1, 300 sec: 3443.2). Total num frames: 4001792. Throughput: 0: 885.6. Samples: 1001152. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:08,864][03043] Avg episode reward: [(0, '23.780')] |
|
[2025-04-29 03:13:08,878][03043] Saving new best policy, reward=23.780! |
|
[2025-04-29 03:13:13,841][03043] Fps is (10 sec: 3696.4, 60 sec: 3550.4, 300 sec: 3443.7). Total num frames: 4022272. Throughput: 0: 895.1. Samples: 1003808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:13,842][03043] Avg episode reward: [(0, '24.047')] |
|
[2025-04-29 03:13:13,881][03043] Saving new best policy, reward=24.047! |
|
[2025-04-29 03:13:18,867][03043] Fps is (10 sec: 3685.1, 60 sec: 3549.7, 300 sec: 3457.0). Total num frames: 4038656. Throughput: 0: 892.4. Samples: 1008960. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:18,868][03043] Avg episode reward: [(0, '23.141')] |
|
[2025-04-29 03:13:18,909][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000986_4038656.pth... |
|
[2025-04-29 03:13:18,968][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000885_3624960.pth |
|
[2025-04-29 03:13:23,863][03043] Fps is (10 sec: 3269.7, 60 sec: 3548.5, 300 sec: 3443.5). Total num frames: 4055040. Throughput: 0: 895.7. Samples: 1014432. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:23,864][03043] Avg episode reward: [(0, '21.992')] |
|
[2025-04-29 03:13:28,854][03043] Fps is (10 sec: 3691.4, 60 sec: 3618.9, 300 sec: 3457.2). Total num frames: 4075520. Throughput: 0: 901.4. Samples: 1017312. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:28,855][03043] Avg episode reward: [(0, '19.970')] |
|
[2025-04-29 03:13:33,859][03043] Fps is (10 sec: 3687.8, 60 sec: 3550.2, 300 sec: 3457.1). Total num frames: 4091904. Throughput: 0: 889.6. Samples: 1022528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:33,860][03043] Avg episode reward: [(0, '19.426')] |
|
[2025-04-29 03:13:38,852][03043] Fps is (10 sec: 3686.8, 60 sec: 3617.9, 300 sec: 3457.2). Total num frames: 4112384. Throughput: 0: 902.1. Samples: 1028288. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:38,853][03043] Avg episode reward: [(0, '18.618')] |
|
[2025-04-29 03:13:43,866][03043] Fps is (10 sec: 3684.0, 60 sec: 3616.5, 300 sec: 3471.2). Total num frames: 4128768. Throughput: 0: 901.3. Samples: 1031104. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:43,867][03043] Avg episode reward: [(0, '19.045')] |
|
[2025-04-29 03:13:48,864][03043] Fps is (10 sec: 3682.1, 60 sec: 3617.8, 300 sec: 3471.0). Total num frames: 4149248. Throughput: 0: 888.3. Samples: 1036256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:48,865][03043] Avg episode reward: [(0, '19.682')] |
|
[2025-04-29 03:13:53,849][03043] Fps is (10 sec: 3692.6, 60 sec: 3617.3, 300 sec: 3471.2). Total num frames: 4165632. Throughput: 0: 909.1. Samples: 1042048. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:53,850][03043] Avg episode reward: [(0, '19.233')] |
|
[2025-04-29 03:13:58,912][03043] Fps is (10 sec: 3668.7, 60 sec: 3614.2, 300 sec: 3484.3). Total num frames: 4186112. Throughput: 0: 903.1. Samples: 1044512. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:13:58,913][03043] Avg episode reward: [(0, '18.692')] |
|
[2025-04-29 03:14:03,854][03043] Fps is (10 sec: 3684.7, 60 sec: 3619.0, 300 sec: 3471.2). Total num frames: 4202496. Throughput: 0: 917.6. Samples: 1050240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:03,855][03043] Avg episode reward: [(0, '18.976')] |
|
[2025-04-29 03:14:08,858][03043] Fps is (10 sec: 3706.6, 60 sec: 3686.8, 300 sec: 3499.2). Total num frames: 4222976. Throughput: 0: 916.7. Samples: 1055680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:08,861][03043] Avg episode reward: [(0, '19.534')] |
|
[2025-04-29 03:14:13,848][03043] Fps is (10 sec: 3688.6, 60 sec: 3617.8, 300 sec: 3485.1). Total num frames: 4239360. Throughput: 0: 908.2. Samples: 1058176. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:13,849][03043] Avg episode reward: [(0, '19.446')] |
|
[2025-04-29 03:14:18,856][03043] Fps is (10 sec: 3687.1, 60 sec: 3687.1, 300 sec: 3499.0). Total num frames: 4259840. Throughput: 0: 919.5. Samples: 1063904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:18,857][03043] Avg episode reward: [(0, '19.382')] |
|
[2025-04-29 03:14:18,895][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001040_4259840.pth... |
|
[2025-04-29 03:14:18,945][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000934_3825664.pth |
|
[2025-04-29 03:14:23,856][03043] Fps is (10 sec: 3683.3, 60 sec: 3686.8, 300 sec: 3498.9). Total num frames: 4276224. Throughput: 0: 905.2. Samples: 1069024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:23,857][03043] Avg episode reward: [(0, '18.064')] |
|
[2025-04-29 03:14:28,858][03043] Fps is (10 sec: 3276.2, 60 sec: 3617.9, 300 sec: 3499.0). Total num frames: 4292608. Throughput: 0: 906.8. Samples: 1071904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:28,859][03043] Avg episode reward: [(0, '18.111')] |
|
[2025-04-29 03:14:33,871][03043] Fps is (10 sec: 3680.8, 60 sec: 3685.7, 300 sec: 3512.9). Total num frames: 4313088. Throughput: 0: 919.3. Samples: 1077632. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:33,876][03043] Avg episode reward: [(0, '17.152')] |
|
[2025-04-29 03:14:38,843][03043] Fps is (10 sec: 3692.0, 60 sec: 3618.7, 300 sec: 3499.0). Total num frames: 4329472. Throughput: 0: 905.4. Samples: 1082784. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:38,844][03043] Avg episode reward: [(0, '17.913')] |
|
[2025-04-29 03:14:43,865][03043] Fps is (10 sec: 3688.8, 60 sec: 3686.5, 300 sec: 3512.8). Total num frames: 4349952. Throughput: 0: 916.2. Samples: 1085696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:43,866][03043] Avg episode reward: [(0, '18.416')] |
|
[2025-04-29 03:14:48,849][03043] Fps is (10 sec: 3683.9, 60 sec: 3619.0, 300 sec: 3512.9). Total num frames: 4366336. Throughput: 0: 907.5. Samples: 1091072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:48,850][03043] Avg episode reward: [(0, '19.061')] |
|
[2025-04-29 03:14:53,862][03043] Fps is (10 sec: 3687.3, 60 sec: 3685.6, 300 sec: 3526.5). Total num frames: 4386816. Throughput: 0: 910.1. Samples: 1096640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:53,863][03043] Avg episode reward: [(0, '20.494')] |
|
[2025-04-29 03:14:58,855][03043] Fps is (10 sec: 3684.4, 60 sec: 3621.6, 300 sec: 3512.8). Total num frames: 4403200. Throughput: 0: 918.6. Samples: 1099520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:14:58,856][03043] Avg episode reward: [(0, '21.857')] |
|
[2025-04-29 03:15:03,857][03043] Fps is (10 sec: 3278.5, 60 sec: 3617.9, 300 sec: 3512.6). Total num frames: 4419584. Throughput: 0: 906.6. Samples: 1104704. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:03,858][03043] Avg episode reward: [(0, '19.388')] |
|
[2025-04-29 03:15:08,850][03043] Fps is (10 sec: 3688.3, 60 sec: 3618.6, 300 sec: 3526.9). Total num frames: 4440064. Throughput: 0: 921.0. Samples: 1110464. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:08,851][03043] Avg episode reward: [(0, '20.148')] |
|
[2025-04-29 03:15:13,850][03043] Fps is (10 sec: 3689.1, 60 sec: 3618.0, 300 sec: 3526.8). Total num frames: 4456448. Throughput: 0: 919.6. Samples: 1113280. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:13,852][03043] Avg episode reward: [(0, '21.178')] |
|
[2025-04-29 03:15:18,862][03043] Fps is (10 sec: 3681.9, 60 sec: 3617.8, 300 sec: 3540.3). Total num frames: 4476928. Throughput: 0: 909.0. Samples: 1118528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:18,863][03043] Avg episode reward: [(0, '20.444')] |
|
[2025-04-29 03:15:18,902][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001093_4476928.pth... |
|
[2025-04-29 03:15:18,953][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000000986_4038656.pth |
|
[2025-04-29 03:15:23,838][03043] Fps is (10 sec: 4101.0, 60 sec: 3687.5, 300 sec: 3540.9). Total num frames: 4497408. Throughput: 0: 925.3. Samples: 1124416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:23,839][03043] Avg episode reward: [(0, '20.014')] |
|
[2025-04-29 03:15:28,845][03043] Fps is (10 sec: 3692.6, 60 sec: 3687.2, 300 sec: 3540.9). Total num frames: 4513792. Throughput: 0: 912.8. Samples: 1126752. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:28,846][03043] Avg episode reward: [(0, '20.282')] |
|
[2025-04-29 03:15:33,897][03043] Fps is (10 sec: 3664.7, 60 sec: 3684.8, 300 sec: 3553.8). Total num frames: 4534272. Throughput: 0: 917.8. Samples: 1132416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:33,898][03043] Avg episode reward: [(0, '20.250')] |
|
[2025-04-29 03:15:38,841][03043] Fps is (10 sec: 3688.1, 60 sec: 3686.5, 300 sec: 3554.9). Total num frames: 4550656. Throughput: 0: 916.4. Samples: 1137856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:38,847][03043] Avg episode reward: [(0, '19.399')] |
|
[2025-04-29 03:15:43,837][03043] Fps is (10 sec: 3296.6, 60 sec: 3619.8, 300 sec: 3554.6). Total num frames: 4567040. Throughput: 0: 908.5. Samples: 1140384. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:43,838][03043] Avg episode reward: [(0, '20.252')] |
|
[2025-04-29 03:15:48,844][03043] Fps is (10 sec: 3685.2, 60 sec: 3686.7, 300 sec: 3568.6). Total num frames: 4587520. Throughput: 0: 916.9. Samples: 1145952. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:48,845][03043] Avg episode reward: [(0, '22.020')] |
|
[2025-04-29 03:15:53,862][03043] Fps is (10 sec: 3677.1, 60 sec: 3618.2, 300 sec: 3568.4). Total num frames: 4603904. Throughput: 0: 902.9. Samples: 1151104. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:53,863][03043] Avg episode reward: [(0, '23.648')] |
|
[2025-04-29 03:15:58,877][03043] Fps is (10 sec: 3674.2, 60 sec: 3685.0, 300 sec: 3567.9). Total num frames: 4624384. Throughput: 0: 905.4. Samples: 1154048. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:15:58,884][03043] Avg episode reward: [(0, '23.523')] |
|
[2025-04-29 03:16:03,862][03043] Fps is (10 sec: 3686.3, 60 sec: 3686.1, 300 sec: 3568.1). Total num frames: 4640768. Throughput: 0: 916.6. Samples: 1159776. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:03,866][03043] Avg episode reward: [(0, '22.953')] |
|
[2025-04-29 03:16:08,860][03043] Fps is (10 sec: 3282.4, 60 sec: 3617.5, 300 sec: 3568.2). Total num frames: 4657152. Throughput: 0: 899.8. Samples: 1164928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:08,861][03043] Avg episode reward: [(0, '24.399')] |
|
[2025-04-29 03:16:08,898][03043] Saving new best policy, reward=24.399! |
|
[2025-04-29 03:16:13,848][03043] Fps is (10 sec: 3691.5, 60 sec: 3686.5, 300 sec: 3583.3). Total num frames: 4677632. Throughput: 0: 909.5. Samples: 1167680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:13,850][03043] Avg episode reward: [(0, '23.314')] |
|
[2025-04-29 03:16:18,866][03043] Fps is (10 sec: 3684.2, 60 sec: 3617.9, 300 sec: 3582.3). Total num frames: 4694016. Throughput: 0: 904.4. Samples: 1173088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:18,867][03043] Avg episode reward: [(0, '22.344')] |
|
[2025-04-29 03:16:18,879][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001146_4694016.pth... |
|
[2025-04-29 03:16:18,928][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001040_4259840.pth |
|
[2025-04-29 03:16:23,850][03043] Fps is (10 sec: 3685.9, 60 sec: 3617.4, 300 sec: 3582.2). Total num frames: 4714496. Throughput: 0: 903.6. Samples: 1178528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:23,851][03043] Avg episode reward: [(0, '22.328')] |
|
[2025-04-29 03:16:28,867][03043] Fps is (10 sec: 3686.2, 60 sec: 3616.9, 300 sec: 3581.9). Total num frames: 4730880. Throughput: 0: 911.0. Samples: 1181408. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:28,871][03043] Avg episode reward: [(0, '23.539')] |
|
[2025-04-29 03:16:33,838][03043] Fps is (10 sec: 3280.8, 60 sec: 3553.4, 300 sec: 3582.5). Total num frames: 4747264. Throughput: 0: 902.5. Samples: 1186560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:33,839][03043] Avg episode reward: [(0, '23.154')] |
|
[2025-04-29 03:16:38,854][03043] Fps is (10 sec: 3691.2, 60 sec: 3617.3, 300 sec: 3596.3). Total num frames: 4767744. Throughput: 0: 918.9. Samples: 1192448. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:38,855][03043] Avg episode reward: [(0, '22.840')] |
|
[2025-04-29 03:16:43,934][03043] Fps is (10 sec: 4057.0, 60 sec: 3680.5, 300 sec: 3609.2). Total num frames: 4788224. Throughput: 0: 916.2. Samples: 1195328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:43,936][03043] Avg episode reward: [(0, '21.848')] |
|
[2025-04-29 03:16:48,849][03043] Fps is (10 sec: 3688.2, 60 sec: 3617.8, 300 sec: 3610.2). Total num frames: 4804608. Throughput: 0: 903.4. Samples: 1200416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:48,850][03043] Avg episode reward: [(0, '22.035')] |
|
[2025-04-29 03:16:53,851][03043] Fps is (10 sec: 3304.1, 60 sec: 3618.8, 300 sec: 3596.0). Total num frames: 4820992. Throughput: 0: 911.1. Samples: 1205920. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:53,852][03043] Avg episode reward: [(0, '22.615')] |
|
[2025-04-29 03:16:58,865][03043] Fps is (10 sec: 3271.6, 60 sec: 3550.6, 300 sec: 3596.5). Total num frames: 4837376. Throughput: 0: 898.5. Samples: 1208128. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:16:58,866][03043] Avg episode reward: [(0, '22.510')] |
|
[2025-04-29 03:17:03,852][03043] Fps is (10 sec: 3686.1, 60 sec: 3618.8, 300 sec: 3610.1). Total num frames: 4857856. Throughput: 0: 898.4. Samples: 1213504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:03,853][03043] Avg episode reward: [(0, '22.027')] |
|
[2025-04-29 03:17:08,863][03043] Fps is (10 sec: 3686.9, 60 sec: 3617.9, 300 sec: 3609.9). Total num frames: 4874240. Throughput: 0: 890.8. Samples: 1218624. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:08,868][03043] Avg episode reward: [(0, '22.771')] |
|
[2025-04-29 03:17:13,858][03043] Fps is (10 sec: 3275.0, 60 sec: 3549.3, 300 sec: 3610.1). Total num frames: 4890624. Throughput: 0: 878.4. Samples: 1220928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:13,859][03043] Avg episode reward: [(0, '23.357')] |
|
[2025-04-29 03:17:18,837][03043] Fps is (10 sec: 3285.5, 60 sec: 3551.6, 300 sec: 3610.1). Total num frames: 4907008. Throughput: 0: 886.8. Samples: 1226464. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:18,838][03043] Avg episode reward: [(0, '22.743')] |
|
[2025-04-29 03:17:18,879][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001198_4907008.pth... |
|
[2025-04-29 03:17:18,940][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001093_4476928.pth |
|
[2025-04-29 03:17:23,865][03043] Fps is (10 sec: 3274.5, 60 sec: 3480.7, 300 sec: 3610.1). Total num frames: 4923392. Throughput: 0: 858.8. Samples: 1231104. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:23,866][03043] Avg episode reward: [(0, '22.197')] |
|
[2025-04-29 03:17:28,862][03043] Fps is (10 sec: 3268.6, 60 sec: 3481.9, 300 sec: 3596.2). Total num frames: 4939776. Throughput: 0: 858.3. Samples: 1233888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:28,864][03043] Avg episode reward: [(0, '21.506')] |
|
[2025-04-29 03:17:33,850][03043] Fps is (10 sec: 3691.7, 60 sec: 3549.1, 300 sec: 3610.0). Total num frames: 4960256. Throughput: 0: 862.6. Samples: 1239232. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:33,851][03043] Avg episode reward: [(0, '21.311')] |
|
[2025-04-29 03:17:38,836][03043] Fps is (10 sec: 3696.1, 60 sec: 3482.6, 300 sec: 3610.1). Total num frames: 4976640. Throughput: 0: 851.5. Samples: 1244224. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:38,837][03043] Avg episode reward: [(0, '21.920')] |
|
[2025-04-29 03:17:43,851][03043] Fps is (10 sec: 3276.4, 60 sec: 3418.0, 300 sec: 3596.2). Total num frames: 4993024. Throughput: 0: 860.7. Samples: 1246848. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:43,852][03043] Avg episode reward: [(0, '20.296')] |
|
[2025-04-29 03:17:48,867][03043] Fps is (10 sec: 3266.5, 60 sec: 3412.3, 300 sec: 3595.8). Total num frames: 5009408. Throughput: 0: 853.8. Samples: 1251936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:48,868][03043] Avg episode reward: [(0, '19.158')] |
|
[2025-04-29 03:17:53,843][03043] Fps is (10 sec: 3279.4, 60 sec: 3413.8, 300 sec: 3582.3). Total num frames: 5025792. Throughput: 0: 856.6. Samples: 1257152. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:53,844][03043] Avg episode reward: [(0, '20.048')] |
|
[2025-04-29 03:17:58,877][03043] Fps is (10 sec: 3683.0, 60 sec: 3480.9, 300 sec: 3596.1). Total num frames: 5046272. Throughput: 0: 861.5. Samples: 1259712. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:17:58,881][03043] Avg episode reward: [(0, '21.344')] |
|
[2025-04-29 03:18:03,876][03043] Fps is (10 sec: 3674.3, 60 sec: 3411.9, 300 sec: 3596.0). Total num frames: 5062656. Throughput: 0: 847.6. Samples: 1264640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:03,879][03043] Avg episode reward: [(0, '21.253')] |
|
[2025-04-29 03:18:08,837][03043] Fps is (10 sec: 3289.9, 60 sec: 3414.8, 300 sec: 3582.3). Total num frames: 5079040. Throughput: 0: 868.1. Samples: 1270144. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:08,838][03043] Avg episode reward: [(0, '19.966')] |
|
[2025-04-29 03:18:13,859][03043] Fps is (10 sec: 3282.5, 60 sec: 3413.3, 300 sec: 3582.4). Total num frames: 5095424. Throughput: 0: 866.2. Samples: 1272864. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:13,864][03043] Avg episode reward: [(0, '20.434')] |
|
[2025-04-29 03:18:18,848][03043] Fps is (10 sec: 3273.2, 60 sec: 3412.7, 300 sec: 3582.4). Total num frames: 5111808. Throughput: 0: 852.0. Samples: 1277568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:18,849][03043] Avg episode reward: [(0, '20.509')] |
|
[2025-04-29 03:18:18,888][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001248_5111808.pth... |
|
[2025-04-29 03:18:18,937][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001146_4694016.pth |
|
[2025-04-29 03:18:23,840][03043] Fps is (10 sec: 3283.0, 60 sec: 3414.7, 300 sec: 3568.5). Total num frames: 5128192. Throughput: 0: 854.0. Samples: 1282656. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:23,841][03043] Avg episode reward: [(0, '19.494')] |
|
[2025-04-29 03:18:28,859][03043] Fps is (10 sec: 3273.1, 60 sec: 3413.5, 300 sec: 3568.4). Total num frames: 5144576. Throughput: 0: 845.4. Samples: 1284896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:28,860][03043] Avg episode reward: [(0, '19.555')] |
|
[2025-04-29 03:18:33,895][03043] Fps is (10 sec: 3258.8, 60 sec: 3342.6, 300 sec: 3554.0). Total num frames: 5160960. Throughput: 0: 850.0. Samples: 1290208. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:33,896][03043] Avg episode reward: [(0, '19.945')] |
|
[2025-04-29 03:18:38,845][03043] Fps is (10 sec: 3691.6, 60 sec: 3412.8, 300 sec: 3568.6). Total num frames: 5181440. Throughput: 0: 849.0. Samples: 1295360. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:38,847][03043] Avg episode reward: [(0, '21.024')] |
|
[2025-04-29 03:18:43,862][03043] Fps is (10 sec: 3698.6, 60 sec: 3412.7, 300 sec: 3554.5). Total num frames: 5197824. Throughput: 0: 845.8. Samples: 1297760. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:43,863][03043] Avg episode reward: [(0, '20.498')] |
|
[2025-04-29 03:18:48,855][03043] Fps is (10 sec: 3273.5, 60 sec: 3414.0, 300 sec: 3554.4). Total num frames: 5214208. Throughput: 0: 856.6. Samples: 1303168. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:48,856][03043] Avg episode reward: [(0, '19.643')] |
|
[2025-04-29 03:18:53,849][03043] Fps is (10 sec: 3281.1, 60 sec: 3413.0, 300 sec: 3541.4). Total num frames: 5230592. Throughput: 0: 839.6. Samples: 1307936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:53,850][03043] Avg episode reward: [(0, '20.056')] |
|
[2025-04-29 03:18:58,875][03043] Fps is (10 sec: 3270.4, 60 sec: 3345.2, 300 sec: 3540.4). Total num frames: 5246976. Throughput: 0: 842.4. Samples: 1310784. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:18:58,876][03043] Avg episode reward: [(0, '18.755')] |
|
[2025-04-29 03:19:03,875][03043] Fps is (10 sec: 3676.9, 60 sec: 3413.4, 300 sec: 3540.4). Total num frames: 5267456. Throughput: 0: 855.7. Samples: 1316096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:03,881][03043] Avg episode reward: [(0, '17.993')] |
|
[2025-04-29 03:19:08,859][03043] Fps is (10 sec: 3692.3, 60 sec: 3412.1, 300 sec: 3540.5). Total num frames: 5283840. Throughput: 0: 849.4. Samples: 1320896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:08,860][03043] Avg episode reward: [(0, '18.751')] |
|
[2025-04-29 03:19:13,865][03043] Fps is (10 sec: 3280.2, 60 sec: 3413.0, 300 sec: 3526.6). Total num frames: 5300224. Throughput: 0: 861.0. Samples: 1323648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:13,866][03043] Avg episode reward: [(0, '18.649')] |
|
[2025-04-29 03:19:18,864][03043] Fps is (10 sec: 3275.0, 60 sec: 3412.4, 300 sec: 3526.6). Total num frames: 5316608. Throughput: 0: 854.6. Samples: 1328640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:18,865][03043] Avg episode reward: [(0, '20.423')] |
|
[2025-04-29 03:19:18,903][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001298_5316608.pth... |
|
[2025-04-29 03:19:18,959][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001198_4907008.pth |
|
[2025-04-29 03:19:23,854][03043] Fps is (10 sec: 3280.4, 60 sec: 3412.5, 300 sec: 3526.8). Total num frames: 5332992. Throughput: 0: 856.0. Samples: 1333888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:23,855][03043] Avg episode reward: [(0, '20.446')] |
|
[2025-04-29 03:19:28,857][03043] Fps is (10 sec: 3689.2, 60 sec: 3481.7, 300 sec: 3526.9). Total num frames: 5353472. Throughput: 0: 863.4. Samples: 1336608. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:28,858][03043] Avg episode reward: [(0, '21.230')] |
|
[2025-04-29 03:19:33,862][03043] Fps is (10 sec: 3683.5, 60 sec: 3483.5, 300 sec: 3526.5). Total num frames: 5369856. Throughput: 0: 850.4. Samples: 1341440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:33,863][03043] Avg episode reward: [(0, '22.626')] |
|
[2025-04-29 03:19:38,844][03043] Fps is (10 sec: 3281.1, 60 sec: 3413.4, 300 sec: 3513.1). Total num frames: 5386240. Throughput: 0: 864.1. Samples: 1346816. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:38,845][03043] Avg episode reward: [(0, '21.916')] |
|
[2025-04-29 03:19:43,864][03043] Fps is (10 sec: 3276.2, 60 sec: 3413.3, 300 sec: 3512.7). Total num frames: 5402624. Throughput: 0: 862.8. Samples: 1349600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:43,868][03043] Avg episode reward: [(0, '22.305')] |
|
[2025-04-29 03:19:48,847][03043] Fps is (10 sec: 3275.6, 60 sec: 3413.8, 300 sec: 3499.1). Total num frames: 5419008. Throughput: 0: 851.7. Samples: 1354400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:48,848][03043] Avg episode reward: [(0, '21.785')] |
|
[2025-04-29 03:19:53,889][03043] Fps is (10 sec: 3677.3, 60 sec: 3479.3, 300 sec: 3512.4). Total num frames: 5439488. Throughput: 0: 862.7. Samples: 1359744. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:53,890][03043] Avg episode reward: [(0, '21.223')] |
|
[2025-04-29 03:19:58,859][03043] Fps is (10 sec: 3272.9, 60 sec: 3414.2, 300 sec: 3498.9). Total num frames: 5451776. Throughput: 0: 854.2. Samples: 1362080. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:19:58,860][03043] Avg episode reward: [(0, '21.420')] |
|
[2025-04-29 03:20:03,863][03043] Fps is (10 sec: 3285.1, 60 sec: 3414.0, 300 sec: 3498.8). Total num frames: 5472256. Throughput: 0: 856.2. Samples: 1367168. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:03,864][03043] Avg episode reward: [(0, '21.047')] |
|
[2025-04-29 03:20:08,871][03043] Fps is (10 sec: 3681.9, 60 sec: 3412.6, 300 sec: 3498.7). Total num frames: 5488640. Throughput: 0: 854.4. Samples: 1372352. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:08,873][03043] Avg episode reward: [(0, '23.851')] |
|
[2025-04-29 03:20:13,865][03043] Fps is (10 sec: 3276.3, 60 sec: 3413.3, 300 sec: 3485.0). Total num frames: 5505024. Throughput: 0: 847.5. Samples: 1374752. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:13,866][03043] Avg episode reward: [(0, '23.630')] |
|
[2025-04-29 03:20:18,842][03043] Fps is (10 sec: 3286.5, 60 sec: 3414.6, 300 sec: 3471.1). Total num frames: 5521408. Throughput: 0: 861.5. Samples: 1380192. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:18,843][03043] Avg episode reward: [(0, '23.601')] |
|
[2025-04-29 03:20:18,888][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001348_5521408.pth... |
|
[2025-04-29 03:20:18,935][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001248_5111808.pth |
|
[2025-04-29 03:20:23,847][03043] Fps is (10 sec: 3282.6, 60 sec: 3413.7, 300 sec: 3471.2). Total num frames: 5537792. Throughput: 0: 846.2. Samples: 1384896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:23,849][03043] Avg episode reward: [(0, '23.898')] |
|
[2025-04-29 03:20:28,859][03043] Fps is (10 sec: 3680.1, 60 sec: 3413.2, 300 sec: 3471.6). Total num frames: 5558272. Throughput: 0: 844.9. Samples: 1387616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:28,860][03043] Avg episode reward: [(0, '22.446')] |
|
[2025-04-29 03:20:33,844][03043] Fps is (10 sec: 3687.7, 60 sec: 3414.4, 300 sec: 3471.2). Total num frames: 5574656. Throughput: 0: 857.7. Samples: 1392992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:33,848][03043] Avg episode reward: [(0, '20.841')] |
|
[2025-04-29 03:20:38,857][03043] Fps is (10 sec: 3277.5, 60 sec: 3412.6, 300 sec: 3470.9). Total num frames: 5591040. Throughput: 0: 847.5. Samples: 1397856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:38,858][03043] Avg episode reward: [(0, '19.701')] |
|
[2025-04-29 03:20:43,856][03043] Fps is (10 sec: 3272.7, 60 sec: 3413.8, 300 sec: 3457.2). Total num frames: 5607424. Throughput: 0: 856.9. Samples: 1400640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:43,857][03043] Avg episode reward: [(0, '18.498')] |
|
[2025-04-29 03:20:48,886][03043] Fps is (10 sec: 3267.5, 60 sec: 3411.2, 300 sec: 3457.0). Total num frames: 5623808. Throughput: 0: 859.3. Samples: 1405856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:48,887][03043] Avg episode reward: [(0, '17.218')] |
|
[2025-04-29 03:20:53,894][03043] Fps is (10 sec: 3672.6, 60 sec: 3413.0, 300 sec: 3457.1). Total num frames: 5644288. Throughput: 0: 857.2. Samples: 1410944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:53,895][03043] Avg episode reward: [(0, '18.373')] |
|
[2025-04-29 03:20:58,839][03043] Fps is (10 sec: 3703.6, 60 sec: 3482.8, 300 sec: 3457.6). Total num frames: 5660672. Throughput: 0: 863.1. Samples: 1413568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:20:58,840][03043] Avg episode reward: [(0, '19.544')] |
|
[2025-04-29 03:21:03,855][03043] Fps is (10 sec: 3289.6, 60 sec: 3413.8, 300 sec: 3457.4). Total num frames: 5677056. Throughput: 0: 848.8. Samples: 1418400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:03,856][03043] Avg episode reward: [(0, '20.798')] |
|
[2025-04-29 03:21:08,865][03043] Fps is (10 sec: 3268.2, 60 sec: 3413.7, 300 sec: 3443.2). Total num frames: 5693440. Throughput: 0: 866.5. Samples: 1423904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:08,867][03043] Avg episode reward: [(0, '21.197')] |
|
[2025-04-29 03:21:13,864][03043] Fps is (10 sec: 3273.7, 60 sec: 3413.4, 300 sec: 3443.4). Total num frames: 5709824. Throughput: 0: 865.3. Samples: 1426560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:13,866][03043] Avg episode reward: [(0, '21.389')] |
|
[2025-04-29 03:21:18,882][03043] Fps is (10 sec: 3680.1, 60 sec: 3479.3, 300 sec: 3443.0). Total num frames: 5730304. Throughput: 0: 856.2. Samples: 1431552. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:18,883][03043] Avg episode reward: [(0, '21.815')] |
|
[2025-04-29 03:21:18,925][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001399_5730304.pth... |
|
[2025-04-29 03:21:18,982][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001298_5316608.pth |
|
[2025-04-29 03:21:23,850][03043] Fps is (10 sec: 3691.7, 60 sec: 3481.4, 300 sec: 3443.6). Total num frames: 5746688. Throughput: 0: 864.1. Samples: 1436736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:23,851][03043] Avg episode reward: [(0, '21.277')] |
|
[2025-04-29 03:21:28,860][03043] Fps is (10 sec: 3284.0, 60 sec: 3413.3, 300 sec: 3443.2). Total num frames: 5763072. Throughput: 0: 854.7. Samples: 1439104. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:28,862][03043] Avg episode reward: [(0, '20.418')] |
|
[2025-04-29 03:21:33,864][03043] Fps is (10 sec: 3272.1, 60 sec: 3412.2, 300 sec: 3429.4). Total num frames: 5779456. Throughput: 0: 855.9. Samples: 1444352. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:33,865][03043] Avg episode reward: [(0, '21.048')] |
|
[2025-04-29 03:21:38,850][03043] Fps is (10 sec: 3280.2, 60 sec: 3413.7, 300 sec: 3416.6). Total num frames: 5795840. Throughput: 0: 857.0. Samples: 1449472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:38,851][03043] Avg episode reward: [(0, '20.744')] |
|
[2025-04-29 03:21:43,843][03043] Fps is (10 sec: 3283.8, 60 sec: 3414.1, 300 sec: 3415.7). Total num frames: 5812224. Throughput: 0: 849.0. Samples: 1451776. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:43,844][03043] Avg episode reward: [(0, '21.032')] |
|
[2025-04-29 03:21:48,844][03043] Fps is (10 sec: 3688.7, 60 sec: 3484.0, 300 sec: 3429.6). Total num frames: 5832704. Throughput: 0: 862.8. Samples: 1457216. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:48,845][03043] Avg episode reward: [(0, '22.817')] |
|
[2025-04-29 03:21:53,846][03043] Fps is (10 sec: 3275.8, 60 sec: 3347.7, 300 sec: 3415.9). Total num frames: 5844992. Throughput: 0: 846.6. Samples: 1461984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:53,847][03043] Avg episode reward: [(0, '22.910')] |
|
[2025-04-29 03:21:58,854][03043] Fps is (10 sec: 3273.6, 60 sec: 3412.5, 300 sec: 3415.6). Total num frames: 5865472. Throughput: 0: 845.7. Samples: 1464608. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:21:58,854][03043] Avg episode reward: [(0, '22.796')] |
|
[2025-04-29 03:22:03,847][03043] Fps is (10 sec: 3685.9, 60 sec: 3413.8, 300 sec: 3415.8). Total num frames: 5881856. Throughput: 0: 860.4. Samples: 1470240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:03,852][03043] Avg episode reward: [(0, '21.689')] |
|
[2025-04-29 03:22:08,838][03043] Fps is (10 sec: 3281.7, 60 sec: 3414.9, 300 sec: 3415.9). Total num frames: 5898240. Throughput: 0: 855.0. Samples: 1475200. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:08,839][03043] Avg episode reward: [(0, '21.357')] |
|
[2025-04-29 03:22:13,852][03043] Fps is (10 sec: 3684.7, 60 sec: 3482.3, 300 sec: 3429.4). Total num frames: 5918720. Throughput: 0: 861.3. Samples: 1477856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:13,853][03043] Avg episode reward: [(0, '21.803')] |
|
[2025-04-29 03:22:18,932][03043] Fps is (10 sec: 3246.4, 60 sec: 3342.3, 300 sec: 3414.9). Total num frames: 5931008. Throughput: 0: 855.6. Samples: 1482912. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:18,933][03043] Avg episode reward: [(0, '21.135')] |
|
[2025-04-29 03:22:18,949][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001449_5935104.pth... |
|
[2025-04-29 03:22:18,995][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001348_5521408.pth |
|
[2025-04-29 03:22:23,861][03043] Fps is (10 sec: 3273.8, 60 sec: 3412.7, 300 sec: 3429.5). Total num frames: 5951488. Throughput: 0: 856.0. Samples: 1488000. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:23,862][03043] Avg episode reward: [(0, '21.169')] |
|
[2025-04-29 03:22:28,858][03043] Fps is (10 sec: 3714.1, 60 sec: 3413.5, 300 sec: 3415.6). Total num frames: 5967872. Throughput: 0: 864.4. Samples: 1490688. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:28,865][03043] Avg episode reward: [(0, '21.324')] |
|
[2025-04-29 03:22:33,858][03043] Fps is (10 sec: 3277.8, 60 sec: 3413.7, 300 sec: 3415.4). Total num frames: 5984256. Throughput: 0: 851.6. Samples: 1495552. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:33,859][03043] Avg episode reward: [(0, '20.744')] |
|
[2025-04-29 03:22:38,841][03043] Fps is (10 sec: 3282.2, 60 sec: 3413.8, 300 sec: 3415.8). Total num frames: 6000640. Throughput: 0: 865.5. Samples: 1500928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:38,842][03043] Avg episode reward: [(0, '20.317')] |
|
[2025-04-29 03:22:43,842][03043] Fps is (10 sec: 3282.1, 60 sec: 3413.4, 300 sec: 3415.9). Total num frames: 6017024. Throughput: 0: 867.8. Samples: 1503648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:43,845][03043] Avg episode reward: [(0, '21.171')] |
|
[2025-04-29 03:22:48,844][03043] Fps is (10 sec: 3685.3, 60 sec: 3413.3, 300 sec: 3429.5). Total num frames: 6037504. Throughput: 0: 849.8. Samples: 1508480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:48,845][03043] Avg episode reward: [(0, '20.777')] |
|
[2025-04-29 03:22:53,865][03043] Fps is (10 sec: 3678.0, 60 sec: 3480.5, 300 sec: 3415.8). Total num frames: 6053888. Throughput: 0: 857.8. Samples: 1513824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:53,866][03043] Avg episode reward: [(0, '21.017')] |
|
[2025-04-29 03:22:58,855][03043] Fps is (10 sec: 3273.4, 60 sec: 3413.3, 300 sec: 3415.9). Total num frames: 6070272. Throughput: 0: 852.6. Samples: 1516224. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:22:58,856][03043] Avg episode reward: [(0, '21.043')] |
|
[2025-04-29 03:23:03,853][03043] Fps is (10 sec: 3280.9, 60 sec: 3413.0, 300 sec: 3415.5). Total num frames: 6086656. Throughput: 0: 857.7. Samples: 1521440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:03,854][03043] Avg episode reward: [(0, '21.508')] |
|
[2025-04-29 03:23:08,862][03043] Fps is (10 sec: 3274.3, 60 sec: 3412.0, 300 sec: 3415.6). Total num frames: 6103040. Throughput: 0: 857.6. Samples: 1526592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:08,871][03043] Avg episode reward: [(0, '20.713')] |
|
[2025-04-29 03:23:13,922][03043] Fps is (10 sec: 3660.9, 60 sec: 3409.3, 300 sec: 3428.7). Total num frames: 6123520. Throughput: 0: 847.8. Samples: 1528896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:13,923][03043] Avg episode reward: [(0, '20.623')] |
|
[2025-04-29 03:23:18,848][03043] Fps is (10 sec: 3691.7, 60 sec: 3486.5, 300 sec: 3429.4). Total num frames: 6139904. Throughput: 0: 859.2. Samples: 1534208. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:18,849][03043] Avg episode reward: [(0, '19.535')] |
|
[2025-04-29 03:23:18,890][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001499_6139904.pth... |
|
[2025-04-29 03:23:18,944][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001399_5730304.pth |
|
[2025-04-29 03:23:23,850][03043] Fps is (10 sec: 3300.8, 60 sec: 3414.0, 300 sec: 3429.6). Total num frames: 6156288. Throughput: 0: 848.9. Samples: 1539136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:23,851][03043] Avg episode reward: [(0, '18.311')] |
|
[2025-04-29 03:23:28,867][03043] Fps is (10 sec: 3270.4, 60 sec: 3412.8, 300 sec: 3429.9). Total num frames: 6172672. Throughput: 0: 845.0. Samples: 1541696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:28,869][03043] Avg episode reward: [(0, '18.149')] |
|
[2025-04-29 03:23:33,843][03043] Fps is (10 sec: 3279.1, 60 sec: 3414.2, 300 sec: 3415.7). Total num frames: 6189056. Throughput: 0: 851.9. Samples: 1546816. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:33,848][03043] Avg episode reward: [(0, '16.724')] |
|
[2025-04-29 03:23:38,863][03043] Fps is (10 sec: 3278.1, 60 sec: 3412.1, 300 sec: 3415.6). Total num frames: 6205440. Throughput: 0: 837.0. Samples: 1551488. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:38,864][03043] Avg episode reward: [(0, '17.134')] |
|
[2025-04-29 03:23:43,867][03043] Fps is (10 sec: 3268.9, 60 sec: 3411.9, 300 sec: 3415.5). Total num frames: 6221824. Throughput: 0: 845.3. Samples: 1554272. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:43,868][03043] Avg episode reward: [(0, '17.788')] |
|
[2025-04-29 03:23:48,859][03043] Fps is (10 sec: 3278.3, 60 sec: 3344.2, 300 sec: 3415.5). Total num frames: 6238208. Throughput: 0: 841.8. Samples: 1559328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:48,860][03043] Avg episode reward: [(0, '18.132')] |
|
[2025-04-29 03:23:53,854][03043] Fps is (10 sec: 3281.0, 60 sec: 3345.7, 300 sec: 3415.9). Total num frames: 6254592. Throughput: 0: 838.6. Samples: 1564320. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:53,855][03043] Avg episode reward: [(0, '19.695')] |
|
[2025-04-29 03:23:58,843][03043] Fps is (10 sec: 3692.3, 60 sec: 3414.0, 300 sec: 3416.0). Total num frames: 6275072. Throughput: 0: 849.9. Samples: 1567072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:23:58,844][03043] Avg episode reward: [(0, '19.961')] |
|
[2025-04-29 03:24:03,862][03043] Fps is (10 sec: 3683.6, 60 sec: 3412.8, 300 sec: 3415.6). Total num frames: 6291456. Throughput: 0: 837.4. Samples: 1571904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:03,863][03043] Avg episode reward: [(0, '20.531')] |
|
[2025-04-29 03:24:08,866][03043] Fps is (10 sec: 3269.3, 60 sec: 3413.1, 300 sec: 3415.6). Total num frames: 6307840. Throughput: 0: 846.6. Samples: 1577248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:08,867][03043] Avg episode reward: [(0, '20.818')] |
|
[2025-04-29 03:24:13,852][03043] Fps is (10 sec: 3279.8, 60 sec: 3349.0, 300 sec: 3415.8). Total num frames: 6324224. Throughput: 0: 850.8. Samples: 1579968. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:13,854][03043] Avg episode reward: [(0, '22.367')] |
|
[2025-04-29 03:24:18,851][03043] Fps is (10 sec: 3281.8, 60 sec: 3344.9, 300 sec: 3415.7). Total num frames: 6340608. Throughput: 0: 845.4. Samples: 1584864. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:18,852][03043] Avg episode reward: [(0, '23.333')] |
|
[2025-04-29 03:24:18,899][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001548_6340608.pth... |
|
[2025-04-29 03:24:18,958][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001449_5935104.pth |
|
[2025-04-29 03:24:23,857][03043] Fps is (10 sec: 3684.8, 60 sec: 3412.9, 300 sec: 3415.6). Total num frames: 6361088. Throughput: 0: 859.9. Samples: 1590176. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:23,858][03043] Avg episode reward: [(0, '22.695')] |
|
[2025-04-29 03:24:28,855][03043] Fps is (10 sec: 3275.5, 60 sec: 3345.8, 300 sec: 3401.8). Total num frames: 6373376. Throughput: 0: 852.1. Samples: 1592608. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:28,856][03043] Avg episode reward: [(0, '21.499')] |
|
[2025-04-29 03:24:33,840][03043] Fps is (10 sec: 3282.2, 60 sec: 3413.5, 300 sec: 3415.7). Total num frames: 6393856. Throughput: 0: 853.0. Samples: 1597696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:33,841][03043] Avg episode reward: [(0, '23.510')] |
|
[2025-04-29 03:24:38,843][03043] Fps is (10 sec: 3690.9, 60 sec: 3414.5, 300 sec: 3415.9). Total num frames: 6410240. Throughput: 0: 867.1. Samples: 1603328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:38,847][03043] Avg episode reward: [(0, '23.644')] |
|
[2025-04-29 03:24:43,853][03043] Fps is (10 sec: 3681.5, 60 sec: 3482.4, 300 sec: 3429.5). Total num frames: 6430720. Throughput: 0: 860.2. Samples: 1605792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:43,855][03043] Avg episode reward: [(0, '22.523')] |
|
[2025-04-29 03:24:48,851][03043] Fps is (10 sec: 4092.7, 60 sec: 3550.3, 300 sec: 3430.0). Total num frames: 6451200. Throughput: 0: 886.3. Samples: 1611776. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:48,852][03043] Avg episode reward: [(0, '20.529')] |
|
[2025-04-29 03:24:53,843][03043] Fps is (10 sec: 3690.2, 60 sec: 3550.5, 300 sec: 3443.6). Total num frames: 6467584. Throughput: 0: 882.9. Samples: 1616960. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:53,844][03043] Avg episode reward: [(0, '22.119')] |
|
[2025-04-29 03:24:58,872][03043] Fps is (10 sec: 3269.9, 60 sec: 3479.9, 300 sec: 3429.4). Total num frames: 6483968. Throughput: 0: 886.4. Samples: 1619872. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:24:58,873][03043] Avg episode reward: [(0, '21.449')] |
|
[2025-04-29 03:25:03,850][03043] Fps is (10 sec: 3684.0, 60 sec: 3550.6, 300 sec: 3443.7). Total num frames: 6504448. Throughput: 0: 903.1. Samples: 1625504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:03,855][03043] Avg episode reward: [(0, '21.423')] |
|
[2025-04-29 03:25:08,855][03043] Fps is (10 sec: 3692.7, 60 sec: 3550.5, 300 sec: 3443.5). Total num frames: 6520832. Throughput: 0: 898.9. Samples: 1630624. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:08,856][03043] Avg episode reward: [(0, '22.339')] |
|
[2025-04-29 03:25:13,850][03043] Fps is (10 sec: 3686.3, 60 sec: 3618.3, 300 sec: 3457.2). Total num frames: 6541312. Throughput: 0: 908.2. Samples: 1633472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:13,851][03043] Avg episode reward: [(0, '22.181')] |
|
[2025-04-29 03:25:18,844][03043] Fps is (10 sec: 3690.3, 60 sec: 3618.5, 300 sec: 3457.3). Total num frames: 6557696. Throughput: 0: 918.7. Samples: 1639040. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:18,850][03043] Avg episode reward: [(0, '22.754')] |
|
[2025-04-29 03:25:18,901][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001601_6557696.pth... |
|
[2025-04-29 03:25:18,977][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001499_6139904.pth |
|
[2025-04-29 03:25:23,900][03043] Fps is (10 sec: 3668.3, 60 sec: 3615.6, 300 sec: 3456.8). Total num frames: 6578176. Throughput: 0: 910.5. Samples: 1644352. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:23,905][03043] Avg episode reward: [(0, '22.135')] |
|
[2025-04-29 03:25:28,838][03043] Fps is (10 sec: 3688.5, 60 sec: 3687.4, 300 sec: 3457.4). Total num frames: 6594560. Throughput: 0: 921.9. Samples: 1647264. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:28,839][03043] Avg episode reward: [(0, '20.171')] |
|
[2025-04-29 03:25:33,845][03043] Fps is (10 sec: 3294.7, 60 sec: 3617.8, 300 sec: 3457.4). Total num frames: 6610944. Throughput: 0: 903.2. Samples: 1652416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:33,846][03043] Avg episode reward: [(0, '20.142')] |
|
[2025-04-29 03:25:38,855][03043] Fps is (10 sec: 3680.1, 60 sec: 3685.6, 300 sec: 3471.2). Total num frames: 6631424. Throughput: 0: 915.0. Samples: 1658144. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:38,856][03043] Avg episode reward: [(0, '20.303')] |
|
[2025-04-29 03:25:43,837][03043] Fps is (10 sec: 3689.3, 60 sec: 3619.1, 300 sec: 3471.8). Total num frames: 6647808. Throughput: 0: 915.2. Samples: 1661024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:43,839][03043] Avg episode reward: [(0, '20.448')] |
|
[2025-04-29 03:25:48,851][03043] Fps is (10 sec: 3688.1, 60 sec: 3618.1, 300 sec: 3471.7). Total num frames: 6668288. Throughput: 0: 902.4. Samples: 1666112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:48,852][03043] Avg episode reward: [(0, '20.697')] |
|
[2025-04-29 03:25:53,862][03043] Fps is (10 sec: 3677.3, 60 sec: 3617.0, 300 sec: 3470.9). Total num frames: 6684672. Throughput: 0: 914.3. Samples: 1671776. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:53,863][03043] Avg episode reward: [(0, '20.858')] |
|
[2025-04-29 03:25:58,862][03043] Fps is (10 sec: 3273.1, 60 sec: 3618.7, 300 sec: 3471.1). Total num frames: 6701056. Throughput: 0: 907.1. Samples: 1674304. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:25:58,863][03043] Avg episode reward: [(0, '22.631')] |
|
[2025-04-29 03:26:03,854][03043] Fps is (10 sec: 3689.4, 60 sec: 3617.9, 300 sec: 3485.2). Total num frames: 6721536. Throughput: 0: 905.8. Samples: 1679808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:03,855][03043] Avg episode reward: [(0, '22.621')] |
|
[2025-04-29 03:26:08,844][03043] Fps is (10 sec: 4103.6, 60 sec: 3687.1, 300 sec: 3499.2). Total num frames: 6742016. Throughput: 0: 914.9. Samples: 1685472. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:08,845][03043] Avg episode reward: [(0, '22.353')] |
|
[2025-04-29 03:26:13,841][03043] Fps is (10 sec: 3691.1, 60 sec: 3618.7, 300 sec: 3485.6). Total num frames: 6758400. Throughput: 0: 901.6. Samples: 1687840. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:13,842][03043] Avg episode reward: [(0, '23.953')] |
|
[2025-04-29 03:26:18,875][03043] Fps is (10 sec: 3674.8, 60 sec: 3684.5, 300 sec: 3498.7). Total num frames: 6778880. Throughput: 0: 914.6. Samples: 1693600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:18,876][03043] Avg episode reward: [(0, '23.999')] |
|
[2025-04-29 03:26:18,886][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001655_6778880.pth... |
|
[2025-04-29 03:26:18,939][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001548_6340608.pth |
|
[2025-04-29 03:26:23,848][03043] Fps is (10 sec: 3274.6, 60 sec: 3552.9, 300 sec: 3485.2). Total num frames: 6791168. Throughput: 0: 899.0. Samples: 1698592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:23,849][03043] Avg episode reward: [(0, '23.875')] |
|
[2025-04-29 03:26:28,849][03043] Fps is (10 sec: 3285.4, 60 sec: 3617.5, 300 sec: 3499.1). Total num frames: 6811648. Throughput: 0: 896.5. Samples: 1701376. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:28,850][03043] Avg episode reward: [(0, '22.864')] |
|
[2025-04-29 03:26:33,867][03043] Fps is (10 sec: 4088.0, 60 sec: 3685.0, 300 sec: 3512.6). Total num frames: 6832128. Throughput: 0: 909.9. Samples: 1707072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:33,869][03043] Avg episode reward: [(0, '22.212')] |
|
[2025-04-29 03:26:38,859][03043] Fps is (10 sec: 3682.7, 60 sec: 3617.9, 300 sec: 3512.6). Total num frames: 6848512. Throughput: 0: 898.9. Samples: 1712224. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:38,864][03043] Avg episode reward: [(0, '21.215')] |
|
[2025-04-29 03:26:43,928][03043] Fps is (10 sec: 3664.2, 60 sec: 3680.8, 300 sec: 3511.8). Total num frames: 6868992. Throughput: 0: 906.0. Samples: 1715136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:43,930][03043] Avg episode reward: [(0, '20.034')] |
|
[2025-04-29 03:26:48,859][03043] Fps is (10 sec: 3686.5, 60 sec: 3617.7, 300 sec: 3526.6). Total num frames: 6885376. Throughput: 0: 905.1. Samples: 1720544. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:48,860][03043] Avg episode reward: [(0, '20.505')] |
|
[2025-04-29 03:26:53,848][03043] Fps is (10 sec: 3303.3, 60 sec: 3619.0, 300 sec: 3512.9). Total num frames: 6901760. Throughput: 0: 899.5. Samples: 1725952. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:53,849][03043] Avg episode reward: [(0, '21.754')] |
|
[2025-04-29 03:26:58,844][03043] Fps is (10 sec: 3691.8, 60 sec: 3687.5, 300 sec: 3526.8). Total num frames: 6922240. Throughput: 0: 910.2. Samples: 1728800. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:26:58,845][03043] Avg episode reward: [(0, '22.413')] |
|
[2025-04-29 03:27:03,843][03043] Fps is (10 sec: 3688.2, 60 sec: 3618.8, 300 sec: 3526.7). Total num frames: 6938624. Throughput: 0: 895.9. Samples: 1733888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:03,844][03043] Avg episode reward: [(0, '23.041')] |
|
[2025-04-29 03:27:08,841][03043] Fps is (10 sec: 3687.7, 60 sec: 3618.3, 300 sec: 3526.9). Total num frames: 6959104. Throughput: 0: 913.9. Samples: 1739712. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:08,841][03043] Avg episode reward: [(0, '22.875')] |
|
[2025-04-29 03:27:13,850][03043] Fps is (10 sec: 3684.0, 60 sec: 3617.6, 300 sec: 3541.6). Total num frames: 6975488. Throughput: 0: 915.9. Samples: 1742592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:13,851][03043] Avg episode reward: [(0, '23.802')] |
|
[2025-04-29 03:27:18,848][03043] Fps is (10 sec: 3274.3, 60 sec: 3551.5, 300 sec: 3526.9). Total num frames: 6991872. Throughput: 0: 901.4. Samples: 1747616. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:18,849][03043] Avg episode reward: [(0, '22.445')] |
|
[2025-04-29 03:27:18,888][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001707_6991872.pth... |
|
[2025-04-29 03:27:18,936][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001601_6557696.pth |
|
[2025-04-29 03:27:23,842][03043] Fps is (10 sec: 3689.3, 60 sec: 3686.8, 300 sec: 3540.8). Total num frames: 7012352. Throughput: 0: 914.1. Samples: 1753344. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:23,843][03043] Avg episode reward: [(0, '22.338')] |
|
[2025-04-29 03:27:28,851][03043] Fps is (10 sec: 3685.4, 60 sec: 3618.0, 300 sec: 3540.7). Total num frames: 7028736. Throughput: 0: 906.8. Samples: 1755872. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:28,852][03043] Avg episode reward: [(0, '21.895')] |
|
[2025-04-29 03:27:33,846][03043] Fps is (10 sec: 3684.8, 60 sec: 3619.4, 300 sec: 3554.4). Total num frames: 7049216. Throughput: 0: 910.5. Samples: 1761504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:33,847][03043] Avg episode reward: [(0, '23.125')] |
|
[2025-04-29 03:27:38,862][03043] Fps is (10 sec: 3682.5, 60 sec: 3618.0, 300 sec: 3554.3). Total num frames: 7065600. Throughput: 0: 914.9. Samples: 1767136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:38,867][03043] Avg episode reward: [(0, '23.592')] |
|
[2025-04-29 03:27:43,843][03043] Fps is (10 sec: 3687.5, 60 sec: 3623.3, 300 sec: 3554.5). Total num frames: 7086080. Throughput: 0: 906.0. Samples: 1769568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:43,844][03043] Avg episode reward: [(0, '22.542')] |
|
[2025-04-29 03:27:48,848][03043] Fps is (10 sec: 3691.5, 60 sec: 3618.8, 300 sec: 3554.7). Total num frames: 7102464. Throughput: 0: 922.9. Samples: 1775424. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:48,849][03043] Avg episode reward: [(0, '24.201')] |
|
[2025-04-29 03:27:53,867][03043] Fps is (10 sec: 3269.2, 60 sec: 3617.0, 300 sec: 3554.4). Total num frames: 7118848. Throughput: 0: 905.4. Samples: 1780480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:53,868][03043] Avg episode reward: [(0, '24.201')] |
|
[2025-04-29 03:27:58,843][03043] Fps is (10 sec: 3688.4, 60 sec: 3618.2, 300 sec: 3568.5). Total num frames: 7139328. Throughput: 0: 905.4. Samples: 1783328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:27:58,844][03043] Avg episode reward: [(0, '26.204')] |
|
[2025-04-29 03:27:58,883][03043] Saving new best policy, reward=26.204! |
|
[2025-04-29 03:28:03,872][03043] Fps is (10 sec: 4093.8, 60 sec: 3684.6, 300 sec: 3582.1). Total num frames: 7159808. Throughput: 0: 919.0. Samples: 1788992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:03,873][03043] Avg episode reward: [(0, '25.518')] |
|
[2025-04-29 03:28:08,860][03043] Fps is (10 sec: 3680.0, 60 sec: 3617.0, 300 sec: 3569.1). Total num frames: 7176192. Throughput: 0: 905.6. Samples: 1794112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:08,861][03043] Avg episode reward: [(0, '26.285')] |
|
[2025-04-29 03:28:08,898][03043] Saving new best policy, reward=26.285! |
|
[2025-04-29 03:28:13,860][03043] Fps is (10 sec: 3280.8, 60 sec: 3617.5, 300 sec: 3568.2). Total num frames: 7192576. Throughput: 0: 912.9. Samples: 1796960. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:13,861][03043] Avg episode reward: [(0, '26.807')] |
|
[2025-04-29 03:28:13,900][03043] Saving new best policy, reward=26.807! |
|
[2025-04-29 03:28:18,936][03043] Fps is (10 sec: 3251.9, 60 sec: 3612.8, 300 sec: 3567.3). Total num frames: 7208960. Throughput: 0: 905.6. Samples: 1802336. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:18,937][03043] Avg episode reward: [(0, '24.856')] |
|
[2025-04-29 03:28:18,950][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001761_7213056.pth... |
|
[2025-04-29 03:28:18,995][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001655_6778880.pth |
|
[2025-04-29 03:28:23,840][03043] Fps is (10 sec: 3693.5, 60 sec: 3618.2, 300 sec: 3582.6). Total num frames: 7229440. Throughput: 0: 901.4. Samples: 1807680. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:23,841][03043] Avg episode reward: [(0, '23.871')] |
|
[2025-04-29 03:28:28,853][03043] Fps is (10 sec: 4130.3, 60 sec: 3686.3, 300 sec: 3596.0). Total num frames: 7249920. Throughput: 0: 910.7. Samples: 1810560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:28,854][03043] Avg episode reward: [(0, '22.353')] |
|
[2025-04-29 03:28:33,849][03043] Fps is (10 sec: 3683.2, 60 sec: 3618.0, 300 sec: 3596.3). Total num frames: 7266304. Throughput: 0: 893.8. Samples: 1815648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:33,850][03043] Avg episode reward: [(0, '21.813')] |
|
[2025-04-29 03:28:38,844][03043] Fps is (10 sec: 3280.0, 60 sec: 3619.2, 300 sec: 3596.4). Total num frames: 7282688. Throughput: 0: 902.9. Samples: 1821088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:38,845][03043] Avg episode reward: [(0, '21.269')] |
|
[2025-04-29 03:28:43,843][03043] Fps is (10 sec: 3278.7, 60 sec: 3549.9, 300 sec: 3596.3). Total num frames: 7299072. Throughput: 0: 894.6. Samples: 1823584. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:43,845][03043] Avg episode reward: [(0, '21.509')] |
|
[2025-04-29 03:28:48,838][03043] Fps is (10 sec: 3278.8, 60 sec: 3550.5, 300 sec: 3596.3). Total num frames: 7315456. Throughput: 0: 873.2. Samples: 1828256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:48,839][03043] Avg episode reward: [(0, '20.544')] |
|
[2025-04-29 03:28:53,851][03043] Fps is (10 sec: 3274.3, 60 sec: 3550.8, 300 sec: 3582.2). Total num frames: 7331840. Throughput: 0: 879.8. Samples: 1833696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:53,852][03043] Avg episode reward: [(0, '20.498')] |
|
[2025-04-29 03:28:58,862][03043] Fps is (10 sec: 3268.9, 60 sec: 3480.5, 300 sec: 3582.3). Total num frames: 7348224. Throughput: 0: 866.8. Samples: 1835968. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:28:58,863][03043] Avg episode reward: [(0, '21.650')] |
|
[2025-04-29 03:29:03,886][03043] Fps is (10 sec: 3673.5, 60 sec: 3480.8, 300 sec: 3595.9). Total num frames: 7368704. Throughput: 0: 863.5. Samples: 1841152. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:03,887][03043] Avg episode reward: [(0, '23.009')] |
|
[2025-04-29 03:29:08,857][03043] Fps is (10 sec: 3688.2, 60 sec: 3481.8, 300 sec: 3596.1). Total num frames: 7385088. Throughput: 0: 860.1. Samples: 1846400. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:08,861][03043] Avg episode reward: [(0, '23.199')] |
|
[2025-04-29 03:29:13,854][03043] Fps is (10 sec: 3287.4, 60 sec: 3482.0, 300 sec: 3596.1). Total num frames: 7401472. Throughput: 0: 848.4. Samples: 1848736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:13,854][03043] Avg episode reward: [(0, '22.632')] |
|
[2025-04-29 03:29:18,848][03043] Fps is (10 sec: 3279.7, 60 sec: 3486.7, 300 sec: 3582.4). Total num frames: 7417856. Throughput: 0: 854.1. Samples: 1854080. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:18,849][03043] Avg episode reward: [(0, '23.812')] |
|
[2025-04-29 03:29:18,889][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001811_7417856.pth... |
|
[2025-04-29 03:29:18,938][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001707_6991872.pth |
|
[2025-04-29 03:29:23,839][03043] Fps is (10 sec: 3281.6, 60 sec: 3413.4, 300 sec: 3596.3). Total num frames: 7434240. Throughput: 0: 839.9. Samples: 1858880. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:23,840][03043] Avg episode reward: [(0, '24.106')] |
|
[2025-04-29 03:29:28,862][03043] Fps is (10 sec: 3272.3, 60 sec: 3344.6, 300 sec: 3582.0). Total num frames: 7450624. Throughput: 0: 843.7. Samples: 1861568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:28,863][03043] Avg episode reward: [(0, '23.756')] |
|
[2025-04-29 03:29:33,861][03043] Fps is (10 sec: 3678.2, 60 sec: 3412.6, 300 sec: 3595.9). Total num frames: 7471104. Throughput: 0: 860.7. Samples: 1867008. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:33,867][03043] Avg episode reward: [(0, '20.741')] |
|
[2025-04-29 03:29:38,853][03043] Fps is (10 sec: 3689.6, 60 sec: 3412.8, 300 sec: 3582.3). Total num frames: 7487488. Throughput: 0: 849.0. Samples: 1871904. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:38,854][03043] Avg episode reward: [(0, '20.249')] |
|
[2025-04-29 03:29:43,859][03043] Fps is (10 sec: 3277.5, 60 sec: 3412.4, 300 sec: 3568.3). Total num frames: 7503872. Throughput: 0: 857.7. Samples: 1874560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:43,860][03043] Avg episode reward: [(0, '18.557')] |
|
[2025-04-29 03:29:48,870][03043] Fps is (10 sec: 3271.1, 60 sec: 3411.5, 300 sec: 3568.1). Total num frames: 7520256. Throughput: 0: 859.3. Samples: 1879808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:48,872][03043] Avg episode reward: [(0, '18.169')] |
|
[2025-04-29 03:29:53,848][03043] Fps is (10 sec: 3280.5, 60 sec: 3413.5, 300 sec: 3568.7). Total num frames: 7536640. Throughput: 0: 852.1. Samples: 1884736. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:53,849][03043] Avg episode reward: [(0, '16.689')] |
|
[2025-04-29 03:29:58,857][03043] Fps is (10 sec: 3691.3, 60 sec: 3481.9, 300 sec: 3568.3). Total num frames: 7557120. Throughput: 0: 859.0. Samples: 1887392. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:29:58,858][03043] Avg episode reward: [(0, '16.594')] |
|
[2025-04-29 03:30:03,846][03043] Fps is (10 sec: 3277.3, 60 sec: 3347.3, 300 sec: 3554.6). Total num frames: 7569408. Throughput: 0: 849.8. Samples: 1892320. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:03,847][03043] Avg episode reward: [(0, '16.464')] |
|
[2025-04-29 03:30:08,862][03043] Fps is (10 sec: 3275.2, 60 sec: 3413.1, 300 sec: 3554.4). Total num frames: 7589888. Throughput: 0: 864.3. Samples: 1897792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:08,863][03043] Avg episode reward: [(0, '16.928')] |
|
[2025-04-29 03:30:13,846][03043] Fps is (10 sec: 3686.6, 60 sec: 3413.8, 300 sec: 3554.5). Total num frames: 7606272. Throughput: 0: 868.6. Samples: 1900640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:13,847][03043] Avg episode reward: [(0, '16.224')] |
|
[2025-04-29 03:30:18,851][03043] Fps is (10 sec: 3280.4, 60 sec: 3413.2, 300 sec: 3541.2). Total num frames: 7622656. Throughput: 0: 854.2. Samples: 1905440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:18,854][03043] Avg episode reward: [(0, '16.317')] |
|
[2025-04-29 03:30:18,897][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001861_7622656.pth... |
|
[2025-04-29 03:30:18,952][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001761_7213056.pth |
|
[2025-04-29 03:30:23,839][03043] Fps is (10 sec: 3688.8, 60 sec: 3481.6, 300 sec: 3554.5). Total num frames: 7643136. Throughput: 0: 865.7. Samples: 1910848. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:23,840][03043] Avg episode reward: [(0, '18.651')] |
|
[2025-04-29 03:30:28,869][03043] Fps is (10 sec: 3679.9, 60 sec: 3481.2, 300 sec: 3554.2). Total num frames: 7659520. Throughput: 0: 858.1. Samples: 1913184. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:28,870][03043] Avg episode reward: [(0, '19.105')] |
|
[2025-04-29 03:30:33,852][03043] Fps is (10 sec: 3272.7, 60 sec: 3413.9, 300 sec: 3540.7). Total num frames: 7675904. Throughput: 0: 856.5. Samples: 1918336. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:33,853][03043] Avg episode reward: [(0, '19.847')] |
|
[2025-04-29 03:30:38,854][03043] Fps is (10 sec: 3281.7, 60 sec: 3413.3, 300 sec: 3540.4). Total num frames: 7692288. Throughput: 0: 861.8. Samples: 1923520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:38,859][03043] Avg episode reward: [(0, '19.998')] |
|
[2025-04-29 03:30:43,845][03043] Fps is (10 sec: 3278.8, 60 sec: 3414.1, 300 sec: 3526.8). Total num frames: 7708672. Throughput: 0: 857.1. Samples: 1925952. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:43,846][03043] Avg episode reward: [(0, '20.030')] |
|
[2025-04-29 03:30:48,865][03043] Fps is (10 sec: 3682.1, 60 sec: 3481.9, 300 sec: 3540.6). Total num frames: 7729152. Throughput: 0: 866.5. Samples: 1931328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:48,866][03043] Avg episode reward: [(0, '20.744')] |
|
[2025-04-29 03:30:53,849][03043] Fps is (10 sec: 3275.6, 60 sec: 3413.2, 300 sec: 3526.9). Total num frames: 7741440. Throughput: 0: 851.4. Samples: 1936096. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:53,855][03043] Avg episode reward: [(0, '20.513')] |
|
[2025-04-29 03:30:58,844][03043] Fps is (10 sec: 3283.8, 60 sec: 3414.1, 300 sec: 3526.8). Total num frames: 7761920. Throughput: 0: 847.7. Samples: 1938784. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:30:58,845][03043] Avg episode reward: [(0, '21.198')] |
|
[2025-04-29 03:31:03,855][03043] Fps is (10 sec: 3684.4, 60 sec: 3481.1, 300 sec: 3512.7). Total num frames: 7778304. Throughput: 0: 861.8. Samples: 1944224. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:03,856][03043] Avg episode reward: [(0, '21.118')] |
|
[2025-04-29 03:31:08,862][03043] Fps is (10 sec: 3270.9, 60 sec: 3413.3, 300 sec: 3512.6). Total num frames: 7794688. Throughput: 0: 846.5. Samples: 1948960. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:08,863][03043] Avg episode reward: [(0, '20.286')] |
|
[2025-04-29 03:31:13,850][03043] Fps is (10 sec: 3278.3, 60 sec: 3413.1, 300 sec: 3499.3). Total num frames: 7811072. Throughput: 0: 858.0. Samples: 1951776. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:13,851][03043] Avg episode reward: [(0, '21.190')] |
|
[2025-04-29 03:31:18,874][03043] Fps is (10 sec: 3273.0, 60 sec: 3412.1, 300 sec: 3512.5). Total num frames: 7827456. Throughput: 0: 857.2. Samples: 1956928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:18,875][03043] Avg episode reward: [(0, '20.692')] |
|
[2025-04-29 03:31:18,936][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001911_7827456.pth... |
|
[2025-04-29 03:31:19,009][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001811_7417856.pth |
|
[2025-04-29 03:31:23,860][03043] Fps is (10 sec: 3682.7, 60 sec: 3412.1, 300 sec: 3512.7). Total num frames: 7847936. Throughput: 0: 852.5. Samples: 1961888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:23,861][03043] Avg episode reward: [(0, '20.994')] |
|
[2025-04-29 03:31:28,864][03043] Fps is (10 sec: 3690.0, 60 sec: 3413.6, 300 sec: 3499.0). Total num frames: 7864320. Throughput: 0: 860.1. Samples: 1964672. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:28,865][03043] Avg episode reward: [(0, '21.861')] |
|
[2025-04-29 03:31:33,867][03043] Fps is (10 sec: 3274.5, 60 sec: 3412.4, 300 sec: 3498.9). Total num frames: 7880704. Throughput: 0: 846.9. Samples: 1969440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:33,868][03043] Avg episode reward: [(0, '22.532')] |
|
[2025-04-29 03:31:38,851][03043] Fps is (10 sec: 3280.9, 60 sec: 3413.5, 300 sec: 3486.0). Total num frames: 7897088. Throughput: 0: 861.1. Samples: 1974848. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:38,853][03043] Avg episode reward: [(0, '24.163')] |
|
[2025-04-29 03:31:43,863][03043] Fps is (10 sec: 3278.2, 60 sec: 3412.3, 300 sec: 3485.0). Total num frames: 7913472. Throughput: 0: 858.7. Samples: 1977440. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:43,864][03043] Avg episode reward: [(0, '23.541')] |
|
[2025-04-29 03:31:48,861][03043] Fps is (10 sec: 3273.7, 60 sec: 3345.3, 300 sec: 3484.9). Total num frames: 7929856. Throughput: 0: 844.7. Samples: 1982240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:48,862][03043] Avg episode reward: [(0, '23.252')] |
|
[2025-04-29 03:31:53,851][03043] Fps is (10 sec: 3690.8, 60 sec: 3481.5, 300 sec: 3485.0). Total num frames: 7950336. Throughput: 0: 859.9. Samples: 1987648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:53,852][03043] Avg episode reward: [(0, '23.283')] |
|
[2025-04-29 03:31:58,851][03043] Fps is (10 sec: 3689.9, 60 sec: 3412.9, 300 sec: 3485.0). Total num frames: 7966720. Throughput: 0: 850.5. Samples: 1990048. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:31:58,852][03043] Avg episode reward: [(0, '22.751')] |
|
[2025-04-29 03:32:03,865][03043] Fps is (10 sec: 3272.4, 60 sec: 3412.8, 300 sec: 3470.9). Total num frames: 7983104. Throughput: 0: 849.2. Samples: 1995136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:03,866][03043] Avg episode reward: [(0, '22.475')] |
|
[2025-04-29 03:32:08,843][03043] Fps is (10 sec: 3279.6, 60 sec: 3414.4, 300 sec: 3471.3). Total num frames: 7999488. Throughput: 0: 857.9. Samples: 2000480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:08,850][03043] Avg episode reward: [(0, '21.652')] |
|
[2025-04-29 03:32:13,852][03043] Fps is (10 sec: 3280.8, 60 sec: 3413.2, 300 sec: 3471.1). Total num frames: 8015872. Throughput: 0: 845.7. Samples: 2002720. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:13,853][03043] Avg episode reward: [(0, '21.346')] |
|
[2025-04-29 03:32:18,853][03043] Fps is (10 sec: 3682.8, 60 sec: 3482.8, 300 sec: 3471.1). Total num frames: 8036352. Throughput: 0: 860.0. Samples: 2008128. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:18,854][03043] Avg episode reward: [(0, '19.964')] |
|
[2025-04-29 03:32:18,895][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001962_8036352.pth... |
|
[2025-04-29 03:32:18,949][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001861_7622656.pth |
|
[2025-04-29 03:32:23,846][03043] Fps is (10 sec: 3278.8, 60 sec: 3345.8, 300 sec: 3457.4). Total num frames: 8048640. Throughput: 0: 846.3. Samples: 2012928. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:23,847][03043] Avg episode reward: [(0, '19.591')] |
|
[2025-04-29 03:32:28,861][03043] Fps is (10 sec: 3273.9, 60 sec: 3413.5, 300 sec: 3457.1). Total num frames: 8069120. Throughput: 0: 845.5. Samples: 2015488. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:28,862][03043] Avg episode reward: [(0, '21.152')] |
|
[2025-04-29 03:32:33,859][03043] Fps is (10 sec: 3681.8, 60 sec: 3413.8, 300 sec: 3457.3). Total num frames: 8085504. Throughput: 0: 861.9. Samples: 2021024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:33,861][03043] Avg episode reward: [(0, '19.593')] |
|
[2025-04-29 03:32:38,867][03043] Fps is (10 sec: 3275.0, 60 sec: 3412.5, 300 sec: 3443.1). Total num frames: 8101888. Throughput: 0: 848.8. Samples: 2025856. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:38,868][03043] Avg episode reward: [(0, '20.975')] |
|
[2025-04-29 03:32:43,836][03043] Fps is (10 sec: 3284.3, 60 sec: 3414.9, 300 sec: 3443.6). Total num frames: 8118272. Throughput: 0: 854.3. Samples: 2028480. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:43,837][03043] Avg episode reward: [(0, '20.197')] |
|
[2025-04-29 03:32:48,875][03043] Fps is (10 sec: 3274.2, 60 sec: 3412.5, 300 sec: 3443.3). Total num frames: 8134656. Throughput: 0: 858.8. Samples: 2033792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:48,879][03043] Avg episode reward: [(0, '20.938')] |
|
[2025-04-29 03:32:53,845][03043] Fps is (10 sec: 3683.0, 60 sec: 3413.7, 300 sec: 3443.4). Total num frames: 8155136. Throughput: 0: 854.0. Samples: 2038912. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:53,846][03043] Avg episode reward: [(0, '21.107')] |
|
[2025-04-29 03:32:58,851][03043] Fps is (10 sec: 3695.2, 60 sec: 3413.3, 300 sec: 3429.8). Total num frames: 8171520. Throughput: 0: 864.0. Samples: 2041600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:32:58,852][03043] Avg episode reward: [(0, '20.491')] |
|
[2025-04-29 03:33:03,842][03043] Fps is (10 sec: 3278.0, 60 sec: 3414.6, 300 sec: 3429.7). Total num frames: 8187904. Throughput: 0: 852.1. Samples: 2046464. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:03,843][03043] Avg episode reward: [(0, '22.223')] |
|
[2025-04-29 03:33:08,917][03043] Fps is (10 sec: 3662.2, 60 sec: 3477.3, 300 sec: 3442.7). Total num frames: 8208384. Throughput: 0: 865.5. Samples: 2051936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:08,919][03043] Avg episode reward: [(0, '20.934')] |
|
[2025-04-29 03:33:13,863][03043] Fps is (10 sec: 3678.7, 60 sec: 3481.0, 300 sec: 3444.3). Total num frames: 8224768. Throughput: 0: 868.2. Samples: 2054560. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:13,865][03043] Avg episode reward: [(0, '21.188')] |
|
[2025-04-29 03:33:18,841][03043] Fps is (10 sec: 3301.9, 60 sec: 3414.0, 300 sec: 3429.5). Total num frames: 8241152. Throughput: 0: 851.5. Samples: 2059328. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:18,842][03043] Avg episode reward: [(0, '21.506')] |
|
[2025-04-29 03:33:18,880][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002012_8241152.pth... |
|
[2025-04-29 03:33:18,932][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001911_7827456.pth |
|
[2025-04-29 03:33:23,857][03043] Fps is (10 sec: 3278.7, 60 sec: 3481.0, 300 sec: 3415.6). Total num frames: 8257536. Throughput: 0: 862.8. Samples: 2064672. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:23,858][03043] Avg episode reward: [(0, '21.354')] |
|
[2025-04-29 03:33:28,862][03043] Fps is (10 sec: 3270.0, 60 sec: 3413.3, 300 sec: 3415.5). Total num frames: 8273920. Throughput: 0: 859.2. Samples: 2067168. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:28,863][03043] Avg episode reward: [(0, '22.080')] |
|
[2025-04-29 03:33:33,863][03043] Fps is (10 sec: 3274.6, 60 sec: 3413.1, 300 sec: 3415.4). Total num frames: 8290304. Throughput: 0: 855.0. Samples: 2072256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:33,865][03043] Avg episode reward: [(0, '21.268')] |
|
[2025-04-29 03:33:38,907][03043] Fps is (10 sec: 3669.8, 60 sec: 3479.3, 300 sec: 3428.8). Total num frames: 8310784. Throughput: 0: 857.8. Samples: 2077568. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:38,910][03043] Avg episode reward: [(0, '21.795')] |
|
[2025-04-29 03:33:43,856][03043] Fps is (10 sec: 3279.2, 60 sec: 3412.2, 300 sec: 3415.4). Total num frames: 8323072. Throughput: 0: 846.8. Samples: 2079712. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:43,857][03043] Avg episode reward: [(0, '24.113')] |
|
[2025-04-29 03:33:48,865][03043] Fps is (10 sec: 3290.5, 60 sec: 3482.2, 300 sec: 3429.4). Total num frames: 8343552. Throughput: 0: 855.7. Samples: 2084992. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:48,866][03043] Avg episode reward: [(0, '24.787')] |
|
[2025-04-29 03:33:53,870][03043] Fps is (10 sec: 3272.2, 60 sec: 3343.7, 300 sec: 3415.6). Total num frames: 8355840. Throughput: 0: 842.1. Samples: 2089792. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:53,872][03043] Avg episode reward: [(0, '24.387')] |
|
[2025-04-29 03:33:58,886][03043] Fps is (10 sec: 3270.0, 60 sec: 3411.3, 300 sec: 3415.6). Total num frames: 8376320. Throughput: 0: 835.1. Samples: 2092160. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:33:58,887][03043] Avg episode reward: [(0, '23.485')] |
|
[2025-04-29 03:34:03,847][03043] Fps is (10 sec: 3694.9, 60 sec: 3413.0, 300 sec: 3415.8). Total num frames: 8392704. Throughput: 0: 848.2. Samples: 2097504. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:03,848][03043] Avg episode reward: [(0, '23.173')] |
|
[2025-04-29 03:34:08,853][03043] Fps is (10 sec: 3287.7, 60 sec: 3348.6, 300 sec: 3415.7). Total num frames: 8409088. Throughput: 0: 837.8. Samples: 2102368. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:08,854][03043] Avg episode reward: [(0, '22.873')] |
|
[2025-04-29 03:34:13,842][03043] Fps is (10 sec: 3278.5, 60 sec: 3346.2, 300 sec: 3415.7). Total num frames: 8425472. Throughput: 0: 846.6. Samples: 2105248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:13,843][03043] Avg episode reward: [(0, '21.842')] |
|
[2025-04-29 03:34:18,854][03043] Fps is (10 sec: 3276.6, 60 sec: 3344.4, 300 sec: 3415.5). Total num frames: 8441856. Throughput: 0: 850.7. Samples: 2110528. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:18,855][03043] Avg episode reward: [(0, '20.813')] |
|
[2025-04-29 03:34:18,908][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002061_8441856.pth... |
|
[2025-04-29 03:34:18,990][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000001962_8036352.pth |
|
[2025-04-29 03:34:23,842][03043] Fps is (10 sec: 3686.6, 60 sec: 3414.2, 300 sec: 3429.8). Total num frames: 8462336. Throughput: 0: 845.3. Samples: 2115552. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:23,843][03043] Avg episode reward: [(0, '21.961')] |
|
[2025-04-29 03:34:28,854][03043] Fps is (10 sec: 3686.2, 60 sec: 3413.8, 300 sec: 3415.7). Total num frames: 8478720. Throughput: 0: 853.4. Samples: 2118112. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:28,855][03043] Avg episode reward: [(0, '22.794')] |
|
[2025-04-29 03:34:33,861][03043] Fps is (10 sec: 3270.4, 60 sec: 3413.5, 300 sec: 3415.6). Total num frames: 8495104. Throughput: 0: 843.5. Samples: 2122944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:33,862][03043] Avg episode reward: [(0, '22.947')] |
|
[2025-04-29 03:34:38,839][03043] Fps is (10 sec: 3281.8, 60 sec: 3348.9, 300 sec: 3415.9). Total num frames: 8511488. Throughput: 0: 856.1. Samples: 2128288. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:38,840][03043] Avg episode reward: [(0, '24.679')] |
|
[2025-04-29 03:34:43,861][03043] Fps is (10 sec: 3277.0, 60 sec: 3413.1, 300 sec: 3415.8). Total num frames: 8527872. Throughput: 0: 865.2. Samples: 2131072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:43,865][03043] Avg episode reward: [(0, '25.430')] |
|
[2025-04-29 03:34:48,865][03043] Fps is (10 sec: 3268.3, 60 sec: 3345.1, 300 sec: 3415.4). Total num frames: 8544256. Throughput: 0: 853.7. Samples: 2135936. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:48,866][03043] Avg episode reward: [(0, '24.772')] |
|
[2025-04-29 03:34:53,865][03043] Fps is (10 sec: 3684.9, 60 sec: 3481.9, 300 sec: 3415.6). Total num frames: 8564736. Throughput: 0: 863.8. Samples: 2141248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:53,866][03043] Avg episode reward: [(0, '24.935')] |
|
[2025-04-29 03:34:58,869][03043] Fps is (10 sec: 3684.9, 60 sec: 3414.3, 300 sec: 3429.3). Total num frames: 8581120. Throughput: 0: 856.4. Samples: 2143808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:34:58,870][03043] Avg episode reward: [(0, '23.800')] |
|
[2025-04-29 03:35:03,860][03043] Fps is (10 sec: 3278.3, 60 sec: 3412.6, 300 sec: 3415.7). Total num frames: 8597504. Throughput: 0: 849.7. Samples: 2148768. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:03,861][03043] Avg episode reward: [(0, '22.729')] |
|
[2025-04-29 03:35:08,861][03043] Fps is (10 sec: 3279.4, 60 sec: 3412.9, 300 sec: 3415.5). Total num frames: 8613888. Throughput: 0: 855.8. Samples: 2154080. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:08,865][03043] Avg episode reward: [(0, '21.856')] |
|
[2025-04-29 03:35:13,854][03043] Fps is (10 sec: 3278.8, 60 sec: 3412.7, 300 sec: 3415.6). Total num frames: 8630272. Throughput: 0: 846.9. Samples: 2156224. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:13,855][03043] Avg episode reward: [(0, '21.493')] |
|
[2025-04-29 03:35:18,852][03043] Fps is (10 sec: 3690.0, 60 sec: 3481.7, 300 sec: 3415.5). Total num frames: 8650752. Throughput: 0: 861.3. Samples: 2161696. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:18,854][03043] Avg episode reward: [(0, '22.497')] |
|
[2025-04-29 03:35:18,897][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002112_8650752.pth... |
|
[2025-04-29 03:35:18,950][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002012_8241152.pth |
|
[2025-04-29 03:35:23,871][03043] Fps is (10 sec: 3271.2, 60 sec: 3343.4, 300 sec: 3401.7). Total num frames: 8663040. Throughput: 0: 848.5. Samples: 2166496. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:23,880][03043] Avg episode reward: [(0, '21.303')] |
|
[2025-04-29 03:35:28,857][03043] Fps is (10 sec: 3275.2, 60 sec: 3413.2, 300 sec: 3415.6). Total num frames: 8683520. Throughput: 0: 844.9. Samples: 2169088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:28,857][03043] Avg episode reward: [(0, '21.474')] |
|
[2025-04-29 03:35:33,837][03043] Fps is (10 sec: 3699.2, 60 sec: 3414.7, 300 sec: 3415.8). Total num frames: 8699904. Throughput: 0: 859.6. Samples: 2174592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:33,838][03043] Avg episode reward: [(0, '22.439')] |
|
[2025-04-29 03:35:38,853][03043] Fps is (10 sec: 3278.0, 60 sec: 3412.5, 300 sec: 3415.6). Total num frames: 8716288. Throughput: 0: 847.2. Samples: 2179360. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:38,854][03043] Avg episode reward: [(0, '22.976')] |
|
[2025-04-29 03:35:43,867][03043] Fps is (10 sec: 3267.0, 60 sec: 3413.0, 300 sec: 3401.7). Total num frames: 8732672. Throughput: 0: 853.4. Samples: 2182208. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:43,868][03043] Avg episode reward: [(0, '22.224')] |
|
[2025-04-29 03:35:48,837][03043] Fps is (10 sec: 3692.3, 60 sec: 3483.2, 300 sec: 3429.7). Total num frames: 8753152. Throughput: 0: 859.5. Samples: 2187424. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:48,839][03043] Avg episode reward: [(0, '21.811')] |
|
[2025-04-29 03:35:53,838][03043] Fps is (10 sec: 3697.0, 60 sec: 3414.9, 300 sec: 3415.7). Total num frames: 8769536. Throughput: 0: 850.2. Samples: 2192320. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:53,839][03043] Avg episode reward: [(0, '24.073')] |
|
[2025-04-29 03:35:58,850][03043] Fps is (10 sec: 3272.5, 60 sec: 3414.4, 300 sec: 3415.7). Total num frames: 8785920. Throughput: 0: 863.4. Samples: 2195072. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:35:58,851][03043] Avg episode reward: [(0, '23.725')] |
|
[2025-04-29 03:36:03,862][03043] Fps is (10 sec: 3269.0, 60 sec: 3413.2, 300 sec: 3415.6). Total num frames: 8802304. Throughput: 0: 848.2. Samples: 2199872. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:03,863][03043] Avg episode reward: [(0, '24.186')] |
|
[2025-04-29 03:36:08,849][03043] Fps is (10 sec: 3277.2, 60 sec: 3414.0, 300 sec: 3415.7). Total num frames: 8818688. Throughput: 0: 858.7. Samples: 2205120. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:08,850][03043] Avg episode reward: [(0, '23.659')] |
|
[2025-04-29 03:36:13,844][03043] Fps is (10 sec: 3282.9, 60 sec: 3413.9, 300 sec: 3416.0). Total num frames: 8835072. Throughput: 0: 860.7. Samples: 2207808. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:13,850][03043] Avg episode reward: [(0, '24.304')] |
|
[2025-04-29 03:36:18,841][03043] Fps is (10 sec: 3279.4, 60 sec: 3345.7, 300 sec: 3402.0). Total num frames: 8851456. Throughput: 0: 846.9. Samples: 2212704. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:18,842][03043] Avg episode reward: [(0, '24.933')] |
|
[2025-04-29 03:36:18,885][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002161_8851456.pth... |
|
[2025-04-29 03:36:18,935][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002061_8441856.pth |
|
[2025-04-29 03:36:23,850][03043] Fps is (10 sec: 3683.9, 60 sec: 3482.8, 300 sec: 3415.8). Total num frames: 8871936. Throughput: 0: 858.4. Samples: 2217984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:23,852][03043] Avg episode reward: [(0, '24.305')] |
|
[2025-04-29 03:36:28,891][03043] Fps is (10 sec: 3668.0, 60 sec: 3411.4, 300 sec: 3415.4). Total num frames: 8888320. Throughput: 0: 851.4. Samples: 2220544. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:28,892][03043] Avg episode reward: [(0, '23.689')] |
|
[2025-04-29 03:36:33,839][03043] Fps is (10 sec: 3280.6, 60 sec: 3413.2, 300 sec: 3415.8). Total num frames: 8904704. Throughput: 0: 844.1. Samples: 2225408. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:33,840][03043] Avg episode reward: [(0, '23.666')] |
|
[2025-04-29 03:36:38,836][03043] Fps is (10 sec: 3295.0, 60 sec: 3414.3, 300 sec: 3416.0). Total num frames: 8921088. Throughput: 0: 854.8. Samples: 2230784. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:38,837][03043] Avg episode reward: [(0, '23.806')] |
|
[2025-04-29 03:36:43,866][03043] Fps is (10 sec: 3267.9, 60 sec: 3413.4, 300 sec: 3415.6). Total num frames: 8937472. Throughput: 0: 843.1. Samples: 2233024. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:43,867][03043] Avg episode reward: [(0, '23.350')] |
|
[2025-04-29 03:36:48,930][03043] Fps is (10 sec: 3246.4, 60 sec: 3339.9, 300 sec: 3400.9). Total num frames: 8953856. Throughput: 0: 852.8. Samples: 2238304. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:48,932][03043] Avg episode reward: [(0, '22.394')] |
|
[2025-04-29 03:36:53,931][03043] Fps is (10 sec: 3662.7, 60 sec: 3408.1, 300 sec: 3414.7). Total num frames: 8974336. Throughput: 0: 850.4. Samples: 2243456. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:53,933][03043] Avg episode reward: [(0, '22.505')] |
|
[2025-04-29 03:36:58,861][03043] Fps is (10 sec: 3712.0, 60 sec: 3412.7, 300 sec: 3415.7). Total num frames: 8990720. Throughput: 0: 848.0. Samples: 2245984. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:36:58,862][03043] Avg episode reward: [(0, '21.744')] |
|
[2025-04-29 03:37:03,864][03043] Fps is (10 sec: 3298.8, 60 sec: 3413.2, 300 sec: 3415.4). Total num frames: 9007104. Throughput: 0: 863.6. Samples: 2251584. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:03,865][03043] Avg episode reward: [(0, '20.977')] |
|
[2025-04-29 03:37:08,865][03043] Fps is (10 sec: 3275.3, 60 sec: 3412.4, 300 sec: 3415.5). Total num frames: 9023488. Throughput: 0: 859.4. Samples: 2256672. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:08,866][03043] Avg episode reward: [(0, '20.343')] |
|
[2025-04-29 03:37:13,842][03043] Fps is (10 sec: 3694.5, 60 sec: 3481.7, 300 sec: 3415.8). Total num frames: 9043968. Throughput: 0: 868.5. Samples: 2259584. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:13,843][03043] Avg episode reward: [(0, '20.672')] |
|
[2025-04-29 03:37:18,838][03043] Fps is (10 sec: 4107.4, 60 sec: 3550.1, 300 sec: 3443.5). Total num frames: 9064448. Throughput: 0: 886.1. Samples: 2265280. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:18,845][03043] Avg episode reward: [(0, '19.455')] |
|
[2025-04-29 03:37:18,894][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002213_9064448.pth... |
|
[2025-04-29 03:37:18,971][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002112_8650752.pth |
|
[2025-04-29 03:37:23,858][03043] Fps is (10 sec: 3680.3, 60 sec: 3481.1, 300 sec: 3429.6). Total num frames: 9080832. Throughput: 0: 879.2. Samples: 2270368. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:23,859][03043] Avg episode reward: [(0, '19.787')] |
|
[2025-04-29 03:37:28,882][03043] Fps is (10 sec: 3670.0, 60 sec: 3550.4, 300 sec: 3443.1). Total num frames: 9101312. Throughput: 0: 893.5. Samples: 2273248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:28,883][03043] Avg episode reward: [(0, '19.692')] |
|
[2025-04-29 03:37:33,866][03043] Fps is (10 sec: 3683.7, 60 sec: 3548.3, 300 sec: 3443.4). Total num frames: 9117696. Throughput: 0: 894.4. Samples: 2278496. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:33,867][03043] Avg episode reward: [(0, '19.877')] |
|
[2025-04-29 03:37:38,863][03043] Fps is (10 sec: 3283.3, 60 sec: 3548.3, 300 sec: 3443.1). Total num frames: 9134080. Throughput: 0: 902.3. Samples: 2284000. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:38,864][03043] Avg episode reward: [(0, '20.405')] |
|
[2025-04-29 03:37:43,846][03043] Fps is (10 sec: 3693.8, 60 sec: 3619.4, 300 sec: 3457.6). Total num frames: 9154560. Throughput: 0: 909.1. Samples: 2286880. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:43,850][03043] Avg episode reward: [(0, '21.267')] |
|
[2025-04-29 03:37:48,839][03043] Fps is (10 sec: 3695.3, 60 sec: 3623.6, 300 sec: 3443.5). Total num frames: 9170944. Throughput: 0: 899.3. Samples: 2292032. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:48,840][03043] Avg episode reward: [(0, '20.891')] |
|
[2025-04-29 03:37:53,858][03043] Fps is (10 sec: 3682.1, 60 sec: 3622.5, 300 sec: 3457.2). Total num frames: 9191424. Throughput: 0: 916.1. Samples: 2297888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:53,859][03043] Avg episode reward: [(0, '21.456')] |
|
[2025-04-29 03:37:58,846][03043] Fps is (10 sec: 3683.8, 60 sec: 3619.0, 300 sec: 3457.3). Total num frames: 9207808. Throughput: 0: 912.3. Samples: 2300640. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:37:58,847][03043] Avg episode reward: [(0, '22.536')] |
|
[2025-04-29 03:38:03,836][03043] Fps is (10 sec: 3283.9, 60 sec: 3619.8, 300 sec: 3444.4). Total num frames: 9224192. Throughput: 0: 901.0. Samples: 2305824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:03,837][03043] Avg episode reward: [(0, '22.448')] |
|
[2025-04-29 03:38:08,844][03043] Fps is (10 sec: 3687.1, 60 sec: 3687.7, 300 sec: 3457.5). Total num frames: 9244672. Throughput: 0: 914.8. Samples: 2311520. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:08,845][03043] Avg episode reward: [(0, '22.352')] |
|
[2025-04-29 03:38:13,836][03043] Fps is (10 sec: 3686.4, 60 sec: 3618.5, 300 sec: 3457.4). Total num frames: 9261056. Throughput: 0: 904.8. Samples: 2313920. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:13,837][03043] Avg episode reward: [(0, '22.547')] |
|
[2025-04-29 03:38:18,851][03043] Fps is (10 sec: 3683.8, 60 sec: 3617.3, 300 sec: 3471.3). Total num frames: 9281536. Throughput: 0: 916.2. Samples: 2319712. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:18,852][03043] Avg episode reward: [(0, '23.362')] |
|
[2025-04-29 03:38:18,889][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002266_9281536.pth... |
|
[2025-04-29 03:38:18,940][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002161_8851456.pth |
|
[2025-04-29 03:38:23,867][03043] Fps is (10 sec: 3675.1, 60 sec: 3617.6, 300 sec: 3471.1). Total num frames: 9297920. Throughput: 0: 913.0. Samples: 2325088. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:23,868][03043] Avg episode reward: [(0, '23.414')] |
|
[2025-04-29 03:38:28,841][03043] Fps is (10 sec: 3690.0, 60 sec: 3620.6, 300 sec: 3485.3). Total num frames: 9318400. Throughput: 0: 906.1. Samples: 2327648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:28,842][03043] Avg episode reward: [(0, '21.478')] |
|
[2025-04-29 03:38:33,858][03043] Fps is (10 sec: 3689.6, 60 sec: 3618.6, 300 sec: 3471.8). Total num frames: 9334784. Throughput: 0: 919.1. Samples: 2333408. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:33,859][03043] Avg episode reward: [(0, '21.235')] |
|
[2025-04-29 03:38:38,901][03043] Fps is (10 sec: 3257.4, 60 sec: 3615.8, 300 sec: 3484.5). Total num frames: 9351168. Throughput: 0: 903.7. Samples: 2338592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:38,902][03043] Avg episode reward: [(0, '20.721')] |
|
[2025-04-29 03:38:43,852][03043] Fps is (10 sec: 3688.6, 60 sec: 3617.7, 300 sec: 3485.2). Total num frames: 9371648. Throughput: 0: 905.8. Samples: 2341408. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:43,853][03043] Avg episode reward: [(0, '21.816')] |
|
[2025-04-29 03:38:48,842][03043] Fps is (10 sec: 4120.0, 60 sec: 3686.2, 300 sec: 3513.2). Total num frames: 9392128. Throughput: 0: 917.9. Samples: 2347136. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:48,844][03043] Avg episode reward: [(0, '20.796')] |
|
[2025-04-29 03:38:53,837][03043] Fps is (10 sec: 3691.9, 60 sec: 3619.3, 300 sec: 3499.5). Total num frames: 9408512. Throughput: 0: 907.5. Samples: 2352352. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:53,838][03043] Avg episode reward: [(0, '20.994')] |
|
[2025-04-29 03:38:58,848][03043] Fps is (10 sec: 3274.8, 60 sec: 3618.0, 300 sec: 3498.9). Total num frames: 9424896. Throughput: 0: 917.1. Samples: 2355200. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:38:58,850][03043] Avg episode reward: [(0, '21.469')] |
|
[2025-04-29 03:39:03,848][03043] Fps is (10 sec: 3273.5, 60 sec: 3617.4, 300 sec: 3499.0). Total num frames: 9441280. Throughput: 0: 901.0. Samples: 2360256. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:03,849][03043] Avg episode reward: [(0, '21.827')] |
|
[2025-04-29 03:39:08,850][03043] Fps is (10 sec: 3685.7, 60 sec: 3617.7, 300 sec: 3512.7). Total num frames: 9461760. Throughput: 0: 900.6. Samples: 2365600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:08,851][03043] Avg episode reward: [(0, '21.712')] |
|
[2025-04-29 03:39:13,854][03043] Fps is (10 sec: 3683.9, 60 sec: 3617.0, 300 sec: 3512.8). Total num frames: 9478144. Throughput: 0: 905.7. Samples: 2368416. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:13,860][03043] Avg episode reward: [(0, '22.469')] |
|
[2025-04-29 03:39:18,841][03043] Fps is (10 sec: 3689.8, 60 sec: 3618.7, 300 sec: 3512.8). Total num frames: 9498624. Throughput: 0: 893.5. Samples: 2373600. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:18,842][03043] Avg episode reward: [(0, '22.406')] |
|
[2025-04-29 03:39:18,880][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002319_9498624.pth... |
|
[2025-04-29 03:39:18,928][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002213_9064448.pth |
|
[2025-04-29 03:39:23,836][03043] Fps is (10 sec: 3693.2, 60 sec: 3620.0, 300 sec: 3513.1). Total num frames: 9515008. Throughput: 0: 905.8. Samples: 2379296. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:23,837][03043] Avg episode reward: [(0, '22.597')] |
|
[2025-04-29 03:39:28,871][03043] Fps is (10 sec: 3267.0, 60 sec: 3548.1, 300 sec: 3512.7). Total num frames: 9531392. Throughput: 0: 900.6. Samples: 2381952. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:28,872][03043] Avg episode reward: [(0, '22.683')] |
|
[2025-04-29 03:39:33,859][03043] Fps is (10 sec: 3678.0, 60 sec: 3618.1, 300 sec: 3526.5). Total num frames: 9551872. Throughput: 0: 892.1. Samples: 2387296. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:33,860][03043] Avg episode reward: [(0, '21.199')] |
|
[2025-04-29 03:39:38,844][03043] Fps is (10 sec: 4107.4, 60 sec: 3689.9, 300 sec: 3540.8). Total num frames: 9572352. Throughput: 0: 908.7. Samples: 2393248. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:38,845][03043] Avg episode reward: [(0, '22.297')] |
|
[2025-04-29 03:39:43,845][03043] Fps is (10 sec: 3691.7, 60 sec: 3618.6, 300 sec: 3540.9). Total num frames: 9588736. Throughput: 0: 898.9. Samples: 2395648. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:43,846][03043] Avg episode reward: [(0, '21.375')] |
|
[2025-04-29 03:39:48,868][03043] Fps is (10 sec: 3677.4, 60 sec: 3616.6, 300 sec: 3540.6). Total num frames: 9609216. Throughput: 0: 912.6. Samples: 2401344. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:48,872][03043] Avg episode reward: [(0, '21.247')] |
|
[2025-04-29 03:39:53,849][03043] Fps is (10 sec: 3684.9, 60 sec: 3617.5, 300 sec: 3540.9). Total num frames: 9625600. Throughput: 0: 913.8. Samples: 2406720. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:53,850][03043] Avg episode reward: [(0, '21.850')] |
|
[2025-04-29 03:39:58,859][03043] Fps is (10 sec: 3279.9, 60 sec: 3617.5, 300 sec: 3540.6). Total num frames: 9641984. Throughput: 0: 908.7. Samples: 2409312. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:39:58,860][03043] Avg episode reward: [(0, '21.560')] |
|
[2025-04-29 03:40:03,856][03043] Fps is (10 sec: 3683.6, 60 sec: 3685.9, 300 sec: 3554.6). Total num frames: 9662464. Throughput: 0: 918.4. Samples: 2414944. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:03,857][03043] Avg episode reward: [(0, '19.692')] |
|
[2025-04-29 03:40:08,860][03043] Fps is (10 sec: 3685.8, 60 sec: 3617.5, 300 sec: 3554.4). Total num frames: 9678848. Throughput: 0: 904.0. Samples: 2420000. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:08,861][03043] Avg episode reward: [(0, '19.747')] |
|
[2025-04-29 03:40:13,853][03043] Fps is (10 sec: 3687.6, 60 sec: 3686.5, 300 sec: 3554.5). Total num frames: 9699328. Throughput: 0: 909.9. Samples: 2422880. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:13,854][03043] Avg episode reward: [(0, '20.984')] |
|
[2025-04-29 03:40:18,843][03043] Fps is (10 sec: 3692.9, 60 sec: 3618.0, 300 sec: 3568.7). Total num frames: 9715712. Throughput: 0: 919.8. Samples: 2428672. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:18,844][03043] Avg episode reward: [(0, '21.038')] |
|
[2025-04-29 03:40:18,892][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002372_9715712.pth... |
|
[2025-04-29 03:40:18,969][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002266_9281536.pth |
|
[2025-04-29 03:40:23,841][03043] Fps is (10 sec: 3280.6, 60 sec: 3617.8, 300 sec: 3554.7). Total num frames: 9732096. Throughput: 0: 899.6. Samples: 2433728. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:23,842][03043] Avg episode reward: [(0, '22.274')] |
|
[2025-04-29 03:40:28,864][03043] Fps is (10 sec: 3678.5, 60 sec: 3686.9, 300 sec: 3568.0). Total num frames: 9752576. Throughput: 0: 909.1. Samples: 2436576. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:28,866][03043] Avg episode reward: [(0, '22.898')] |
|
[2025-04-29 03:40:33,839][03043] Fps is (10 sec: 3687.3, 60 sec: 3619.3, 300 sec: 3568.5). Total num frames: 9768960. Throughput: 0: 901.6. Samples: 2441888. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:33,840][03043] Avg episode reward: [(0, '24.030')] |
|
[2025-04-29 03:40:38,843][03043] Fps is (10 sec: 3694.3, 60 sec: 3618.2, 300 sec: 3582.6). Total num frames: 9789440. Throughput: 0: 903.9. Samples: 2447392. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:38,844][03043] Avg episode reward: [(0, '23.349')] |
|
[2025-04-29 03:40:43,855][03043] Fps is (10 sec: 3680.3, 60 sec: 3617.5, 300 sec: 3568.2). Total num frames: 9805824. Throughput: 0: 910.3. Samples: 2450272. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:43,860][03043] Avg episode reward: [(0, '23.254')] |
|
[2025-04-29 03:40:48,884][03043] Fps is (10 sec: 3671.2, 60 sec: 3617.2, 300 sec: 3581.7). Total num frames: 9826304. Throughput: 0: 899.7. Samples: 2455456. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:48,885][03043] Avg episode reward: [(0, '23.345')] |
|
[2025-04-29 03:40:53,854][03043] Fps is (10 sec: 3686.8, 60 sec: 3617.8, 300 sec: 3582.2). Total num frames: 9842688. Throughput: 0: 918.2. Samples: 2461312. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:53,855][03043] Avg episode reward: [(0, '23.790')] |
|
[2025-04-29 03:40:58,857][03043] Fps is (10 sec: 3285.6, 60 sec: 3618.2, 300 sec: 3582.3). Total num frames: 9859072. Throughput: 0: 913.0. Samples: 2463968. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:40:58,858][03043] Avg episode reward: [(0, '24.600')] |
|
[2025-04-29 03:41:03,851][03043] Fps is (10 sec: 3687.7, 60 sec: 3618.5, 300 sec: 3596.1). Total num frames: 9879552. Throughput: 0: 893.7. Samples: 2468896. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:03,856][03043] Avg episode reward: [(0, '24.611')] |
|
[2025-04-29 03:41:08,845][03043] Fps is (10 sec: 3691.0, 60 sec: 3619.1, 300 sec: 3596.1). Total num frames: 9895936. Throughput: 0: 900.2. Samples: 2474240. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:08,846][03043] Avg episode reward: [(0, '26.067')] |
|
[2025-04-29 03:41:13,849][03043] Fps is (10 sec: 3277.5, 60 sec: 3550.1, 300 sec: 3596.1). Total num frames: 9912320. Throughput: 0: 884.9. Samples: 2476384. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:13,850][03043] Avg episode reward: [(0, '27.012')] |
|
[2025-04-29 03:41:13,886][03043] Saving new best policy, reward=27.012! |
|
[2025-04-29 03:41:18,852][03043] Fps is (10 sec: 3274.4, 60 sec: 3549.3, 300 sec: 3582.2). Total num frames: 9928704. Throughput: 0: 887.2. Samples: 2481824. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:18,853][03043] Avg episode reward: [(0, '27.151')] |
|
[2025-04-29 03:41:18,895][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002424_9928704.pth... |
|
[2025-04-29 03:41:18,949][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002319_9498624.pth |
|
[2025-04-29 03:41:18,957][03043] Saving new best policy, reward=27.151! |
|
[2025-04-29 03:41:23,877][03043] Fps is (10 sec: 3267.6, 60 sec: 3547.8, 300 sec: 3582.4). Total num frames: 9945088. Throughput: 0: 872.6. Samples: 2486688. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:23,880][03043] Avg episode reward: [(0, '26.892')] |
|
[2025-04-29 03:41:28,853][03043] Fps is (10 sec: 3276.4, 60 sec: 3482.2, 300 sec: 3582.1). Total num frames: 9961472. Throughput: 0: 863.3. Samples: 2489120. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:28,854][03043] Avg episode reward: [(0, '24.371')] |
|
[2025-04-29 03:41:33,847][03043] Fps is (10 sec: 3697.4, 60 sec: 3549.4, 300 sec: 3596.0). Total num frames: 9981952. Throughput: 0: 870.4. Samples: 2494592. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:33,849][03043] Avg episode reward: [(0, '23.474')] |
|
[2025-04-29 03:41:38,904][03043] Fps is (10 sec: 3260.2, 60 sec: 3409.9, 300 sec: 3581.8). Total num frames: 9994240. Throughput: 0: 845.3. Samples: 2499392. Policy #0 lag: (min: 0.0, avg: 0.0, max: 1.0) |
|
[2025-04-29 03:41:38,905][03043] Avg episode reward: [(0, '21.784')] |
|
[2025-04-29 03:41:41,256][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:41:41,308][03043] Removing /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002372_9715712.pth |
|
[2025-04-29 03:41:41,353][03043] Stopping RolloutWorker_w0... |
|
[2025-04-29 03:41:41,384][03043] Stopping RolloutWorker_w6... |
|
[2025-04-29 03:41:41,385][03043] Stopping Batcher_0... |
|
[2025-04-29 03:41:41,388][03043] Stopping InferenceWorker_p0-w0... |
|
[2025-04-29 03:41:41,423][03043] Stopping RolloutWorker_w5... |
|
[2025-04-29 03:41:41,458][03043] Stopping RolloutWorker_w2... |
|
[2025-04-29 03:41:41,491][03043] Stopping RolloutWorker_w7... |
|
[2025-04-29 03:41:41,523][03043] Stopping RolloutWorker_w1... |
|
[2025-04-29 03:41:41,555][03043] Stopping RolloutWorker_w4... |
|
[2025-04-29 03:41:41,589][03043] Stopping RolloutWorker_w3... |
|
[2025-04-29 03:41:41,590][03043] Component RolloutWorker_w0 stopped! |
|
[2025-04-29 03:41:41,591][03043] Component RolloutWorker_w6 stopped! |
|
[2025-04-29 03:41:41,592][03043] Saving /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:41:41,672][03043] Stopping LearnerWorker_p0... |
|
[2025-04-29 03:41:41,675][03043] Component Batcher_0 stopped! |
|
[2025-04-29 03:41:41,676][03043] Component InferenceWorker_p0-w0 stopped! |
|
[2025-04-29 03:41:41,677][03043] Component RolloutWorker_w5 stopped! |
|
[2025-04-29 03:41:41,678][03043] Component RolloutWorker_w2 stopped! |
|
[2025-04-29 03:41:41,679][03043] Component RolloutWorker_w7 stopped! |
|
[2025-04-29 03:41:41,680][03043] Component RolloutWorker_w1 stopped! |
|
[2025-04-29 03:41:41,681][03043] Component RolloutWorker_w4 stopped! |
|
[2025-04-29 03:41:41,682][03043] Component RolloutWorker_w3 stopped! |
|
[2025-04-29 03:41:41,682][03043] Component LearnerWorker_p0 stopped! |
|
[2025-04-29 03:41:41,683][03043] Batcher 0 profile tree view: |
|
batching: 18.8170, releasing_batches: 0.0323 |
|
[2025-04-29 03:41:41,684][03043] InferenceWorker_p0-w0 profile tree view: |
|
update_model: 3.3793 |
|
one_step: 0.0032 |
|
handle_policy_step: 275.5466 |
|
deserialize: 17.2195, stack: 1.9625, obs_to_device_normalize: 107.2380, forward: 111.1421, send_messages: 4.6593 |
|
prepare_outputs: 25.5853 |
|
to_cpu: 12.7354 |
|
[2025-04-29 03:41:41,685][03043] Learner 0 profile tree view: |
|
misc: 0.0045, prepare_batch: 22.6921 |
|
train: 84.6235 |
|
epoch_init: 0.0075, minibatch_init: 0.0113, losses_postprocess: 0.3744, kl_divergence: 0.4841, after_optimizer: 27.6353 |
|
calculate_losses: 41.0321 |
|
losses_init: 0.0053, forward_head: 1.4959, bptt_initial: 29.2698, tail: 1.0775, advantages_returns: 0.3053, losses: 6.3956 |
|
bptt: 2.1823 |
|
bptt_forward_core: 2.0914 |
|
update: 14.4153 |
|
clip: 1.6325 |
|
[2025-04-29 03:41:41,686][03043] RolloutWorker_w0 profile tree view: |
|
wait_for_trajectories: 0.2023, enqueue_policy_requests: 12.2841, env_step: 256.7210, overhead: 12.3883, complete_rollouts: 0.2076 |
|
save_policy_outputs: 15.7590 |
|
split_output_tensors: 6.1371 |
|
[2025-04-29 03:41:41,687][03043] RolloutWorker_w7 profile tree view: |
|
wait_for_trajectories: 0.1921, enqueue_policy_requests: 12.1777, env_step: 252.0383, overhead: 12.1271, complete_rollouts: 0.1978 |
|
save_policy_outputs: 14.5914 |
|
split_output_tensors: 5.9437 |
|
[2025-04-29 03:41:41,688][03043] Loop Runner_EvtLoop terminating... |
|
[2025-04-29 03:41:41,689][03043] Runner profile tree view: |
|
main_loop: 2897.3218 |
|
[2025-04-29 03:41:41,690][03043] Collected {0: 10006528}, FPS: 3453.7 |
|
[2025-04-29 03:41:57,796][03043] Loading existing experiment configuration from /content/train_dir/default_experiment/config.json |
|
[2025-04-29 03:41:57,798][03043] Overriding arg 'num_workers' with value 1 passed from command line |
|
[2025-04-29 03:41:57,799][03043] Adding new argument 'no_render'=True that is not in the saved config file! |
|
[2025-04-29 03:41:57,800][03043] Adding new argument 'save_video'=True that is not in the saved config file! |
|
[2025-04-29 03:41:57,801][03043] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file! |
|
[2025-04-29 03:41:57,802][03043] Adding new argument 'video_name'=None that is not in the saved config file! |
|
[2025-04-29 03:41:57,803][03043] Adding new argument 'max_num_frames'=1000000000.0 that is not in the saved config file! |
|
[2025-04-29 03:41:57,804][03043] Adding new argument 'max_num_episodes'=10 that is not in the saved config file! |
|
[2025-04-29 03:41:57,805][03043] Adding new argument 'push_to_hub'=False that is not in the saved config file! |
|
[2025-04-29 03:41:57,805][03043] Adding new argument 'hf_repository'=None that is not in the saved config file! |
|
[2025-04-29 03:41:57,806][03043] Adding new argument 'policy_index'=0 that is not in the saved config file! |
|
[2025-04-29 03:41:57,807][03043] Adding new argument 'eval_deterministic'=False that is not in the saved config file! |
|
[2025-04-29 03:41:57,808][03043] Adding new argument 'train_script'=None that is not in the saved config file! |
|
[2025-04-29 03:41:57,809][03043] Adding new argument 'enjoy_script'=None that is not in the saved config file! |
|
[2025-04-29 03:41:57,810][03043] Using frameskip 1 and render_action_repeat=4 for evaluation |
|
[2025-04-29 03:41:57,820][03043] RunningMeanStd input shape: (3, 72, 128) |
|
[2025-04-29 03:41:57,821][03043] RunningMeanStd input shape: (1,) |
|
[2025-04-29 03:41:57,831][03043] ConvEncoder: input_channels=3 |
|
[2025-04-29 03:41:57,863][03043] Conv encoder output size: 512 |
|
[2025-04-29 03:41:57,863][03043] Policy head output size: 512 |
|
[2025-04-29 03:41:57,880][03043] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:41:57,883][03043] Could not load from checkpoint, attempt 0 |
|
Traceback (most recent call last): |
|
File "/usr/local/lib/python3.11/dist-packages/sample_factory/algo/learning/learner.py", line 281, in load_checkpoint |
|
checkpoint_dict = torch.load(latest_checkpoint, map_location=device) |
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
File "/usr/local/lib/python3.11/dist-packages/torch/serialization.py", line 1470, in load |
|
raise pickle.UnpicklingError(_get_wo_message(str(e))) from None |
|
_pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, [1mdo those steps only if you trust the source of the checkpoint[0m. |
|
(1) In PyTorch 2.6, we changed the default value of the `weights_only` argument in `torch.load` from `False` to `True`. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source. |
|
(2) Alternatively, to load with `weights_only=True` please check the recommended steps in the following error message. |
|
WeightsUnpickler error: Unsupported global: GLOBAL numpy.core.multiarray.scalar was not an allowed global by default. Please use `torch.serialization.add_safe_globals([scalar])` or the `torch.serialization.safe_globals([scalar])` context manager to allowlist this global if you trust this class/function. |
|
|
|
Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html. |
|
[2025-04-29 03:41:57,885][03043] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:41:57,887][03043] Could not load from checkpoint, attempt 1 |
|
Traceback (most recent call last): |
|
File "/usr/local/lib/python3.11/dist-packages/sample_factory/algo/learning/learner.py", line 281, in load_checkpoint |
|
checkpoint_dict = torch.load(latest_checkpoint, map_location=device) |
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
File "/usr/local/lib/python3.11/dist-packages/torch/serialization.py", line 1470, in load |
|
raise pickle.UnpicklingError(_get_wo_message(str(e))) from None |
|
_pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, [1mdo those steps only if you trust the source of the checkpoint[0m. |
|
(1) In PyTorch 2.6, we changed the default value of the `weights_only` argument in `torch.load` from `False` to `True`. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source. |
|
(2) Alternatively, to load with `weights_only=True` please check the recommended steps in the following error message. |
|
WeightsUnpickler error: Unsupported global: GLOBAL numpy.core.multiarray.scalar was not an allowed global by default. Please use `torch.serialization.add_safe_globals([scalar])` or the `torch.serialization.safe_globals([scalar])` context manager to allowlist this global if you trust this class/function. |
|
|
|
Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html. |
|
[2025-04-29 03:41:57,888][03043] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:41:57,889][03043] Could not load from checkpoint, attempt 2 |
|
Traceback (most recent call last): |
|
File "/usr/local/lib/python3.11/dist-packages/sample_factory/algo/learning/learner.py", line 281, in load_checkpoint |
|
checkpoint_dict = torch.load(latest_checkpoint, map_location=device) |
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
|
File "/usr/local/lib/python3.11/dist-packages/torch/serialization.py", line 1470, in load |
|
raise pickle.UnpicklingError(_get_wo_message(str(e))) from None |
|
_pickle.UnpicklingError: Weights only load failed. This file can still be loaded, to do so you have two options, [1mdo those steps only if you trust the source of the checkpoint[0m. |
|
(1) In PyTorch 2.6, we changed the default value of the `weights_only` argument in `torch.load` from `False` to `True`. Re-running `torch.load` with `weights_only` set to `False` will likely succeed, but it can result in arbitrary code execution. Do it only if you got the file from a trusted source. |
|
(2) Alternatively, to load with `weights_only=True` please check the recommended steps in the following error message. |
|
WeightsUnpickler error: Unsupported global: GLOBAL numpy.core.multiarray.scalar was not an allowed global by default. Please use `torch.serialization.add_safe_globals([scalar])` or the `torch.serialization.safe_globals([scalar])` context manager to allowlist this global if you trust this class/function. |
|
|
|
Check the documentation of torch.load to learn more about types accepted by default with weights_only https://pytorch.org/docs/stable/generated/torch.load.html. |
|
[2025-04-29 03:42:27,294][03043] Loading existing experiment configuration from /content/train_dir/default_experiment/config.json |
|
[2025-04-29 03:42:27,295][03043] Overriding arg 'num_workers' with value 1 passed from command line |
|
[2025-04-29 03:42:27,295][03043] Adding new argument 'no_render'=True that is not in the saved config file! |
|
[2025-04-29 03:42:27,296][03043] Adding new argument 'save_video'=True that is not in the saved config file! |
|
[2025-04-29 03:42:27,297][03043] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file! |
|
[2025-04-29 03:42:27,298][03043] Adding new argument 'video_name'=None that is not in the saved config file! |
|
[2025-04-29 03:42:27,299][03043] Adding new argument 'max_num_frames'=1000000000.0 that is not in the saved config file! |
|
[2025-04-29 03:42:27,300][03043] Adding new argument 'max_num_episodes'=10 that is not in the saved config file! |
|
[2025-04-29 03:42:27,301][03043] Adding new argument 'push_to_hub'=False that is not in the saved config file! |
|
[2025-04-29 03:42:27,302][03043] Adding new argument 'hf_repository'=None that is not in the saved config file! |
|
[2025-04-29 03:42:27,303][03043] Adding new argument 'policy_index'=0 that is not in the saved config file! |
|
[2025-04-29 03:42:27,304][03043] Adding new argument 'eval_deterministic'=False that is not in the saved config file! |
|
[2025-04-29 03:42:27,304][03043] Adding new argument 'train_script'=None that is not in the saved config file! |
|
[2025-04-29 03:42:27,305][03043] Adding new argument 'enjoy_script'=None that is not in the saved config file! |
|
[2025-04-29 03:42:27,306][03043] Using frameskip 1 and render_action_repeat=4 for evaluation |
|
[2025-04-29 03:42:27,314][03043] RunningMeanStd input shape: (3, 72, 128) |
|
[2025-04-29 03:42:27,315][03043] RunningMeanStd input shape: (1,) |
|
[2025-04-29 03:42:27,327][03043] ConvEncoder: input_channels=3 |
|
[2025-04-29 03:42:27,360][03043] Conv encoder output size: 512 |
|
[2025-04-29 03:42:27,361][03043] Policy head output size: 512 |
|
[2025-04-29 03:42:27,379][03043] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:42:27,844][03043] Num frames 100... |
|
[2025-04-29 03:42:27,982][03043] Num frames 200... |
|
[2025-04-29 03:42:28,105][03043] Num frames 300... |
|
[2025-04-29 03:42:28,234][03043] Num frames 400... |
|
[2025-04-29 03:42:28,364][03043] Num frames 500... |
|
[2025-04-29 03:42:28,491][03043] Num frames 600... |
|
[2025-04-29 03:42:28,620][03043] Num frames 700... |
|
[2025-04-29 03:42:28,748][03043] Num frames 800... |
|
[2025-04-29 03:42:28,835][03043] Avg episode rewards: #0: 21.240, true rewards: #0: 8.240 |
|
[2025-04-29 03:42:28,836][03043] Avg episode reward: 21.240, avg true_objective: 8.240 |
|
[2025-04-29 03:42:28,948][03043] Num frames 900... |
|
[2025-04-29 03:42:29,074][03043] Num frames 1000... |
|
[2025-04-29 03:42:29,201][03043] Num frames 1100... |
|
[2025-04-29 03:42:29,327][03043] Num frames 1200... |
|
[2025-04-29 03:42:29,454][03043] Num frames 1300... |
|
[2025-04-29 03:42:29,583][03043] Num frames 1400... |
|
[2025-04-29 03:42:29,718][03043] Avg episode rewards: #0: 15.820, true rewards: #0: 7.320 |
|
[2025-04-29 03:42:29,719][03043] Avg episode reward: 15.820, avg true_objective: 7.320 |
|
[2025-04-29 03:42:29,768][03043] Num frames 1500... |
|
[2025-04-29 03:42:29,894][03043] Num frames 1600... |
|
[2025-04-29 03:42:30,037][03043] Num frames 1700... |
|
[2025-04-29 03:42:30,163][03043] Num frames 1800... |
|
[2025-04-29 03:42:30,294][03043] Num frames 1900... |
|
[2025-04-29 03:42:30,366][03043] Avg episode rewards: #0: 13.707, true rewards: #0: 6.373 |
|
[2025-04-29 03:42:30,367][03043] Avg episode reward: 13.707, avg true_objective: 6.373 |
|
[2025-04-29 03:42:30,481][03043] Num frames 2000... |
|
[2025-04-29 03:42:30,612][03043] Num frames 2100... |
|
[2025-04-29 03:42:30,742][03043] Num frames 2200... |
|
[2025-04-29 03:42:30,868][03043] Num frames 2300... |
|
[2025-04-29 03:42:31,006][03043] Num frames 2400... |
|
[2025-04-29 03:42:31,138][03043] Num frames 2500... |
|
[2025-04-29 03:42:31,265][03043] Num frames 2600... |
|
[2025-04-29 03:42:31,392][03043] Num frames 2700... |
|
[2025-04-29 03:42:31,504][03043] Avg episode rewards: #0: 14.110, true rewards: #0: 6.860 |
|
[2025-04-29 03:42:31,505][03043] Avg episode reward: 14.110, avg true_objective: 6.860 |
|
[2025-04-29 03:42:31,576][03043] Num frames 2800... |
|
[2025-04-29 03:42:31,702][03043] Num frames 2900... |
|
[2025-04-29 03:42:31,833][03043] Num frames 3000... |
|
[2025-04-29 03:42:31,961][03043] Num frames 3100... |
|
[2025-04-29 03:42:32,098][03043] Num frames 3200... |
|
[2025-04-29 03:42:32,225][03043] Num frames 3300... |
|
[2025-04-29 03:42:32,354][03043] Num frames 3400... |
|
[2025-04-29 03:42:32,483][03043] Num frames 3500... |
|
[2025-04-29 03:42:32,613][03043] Num frames 3600... |
|
[2025-04-29 03:42:32,743][03043] Num frames 3700... |
|
[2025-04-29 03:42:32,871][03043] Num frames 3800... |
|
[2025-04-29 03:42:32,999][03043] Num frames 3900... |
|
[2025-04-29 03:42:33,146][03043] Num frames 4000... |
|
[2025-04-29 03:42:33,325][03043] Num frames 4100... |
|
[2025-04-29 03:42:33,497][03043] Num frames 4200... |
|
[2025-04-29 03:42:33,674][03043] Num frames 4300... |
|
[2025-04-29 03:42:33,838][03043] Num frames 4400... |
|
[2025-04-29 03:42:34,012][03043] Num frames 4500... |
|
[2025-04-29 03:42:34,198][03043] Num frames 4600... |
|
[2025-04-29 03:42:34,363][03043] Num frames 4700... |
|
[2025-04-29 03:42:34,544][03043] Num frames 4800... |
|
[2025-04-29 03:42:34,677][03043] Avg episode rewards: #0: 22.488, true rewards: #0: 9.688 |
|
[2025-04-29 03:42:34,678][03043] Avg episode reward: 22.488, avg true_objective: 9.688 |
|
[2025-04-29 03:42:34,777][03043] Num frames 4900... |
|
[2025-04-29 03:42:34,950][03043] Num frames 5000... |
|
[2025-04-29 03:42:35,127][03043] Num frames 5100... |
|
[2025-04-29 03:42:35,311][03043] Num frames 5200... |
|
[2025-04-29 03:42:35,437][03043] Num frames 5300... |
|
[2025-04-29 03:42:35,601][03043] Avg episode rewards: #0: 19.980, true rewards: #0: 8.980 |
|
[2025-04-29 03:42:35,603][03043] Avg episode reward: 19.980, avg true_objective: 8.980 |
|
[2025-04-29 03:42:35,622][03043] Num frames 5400... |
|
[2025-04-29 03:42:35,749][03043] Num frames 5500... |
|
[2025-04-29 03:42:35,877][03043] Num frames 5600... |
|
[2025-04-29 03:42:36,006][03043] Num frames 5700... |
|
[2025-04-29 03:42:36,134][03043] Num frames 5800... |
|
[2025-04-29 03:42:36,275][03043] Num frames 5900... |
|
[2025-04-29 03:42:36,413][03043] Avg episode rewards: #0: 18.520, true rewards: #0: 8.520 |
|
[2025-04-29 03:42:36,414][03043] Avg episode reward: 18.520, avg true_objective: 8.520 |
|
[2025-04-29 03:42:36,465][03043] Num frames 6000... |
|
[2025-04-29 03:42:36,594][03043] Num frames 6100... |
|
[2025-04-29 03:42:36,726][03043] Num frames 6200... |
|
[2025-04-29 03:42:36,860][03043] Num frames 6300... |
|
[2025-04-29 03:42:36,988][03043] Num frames 6400... |
|
[2025-04-29 03:42:37,140][03043] Avg episode rewards: #0: 17.095, true rewards: #0: 8.095 |
|
[2025-04-29 03:42:37,141][03043] Avg episode reward: 17.095, avg true_objective: 8.095 |
|
[2025-04-29 03:42:37,175][03043] Num frames 6500... |
|
[2025-04-29 03:42:37,316][03043] Num frames 6600... |
|
[2025-04-29 03:42:37,447][03043] Num frames 6700... |
|
[2025-04-29 03:42:37,578][03043] Num frames 6800... |
|
[2025-04-29 03:42:37,708][03043] Num frames 6900... |
|
[2025-04-29 03:42:37,840][03043] Num frames 7000... |
|
[2025-04-29 03:42:38,107][03043] Num frames 7100... |
|
[2025-04-29 03:42:38,315][03043] Num frames 7200... |
|
[2025-04-29 03:42:38,444][03043] Num frames 7300... |
|
[2025-04-29 03:42:38,572][03043] Num frames 7400... |
|
[2025-04-29 03:42:38,653][03043] Avg episode rewards: #0: 17.578, true rewards: #0: 8.244 |
|
[2025-04-29 03:42:38,654][03043] Avg episode reward: 17.578, avg true_objective: 8.244 |
|
[2025-04-29 03:42:38,815][03043] Num frames 7500... |
|
[2025-04-29 03:42:39,068][03043] Num frames 7600... |
|
[2025-04-29 03:42:39,183][03043] Avg episode rewards: #0: 16.144, true rewards: #0: 7.644 |
|
[2025-04-29 03:42:39,184][03043] Avg episode reward: 16.144, avg true_objective: 7.644 |
|
[2025-04-29 03:43:23,435][03043] Replay video saved to /content/train_dir/default_experiment/replay.mp4! |
|
[2025-04-29 03:44:16,284][03043] Loading existing experiment configuration from /content/train_dir/default_experiment/config.json |
|
[2025-04-29 03:44:16,285][03043] Overriding arg 'num_workers' with value 1 passed from command line |
|
[2025-04-29 03:44:16,286][03043] Adding new argument 'no_render'=True that is not in the saved config file! |
|
[2025-04-29 03:44:16,287][03043] Adding new argument 'save_video'=True that is not in the saved config file! |
|
[2025-04-29 03:44:16,288][03043] Adding new argument 'video_frames'=1000000000.0 that is not in the saved config file! |
|
[2025-04-29 03:44:16,289][03043] Adding new argument 'video_name'=None that is not in the saved config file! |
|
[2025-04-29 03:44:16,290][03043] Adding new argument 'max_num_frames'=100000 that is not in the saved config file! |
|
[2025-04-29 03:44:16,291][03043] Adding new argument 'max_num_episodes'=10 that is not in the saved config file! |
|
[2025-04-29 03:44:16,292][03043] Adding new argument 'push_to_hub'=True that is not in the saved config file! |
|
[2025-04-29 03:44:16,293][03043] Adding new argument 'hf_repository'='s94lopez/rl_course_vizdoom_health_gathering_supreme' that is not in the saved config file! |
|
[2025-04-29 03:44:16,295][03043] Adding new argument 'policy_index'=0 that is not in the saved config file! |
|
[2025-04-29 03:44:16,296][03043] Adding new argument 'eval_deterministic'=False that is not in the saved config file! |
|
[2025-04-29 03:44:16,297][03043] Adding new argument 'train_script'=None that is not in the saved config file! |
|
[2025-04-29 03:44:16,299][03043] Adding new argument 'enjoy_script'=None that is not in the saved config file! |
|
[2025-04-29 03:44:16,301][03043] Using frameskip 1 and render_action_repeat=4 for evaluation |
|
[2025-04-29 03:44:16,309][03043] RunningMeanStd input shape: (3, 72, 128) |
|
[2025-04-29 03:44:16,311][03043] RunningMeanStd input shape: (1,) |
|
[2025-04-29 03:44:16,322][03043] ConvEncoder: input_channels=3 |
|
[2025-04-29 03:44:16,361][03043] Conv encoder output size: 512 |
|
[2025-04-29 03:44:16,362][03043] Policy head output size: 512 |
|
[2025-04-29 03:44:16,382][03043] Loading state from checkpoint /content/train_dir/default_experiment/checkpoint_p0/checkpoint_000002443_10006528.pth... |
|
[2025-04-29 03:44:16,850][03043] Num frames 100... |
|
[2025-04-29 03:44:16,979][03043] Num frames 200... |
|
[2025-04-29 03:44:17,106][03043] Num frames 300... |
|
[2025-04-29 03:44:17,234][03043] Num frames 400... |
|
[2025-04-29 03:44:17,367][03043] Num frames 500... |
|
[2025-04-29 03:44:17,496][03043] Num frames 600... |
|
[2025-04-29 03:44:17,635][03043] Num frames 700... |
|
[2025-04-29 03:44:17,764][03043] Num frames 800... |
|
[2025-04-29 03:44:17,902][03043] Num frames 900... |
|
[2025-04-29 03:44:18,039][03043] Num frames 1000... |
|
[2025-04-29 03:44:18,172][03043] Num frames 1100... |
|
[2025-04-29 03:44:18,310][03043] Num frames 1200... |
|
[2025-04-29 03:44:18,442][03043] Num frames 1300... |
|
[2025-04-29 03:44:18,573][03043] Num frames 1400... |
|
[2025-04-29 03:44:18,713][03043] Num frames 1500... |
|
[2025-04-29 03:44:18,845][03043] Num frames 1600... |
|
[2025-04-29 03:44:18,975][03043] Num frames 1700... |
|
[2025-04-29 03:44:19,109][03043] Num frames 1800... |
|
[2025-04-29 03:44:19,249][03043] Num frames 1900... |
|
[2025-04-29 03:44:19,397][03043] Num frames 2000... |
|
[2025-04-29 03:44:19,548][03043] Num frames 2100... |
|
[2025-04-29 03:44:19,600][03043] Avg episode rewards: #0: 58.999, true rewards: #0: 21.000 |
|
[2025-04-29 03:44:19,601][03043] Avg episode reward: 58.999, avg true_objective: 21.000 |
|
[2025-04-29 03:44:19,746][03043] Num frames 2200... |
|
[2025-04-29 03:44:19,880][03043] Num frames 2300... |
|
[2025-04-29 03:44:20,015][03043] Num frames 2400... |
|
[2025-04-29 03:44:20,145][03043] Num frames 2500... |
|
[2025-04-29 03:44:20,281][03043] Num frames 2600... |
|
[2025-04-29 03:44:20,412][03043] Num frames 2700... |
|
[2025-04-29 03:44:20,541][03043] Num frames 2800... |
|
[2025-04-29 03:44:20,681][03043] Avg episode rewards: #0: 37.339, true rewards: #0: 14.340 |
|
[2025-04-29 03:44:20,682][03043] Avg episode reward: 37.339, avg true_objective: 14.340 |
|
[2025-04-29 03:44:20,738][03043] Num frames 2900... |
|
[2025-04-29 03:44:20,874][03043] Num frames 3000... |
|
[2025-04-29 03:44:21,002][03043] Num frames 3100... |
|
[2025-04-29 03:44:21,130][03043] Num frames 3200... |
|
[2025-04-29 03:44:21,259][03043] Num frames 3300... |
|
[2025-04-29 03:44:21,400][03043] Num frames 3400... |
|
[2025-04-29 03:44:21,575][03043] Num frames 3500... |
|
[2025-04-29 03:44:21,767][03043] Num frames 3600... |
|
[2025-04-29 03:44:21,957][03043] Num frames 3700... |
|
[2025-04-29 03:44:22,140][03043] Num frames 3800... |
|
[2025-04-29 03:44:22,330][03043] Num frames 3900... |
|
[2025-04-29 03:44:22,499][03043] Num frames 4000... |
|
[2025-04-29 03:44:22,663][03043] Num frames 4100... |
|
[2025-04-29 03:44:22,834][03043] Num frames 4200... |
|
[2025-04-29 03:44:22,971][03043] Avg episode rewards: #0: 36.146, true rewards: #0: 14.147 |
|
[2025-04-29 03:44:22,973][03043] Avg episode reward: 36.146, avg true_objective: 14.147 |
|
[2025-04-29 03:44:23,075][03043] Num frames 4300... |
|
[2025-04-29 03:44:23,248][03043] Num frames 4400... |
|
[2025-04-29 03:44:23,430][03043] Num frames 4500... |
|
[2025-04-29 03:44:23,608][03043] Num frames 4600... |
|
[2025-04-29 03:44:23,748][03043] Num frames 4700... |
|
[2025-04-29 03:44:23,880][03043] Num frames 4800... |
|
[2025-04-29 03:44:24,013][03043] Num frames 4900... |
|
[2025-04-29 03:44:24,140][03043] Num frames 5000... |
|
[2025-04-29 03:44:24,272][03043] Avg episode rewards: #0: 32.647, true rewards: #0: 12.647 |
|
[2025-04-29 03:44:24,273][03043] Avg episode reward: 32.647, avg true_objective: 12.647 |
|
[2025-04-29 03:44:24,327][03043] Num frames 5100... |
|
[2025-04-29 03:44:24,453][03043] Num frames 5200... |
|
[2025-04-29 03:44:24,578][03043] Num frames 5300... |
|
[2025-04-29 03:44:24,702][03043] Num frames 5400... |
|
[2025-04-29 03:44:24,827][03043] Num frames 5500... |
|
[2025-04-29 03:44:24,968][03043] Num frames 5600... |
|
[2025-04-29 03:44:25,096][03043] Num frames 5700... |
|
[2025-04-29 03:44:25,227][03043] Num frames 5800... |
|
[2025-04-29 03:44:25,357][03043] Num frames 5900... |
|
[2025-04-29 03:44:25,489][03043] Num frames 6000... |
|
[2025-04-29 03:44:25,616][03043] Num frames 6100... |
|
[2025-04-29 03:44:25,749][03043] Num frames 6200... |
|
[2025-04-29 03:44:25,879][03043] Num frames 6300... |
|
[2025-04-29 03:44:26,019][03043] Num frames 6400... |
|
[2025-04-29 03:44:26,078][03043] Avg episode rewards: #0: 33.206, true rewards: #0: 12.806 |
|
[2025-04-29 03:44:26,079][03043] Avg episode reward: 33.206, avg true_objective: 12.806 |
|
[2025-04-29 03:44:26,205][03043] Num frames 6500... |
|
[2025-04-29 03:44:26,338][03043] Num frames 6600... |
|
[2025-04-29 03:44:26,466][03043] Num frames 6700... |
|
[2025-04-29 03:44:26,593][03043] Num frames 6800... |
|
[2025-04-29 03:44:26,749][03043] Num frames 6900... |
|
[2025-04-29 03:44:26,880][03043] Num frames 7000... |
|
[2025-04-29 03:44:27,019][03043] Num frames 7100... |
|
[2025-04-29 03:44:27,148][03043] Num frames 7200... |
|
[2025-04-29 03:44:27,276][03043] Num frames 7300... |
|
[2025-04-29 03:44:27,408][03043] Num frames 7400... |
|
[2025-04-29 03:44:27,540][03043] Num frames 7500... |
|
[2025-04-29 03:44:27,668][03043] Num frames 7600... |
|
[2025-04-29 03:44:27,798][03043] Num frames 7700... |
|
[2025-04-29 03:44:27,931][03043] Num frames 7800... |
|
[2025-04-29 03:44:28,073][03043] Num frames 7900... |
|
[2025-04-29 03:44:28,210][03043] Num frames 8000... |
|
[2025-04-29 03:44:28,343][03043] Num frames 8100... |
|
[2025-04-29 03:44:28,483][03043] Avg episode rewards: #0: 34.771, true rewards: #0: 13.605 |
|
[2025-04-29 03:44:28,485][03043] Avg episode reward: 34.771, avg true_objective: 13.605 |
|
[2025-04-29 03:44:28,536][03043] Num frames 8200... |
|
[2025-04-29 03:44:28,668][03043] Num frames 8300... |
|
[2025-04-29 03:44:28,798][03043] Num frames 8400... |
|
[2025-04-29 03:44:28,928][03043] Num frames 8500... |
|
[2025-04-29 03:44:29,071][03043] Num frames 8600... |
|
[2025-04-29 03:44:29,203][03043] Num frames 8700... |
|
[2025-04-29 03:44:29,340][03043] Num frames 8800... |
|
[2025-04-29 03:44:29,473][03043] Num frames 8900... |
|
[2025-04-29 03:44:29,616][03043] Num frames 9000... |
|
[2025-04-29 03:44:29,748][03043] Num frames 9100... |
|
[2025-04-29 03:44:29,878][03043] Num frames 9200... |
|
[2025-04-29 03:44:30,006][03043] Num frames 9300... |
|
[2025-04-29 03:44:30,148][03043] Num frames 9400... |
|
[2025-04-29 03:44:30,276][03043] Num frames 9500... |
|
[2025-04-29 03:44:30,409][03043] Num frames 9600... |
|
[2025-04-29 03:44:30,541][03043] Num frames 9700... |
|
[2025-04-29 03:44:30,673][03043] Num frames 9800... |
|
[2025-04-29 03:44:30,806][03043] Num frames 9900... |
|
[2025-04-29 03:44:30,866][03043] Avg episode rewards: #0: 36.860, true rewards: #0: 14.146 |
|
[2025-04-29 03:44:30,867][03043] Avg episode reward: 36.860, avg true_objective: 14.146 |
|
[2025-04-29 03:44:30,996][03043] Num frames 10000... |
|
[2025-04-29 03:44:31,145][03043] Num frames 10100... |
|
[2025-04-29 03:44:31,277][03043] Num frames 10200... |
|
[2025-04-29 03:44:31,406][03043] Num frames 10300... |
|
[2025-04-29 03:44:31,536][03043] Num frames 10400... |
|
[2025-04-29 03:44:31,665][03043] Num frames 10500... |
|
[2025-04-29 03:44:31,794][03043] Num frames 10600... |
|
[2025-04-29 03:44:31,919][03043] Num frames 10700... |
|
[2025-04-29 03:44:32,051][03043] Num frames 10800... |
|
[2025-04-29 03:44:32,192][03043] Num frames 10900... |
|
[2025-04-29 03:44:32,326][03043] Num frames 11000... |
|
[2025-04-29 03:44:32,460][03043] Num frames 11100... |
|
[2025-04-29 03:44:32,590][03043] Num frames 11200... |
|
[2025-04-29 03:44:32,720][03043] Num frames 11300... |
|
[2025-04-29 03:44:32,850][03043] Num frames 11400... |
|
[2025-04-29 03:44:32,980][03043] Num frames 11500... |
|
[2025-04-29 03:44:33,107][03043] Num frames 11600... |
|
[2025-04-29 03:44:33,246][03043] Num frames 11700... |
|
[2025-04-29 03:44:33,382][03043] Num frames 11800... |
|
[2025-04-29 03:44:33,512][03043] Num frames 11900... |
|
[2025-04-29 03:44:33,646][03043] Num frames 12000... |
|
[2025-04-29 03:44:33,707][03043] Avg episode rewards: #0: 38.877, true rewards: #0: 15.003 |
|
[2025-04-29 03:44:33,708][03043] Avg episode reward: 38.877, avg true_objective: 15.003 |
|
[2025-04-29 03:44:33,879][03043] Num frames 12100... |
|
[2025-04-29 03:44:34,061][03043] Num frames 12200... |
|
[2025-04-29 03:44:34,238][03043] Num frames 12300... |
|
[2025-04-29 03:44:34,411][03043] Num frames 12400... |
|
[2025-04-29 03:44:34,586][03043] Num frames 12500... |
|
[2025-04-29 03:44:34,753][03043] Num frames 12600... |
|
[2025-04-29 03:44:34,917][03043] Num frames 12700... |
|
[2025-04-29 03:44:35,091][03043] Num frames 12800... |
|
[2025-04-29 03:44:35,233][03043] Avg episode rewards: #0: 36.829, true rewards: #0: 14.273 |
|
[2025-04-29 03:44:35,235][03043] Avg episode reward: 36.829, avg true_objective: 14.273 |
|
[2025-04-29 03:44:35,343][03043] Num frames 12900... |
|
[2025-04-29 03:44:35,527][03043] Num frames 13000... |
|
[2025-04-29 03:44:35,708][03043] Num frames 13100... |
|
[2025-04-29 03:44:35,862][03043] Num frames 13200... |
|
[2025-04-29 03:44:35,990][03043] Num frames 13300... |
|
[2025-04-29 03:44:36,115][03043] Num frames 13400... |
|
[2025-04-29 03:44:36,247][03043] Num frames 13500... |
|
[2025-04-29 03:44:36,389][03043] Num frames 13600... |
|
[2025-04-29 03:44:36,518][03043] Num frames 13700... |
|
[2025-04-29 03:44:36,669][03043] Avg episode rewards: #0: 34.774, true rewards: #0: 13.774 |
|
[2025-04-29 03:44:36,670][03043] Avg episode reward: 34.774, avg true_objective: 13.774 |
|
[2025-04-29 03:45:55,051][03043] Replay video saved to /content/train_dir/default_experiment/replay.mp4! |
|
|