IDAgents Developer commited on
Commit
346e39a
·
1 Parent(s): 99d0bbd

Add rate limiter testing script and documentation

Browse files

- Created comprehensive test script: scripts/test_rate_limiters.py
* Tests Serper API with 15 concurrent requests
* Tests NCBI API with 15 concurrent requests (requires API key)
* Tests cache effectiveness (measures speedup from caching)
* Detects HTTP 429 errors to verify rate limiting works
* Gracefully handles missing API keys (for local testing)
* Provides detailed success/failure analysis

- Created testing guide: docs/TESTING_RATE_LIMITERS.md
* Instructions for running tests on HF Space terminal
* Option to add test button to Gradio interface
* Troubleshooting guide for HTTP 429 errors
* Performance benchmarks and expected results
* Next steps based on test outcomes

Run on HF Space: python scripts/test_rate_limiters.py

docs/TESTING_RATE_LIMITERS.md ADDED
@@ -0,0 +1,194 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Testing Rate Limiters on Hugging Face Space
2
+
3
+ ## Overview
4
+ This guide explains how to test the rate limiters directly on your HF Space where the NCBI API key is configured.
5
+
6
+ ---
7
+
8
+ ## Option 1: Run via HF Space Terminal (Recommended)
9
+
10
+ ### Steps:
11
+
12
+ 1. **Open your HF Space**:
13
+ - Go to: https://huggingface.co/spaces/John-jero/IDWeekAgents
14
+
15
+ 2. **Click "⋮" menu → "Open Terminal"**:
16
+ - This opens a terminal directly on the Space
17
+
18
+ 3. **Run the test script**:
19
+ ```bash
20
+ python scripts/test_rate_limiters.py
21
+ ```
22
+
23
+ 4. **Review the results**:
24
+ - ✅ Success: No HTTP 429 errors = rate limiters working
25
+ - ❌ Failure: HTTP 429 errors = rate limiters need adjustment
26
+
27
+ ---
28
+
29
+ ## Option 2: Add Test Button to Gradio Interface
30
+
31
+ ### Create a test interface in `app.py`:
32
+
33
+ ```python
34
+ def test_rate_limiters():
35
+ """Run rate limiter tests"""
36
+ import subprocess
37
+ result = subprocess.run(
38
+ ["python", "scripts/test_rate_limiters.py"],
39
+ capture_output=True,
40
+ text=True
41
+ )
42
+ return result.stdout + "\n" + result.stderr
43
+
44
+ # Add to Gradio interface
45
+ with gr.Tab("Admin Tests"):
46
+ test_btn = gr.Button("Test Rate Limiters")
47
+ test_output = gr.Textbox(label="Test Results", lines=30)
48
+ test_btn.click(test_rate_limiters, outputs=test_output)
49
+ ```
50
+
51
+ ---
52
+
53
+ ## Option 3: Run Locally with API Keys
54
+
55
+ If you want to test locally, add both API keys to your `.env` file:
56
+
57
+ ```bash
58
+ SERPER_API_KEY=your_serper_key_here
59
+ NCBI_API_KEY=your_ncbi_key_here
60
+ ```
61
+
62
+ Then run:
63
+ ```bash
64
+ python scripts/test_rate_limiters.py
65
+ ```
66
+
67
+ ---
68
+
69
+ ## What the Test Does
70
+
71
+ ### 1. **Serper API Test** (15 concurrent requests)
72
+ - Tests rate limiting at 50 req/s (Dev tier)
73
+ - Checks for HTTP 429 errors
74
+ - Measures response times
75
+
76
+ ### 2. **NCBI API Test** (15 concurrent requests)
77
+ - Tests rate limiting at 8 req/s (with API key)
78
+ - Checks for HTTP 429 errors
79
+ - Measures response times
80
+
81
+ ### 3. **Cache Effectiveness Test**
82
+ - Runs same query twice
83
+ - Verifies 2nd request is faster (cache hit)
84
+ - Tests both Serper and NCBI caches
85
+
86
+ ---
87
+
88
+ ## Expected Results
89
+
90
+ ### ✅ **Success:**
91
+ ```
92
+ 📊 Overall Success Rates:
93
+ Serper API: 100.0%
94
+ NCBI API: 100.0%
95
+
96
+ 🎉 SUCCESS: No HTTP 429 errors detected!
97
+ ✅ Rate limiters are working correctly
98
+ ✅ Ready for 150-user workshop
99
+ ```
100
+
101
+ ### ⚠️ **Warning:**
102
+ ```
103
+ ⚠️ WARNING: HTTP 429 errors detected:
104
+ - Serper API: 3 rate limit errors
105
+ - NCBI API: 5 rate limit errors
106
+ ⚠️ Rate limiters may need adjustment
107
+ ```
108
+
109
+ ---
110
+
111
+ ## Troubleshooting
112
+
113
+ ### If you see HTTP 429 errors:
114
+
115
+ 1. **Check API Keys**:
116
+ ```python
117
+ import os
118
+ print("Serper:", os.getenv("SERPER_API_KEY")[:10] if os.getenv("SERPER_API_KEY") else "Missing")
119
+ print("NCBI:", os.getenv("NCBI_API_KEY")[:10] if os.getenv("NCBI_API_KEY") else "Missing")
120
+ ```
121
+
122
+ 2. **Check Rate Limiter Config**:
123
+ - Serper: `core/utils/serper_rate_limited.py` line 53 (should be 50 req/s)
124
+ - NCBI: `core/utils/ncbi_rate_limited.py` line 57-58 (should be 8 req/s with key)
125
+
126
+ 3. **Check HF Space Secrets**:
127
+ - Go to: Settings → Repository secrets
128
+ - Verify `SERPER_API_KEY` and `NCBI_API_KEY` are set
129
+
130
+ 4. **Restart HF Space**:
131
+ - Sometimes environment variables need a restart
132
+ - Go to: Settings → Factory reboot
133
+
134
+ ---
135
+
136
+ ## Performance Benchmarks
137
+
138
+ ### Expected Response Times:
139
+
140
+ **Serper API:**
141
+ - First request (no cache): 500-800ms
142
+ - Cached request: <50ms
143
+ - Throttled request: 1-2s delay (when at limit)
144
+
145
+ **NCBI API:**
146
+ - First request (no cache): 200-400ms
147
+ - Cached request: <50ms
148
+ - Throttled request: 1-2s delay (when at limit)
149
+
150
+ ### Expected Throughput:
151
+
152
+ **15 Concurrent Requests:**
153
+ - Serper: ~3-5 seconds total (throttled to 50 req/s)
154
+ - NCBI: ~2-3 seconds total (throttled to 8 req/s)
155
+
156
+ ---
157
+
158
+ ## Next Steps After Testing
159
+
160
+ ### ✅ If All Tests Pass:
161
+ 1. You're ready for the workshop! 🎉
162
+ 2. Optional: Run a larger test (50-100 concurrent)
163
+ 3. Optional: Set HF Space sleep timer for cost savings
164
+
165
+ ### ⚠️ If Tests Fail:
166
+ 1. Review error messages carefully
167
+ 2. Check API key configuration
168
+ 3. Verify rate limiter settings
169
+ 4. Contact support if issues persist
170
+
171
+ ---
172
+
173
+ ## Additional Test: Manual Concurrent Testing
174
+
175
+ You can also test manually by having multiple people use the app:
176
+
177
+ 1. Share your HF Space link with 5-10 colleagues
178
+ 2. Have everyone search at the same time
179
+ 3. Monitor for errors or slowdowns
180
+ 4. Check HF Space logs for HTTP 429 errors
181
+
182
+ ---
183
+
184
+ ## Contact & Support
185
+
186
+ - **HF Space**: https://huggingface.co/spaces/John-jero/IDWeekAgents
187
+ - **GitHub**: Your repository
188
+ - **Test Script**: `scripts/test_rate_limiters.py`
189
+ - **Documentation**: `docs/RATE_LIMITER_INTEGRATION.md`
190
+
191
+ ---
192
+
193
+ **Last Updated**: October 12, 2025
194
+ **Status**: Ready for testing on HF Space
scripts/test_rate_limiters.py ADDED
@@ -0,0 +1,379 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ """
2
+ Test Rate Limiters on HF Space
3
+ ================================
4
+ Tests both Serper and NCBI rate limiters with concurrent requests
5
+ to verify they work correctly and prevent HTTP 429 errors.
6
+
7
+ Usage:
8
+ python scripts/test_rate_limiters.py
9
+ """
10
+
11
+ import asyncio
12
+ import time
13
+ import os
14
+ from datetime import datetime
15
+
16
+ # Import the rate-limited wrappers
17
+ import sys
18
+ sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))
19
+
20
+ from core.utils.serper_rate_limited import rate_limited_serper_search
21
+ from core.utils.ncbi_rate_limited import rate_limited_pubmed_search
22
+
23
+
24
+ # Test queries
25
+ SERPER_TEST_QUERIES = [
26
+ "antibiotic resistance mechanisms",
27
+ "COVID-19 treatment guidelines",
28
+ "hospital acquired infections prevention",
29
+ "sepsis diagnosis criteria",
30
+ "antimicrobial stewardship programs"
31
+ ]
32
+
33
+ NCBI_TEST_QUERIES = [
34
+ "antibiotic resistance",
35
+ "hospital infection control",
36
+ "sepsis management",
37
+ "antimicrobial stewardship",
38
+ "infectious disease epidemiology"
39
+ ]
40
+
41
+
42
+ class RateLimiterTester:
43
+ """Test rate limiters with concurrent requests"""
44
+
45
+ def __init__(self):
46
+ self.serper_results = []
47
+ self.ncbi_results = []
48
+ self.serper_api_key = os.getenv("SERPER_API_KEY")
49
+ self.ncbi_api_key = os.getenv("NCBI_API_KEY")
50
+
51
+ async def test_serper_single(self, query: str, request_id: int):
52
+ """Test a single Serper API request"""
53
+ start_time = time.time()
54
+ try:
55
+ result = await rate_limited_serper_search(query, self.serper_api_key, num_results=3)
56
+ elapsed = time.time() - start_time
57
+
58
+ if result and "organic" in result:
59
+ num_results = len(result.get("organic", []))
60
+ self.serper_results.append({
61
+ "request_id": request_id,
62
+ "query": query,
63
+ "status": "success",
64
+ "elapsed": elapsed,
65
+ "num_results": num_results
66
+ })
67
+ print(f" ✅ Serper #{request_id}: {query[:40]}... ({elapsed:.2f}s, {num_results} results)")
68
+ else:
69
+ self.serper_results.append({
70
+ "request_id": request_id,
71
+ "query": query,
72
+ "status": "no_results",
73
+ "elapsed": elapsed
74
+ })
75
+ print(f" ⚠️ Serper #{request_id}: No results ({elapsed:.2f}s)")
76
+ except Exception as e:
77
+ elapsed = time.time() - start_time
78
+ self.serper_results.append({
79
+ "request_id": request_id,
80
+ "query": query,
81
+ "status": "error",
82
+ "elapsed": elapsed,
83
+ "error": str(e)
84
+ })
85
+ print(f" ❌ Serper #{request_id}: Error - {e} ({elapsed:.2f}s)")
86
+
87
+ async def test_ncbi_single(self, query: str, request_id: int):
88
+ """Test a single NCBI API request"""
89
+ start_time = time.time()
90
+ try:
91
+ result = await rate_limited_pubmed_search(query, self.ncbi_api_key, max_results=5)
92
+ elapsed = time.time() - start_time
93
+
94
+ if result and "esearchresult" in result:
95
+ idlist = result["esearchresult"].get("idlist", [])
96
+ num_results = len(idlist)
97
+ self.ncbi_results.append({
98
+ "request_id": request_id,
99
+ "query": query,
100
+ "status": "success",
101
+ "elapsed": elapsed,
102
+ "num_results": num_results
103
+ })
104
+ print(f" ✅ NCBI #{request_id}: {query[:40]}... ({elapsed:.2f}s, {num_results} articles)")
105
+ else:
106
+ self.ncbi_results.append({
107
+ "request_id": request_id,
108
+ "query": query,
109
+ "status": "no_results",
110
+ "elapsed": elapsed
111
+ })
112
+ print(f" ⚠️ NCBI #{request_id}: No results ({elapsed:.2f}s)")
113
+ except Exception as e:
114
+ elapsed = time.time() - start_time
115
+ self.ncbi_results.append({
116
+ "request_id": request_id,
117
+ "query": query,
118
+ "status": "error",
119
+ "elapsed": elapsed,
120
+ "error": str(e)
121
+ })
122
+ print(f" ❌ NCBI #{request_id}: Error - {e} ({elapsed:.2f}s)")
123
+
124
+ async def test_serper_concurrent(self, num_requests: int = 10):
125
+ """Test Serper API with concurrent requests"""
126
+ print(f"\n{'='*70}")
127
+ print(f"🔍 Testing Serper API Rate Limiter ({num_requests} concurrent requests)")
128
+ print(f"{'='*70}")
129
+
130
+ if not self.serper_api_key:
131
+ print("❌ ERROR: SERPER_API_KEY not found in environment")
132
+ return
133
+
134
+ print(f"✅ Serper API Key found: {self.serper_api_key[:10]}...")
135
+ print(f"⏳ Starting {num_requests} concurrent requests...\n")
136
+
137
+ start_time = time.time()
138
+
139
+ # Create tasks - use queries cyclically
140
+ tasks = []
141
+ for i in range(num_requests):
142
+ query = SERPER_TEST_QUERIES[i % len(SERPER_TEST_QUERIES)]
143
+ tasks.append(self.test_serper_single(query, i + 1))
144
+
145
+ # Execute all tasks concurrently
146
+ await asyncio.gather(*tasks)
147
+
148
+ total_time = time.time() - start_time
149
+
150
+ # Analyze results
151
+ print(f"\n{'='*70}")
152
+ print(f"📊 Serper API Test Results")
153
+ print(f"{'='*70}")
154
+
155
+ success = [r for r in self.serper_results if r["status"] == "success"]
156
+ no_results = [r for r in self.serper_results if r["status"] == "no_results"]
157
+ errors = [r for r in self.serper_results if r["status"] == "error"]
158
+
159
+ print(f"Total Requests: {num_requests}")
160
+ print(f"Successful: {len(success)} ({len(success)/num_requests*100:.1f}%)")
161
+ print(f"No Results: {len(no_results)} ({len(no_results)/num_requests*100:.1f}%)")
162
+ print(f"Errors: {len(errors)} ({len(errors)/num_requests*100:.1f}%)")
163
+ print(f"Total Time: {total_time:.2f}s")
164
+ print(f"Avg Throughput: {num_requests/total_time:.2f} req/s")
165
+
166
+ if success:
167
+ avg_time = sum(r["elapsed"] for r in success) / len(success)
168
+ min_time = min(r["elapsed"] for r in success)
169
+ max_time = max(r["elapsed"] for r in success)
170
+ print(f"\nResponse Times (successful):")
171
+ print(f" Average: {avg_time:.2f}s")
172
+ print(f" Min: {min_time:.2f}s")
173
+ print(f" Max: {max_time:.2f}s")
174
+
175
+ if errors:
176
+ print(f"\n⚠️ Errors found:")
177
+ for err in errors[:5]: # Show first 5 errors
178
+ print(f" - Request #{err['request_id']}: {err.get('error', 'Unknown')}")
179
+
180
+ # Check for HTTP 429 errors
181
+ http_429_errors = [e for e in errors if "429" in str(e.get("error", ""))]
182
+ if http_429_errors:
183
+ print(f"\n❌ CRITICAL: {len(http_429_errors)} HTTP 429 (Rate Limit) errors detected!")
184
+ print(f" Rate limiter may not be working correctly.")
185
+ else:
186
+ print(f"\n✅ SUCCESS: No HTTP 429 errors - Rate limiter working!")
187
+
188
+ async def test_ncbi_concurrent(self, num_requests: int = 10):
189
+ """Test NCBI API with concurrent requests"""
190
+ print(f"\n{'='*70}")
191
+ print(f"🔬 Testing NCBI API Rate Limiter ({num_requests} concurrent requests)")
192
+ print(f"{'='*70}")
193
+
194
+ if self.ncbi_api_key:
195
+ print(f"✅ NCBI API Key found: {self.ncbi_api_key[:10]}...")
196
+ print(f" Rate limit: 10 req/s (using 8 req/s throttle)")
197
+ else:
198
+ print(f"⚠️ No NCBI API Key found - SKIPPING NCBI TESTS")
199
+ print(f" This is expected if running locally without API key")
200
+ print(f" NCBI tests will run on HF Space where key is configured")
201
+ return
202
+
203
+ print(f"⏳ Starting {num_requests} concurrent requests...\n")
204
+
205
+ start_time = time.time()
206
+
207
+ # Create tasks - use queries cyclically
208
+ tasks = []
209
+ for i in range(num_requests):
210
+ query = NCBI_TEST_QUERIES[i % len(NCBI_TEST_QUERIES)]
211
+ tasks.append(self.test_ncbi_single(query, i + 1))
212
+
213
+ # Execute all tasks concurrently
214
+ await asyncio.gather(*tasks)
215
+
216
+ total_time = time.time() - start_time
217
+
218
+ # Analyze results
219
+ print(f"\n{'='*70}")
220
+ print(f"📊 NCBI API Test Results")
221
+ print(f"{'='*70}")
222
+
223
+ success = [r for r in self.ncbi_results if r["status"] == "success"]
224
+ no_results = [r for r in self.ncbi_results if r["status"] == "no_results"]
225
+ errors = [r for r in self.ncbi_results if r["status"] == "error"]
226
+
227
+ print(f"Total Requests: {num_requests}")
228
+ print(f"Successful: {len(success)} ({len(success)/num_requests*100:.1f}%)")
229
+ print(f"No Results: {len(no_results)} ({len(no_results)/num_requests*100:.1f}%)")
230
+ print(f"Errors: {len(errors)} ({len(errors)/num_requests*100:.1f}%)")
231
+ print(f"Total Time: {total_time:.2f}s")
232
+ print(f"Avg Throughput: {num_requests/total_time:.2f} req/s")
233
+
234
+ if success:
235
+ avg_time = sum(r["elapsed"] for r in success) / len(success)
236
+ min_time = min(r["elapsed"] for r in success)
237
+ max_time = max(r["elapsed"] for r in success)
238
+ print(f"\nResponse Times (successful):")
239
+ print(f" Average: {avg_time:.2f}s")
240
+ print(f" Min: {min_time:.2f}s")
241
+ print(f" Max: {max_time:.2f}s")
242
+
243
+ if errors:
244
+ print(f"\n⚠️ Errors found:")
245
+ for err in errors[:5]: # Show first 5 errors
246
+ print(f" - Request #{err['request_id']}: {err.get('error', 'Unknown')}")
247
+
248
+ # Check for HTTP 429 errors
249
+ http_429_errors = [e for e in errors if "429" in str(e.get("error", ""))]
250
+ if http_429_errors:
251
+ print(f"\n❌ CRITICAL: {len(http_429_errors)} HTTP 429 (Rate Limit) errors detected!")
252
+ print(f" Rate limiter may not be working correctly.")
253
+ else:
254
+ print(f"\n✅ SUCCESS: No HTTP 429 errors - Rate limiter working!")
255
+
256
+ async def test_cache_effectiveness(self):
257
+ """Test cache by running same queries twice"""
258
+ print(f"\n{'='*70}")
259
+ print(f"💾 Testing Cache Effectiveness")
260
+ print(f"{'='*70}")
261
+
262
+ if not self.serper_api_key:
263
+ print("⚠️ Serper API key not found - skipping cache test")
264
+ return
265
+
266
+ test_query = "antibiotic resistance mechanisms"
267
+
268
+ # First request (should hit API)
269
+ print(f"\n1️⃣ First request (should hit API):")
270
+ start1 = time.time()
271
+ result1 = await rate_limited_serper_search(test_query, self.serper_api_key, num_results=3)
272
+ time1 = time.time() - start1
273
+ print(f" Time: {time1:.3f}s")
274
+
275
+ # Wait a moment
276
+ await asyncio.sleep(0.5)
277
+
278
+ # Second request (should hit cache)
279
+ print(f"\n2️⃣ Second request (should hit cache):")
280
+ start2 = time.time()
281
+ result2 = await rate_limited_serper_search(test_query, self.serper_api_key, num_results=3)
282
+ time2 = time.time() - start2
283
+ print(f" Time: {time2:.3f}s")
284
+
285
+ # Analysis
286
+ print(f"\n📊 Cache Analysis:")
287
+ if time2 < time1 * 0.3: # Second request should be <30% of first
288
+ print(f" ✅ Cache HIT detected! (2nd request {time2/time1*100:.1f}% of 1st)")
289
+ print(f" Speedup: {time1/time2:.1f}x faster")
290
+ else:
291
+ print(f" ⚠️ Cache may not be working (2nd: {time2:.3f}s vs 1st: {time1:.3f}s)")
292
+
293
+ # NCBI cache test (only if API key available)
294
+ if self.ncbi_api_key:
295
+ print(f"\n3️⃣ Testing NCBI cache:")
296
+ ncbi_query = "sepsis diagnosis"
297
+
298
+ start3 = time.time()
299
+ result3 = await rate_limited_pubmed_search(ncbi_query, self.ncbi_api_key, max_results=5)
300
+ time3 = time.time() - start3
301
+ print(f" First request: {time3:.3f}s")
302
+
303
+ await asyncio.sleep(0.5)
304
+
305
+ start4 = time.time()
306
+ result4 = await rate_limited_pubmed_search(ncbi_query, self.ncbi_api_key, max_results=5)
307
+ time4 = time.time() - start4
308
+ print(f" Second request: {time4:.3f}s")
309
+
310
+ if time4 < time3 * 0.3:
311
+ print(f" ✅ Cache HIT detected! (2nd request {time4/time3*100:.1f}% of 1st)")
312
+ print(f" Speedup: {time3/time4:.1f}x faster")
313
+ else:
314
+ print(f" ⚠️ Cache may not be working (2nd: {time4:.3f}s vs 1st: {time3:.3f}s)")
315
+ else:
316
+ print(f"\n3️⃣ NCBI cache test skipped (no API key)")
317
+
318
+ async def run_all_tests(self):
319
+ """Run all rate limiter tests"""
320
+ print(f"\n{'='*70}")
321
+ print(f"🚀 IDWeek Agents - Rate Limiter Test Suite")
322
+ print(f"{'='*70}")
323
+ print(f"Start Time: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}")
324
+ print(f"Environment: {'Production' if 'SPACE_ID' in os.environ else 'Local'}")
325
+
326
+ # Test 1: Serper concurrent requests
327
+ await self.test_serper_concurrent(num_requests=15)
328
+
329
+ # Wait between tests
330
+ await asyncio.sleep(2)
331
+
332
+ # Test 2: NCBI concurrent requests
333
+ await self.test_ncbi_concurrent(num_requests=15)
334
+
335
+ # Wait between tests
336
+ await asyncio.sleep(2)
337
+
338
+ # Test 3: Cache effectiveness
339
+ await self.test_cache_effectiveness()
340
+
341
+ # Final summary
342
+ print(f"\n{'='*70}")
343
+ print(f"✅ All Tests Complete!")
344
+ print(f"{'='*70}")
345
+
346
+ # Overall analysis
347
+ serper_success_rate = len([r for r in self.serper_results if r["status"] == "success"]) / len(self.serper_results) * 100 if self.serper_results else 0
348
+ ncbi_success_rate = len([r for r in self.ncbi_results if r["status"] == "success"]) / len(self.ncbi_results) * 100 if self.ncbi_results else 0
349
+
350
+ print(f"\n📊 Overall Success Rates:")
351
+ print(f" Serper API: {serper_success_rate:.1f}%")
352
+ print(f" NCBI API: {ncbi_success_rate:.1f}%")
353
+
354
+ # Check for HTTP 429 errors
355
+ serper_429 = len([r for r in self.serper_results if r["status"] == "error" and "429" in str(r.get("error", ""))])
356
+ ncbi_429 = len([r for r in self.ncbi_results if r["status"] == "error" and "429" in str(r.get("error", ""))])
357
+
358
+ if serper_429 == 0 and ncbi_429 == 0:
359
+ print(f"\n🎉 SUCCESS: No HTTP 429 errors detected!")
360
+ print(f" ✅ Rate limiters are working correctly")
361
+ print(f" ✅ Ready for 150-user workshop")
362
+ else:
363
+ print(f"\n⚠️ WARNING: HTTP 429 errors detected:")
364
+ if serper_429 > 0:
365
+ print(f" - Serper API: {serper_429} rate limit errors")
366
+ if ncbi_429 > 0:
367
+ print(f" - NCBI API: {ncbi_429} rate limit errors")
368
+ print(f" ⚠️ Rate limiters may need adjustment")
369
+
370
+
371
+ async def main():
372
+ """Main test execution"""
373
+ tester = RateLimiterTester()
374
+ await tester.run_all_tests()
375
+
376
+
377
+ if __name__ == "__main__":
378
+ print("Starting rate limiter tests...")
379
+ asyncio.run(main())