feat: fpw plugins, validation/crawl perf, WS stats, test DB isolation
- Add Free_Proxy_Website-style fpw_* plugins and register them - Per-plugin crawl timeout (crawl_timeout_seconds=120); remove global crawl_timeout setting - Validator: fix connect vs total timeout on save; SOCKS session LRU cache; drop redundant semaphore - Validation handler uses single DB connection; batch upsert after crawl; WorkerPool put_nowait - Remove unused max_retries from settings API/UI; settings maintenance SQL + init_db cleanup of deprecated keys - WebSocket dashboard stats; ProxyList pool_filter and API alignment - POST /api/proxies/delete-one for IPv6-safe deletes; task poll stops on 404 - pytest uses PROXYPOOL_DB_PATH=db/proxies.test.sqlite so tests do not wipe production DB - .gitignore: explicit proxies.test.sqlite patterns; fix plugin_service ValidationException import Made-with: Cursor
This commit is contained in:
22
tests/task_utils.py
Normal file
22
tests/task_utils.py
Normal file
@@ -0,0 +1,22 @@
|
||||
"""测试用异步任务轮询工具"""
|
||||
import asyncio
|
||||
from typing import Any, Dict, Optional
|
||||
|
||||
|
||||
async def poll_task_until_terminal(
|
||||
client,
|
||||
task_id: str,
|
||||
*,
|
||||
max_rounds: int,
|
||||
interval: float,
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""轮询任务直到终态或超时。返回最后一次 task data。"""
|
||||
task_data = None
|
||||
for _ in range(max_rounds):
|
||||
await asyncio.sleep(interval)
|
||||
res = await client.get(f"/api/tasks/{task_id}")
|
||||
assert res.status_code == 200
|
||||
task_data = res.json()["data"]
|
||||
if task_data["status"] in ("completed", "failed", "cancelled"):
|
||||
break
|
||||
return task_data
|
||||
Reference in New Issue
Block a user