feat: JSON 配置、质量分与仪表盘,及设置与爬取流程

- 后端改为 config/app.json;pytest 使用 config/app.test.json 与 set_config_file,不再依赖环境变量;移除 pydantic-settings。

- 前端 API/WebSocket 由 config/webui.json 经 Vite define 注入。

- 代理分数按延迟与随机取用次数计算,新增 use_count 与 proxy_scoring;保存设置时同步调度器启停。

- 仪表盘双饼图(可用/待验证协议);设置页去掉调度器启停按钮并移动立即验证;爬取全部结束后自动提交全量验证。

- 删除 script/settings_maintain.py(此前已标记删除)。

Made-with: Cursor
This commit is contained in:
祀梦
2026-04-05 16:08:32 +08:00
parent 07248ff4ee
commit 7bc6d4e4de
31 changed files with 643 additions and 280 deletions

View File

@@ -4,7 +4,8 @@ from pydantic import BaseModel
from app.services.plugin_service import PluginService
from app.services.plugin_runner import PluginRunner
from app.core.execution import JobExecutor, CrawlJob
from app.core.execution import JobExecutor, CrawlJob, ValidateAllJob
from app.core.log import logger
from app.core.exceptions import PluginNotFoundException
from app.api.deps import get_plugin_service, get_plugin_runner, get_executor
from app.api.common import success_response, format_plugin
@@ -106,7 +107,7 @@ async def crawl_all(
def _create_crawl_all_aggregator(job_ids, executor):
"""创建一个简单的聚合 Job查询所有子 Job 的状态汇总"""
"""创建一个简单的聚合 Job查询所有子 Job 的状态汇总;正常结束时自动提交一次全量验证"""
from app.core.execution.job import Job
import asyncio
@@ -177,6 +178,13 @@ def _create_crawl_all_aggregator(job_ids, executor):
}
if self.is_cancelled:
result["cancelled"] = True
else:
v_job = ValidateAllJob(validator_pool=executor.worker_pool)
result["validate_all_task_id"] = executor.submit_job(v_job)
logger.info(
"Crawl-all finished; submitted ValidateAllJob %s",
result["validate_all_task_id"],
)
return result
return CrawlAllAggregator()