全面架构重构:建立分层架构与高度可扩展的插件系统

后端重构:
- 新增分层架构:API Routes -> Services -> Repositories -> Infrastructure
- 彻底移除全局单例,全面采用 FastAPI 依赖注入
- 新增 api/ 目录拆分路由(proxies, plugins, scheduler, settings, stats)
- 新增 services/ 业务逻辑层:ProxyService, PluginService, SchedulerService, ValidatorService, SettingsService
- 新增 repositories/ 数据访问层:ProxyRepository, SettingsRepository, PluginSettingsRepository
- 新增 models/ 层:Pydantic Schemas + Domain Models
- 重写 core/config.py:采用 Pydantic Settings 管理配置
- 新增 core/db.py:基于 asynccontextmanager 的连接管理,支持数据库迁移
- 新增 core/exceptions.py:统一业务异常体系

插件系统重构(核心):
- 新增 core/plugin_system/:BaseCrawlerPlugin + PluginRegistry
- 采用显式注册模式(装饰器 + plugins/__init__.py),类型安全、测试友好
- 新增 plugins/base.py:BaseHTTPPlugin 通用 HTTP 爬虫基类
- 迁移全部 7 个插件到新架构(fate0, proxylist_download, ip3366, ip89, kuaidaili, speedx, yundaili)
- 插件状态持久化到 plugin_settings 表

任务调度重构:
- 新增 core/tasks/queue.py:ValidationQueue + WorkerPool
- 解耦爬取与验证:爬虫只负责爬取,代理提交队列后由 Worker 异步验证
- 调度器定时从数据库拉取存量代理并分批投入验证队列

前端调整:
- 新增 frontend/src/services/ 层拆分 API 调用逻辑
- 调整 stores/ 和 views/ 使用 Service 层
- 保持 API 兼容性,页面无需大幅修改

其他:
- 新增 main.py 作为新入口
- 新增 DESIGN.md 架构设计文档
- 更新 requirements.txt 增加 pydantic-settings
This commit is contained in:
祀梦
2026-04-02 11:55:05 +08:00
parent a79f78b338
commit 209a744d94
56 changed files with 2891 additions and 2095 deletions

View File

@@ -1,66 +1,38 @@
import sys
import os
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from core.crawler import BasePlugin
from core.log import logger
import json
import asyncio
from typing import List
from core.plugin_system import ProxyRaw
from plugins.base import BaseHTTPPlugin
from core.log import logger
class Fate0Plugin(BaseHTTPPlugin):
name = "fate0"
display_name = "Fate0聚合源"
description = "从 GitHub 持续更新的高质量代理聚合列表"
class Fate0Plugin(BasePlugin):
def __init__(self):
super().__init__()
self.name = "Fate0聚合源"
# 这是一个持续更新的高质量代理聚合列表
self.urls = ["https://raw.githubusercontent.com/fate0/proxylist/master/proxy.list"]
async def parse(self, html):
if not html:
return
count = 0
# fate0 的数据格式是每行一个 JSON 对象
for line in html.split('\n'):
if not line.strip():
async def crawl(self) -> List[ProxyRaw]:
results = []
for url in self.urls:
html = await self.fetch(url, timeout=30)
if not html:
continue
try:
data = json.loads(line)
ip = data.get('host')
port = data.get('port')
protocol = data.get('type', 'http')
# 协议标准化
protocol = protocol.lower().strip()
if protocol not in ('http', 'https', 'socks4', 'socks5'):
protocol = 'http'
if ip and port:
yield ip, int(port), protocol
count += 1
except Exception:
continue
if count > 0:
logger.info(f"{self.name} 解析完成,获得 {count} 个潜在代理")
if __name__ == "__main__":
async def test_plugin():
plugin = Fate0Plugin()
print(f"========== 测试 {plugin.name} ==========")
print(f"目标URL数量: {len(plugin.urls)}")
print(f"开始抓取...\n")
proxies = await plugin.run()
print(f"\n========== 抓取结果 ==========")
print(f"总计获取 {len(proxies)} 个代理:")
print("-" * 60)
for idx, (ip, port, protocol) in enumerate(proxies, 1):
print(f"{idx:3d}. {ip:15s} : {str(port):5s} | {protocol}")
print("-" * 60)
print(f"完成!共 {len(proxies)} 个代理~")
asyncio.run(test_plugin())
for line in html.split("\n"):
line = line.strip()
if not line:
continue
try:
data = json.loads(line)
ip = data.get("host")
port = data.get("port")
protocol = data.get("type", "http")
if ip and port:
results.append(ProxyRaw(ip, int(port), protocol))
except Exception:
continue
if results:
logger.info(f"{self.display_name} 解析完成,获得 {len(results)} 个潜在代理")
return results