Fix batch size (20→5) and script detection in monitor

- Reduce embed batch to 5 — AnythingLLM hangs on batches >10
- Fix check_script_running() to properly detect setup.py process
  (was returning false because pgrep matched monitor.py too)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
salvacybersec
2026-04-07 00:33:35 +03:00
parent 1028d11507
commit 3176ebf102
2 changed files with 11 additions and 4 deletions

View File

@@ -532,8 +532,8 @@ def assign_to_workspaces(config, persona_folders, progress, batch_size, delay):
log.info(f"[{idx}/{total_personas}] → {codename} ({slug}): {len(new_docs)} docs to embed")
# Use smaller batches for embedding (10-20 is safer than 50)
embed_batch = min(batch_size, 20)
# Use small batches for embedding — AnythingLLM hangs on large batches
embed_batch = min(batch_size, 5)
persona_ok = 0
persona_fail = 0