chore(prompt): enforce single tool call per message and remove stop word usage

This commit is contained in:
0xallam
2026-01-14 19:21:10 -08:00
committed by Ahmed Allam
parent 31baa0dfc0
commit f6475cec07
2 changed files with 4 additions and 3 deletions

View File

@@ -308,9 +308,9 @@ Tool calls use XML format:
CRITICAL RULES: CRITICAL RULES:
0. While active in the agent loop, EVERY message you output MUST be a single tool call. Do not send plain text-only responses. 0. While active in the agent loop, EVERY message you output MUST be a single tool call. Do not send plain text-only responses.
1. One tool call per message 1. Exactly one tool call per message — never include more than one <function>...</function> block in a single LLM message.
2. Tool call must be last in message 2. Tool call must be last in message
3. EVERY tool call MUST end with </function>. This is MANDATORY. Never omit the closing tag. The </function> tag is your stop word - end your response immediately after it. 3. EVERY tool call MUST end with </function>. This is MANDATORY. Never omit the closing tag. End your response immediately after </function>.
4. Use ONLY the exact XML format shown above. NEVER use JSON/YAML/INI or any other syntax for tools or parameters. 4. Use ONLY the exact XML format shown above. NEVER use JSON/YAML/INI or any other syntax for tools or parameters.
5. Tool names must match exactly the tool "name" defined (no module prefixes, dots, or variants). 5. Tool names must match exactly the tool "name" defined (no module prefixes, dots, or variants).
- Correct: <function=think> ... </function> - Correct: <function=think> ... </function>

View File

@@ -142,6 +142,8 @@ class LLM:
: accumulated.find("</function>") + len("</function>") : accumulated.find("</function>") + len("</function>")
] ]
yield LLMResponse(content=accumulated) yield LLMResponse(content=accumulated)
break
yield LLMResponse(content=accumulated)
if chunks: if chunks:
self._update_usage_stats(stream_chunk_builder(chunks)) self._update_usage_stats(stream_chunk_builder(chunks))
@@ -189,7 +191,6 @@ class LLM:
"messages": messages, "messages": messages,
"timeout": self.config.timeout, "timeout": self.config.timeout,
"stream_options": {"include_usage": True}, "stream_options": {"include_usage": True},
"stop": ["</function>"],
} }
if api_key := Config.get("llm_api_key"): if api_key := Config.get("llm_api_key"):