How to disable OpenAI parallel function calling a.k.a. "multi_tool_use.parallel"
When using OpenAI's function calling (tool use), the model may decide to use several tools simultaneously. This can lead to the following characteristics in the response:
multi_tool_use.parallel
is presenttool_uses
is a list of the tools containingrecipient_name
with the name of the tools you provided
As this feature could be beneficial to reduce latency, the documentation states an important information:
When the model outputs multiple function calls via parallel function calling, model outputs may not match strict schemas supplied in tools.
This lack of schema enforcement can be problematic. To disable parallel function calling, you can add the following parameter to your function call:
python
#: Add this to disable parallel function call
#: parallel_tool_calls=False
client.chat.completions.create(
model=model,
messages=messages,
temperature=temperature,
tools=tools,
tool_choice=tool_choice,
parallel_tool_calls=False,
)
By setting parallel_tool_calls
to False
, you ensure that the model will not use multiple tools in parallel, thus maintaining strict schema enforcement.