General Guide for AI Agents.
Prompt examples for an AI Agent
Frequently Asked Questions
When we need to give the AI Agent additional context about a company so it can respond to user queries. They should use the AI Agent node.Prompt
Redirect the user
When you need to redirect the user to a part of the flow based on their intent. If it is the only purpose (like a router), you can use an AI Task. If this is part of an interaction, such as redirecting the user to a flow when they request it, you can use an AI Agent. This can be achieved with a combination of AI and a conditional node. You should save the AI’s response, which you can access with{{$memory.variableName}}.
Prompt
Request data from the user
For registration processes or flows where information must be requested from the user.Prompt
{{$memory.variableName}} to save it in a Datum node.
Prompt examples for an AI Task
AI Tasks are requests to OpenAI/preferred LLM. Their prompt is easier to write and does not require using keywords to finish such asend_function. Their goal will always be to fulfill what the prompt says and save the response in JSON format.
Redirect the user
end_function is not used.
Best Practices Summary
| Aspect | Recommendation |
|---|---|
| Clarity | Use direct, structured prompts |
| Language | Be consistent with terms (e.g., “tool”) |
| JSON usage | Clearly specify the expected format |
| Termination | In AI Agents, use end_function correctly. In AI Tasks just make sure to specify the JSON. |
| Context | Indicate how to evaluate inputs or relevant references. |
Content types and format in prompts
- Maintain consistent language throughout the prompt.
- Clearly specify the output format (JSON recommended).
- If the flow needs to terminate, explicitly define the use of
end_function.
Intelligent Design of Conditional Prompts
To get better results, it is key to minimize the number of decisions the AI must make, especially when they are not directly related to its main objective. Instead of loading the prompt with multiple conditional instructions, it is preferable to pre-process the logic outside the AI Agent, keeping the prompt clean, focused, and more effective.What to avoid
Avoid writing prompts like this:
If you have data in this variable {{$memory.variable}}, then do this, otherwise follow these other instructions.
This type of logic adds ambiguity and noise to the prompt, which can hinder the model’s comprehension and reduce the accuracy of the response.
Recommended alternative
Use prior conditional logic in a code-type node to define what instructions the AI should follow before sending it the prompt. Example:Benefits
- Reduces the model’s cognitive load
- Decreases the use of unnecessary tokens
- Increases prompt clarity
- Improves response consistency
For considerations about models (GPT-4.1, long contexts, CoT, and agent reminders), see the General Guide for AI Agents.
Prompt Validation Tool
To make it easier to create effective prompts, you can use a beta tool that acts as a testing assistant. If your prompt is not working as expected or you need a quick guide to get started, simply go to the following link, write your prompt or objective, and follow the assistant’s suggestions: ChatGPT - AI Agents Prompt Corrector This tool is in beta, so it is still under development. However, it is useful for detecting common errors, refining instructions, and validating whether the prompt is well-structured before implementing it in Brain Studio.Additional resources
- OpenAI Platform - Explore developer resources, tutorials, API documentation, and dynamic examples
- AI Playground - Compare AI models side by side: OpenAI GPT, Anthropic Claude, Google Gemini, Llama, Mistral, and more