ConversationChain
⚠️ DEPRECATION WARNING
This component is deprecated and will be removed in a future version of Nappai. Please migrate to the recommended alternative components.
The ConversationChain component lets you chat with an AI model while keeping track of what was said before. It remembers past messages so the AI can give more relevant answers.
How it Works
When you use this component, it connects to a language model (LLM) that generates responses. It also uses a memory store to keep a record of the conversation history. Each time you send a new message, the component adds it to the memory, then asks the LLM to produce a reply that takes the whole history into account. The component also sends callbacks so that the dashboard can show progress and log events in real time.
Inputs
Input Fields
- Model: The AI model that will generate responses. You must connect a language model component (e.g., OpenAI, Anthropic) to this input.
- Memory: A memory component that stores past messages (e.g., ConversationBufferMemory). This is optional; if omitted, the conversation will not remember previous turns.
- Input: The text you want to send to the AI. This is the user’s current question or statement.
Outputs
- Text: The AI’s reply as a
Message
object. This is the main output you’ll use in your workflow. - Runnable: A
Runnable
chain that can be executed elsewhere in the workflow. This is useful if you want to reuse the same conversation logic in multiple places.
Usage Example
- Add a Language Model – Drag a language model component (e.g., OpenAI) onto the canvas and configure your API key.
- Add a Memory Store – Drag a memory component (e.g., ConversationBufferMemory) onto the canvas.
- Add ConversationChain – Connect the model to the Model input, the memory to the Memory input, and type a question into the Input field.
- Run – Click “Run” and watch the AI reply appear in the Text output. The memory will automatically store the conversation, so the next time you send a new question, the AI will remember the context.
Related Components
- ChatPromptTemplate – Build custom prompts for chat models.
- LLMChain – Run a single prompt through a language model.
- ConversationBufferMemory – Store conversation history in memory.
- ConversationSummaryMemory – Keep a summarized version of the conversation.
Tips and Best Practices
- Use a dedicated memory component to keep the conversation context; otherwise, the AI will treat each input as a new conversation.
- Keep the input concise; long messages can increase token usage and cost.
- Monitor the callback logs to see how the component is interacting with the model in real time.
- Plan for deprecation: Since this component is legacy, consider switching to newer conversation components in future projects.
Security Considerations
- API Keys: Store your language model API keys securely in Nappai’s secret manager.
- Data Privacy: Be aware that all conversation text is sent to the external LLM provider. If you handle sensitive data, ensure compliance with your organization’s data‑handling policies.
- Rate Limits: Monitor usage to avoid exceeding provider limits, which could interrupt your workflow.