Ai Filter
The Ai Filter component lets you sift through data by asking a question in plain English.
You give it some data, tell it what you’re looking for, and it returns only the parts that match your request.
How it Works
The component uses a language model (LLM) to understand your natural‑language query.
It splits the incoming data into chunks, asks the LLM to decide which chunks match the query, and then stitches the matching chunks back together.
The result is a new set of data that contains only the information you asked for.
Inputs
- Data: The information you want to filter. It can be a structured dataset, plain text, or a message.
- Model: The language model that will interpret your query. This input is required.
- Max Chunks: The maximum number of data chunks the component will examine. If you leave it blank, all chunks are processed.
- Query: A natural‑language question or statement that tells the component what to keep. For example, “Show me all sales records from 2023” or “Find any mention of ‘budget’.”
Outputs
- Filter: The filtered data that matches your query. It can be used directly in other components or displayed in the dashboard.
- Tool: A reusable tool that can be added to the LLM’s toolbox, allowing the model to call the filter operation directly in future conversations.
Usage Example
- Add the Ai Filter component to your workflow.
- Connect the Data input to the output of a data‑loading component (e.g., a CSV reader).
- Select a Model (e.g., OpenAI GPT‑4).
- Set Max Chunks if you want to limit the amount of data processed.
- Enter a Query such as “Show me all records where the status is ‘completed’.”
- Run the workflow.
- The Filter output will contain only the rows that match the query, ready to be passed to a chart, report, or another processing step.
Related Components
- Data Loader – Bring data into the dashboard from files or databases.
- Data Visualizer – Create charts and tables from filtered data.
- Text Summarizer – Summarize the filtered text for quick insights.
- Condition Checker – Use the filtered data to trigger alerts or actions.
Tips and Best Practices
- Keep your queries short and clear; the LLM interprets them best when they’re concise.
- Use Max Chunks to speed up processing on very large datasets.
- Combine the Tool output with other LLM components to build conversational workflows that can filter data on the fly.
- Test the component with a small sample of data first to ensure the query behaves as expected.
Security Considerations
- The component sends data to the chosen LLM, so sensitive information should be handled according to your organization’s data‑privacy policies.
- If you’re using a private or on‑prem LLM, the data stays within your infrastructure.
- Always review the LLM’s privacy policy and data‑handling practices before connecting it to the component.