After receiving a Crisp message, the system first attempts to match Keyword Replies. If no match is found, it proceeds to the AI Smart Reply.
Please Note:
The AI constructs its response based on the following data points:
User Profile (Reported by your website or the user).
Geographic Location.
Operating System & Browser.
Additional Data (Custom data attributes reported by your site).
Effective prompts can significantly improve communication efficiency. Below is a recommended template:
You can convert your documentation into an Embedding Model. You only need to provide the "Title" and "Content." This process consumes a very small portion of your balance but greatly improves AI accuracy.
You can choose to use Prompts only without a Knowledge Base. However, if you use the Knowledge Base, you must also configure a Prompt so the AI understands its identity and how to use the retrieved information.
| Feature | Knowledge Base (Embeddings) | Pure Prompt Replies |
|---|---|---|
| Logic | Vectorizes data. Retrieves relevant snippets first, then sends them to the AI with the prompt. | Writes the entire knowledge base into the prompt. AI references the full context every time. |
| Cost | Only consumes tokens for Question + Specific Snippets. More cost-effective for large datasets. | Consumes tokens for the Entire Dataset per message. Costs increase significantly as the base grows. |
| Speed | Extra retrieval step, but smaller context leads to faster model inference. | No retrieval step, but a massive context can significantly slow down model processing. |
| Quality | Responses are focused and relevant, staying strictly on topic based on matched data. | Provides comprehensive info, but prone to redundancy, rambling, or "hallucinations." |