Description
In May, we introduced enhancements to LivePerson’s Prompt Library, enabling prompt engineers to better manage Conversational Cloud prompts. These updates are now also available in Conversation Assist.
Additionally, we've added two new prompt settings:
- Max. tokens: Set the maximum number of output tokens from the LLM, useful for controlling response length and costs.
- Temperature: Control the randomness of responses (0 for consistent, 1 for more varied, human-like responses).
These updates provide greater flexibility and control over prompt creation.
Related Links
Check out Release Notes here.
Status
Live