Description

In May, we introduced enhancements to LivePerson’s Prompt Library, enabling prompt engineers to better manage Conversational Cloud prompts. These updates are now also available in Conversation Assist.

Additionally, we've added two new prompt settings:

  • Max. tokens: Set the maximum number of output tokens from the LLM, useful for controlling response length and costs.
  • Temperature: Control the randomness of responses (0 for consistent, 1 for more varied, human-like responses).

These updates provide greater flexibility and control over prompt creation.

Related Links

Check out Release Notes here.

Status

Live

Was this article helpful?

Be the first one to vote!
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service
Loading