New updates: Voice bots, Knowledge AI, Conversation Assist & Rollover Enhancements
➡️ Exact delivery dates may vary, and brands may therefore not have immediate access to all features on the date of publication. Please contact your LivePerson account team for the exact dates on which you will have access to the features.
🛑 The timing and scope of these features or functionalities remain at the sole discretion of LivePerson and are subject to change.
Conversation Builder
Features
Voice bots: Transferring calls to agents? Setup is more intuitive
We’ve just given the SIP Transfer interaction a facelift to make working with it more intuitive during bot building.
You still use the interaction to transfer the conversation to a human agent in a third-party contact center, by making either:
- A SIP call over a SIP trunk
- A call to the contact center’s E.164 number over the public switched telephone network (PSTN)
So what’s changed specifically?
- The interaction is now called “Transfer Call.” That’s a simple and clear name that accounts for both types of calls that you can make.
- There’s a new Transfer type dropdown. Use this to specify the type of call: SIP or E.164.
- There’s a new Use caller’s phone number setting. Turn this on if you want the caller’s phone number to be shown to the recipient as the calling number when the call is transferred. Turn this off to use the Voice bot’s phone number instead. For new bots, the default is On. For existing bots, the default is Off; this maintains backwards compatibility. And it means there’s no change to the runtime experience of your existing Voice bots.
We hope you find these changes make setup more intuitive.
Voice bots: Transferring calls to Messaging? Setup is a whole lot easier
We understand the pain point: During an automated voice conversation, in some bot flows you want to transfer the call to a messaging channel. But this hasn’t been easy. It’s required not one…not two…but three integration interactions to get it done.
We’re delighted to say that there’s good news on this front. We’ve just added a new Transfer to Messaging integration interaction. This powerhouse interaction makes all three necessary Connect-to-Messaging (C2M) API calls:
You’ll need to build out the bot flows that handle failures from these API calls. You’ll also need to build out the flow that handles the automated voice call.
Both the voice and messaging conversations can happen in parallel as two separate conversations. But this depends on your needs.
There are several important use cases where you might to transfer automated voice conversations to messaging:
- Your consumers prefer messaging. You interact with them over voice and messaging, but you aim to always redirect the conversation to their channel of choice.
- You want to reduce the volume of calls handled by your agents. High call volume can overwhelm agents and lead to lower call answer rates. You intend to redirect calls to messaging to alleviate these pain points and raise consumer satisfaction (CSAT).
- You mostly or solely communicate with your consumers over messaging. In this case, the primary and perhaps only purpose of your voice bot is to redirect your consumers to a messaging channel.
Take advantage of the new Transfer to Messaging interaction to reach these goals. Setup is now a whole lot simpler and quicker.
Messaging bots: Capture and transmit sensitive consumer data securely
Using secure forms to collect sensitive consumer data (credit card info, etc.) is important for many reasons, such as:
- Data privacy: Secure forms help maintain data privacy and confidentiality because the data is encrypted during transmission.
- Data integrity: Encryption prevents unauthorized modifications, ensuring the data submitted by consumers remains unchanged during transmission.
- Prevention of data breaches: Without encryption, data transmitted over the Internet can be intercepted by malicious actors, leading to potential data leaks and breaches.
- Trust and reputation: Consumers are more likely to trust a website or organization that prioritizes their data security.
- Compliance with regulations: Depending on your jurisdiction, there might be data protection laws that require you to implement appropriate security measures when handling consumer data.
Conversation Builder is thrilled to announce a new Secure Forms interaction that you can use in Messaging bots (only).
Consumer experience
1 - During an automated conversation, the secure form invitation is presented to the consumer. Once the consumer clicks the invitation link, the form is presented in a slideout window:
2 - The consumer enters their info and submits the form.
3 - The data is saved to a PCI-compliant vault, so it can be used for a short period of time. Also, the bot is notified of the successful form submission.
4 - Most commonly, the conversation is then transferred to an agent that’s trained in handling sensitive info.
5 - The agent then handles the info as required.
Setup
1 - Request that your LivePerson representative activate 1) the Secure Form feature and 2) the Secure Form Studio for your account. This requires that we flip several switches under the hood. Learn about account setup and necessary permissions in the Knowledge Center.
2 - In LivePerson’s self-service Secure Form Studio, create the secure form and customize the styling for your brand. You can access the studio via the Management Console in Conversational Cloud.
3 - In the Conversation Builder bot, add the Secure Form interaction and configure it per your requirements.
In most cases, after the consumer enters their info and submits the secure form, you’ll want to transfer the conversation to an agent that’s trained to handle sensitive data. This transfers the conversation to the skill or agent that you specify in the interaction’s settings. But this success flow isn’t required. Handle both the success and failure flows as you require.
Keep in mind that, currently, the bot cannot access the data provided via the secure form, so you can’t use the data to drive or otherwise inform the bot flow.
4 - Deploy the bot as normal.
Get started today: Take advantage of the new Secure Forms interaction for Conversation Builder bots to offer a safe and reliable online environment for your consumers and your brand.
Learn more about LivePerson secure forms.
KnowledgeAI
Features
AI Search for everyone: Return great answers in 50+ languages
Is your Conversational AI solution using a non-English knowledge base to return answers to consumer queries? If Yes, its performance just got much better. Automatically.
Before this release, only English-language knowledge bases could take advantage of our powerful and easy-to-use AI Search method for answer retrieval. But now AI Search can be used with knowledge bases in over 50 languages.
Call to action
For most, there’s no call to action.
Conversation Assist uses the “KnowledgeAI” search offering to retrieve answers from KnowledgeAI, so you’re good to go here. You’re reaping the benefits already, as this search offering includes AI Search.
In a Conversation Builder bot that uses a language other than English and that uses a KnowledgeAI integration:
- If the KnowledgeAI integration uses the “KnowledgeAI” search offering (recommended), you’re good to go here too. You’re reaping the benefits already. Because again, this search offering includes AI Search.
- If your KnowledgeAI integration uses the “Intents only” search offering, that’s fine too. But in this case you’re not taking advantage of AI Search. We recommend that you switch to the “KnowledgeAI” search offering. It uses AI Search as a fallback after the Intents search, so this is a powerful combination.
LivePerson has thoroughly tested AI Search in English and Spanish. As always, test answer retrieval thoroughly beforehand.
Not familiar with AI Search? It’s our context-aware, phrasing-agnostic search method that uses language embedding models to understand the intent in the consumer’s query and match it to content in your KB. No setup required. Read our blog.
Generative AI: Our solution shifts from monolingual to multilingual
The feature enhancements discussed below are experimental only. Don’t hesitate to get started using them in your demo solutions to explore the capabilities of Generative AI. Learn alongside us. And share your feedback! As always, proceed with care: Test thoroughly before rolling out to Production.
The world of Generative AI is changing quickly. And at LivePerson, we continue to innovate on our solution. We’re thrilled to announce that our latest feature enhancements center on language support. You can now:
- Get answers enriched via Generative AI in 50+ languages, not just English
- Support cross-lingual queries and use mixed-language knowledge bases
Answers enriched via Generative AI in 50+ languages
LLM-enriched answers are smarter, warmer, and better. And now you can get them in 50+ languages. French. Italian. Japanese. The list goes on.
Use KnowledgeAI’s LLM-based “answer enrichment service” to safely and productively take advantage of the unparalleled capabilities of Generative AI within our trusted Conversational AI platform. You can offer LLM-enriched answer recommendations to your agents within Conversation Assist. And you can automate enriched answers within Conversation Builder bots. In 50+ languages.
Get started today:
1 - Learn about our trustworthy Generative AI solution.
2 - Activate our Generative AI features and understand next steps.
Support for cross-lingual queries and mixed-language knowledge bases
Many of our brands have a global presence or a presence in countries with multiple national languages. So naturally they want their Conversational AI solution to support multilingual scenarios. If this is you, now is the time to start exploring. KnowledgeAI offers experimental support for:
- Cross-lingual queries
- Mixed-language knowledge bases
Let’s define terms.
Cross-lingual queries are those where the consumer’s query is in one language, but the knowledge article is in another. For example:
- The consumer’s query is in Spanish, but the article is in English.
- The consumer’s query is in Italian, but the article is in German.
Mixed-language knowledge bases are those that contain content in two or more languages. By their nature, mixed-language knowledge bases can lead to cross-lingual queries.
Both of these features are intended only for solutions that also use answers enriched via Generative AI. Why? Consider the following scenario:
1 - The consumer sends a query in Spanish.
2 - An English-language article in the knowledge base is matched and sent to the LLM service for enrichment.
3 - The enriched answer is returned from the LLM service.
In what language is the enriched answer? Our early testing indicates that the LLM service is likely to generate an answer in the same language as the consumer’s query. We’re researching this now and working to strengthen this outcome.
Again, support for cross-lingual queries and mixed-language knowledge bases is experimental. Explore these features in your demo messaging bots (there's no support in voice bots) and in your Conversation Assist solution. Learn alongside us. And share your feedback with your LivePerson representative! As always, proceed with care: Test thoroughly before rolling out to Production.
Setup includes just one step: Configure the Language setting for the knowledge base as follows:
- If the content in the knowledge base is in a language other than English, the choice here is easy. Select the primary language of the content. Behind the scenes, when searching the knowledge base for articles, a multilingual embedding model is used.
- If the content in the knowledge base is in English, and you only need to support English-language consumer queries, the choice here is easy too. Select the variant of English from the dropdown. In this case, an English-language embedding model is used.
- If the content in the knowledge base is entirely in English and you need to support cross-lingual queries, or if it’s in English and one or more other languages (which also means you need to support cross-lingual queries), select “Other” from the dropdown. In this case, the multilingual embedding model is used. The multilingual model is required to support cross-lingual queries.
The multilingual embedding model performs very well. However, the performance of the English-language model is even better. So, if your content is mostly in English and the queries of your consumers are mostly in English, we recommend that you select a variant of English from the dropdown. Your choice will depend on your use case and priorities.
Mixed-language knowledge bases shouldn’t be common. Use them when you have some content that needs to go in one language and other content that needs to go in another language. Don’t include the same content in two or more languages. This isn’t efficient, as the LLM will translate the answer during enrichment.
Conversation Assist
Features
Trustworthy Generative AI: Automatic hallucination masking empowers agents to send the right info
Hallucinations in LLM-generated responses happen from time to time, so a Generative AI solution that’s trustworthy requires smart and efficient ways to handle them. In this release, we’re delighted to introduce hallucination masking to Conversation Assist.
Recently, we enhanced our LLM Gateway so that it can mark (not remove) halllucinated URLs, phone numbers, and email addresses. We did this so that, in turn, client applications like Conversation Assist can take advantage of this functionality.
Now, when Conversation Assist receives a recommended answer that contains a marked hallucination (URL, phone number, or email address), it automatically masks the hallucination and replaces it with a placeholder for the right info. These placeholders are visually highlighted for agents, so they can quickly see where to take action and fill in the right info.
Check out our animated example below: A hallucinated URL has been detected and masked. The agent sees the placeholder, enters the right URL, and sends the fixed response to the consumer.
To make quick work of filling in placeholders, make your contact info available as predefined content. This exposes the content on the Replies tab in the On-Demand Recommendations widget. The agent can copy the info with a single click and paste it where needed. We’ve illustrated this in our animation above.
Conversation Rollover
Features
New support for Manual Rollover and Rollback
We are excited to unveil the Inter-Account Conversation Transfer feature, a game-changing update designed to elevate your brand interactions. This release introduces various capabilities to enhance customer experiences and optimize agent workflows across your brand accounts.
Key Capabilities
Cross-Account Conversation Transfer:
Agents and bots can effortlessly transfer messaging conversations between brand accounts across all supported channels (web, amb, sms, etc.). Maintain a consistent customer experience while utilizing the full scope of LivePerson's messaging channels.
Agent Notes and Warm Transfers:
Agents can include agent notes and warm transfer messages when passing conversations to other agents or skills. This ensures smooth knowledge transfer and context preservation.
Smart Transfer Timing:
Transfers can only occur to destination skills during their working hours. A fallback option is available if the destination agents become offline or unavailable, ensuring uninterrupted customer support.
Two-Way Transfer Capability:
Agents can seamlessly transfer conversations back to the originating agent or skill from the sending account. This bi-directional transfer flexibility facilitates efficient resolution and support.
Customer Benefits
Consistency Redefined:
Deliver a uniform experience to customers engaging with your various brand accounts. Agents can seamlessly pick up conversations, eliminating the need for customers to repeat information.
Enhanced Agent Productivity:
Streamline agent workflows with automated conversation transfers. Agents can focus on issue resolution instead of manual account management, resulting in faster response times.
Brand-Centric Engagement:
Maintain distinct brand identities while still facilitating inter-account transfers. Showcase your brands' uniqueness without compromising on efficient support.
The Manual Rollover for messaging requires backend enablement. Please get in touch with your LivePerson account team for more information.
User Walkthrough
Step 1: Identify Transfer Opportunity
While conversing with a customer, as an agent, you might realize that the discussion belongs to another account of the same brand. This is the perfect opportunity to provide seamless support by transferring the conversation to the appropriate skill in the other account.
Step 2: Initiate Transfer
1. Look for the "Transfer Conversation" Call-to-Action (CTA) within the conversation interface.
2. Click the "Transfer Conversation" button to begin the transfer process.
Step 3: Choose the Rollover Option
1. After clicking "Transfer Conversation," a new window will appear, showing transfer options.
2. Click "Rollover" to transfer the conversation to a skill in the other account.
3. Note: You can only transfer to skills configured while enabling manual rollover.
Step 4: Select Skill and Account
1. Choose the appropriate skill from the dropdown menu. This is the skill within the other brand's account where you want to transfer the conversation.
2. Confirm that you have selected the correct skill and account.
Step 5: Confirm and Execute Transfer
1. Double-check the transfer details, including the chosen skill and account.
2. If everything is accurate, Click "Transfer" to initiate the transfer.
Step 6: Conversation Successfully Transferred
The customer will now seamlessly connect to the new skill and brand account.
Step 7: Rollover and Rollback Options in the New Account
1. In the new account where the conversation has been transferred, agents can roll over the conversation to another account if necessary.
2. Similarly, they can roll back the conversation to the original account if the situation requires it.