Introduction
If you’ve invested in an in-house LLM, you can use it to power the Generative AI features in your Conversational Cloud solution. This lets you align the solution with your brand’s overall LLM strategy.
Key benefits
- Enhanced customization: Use a model that’s fine-tuned by you using your brand’s data.
- Security and compliance: Configure your LLM per your security and compliance needs.
- Increased efficiency and optimized performance: Establish your own SLAs and ensure consistent performance that’s tailored to your needs.
- Cost transparency: Gain full clarity on the model’s traffic and cost.
- Flexibility (Coming soon!): Bringing your own LLM won’t mean you have to use it everywhere in your Conversational Cloud solution. If desired, you’ll soon be able to use your own in-house LLM for some Conversational AI use cases, and use the ones available via LivePerson for other use cases. This flexibility isn’t available yet, but it’s coming soon.
Get started
For more info, including details on how to onboard, supported LLMs, and best practices, see our in-depth article in the Developer Center.