LLM Cache

The Settings > Configuration > LLM Cache window allows you to set parameters for your tenant. These are settings that apply to any bot you create in your Aisera tenant.

Tenant Configuration Options if Using an Large Language Model (LLM)

You can change the following settings when configuring your tenant/platform instance.

Label
Type
Default
Description

Cache Length

Integer

1000

Cache Temperature

Decimal

0.1

Semantic Similarity Threshold

Decimal

0.95

Lower Semantic

Similarity Threshold

Decimal

0.75

Keyword Similarity Threshold

Decimal

0.999

Last updated

Was this helpful?