LLM Cache

The Settings > Configuration > LLM Cache window allows you to set parameters for your Aisera platform (tenant) instance. These are settings that apply to any bot you create on this Aisera instance.

Tenant Configuration Options if Using an Large Language Model (LLM)

You can change the following settings when configuring your tenant/platform instance.

Label
Field Parameter
Type
Default
Description

Cache Length

Integer

1000

Cache Temperature

Decimal

0.1

Semantec Similarity Threshold

Decimal

0.95

Lower Semantic

Similarity Threshold

Decimal

0.75

Keyword Similarity Threshold

Decimal

0.999

Last updated