Interface: GenerationConfig
Defined in: types/llm.ts:327
Object configuring generation settings.
Properties
batchTimeInterval?
optionalbatchTimeInterval:number
Defined in: types/llm.ts:331
Upper limit on the time interval between consecutive token batches.
outputTokenBatchSize?
optionaloutputTokenBatchSize:number
Defined in: types/llm.ts:330
Soft upper limit on the number of tokens in each token batch (in certain cases there can be more tokens in given batch, i.e. when the batch would end with special emoji join character).
temperature?
optionaltemperature:number
Defined in: types/llm.ts:328
Scales output logits by the inverse of temperature. Controls the randomness / creativity of text generation.
topp?
optionaltopp:number
Defined in: types/llm.ts:329
Only samples from the smallest set of tokens whose cumulative probability exceeds topp.