Interface: GenerationConfig
Defined in: packages/react-native-executorch/src/types/llm.ts:247
Object configuring generation settings.
Properties
batchTimeInterval?
optionalbatchTimeInterval:number
Defined in: packages/react-native-executorch/src/types/llm.ts:251
Upper limit on the time interval between consecutive token batches.
outputTokenBatchSize?
optionaloutputTokenBatchSize:number
Defined in: packages/react-native-executorch/src/types/llm.ts:250
Soft upper limit on the number of tokens in each token batch (in certain cases there can be more tokens in given batch, i.e. when the batch would end with special emoji join character).
temperature?
optionaltemperature:number
Defined in: packages/react-native-executorch/src/types/llm.ts:248
Scales output logits by the inverse of temperature. Controls the randomness / creativity of text generation.
topp?
optionaltopp:number
Defined in: packages/react-native-executorch/src/types/llm.ts:249
Only samples from the smallest set of tokens whose cumulative probability exceeds topp.