Comment on Ask ChatGPT to pick a number between 1 and 100

<- View Parent
kciwsnurb@aussie.zone ⁨7⁩ ⁨months⁩ ago

The temperature scale, I think. You divide the logit output by the temperature before feeding it to the softmax function. Larger (resp. smaller) temperature results in a higher (resp. lower) entropy distribution.

source
Sort:hotnewtop