• Resolved Navida

    (@landscape113)


    When using Custom Mode and GPT-4 with a maximum token limit of 8000, the output is still limited to ~1100 tokens. This causes the article to be cut off, resulting in an output of approximately 700 words.

Viewing 1 replies (of 1 total)
  • Plugin Author senols

    (@senols)

    Hello,

    An 8000-token limit doesn’t guarantee that the output will use all 8000 tokens. You can experiment with the same prompt in the OpenAI playground to see the actual word count generated by the model.

    Keep in mind that GPT-4 may not always generate content up to the maximum token limit. It could reach a natural conclusion in the generated text and not deem it necessary to utilize the entire token allowance.

Viewing 1 replies (of 1 total)
  • The topic ‘Bug: Max Tokens in GPT-4’ is closed to new replies.