• The app is working great (and thank you for the continuous updates; they are amazing); the only thing bothering me is when you ask questions the response takes a long time to show. For some reason, it says on … thinking … , then show response. It seems slow and takes time to reply. I want it to respond fast, like chat.openai.com GPT-4 version … My current settings are as follows:

    Mode = gpt-4
    Temperature = 0.5
    Max Tokens: 1024
    Top P = 0.5
    Best Of = 1
    Frequency Penalty = 0.6
    Presence Penalty = 0.6

Viewing 1 replies (of 1 total)
  • Plugin Author senols

    (@senols)

    Hello,

    I find that the GPT-4 response tends to be less speedy than the Turbo version.

    I would suggest using the comparison tool under the “Content Writer – Comparison Tool” to compare the response time of each model and then decide which one suits your needs best.

    Additionally, it’s important to note that there are several factors that can affect response time:

    1. Save Logs: This option, if enabled, will save your logs. This process can take a little extra time and lead to slower response times.
    2. Save Prompts: The same is true for the “Save Prompts” option.
    3. Using Moderation: If you activate this feature, it will add extra time in order to check and moderate the outputs.
    4. Network: Your network speed can also affect the speed of responses.
    5. Connection: The connection between your server and OpenAI servers can influence the response speed.
    6. If you’re using Pinecone, note that the communication latency between your server and Pinecone servers may also impact the response time.

Viewing 1 replies (of 1 total)

The topic ‘App (thinking) slow response’ is closed to new replies.