Hi. I’ve tested this locally, and it works as expected. I get a “Something went wrong” error if I put a wrong API key. Could this be the issue in your case? Did you check that the API key is working?
Thanks for your reply.
This is my first time using an API key so…
Went to
https://platform.openai.com/api-keys
I generated a key with all permissions and then copied/pasted it into the OpenAI settings
Is that right?
I generate a new one, put it back and come back for the result.
An example of a question to suggest to me for a parallel test?
To be continued
-
This reply was modified 3 months, 2 weeks ago by
web1media.
I did some digging, and I think that OpenAI doesn’t have a free tier anymore, so it might be the case. On your OpenAI account, do you have a credit card set up?
In any case, I’m preparing an update to the plugin, to allow the users adding their own provider URL and model, so that it is not necessary to be tied to OpenAI.
Hello, thank you for your search.
I have a pro subscription, with a bank card and access to uses so a priori it is not a credit problem
Hi!
I’m not sure why this is failing on your case. Perhaps there is a conflict on the OpenAI model, because when this feature was implemented, it supported a specific model. In any case, version 2.8.6 added more flexibility on that, and you can go to the plugin settings and set a specific API provider and AI Model.
Thank you very much for your research. I will test and keep you informed. Beautiful day
Hi,
As promised, here are the rest of the tests :
With OpenAI
“ai_provider”: “https://api.openai.com/v1/chat/completions”,
“ai_model”: “gpt-3.5-turbo”,
“ai_api_key”: “sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX”
Error message: Something went wrong
With Gemini
“ai_provider”: “https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-pro:generateContent”,
“ai_model”: “gemini-1.5-pro”,
“ai_api_key”: “AIzaSyDXXXXXXXXXXXXXXXXXXXXXXXXXXXX”
Error message: Something went wrong
So it will be a use without AI, too bad, thanks for the extension
For the OpenAI test it’s strange, because I’ve been using it with the same configuration myself, https://api.openai.com/v1/chat/completions
and gpt-3.5-turbo
.
For Gemini, though, it doesn’t seem to be the correct endpoint – can you try the URL https://generativelanguage.googleapis.com/v1beta/chat/completions
? Also, on my tests, I’ve used it with gemini-1.5-flash
, which worked as expected.
On both, I’ve used their free versions, and tested both locally and on a demo server.
Thanks for the Gemini clarification; I’ll test it soon.
Could this be due to the site being in WPMU ?
I’ll also test it with OAI on a standalone site to see.
To be continued