Viewing 1 replies (of 1 total)
  • Plugin Author Jordy Meow

    (@tigroumeow)

    Hi @newtonvibe,

    What do you mean by your own API? If you have your own LLM running locally or on your server, you can indeed generate images (via Ollama, for example). Could you tell me more about the API and model you’re running, and how you set it up?

    Also worth mentioning: I’m shipping a new provider in AI Engine to make self-hosted LLMs easier to use. It’ll be in the next version, released tomorrow.

Viewing 1 replies (of 1 total)

You must be logged in to reply to this topic.